This document provides an overview of an upcoming course on inverse problems and regularization. The course will cover three topics: inverse problems, compressed sensing, and sparsity and L1 regularization. Inverse problems involve recovering an unknown signal x0 from noisy observations. Regularization is used to incorporate prior information and make the problem well-posed. Compressed sensing allows signals to be sampled below the Nyquist rate if they are sparse. The L1 norm is used as a convex relaxation of the sparsity prior, allowing sparse recovery problems to be solved as convex programs.
10. Inverse Problem Regularization
Observations: y = x0 + w 2 RP .
Estimator: x(y) depends only on
observations y
parameter
Example: variational methods
1
x(y) 2 argmin ||y
x||2 + J(x)
x2RN 2
Data fidelity Regularity
11. Inverse Problem Regularization
Observations: y = x0 + w 2 RP .
Estimator: x(y) depends only on
observations y
parameter
Example: variational methods
1
x(y) 2 argmin ||y
x||2 + J(x)
x2RN 2
Data fidelity Regularity
Choice of : tradeo
Noise level
||w||
Regularity of x0
J(x0 )
12. Inverse Problem Regularization
Observations: y = x0 + w 2 RP .
Estimator: x(y) depends only on
observations y
parameter
Example: variational methods
1
x(y) 2 argmin ||y
x||2 + J(x)
x2RN 2
Data fidelity Regularity
Choice of : tradeo
No noise:
Noise level
||w||
0+ , minimize
Regularity of x0
J(x0 )
x? 2 argmin J(x)
x2RQ , x=y
13. Inverse Problem Regularization
Observations: y = x0 + w 2 RP .
Estimator: x(y) depends only on
observations y
parameter
Example: variational methods
1
x(y) 2 argmin ||y
x||2 + J(x)
x2RN 2
Data fidelity Regularity
Choice of : tradeo
No noise:
This course:
Noise level
||w||
0+ , minimize
Regularity of x0
J(x0 )
x? 2 argmin J(x)
x2RQ , x=y
Performance analysis.
Fast computational scheme.
18. CS Acquisition Model
˜
CS is about designing hardware: input signals f
L2 (R2 ).
Physical hardware resolution limit: target resolution f
2
x0 2 L
˜
array
resolution
CS hardware
x0 2 R
N
micro
mirrors
RN .
y 2 RP
19. CS Acquisition Model
˜
CS is about designing hardware: input signals f
L2 (R2 ).
Physical hardware resolution limit: target resolution f
x0 2 L
˜
array
resolution
x0 2 R
N
micro
mirrors
y 2 RP
CS hardware
,
,
...
2
RN .
,
Operator
x0
27. Sparse Priors
Ideal sparsity: for most m, xm = 0.
Coe cients x
J0 (x) = # {m xm = 0}
Sparse approximation: f = x where
argmin ||f0
x||2 + T 2 J0 (x)
x2RN
Image f0
28. Sparse Priors
Coe cients x
Ideal sparsity: for most m, xm = 0.
J0 (x) = # {m xm = 0}
Sparse approximation: f = x where
argmin ||f0
x||2 + T 2 J0 (x)
x2RN
Orthogonal
:
=
= IdN
f0 , m if | f0 ,
xm =
0 otherwise.
f=
ST
(f0 )
m
| > T,
ST
Image f0
29. Sparse Priors
Coe cients x
Ideal sparsity: for most m, xm = 0.
J0 (x) = # {m xm = 0}
Sparse approximation: f = x where
argmin ||f0
x||2 + T 2 J0 (x)
x2RN
Orthogonal
:
=
= IdN
f0 , m if | f0 ,
xm =
0 otherwise.
f=
ST
(f0 )
Non-orthogonal :
NP-hard.
m
| > T,
ST
Image f0
37. L1 Regularization
x0 RN
coe cients
f0 = x0 RQ
image
K
y = Kf0 + w RP
observations
w
= K ⇥ ⇥ RP
Sparse recovery: f =
N
x where x solves
1
min
||y
x||2 + ||x||1
x RN 2
Fidelity Regularization
40. Noiseless Sparse Regularization
Noiseless measurements:
y = x0
x
x=
x
argmin
x=y
m
x
x=
y
|xm |
x
argmin
x=y
m
y
|xm |2
Convex linear program.
Interior points, cf. [Chen, Donoho, Saunders] “basis pursuit”.
Douglas-Rachford splitting, see [Combettes, Pesquet].
42. Noisy Sparse Regularization
Noisy measurements:
y = x0 + w
x
1
argmin ||y
x||2 + ||x||1
x RQ 2
Data fidelity Regularization
x
argmin ||x||1
|| x y||
x
Equivalence
|
x=
y|
43. Noisy Sparse Regularization
Noisy measurements:
y = x0 + w
x
1
argmin ||y
x||2 + ||x||1
x RQ 2
Data fidelity Regularization
x
argmin ||x||1
|| x y||
Algorithms:
Iterative soft thresholding
Forward-backward splitting
see [Daubechies et al], [Pesquet et al], etc
Nesterov multi-steps schemes.
x
Equivalence
|
x=
y|