SlideShare a Scribd company logo
1 of 50
Download to read offline
Gabriel Peyré
www.numerical-tours.com
Model Selection
with Piecewise
Regular Gauges
Samuel Vaiter
Charles Deledalle
Jalal Fadili
Joint work with:
Joseph Salmon
VISI N
Overview
• Inverse Problems
• Gauge Decomposition and Model Selection
• L2 Stability Performances
• Model Stability Performances
y = x0 + w 2 RP
Inverse Problems
Recovering x0 RN
from noisy observations
Examples: Inpainting, super-resolution, compressed-sensing
y = x0 + w 2 RP
Inverse Problems
Recovering x0 RN
from noisy observations
x0
x0
Regularized inversion:
Estimators
x(y) 2 argmin
x2RN
1
2
||y x||2
+ J(x)
Data fidelity Regularity
Observations: y = x0 + w 2 RP
.
L2
error stability: ||x(y) x0|| = O(||w||).
Promoted subspace (“model”) stability.
Goal: Performance analysis:
Regularized inversion:
Estimators
x(y) 2 argmin
x2RN
1
2
||y x||2
+ J(x)
! Criteria on (x0, ||w||, ) to ensure
Data fidelity Regularity
Observations: y = x0 + w 2 RP
.
Overview
• Inverse Problems
• Gauge Decomposition and Model Selection
• L2 Stability Performances
• Model Stability Performances
Coe cients x Image x
Union of Linear Models for Data Processing
Union of models: T 2 T linear spaces.
Synthesis
sparsity:
T
Coe cients x Image x
Union of Linear Models for Data Processing
Union of models: T 2 T linear spaces.
Synthesis
sparsity:
T
Structured
sparsity:
Coe cients x Image x
Union of Linear Models for Data Processing
D
Image x Gradient D⇤
x
Union of models: T 2 T linear spaces.
Synthesis
sparsity:
T
Structured
sparsity:
Analysis
sparsity:
Coe cients x Image x
Multi-spectral imaging:
xi,· =
Pr
j=1 Ai,jSj,·
Union of Linear Models for Data Processing
D
Image x Gradient D⇤
x
Union of models: T 2 T linear spaces.
Synthesis
sparsity:
T
Structured
sparsity:
Analysis
sparsity:
Low-rank:
S1,·
S2,·
S3,·x
Gauge: J : RN
! R+
8 ↵ 2 R+
, J(↵x) = ↵J(x)
Gauges for Union of Linear Models
Convex
Gauge: J : RN
! R+
8 ↵ 2 R+
, J(↵x) = ↵J(x)
J(x) = C(x) = inf {⇢ > 0  x 2 ⇢C}
C = {x  J(x) 6 1} (assuming 0 2 C)
Gauges for Union of Linear Models
J(x)
C
1
Convex
Gauge:
, Union of linear models (T)T 2TPiecewise regular ball
J : RN
! R+
8 ↵ 2 R+
, J(↵x) = ↵J(x)
J(x) = C(x) = inf {⇢ > 0  x 2 ⇢C}
C = {x  J(x) 6 1} (assuming 0 2 C)
Gauges for Union of Linear Models
J(x)
C
1
Convex
x
T
Gauge:
, Union of linear models (T)T 2TPiecewise regular ball
J : RN
! R+
8 ↵ 2 R+
, J(↵x) = ↵J(x)
J(x) = C(x) = inf {⇢ > 0  x 2 ⇢C}
C = {x  J(x) 6 1} (assuming 0 2 C)
Gauges for Union of Linear Models
J(x) = ||x||1
T = sparse
vectors
J(x)
C
1
Convex
x
T
Gauge:
, Union of linear models (T)T 2TPiecewise regular ball
J : RN
! R+
8 ↵ 2 R+
, J(↵x) = ↵J(x)
J(x) = C(x) = inf {⇢ > 0  x 2 ⇢C}
C = {x  J(x) 6 1} (assuming 0 2 C)
Gauges for Union of Linear Models
J(x) = ||x||1
x0
T0
T = sparse
vectors
J(x)
C
1
Convex
x
T
Gauge:
, Union of linear models (T)T 2TPiecewise regular ball
J : RN
! R+
8 ↵ 2 R+
, J(↵x) = ↵J(x)
J(x) = C(x) = inf {⇢ > 0  x 2 ⇢C}
C = {x  J(x) 6 1} (assuming 0 2 C)
Gauges for Union of Linear Models
J(x) = ||x||1
x0
T0
T = sparse
vectors
|x1|+||x2,3||
x0
xT
T0
T = block
vectors
sparse
J(x)
C
1
Convex
x
T
Gauge:
, Union of linear models (T)T 2TPiecewise regular ball
J : RN
! R+
8 ↵ 2 R+
, J(↵x) = ↵J(x)
J(x) = C(x) = inf {⇢ > 0  x 2 ⇢C}
C = {x  J(x) 6 1} (assuming 0 2 C)
Gauges for Union of Linear Models
J(x) = ||x||1
T = low-rank
matrices
J(x) = ||x||⇤
x
x0
T0
T = sparse
vectors
|x1|+||x2,3||
x0
xT
T0
T = block
vectors
sparse
J(x)
C
1
Convex
x
T
Gauge:
, Union of linear models (T)T 2TPiecewise regular ball
J : RN
! R+
8 ↵ 2 R+
, J(↵x) = ↵J(x)
J(x) = C(x) = inf {⇢ > 0  x 2 ⇢C}
C = {x  J(x) 6 1} (assuming 0 2 C)
Gauges for Union of Linear Models
J(x) = ||x||1
T = low-rank
matrices
J(x) = ||x||⇤
x
x0
T0
T = anti-
sparse
vectors
J(x) = ||x||1
x
x0
T0
T = sparse
vectors
|x1|+||x2,3||
x0
xT
T0
T = block
vectors
sparse
J(x)
C
1
Convex
Subdifferentials and Models
@J(x) = ⌘ 2 RN
 8 y, J(y) > J(x) + h⌘, y xi
I = supp(x) = {i  xi 6= 0}
Subdifferentials and Models
Example: J(x) = ||x||1 @||x||1 =
⇢
⌘ 
supp(⌘) = I,
8 j /2 I, |⌘j| 6 1
x
@J(x)
0
@J(x) = ⌘ 2 RN
 8 y, J(y) > J(x) + h⌘, y xi
x@J(x)
0
I = supp(x) = {i  xi 6= 0}
Subdifferentials and Models
Example: J(x) = ||x||1 @||x||1 =
⇢
⌘ 
supp(⌘) = I,
8 j /2 I, |⌘j| 6 1
x
@J(x)
0
@J(x) = ⌘ 2 RN
 8 y, J(y) > J(x) + h⌘, y xi
Tx
x@J(x)
0
Definition:
I = supp(x) = {i  xi 6= 0}
Tx = VectHull(@J(x))?
Subdifferentials and Models
Tx = {⌘  supp(⌘) = I}
Example: J(x) = ||x||1 @||x||1 =
⇢
⌘ 
supp(⌘) = I,
8 j /2 I, |⌘j| 6 1
Tx
x
@J(x)
0
@J(x) = ⌘ 2 RN
 8 y, J(y) > J(x) + h⌘, y xi
Tx
x@J(x)
0
Definition:
I = supp(x) = {i  xi 6= 0}
Tx = VectHull(@J(x))?
Subdifferentials and Models
ex
ex = ProjTx
(@J(x))
ex = sign(x)
Tx = {⌘  supp(⌘) = I}
Example: J(x) = ||x||1 @||x||1 =
⇢
⌘ 
supp(⌘) = I,
8 j /2 I, |⌘j| 6 1
ex
Tx
x
@J(x)
0
@J(x) = ⌘ 2 RN
 8 y, J(y) > J(x) + h⌘, y xi
Examples
`1
sparsity: J(x) = ||x||1
ex = sign(x) Tx = {z  supp(z) ⇢ supp(x)}
x0
x @J(x)
Examples
x0
x
@J(x)
`1
sparsity: J(x) = ||x||1
ex = sign(x) Tx = {z  supp(z) ⇢ supp(x)}
ex = (N(xb))b2B
N(a) = a/||a||Structured sparsity: J(x) =
P
b ||xb||
Tx = {z  supp(z) ⇢ supp(x)}
x0
x @J(x)
Examples
x0
x
@J(x)
Tx = {z  U⇤
?zV? = 0}ex = UV ⇤
Nuclear norm: J(x) = ||x||⇤ x = U⇤V ⇤
SVD:
`1
sparsity: J(x) = ||x||1
ex = sign(x) Tx = {z  supp(z) ⇢ supp(x)}
ex = (N(xb))b2B
N(a) = a/||a||Structured sparsity: J(x) =
P
b ||xb||
Tx = {z  supp(z) ⇢ supp(x)}
x
@J(x)
x0
x @J(x)
Examples
x0
x
@J(x)
x x0
@J(x)
I = {i  |xi| = ||x||1}Anti-sparsity: J(x) = ||x||1
Tx = {y  yI / sign(xI)}ex = |I| 1
sign(x)
Tx = {z  U⇤
?zV? = 0}ex = UV ⇤
Nuclear norm: J(x) = ||x||⇤ x = U⇤V ⇤
SVD:
`1
sparsity: J(x) = ||x||1
ex = sign(x) Tx = {z  supp(z) ⇢ supp(x)}
ex = (N(xb))b2B
N(a) = a/||a||Structured sparsity: J(x) =
P
b ||xb||
Tx = {z  supp(z) ⇢ supp(x)}
x
@J(x)
x0
x @J(x)
Overview
• Inverse Problems
• Gauge Decomposition and Model Selection
• L2 Stability Performances
• Model Stability Performances
Noiseless recovery: min
x= x0
J(x) (P0)
x = x0
Dual Certificate and L2 Stability
x?
Noiseless recovery: min
x= x0
J(x) (P0)
Dual certificates:
x = x0
⌘
Proposition:
D = Im( ⇤
)  @J(x0)
9 ⌘ 2 D () x0 solution of (P0)
Dual Certificate and L2 Stability
@J(x0)
x?
Noiseless recovery: min
x= x0
J(x) (P0)
Dual certificates:
Tight dual certificates:
x = x0
⌘
Proposition:
D = Im( ⇤
)  @J(x0)
¯D = Im( ⇤
)  ri(@J(x0))
9 ⌘ 2 D () x0 solution of (P0)
Dual Certificate and L2 Stability
@J(x0)
x?
Noiseless recovery: min
x= x0
J(x) (P0)
Dual certificates:
Tight dual certificates:
x = x0
⌘
Proposition:
D = Im( ⇤
)  @J(x0)
¯D = Im( ⇤
)  ri(@J(x0))
9 ⌘ 2 D () x0 solution of (P0)
Dual Certificate and L2 Stability
@J(x0)
x?
Theorem:
[Fadili et al. 2013] for ⇠ ||w|| one has ||x?
x0|| = O(||w||)
If 9 ⌘ 2 ¯D and ker( )  Tx0 = {0}
Noiseless recovery: min
x= x0
J(x) (P0)
Dual certificates:
Tight dual certificates:
x = x0
⌘
Proposition:
! The constants depend on N . . .
D = Im( ⇤
)  @J(x0)
¯D = Im( ⇤
)  ri(@J(x0))
9 ⌘ 2 D () x0 solution of (P0)
Dual Certificate and L2 Stability
@J(x0)
x?
Theorem:
[Fadili et al. 2013] for ⇠ ||w|| one has ||x?
x0|| = O(||w||)
If 9 ⌘ 2 ¯D and ker( )  Tx0 = {0}
Noiseless recovery: min
x= x0
J(x) (P0)
Dual certificates:
Tight dual certificates:
x = x0
⌘
Proposition:
[Grassmair 2012]: J(x?
x0) = O(||w||).
[Grassmair, Haltmeier, Scherzer 2010]: J = || · ||1.
! The constants depend on N . . .
D = Im( ⇤
)  @J(x0)
¯D = Im( ⇤
)  ri(@J(x0))
9 ⌘ 2 D () x0 solution of (P0)
Dual Certificate and L2 Stability
@J(x0)
x?
Theorem:
[Fadili et al. 2013] for ⇠ ||w|| one has ||x?
x0|| = O(||w||)
If 9 ⌘ 2 ¯D and ker( )  Tx0 = {0}
Overview
• Inverse Problems
• Gauge Decomposition and Model Selection
• L2 Stability Performances
• Model Stability Performances
⌘ 2 D () and J (⌘) = 1
Minimal-norm Certificate
⌘ = ⇤
q
⌘T = e
⇢
T = Tx0
e = ex0
⌘ 2 D ()
We assume ker( )  T = {0} and J piecewise regular.
and J (⌘) = 1
Minimal-norm Certificate
⌘ = ⇤
q
⌘T = e
⇢
T = Tx0
e = ex0
⌘0 = argmin
⌘= ⇤q,⌘T =e
||q||
⌘ 2 D ()
We assume ker( )  T = {0} and J piecewise regular.
and J (⌘) = 1
Minimal-norm Certificate
⌘ = ⇤
q
⌘T = e
Minimal-norm pre-certificate:
⇢
T = Tx0
e = ex0
⌘0 = argmin
⌘= ⇤q,⌘T =e
||q||
⌘ 2 D ()
We assume ker( )  T = {0} and J piecewise regular.
and J (⌘) = 1
Minimal-norm Certificate
Proposition: One has
⌘ = ⇤
q
⌘T = e
Minimal-norm pre-certificate:
⇢
T = Tx0
e = ex0
⌘0 = ( +
T )⇤
e
⌘0 = argmin
⌘= ⇤q,⌘T =e
||q||
⌘ 2 D ()
We assume ker( )  T = {0} and J piecewise regular.
and J (⌘) = 1
Minimal-norm Certificate
Proposition:
||w|| = O(⌫x0 ) and ⇠ ||w||,Theorem:
the unique solution x?
of P (y) for y = x0 + w satisfies
Tx? = Tx0
and ||x?
x0|| = O(||w||) [Vaiter et al. 2013]
One has
⌘ = ⇤
q
⌘T = e
Minimal-norm pre-certificate:
⇢
T = Tx0
e = ex0
⌘0 = ( +
T )⇤
e
If ⌘0 2 ¯D,
[Fuchs 2004]: J = || · ||1.
[Bach 2008]: J = || · ||1,2 and J = || · ||⇤.
[Vaiter et al. 2011]: J = ||D⇤
· ||1.
⌘0 = argmin
⌘= ⇤q,⌘T =e
||q||
⌘ 2 D ()
We assume ker( )  T = {0} and J piecewise regular.
and J (⌘) = 1
Minimal-norm Certificate
Proposition:
||w|| = O(⌫x0 ) and ⇠ ||w||,Theorem:
the unique solution x?
of P (y) for y = x0 + w satisfies
Tx? = Tx0
and ||x?
x0|| = O(||w||) [Vaiter et al. 2013]
One has
⌘ = ⇤
q
⌘T = e
Minimal-norm pre-certificate:
⇢
T = Tx0
e = ex0
⌘0 = ( +
T )⇤
e
If ⌘0 2 ¯D,
⇥x =
i
xi (· i)
Increasing :
reduces correlation.
reduces resolution.
J(x) = ||x||1
Example: 1-D Sparse Deconvolution
x0
x0
⇥x =
i
xi (· i)
Increasing :
reduces correlation.
reduces resolution.
0 10
2
support recovery.
J(x) = ||x||1
()
||⌘0,Ic ||1 < 1
⌘0 2 ¯D(x0)
I = {j  x0(j) 6= 0}
||⌘0,Ic ||1
Example: 1-D Sparse Deconvolution
x0
x0
20
1
()
J(x) = ||rx||1 (rx)i = xi xi 1
= Id I = {i  (rx0)i 6= 0}
8 j /2 I, ( ↵0)j = 0⌘0 = div(↵0) where
Example: 1-D TV Denoising
x0
+1
1
I
J
Support stability.
J(x) = ||rx||1 (rx)i = xi xi 1
= Id I = {i  (rx0)i 6= 0}
8 j /2 I, ( ↵0)j = 0⌘0 = div(↵0) where
||↵0,Ic || < 1
Example: 1-D TV Denoising
x0
x0
+1
1
I
J
`2
stability onlySupport stability.
x0
J(x) = ||rx||1 (rx)i = xi xi 1
= Id I = {i  (rx0)i 6= 0}
8 j /2 I, ( ↵0)j = 0⌘0 = div(↵0) where
||↵0,Ic || < 1 ||↵0,Ic || = 1
Example: 1-D TV Denoising
+1
1
J
x0
x0
Gauges: encode linear models as singular points.
Conclusion
Piecewise smooth gauges: enable model recovery Tx? = Tx0 .
Gauges: encode linear models as singular points.
Tight dual certificates: enables L2
stability.
Conclusion
Piecewise smooth gauges: enable model recovery Tx? = Tx0 .
– Approximate model recovery Tx? ⇡ Tx0 .
Gauges: encode linear models as singular points.
– Infinite dimensional problems (measures, TV, etc.).
Tight dual certificates: enables L2
stability.
Conclusion
Open problems:

More Related Content

What's hot

Mesh Processing Course : Multiresolution
Mesh Processing Course : MultiresolutionMesh Processing Course : Multiresolution
Mesh Processing Course : MultiresolutionGabriel Peyré
 
Geodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and GraphicsGeodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and GraphicsGabriel Peyré
 
Learning Sparse Representation
Learning Sparse RepresentationLearning Sparse Representation
Learning Sparse RepresentationGabriel Peyré
 
Actuarial Science Reference Sheet
Actuarial Science Reference SheetActuarial Science Reference Sheet
Actuarial Science Reference SheetDaniel Nolan
 
Lecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dualLecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dualStéphane Canu
 
Mesh Processing Course : Mesh Parameterization
Mesh Processing Course : Mesh ParameterizationMesh Processing Course : Mesh Parameterization
Mesh Processing Course : Mesh ParameterizationGabriel Peyré
 
A series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropyA series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropyFrank Nielsen
 
Lecture3 linear svm_with_slack
Lecture3 linear svm_with_slackLecture3 linear svm_with_slack
Lecture3 linear svm_with_slackStéphane Canu
 
Lecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primalLecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primalStéphane Canu
 
Bregman divergences from comparative convexity
Bregman divergences from comparative convexityBregman divergences from comparative convexity
Bregman divergences from comparative convexityFrank Nielsen
 
Classification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metricsClassification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metricsFrank Nielsen
 
The dual geometry of Shannon information
The dual geometry of Shannon informationThe dual geometry of Shannon information
The dual geometry of Shannon informationFrank Nielsen
 
Lecture 2: linear SVM in the Dual
Lecture 2: linear SVM in the DualLecture 2: linear SVM in the Dual
Lecture 2: linear SVM in the DualStéphane Canu
 
Iceaa07 Foils
Iceaa07 FoilsIceaa07 Foils
Iceaa07 FoilsAntonini
 
Andreas Eberle
Andreas EberleAndreas Eberle
Andreas EberleBigMC
 
Tele3113 wk1wed
Tele3113 wk1wedTele3113 wk1wed
Tele3113 wk1wedVin Voro
 

What's hot (20)

Mesh Processing Course : Multiresolution
Mesh Processing Course : MultiresolutionMesh Processing Course : Multiresolution
Mesh Processing Course : Multiresolution
 
Geodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and GraphicsGeodesic Method in Computer Vision and Graphics
Geodesic Method in Computer Vision and Graphics
 
Learning Sparse Representation
Learning Sparse RepresentationLearning Sparse Representation
Learning Sparse Representation
 
Actuarial Science Reference Sheet
Actuarial Science Reference SheetActuarial Science Reference Sheet
Actuarial Science Reference Sheet
 
Lecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dualLecture 2: linear SVM in the dual
Lecture 2: linear SVM in the dual
 
Mesh Processing Course : Mesh Parameterization
Mesh Processing Course : Mesh ParameterizationMesh Processing Course : Mesh Parameterization
Mesh Processing Course : Mesh Parameterization
 
Lecture5 kernel svm
Lecture5 kernel svmLecture5 kernel svm
Lecture5 kernel svm
 
A series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropyA series of maximum entropy upper bounds of the differential entropy
A series of maximum entropy upper bounds of the differential entropy
 
Lecture3 linear svm_with_slack
Lecture3 linear svm_with_slackLecture3 linear svm_with_slack
Lecture3 linear svm_with_slack
 
Lecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primalLecture 1: linear SVM in the primal
Lecture 1: linear SVM in the primal
 
Bregman divergences from comparative convexity
Bregman divergences from comparative convexityBregman divergences from comparative convexity
Bregman divergences from comparative convexity
 
Classification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metricsClassification with mixtures of curved Mahalanobis metrics
Classification with mixtures of curved Mahalanobis metrics
 
Image denoising
Image denoisingImage denoising
Image denoising
 
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
QMC: Operator Splitting Workshop, A New (More Intuitive?) Interpretation of I...
 
QMC: Operator Splitting Workshop, Boundedness of the Sequence if Iterates Gen...
QMC: Operator Splitting Workshop, Boundedness of the Sequence if Iterates Gen...QMC: Operator Splitting Workshop, Boundedness of the Sequence if Iterates Gen...
QMC: Operator Splitting Workshop, Boundedness of the Sequence if Iterates Gen...
 
The dual geometry of Shannon information
The dual geometry of Shannon informationThe dual geometry of Shannon information
The dual geometry of Shannon information
 
Lecture 2: linear SVM in the Dual
Lecture 2: linear SVM in the DualLecture 2: linear SVM in the Dual
Lecture 2: linear SVM in the Dual
 
Iceaa07 Foils
Iceaa07 FoilsIceaa07 Foils
Iceaa07 Foils
 
Andreas Eberle
Andreas EberleAndreas Eberle
Andreas Eberle
 
Tele3113 wk1wed
Tele3113 wk1wedTele3113 wk1wed
Tele3113 wk1wed
 

Similar to Model Selection with Piecewise Regular Gauges

Automatic Bayesian method for Numerical Integration
Automatic Bayesian method for Numerical Integration Automatic Bayesian method for Numerical Integration
Automatic Bayesian method for Numerical Integration Jagadeeswaran Rathinavel
 
Tareas-Resueltas.pdf
Tareas-Resueltas.pdfTareas-Resueltas.pdf
Tareas-Resueltas.pdfFrankCruz49
 
Modeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential EquationModeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential EquationMark Chang
 
Formulario Geometria Analitica.pdf
Formulario Geometria Analitica.pdfFormulario Geometria Analitica.pdf
Formulario Geometria Analitica.pdfAntonio Guasco
 
slides_online_optimization_david_mateos
slides_online_optimization_david_mateosslides_online_optimization_david_mateos
slides_online_optimization_david_mateosDavid Mateos
 
A Szemerédi-type theorem for subsets of the unit cube
A Szemerédi-type theorem for subsets of the unit cubeA Szemerédi-type theorem for subsets of the unit cube
A Szemerédi-type theorem for subsets of the unit cubeVjekoslavKovac1
 
Properties of bivariate and conditional Gaussian PDFs
Properties of bivariate and conditional Gaussian PDFsProperties of bivariate and conditional Gaussian PDFs
Properties of bivariate and conditional Gaussian PDFsAhmad Gomaa
 
Differential Calculus
Differential Calculus Differential Calculus
Differential Calculus OlooPundit
 
Applications of Differential Calculus in real life
Applications of Differential Calculus in real life Applications of Differential Calculus in real life
Applications of Differential Calculus in real life OlooPundit
 
Time-Series Analysis on Multiperiodic Conditional Correlation by Sparse Covar...
Time-Series Analysis on Multiperiodic Conditional Correlation by Sparse Covar...Time-Series Analysis on Multiperiodic Conditional Correlation by Sparse Covar...
Time-Series Analysis on Multiperiodic Conditional Correlation by Sparse Covar...Michael Lie
 
Docslide.us 2002 formulae-and-tables
Docslide.us 2002 formulae-and-tablesDocslide.us 2002 formulae-and-tables
Docslide.us 2002 formulae-and-tablesbarasActuarial
 
cps170_bayes_nets.ppt
cps170_bayes_nets.pptcps170_bayes_nets.ppt
cps170_bayes_nets.pptFaizAbaas
 

Similar to Model Selection with Piecewise Regular Gauges (20)

Automatic Bayesian method for Numerical Integration
Automatic Bayesian method for Numerical Integration Automatic Bayesian method for Numerical Integration
Automatic Bayesian method for Numerical Integration
 
Topic5
Topic5Topic5
Topic5
 
Probability
ProbabilityProbability
Probability
 
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
QMC Program: Trends and Advances in Monte Carlo Sampling Algorithms Workshop,...
 
Tareas-Resueltas.pdf
Tareas-Resueltas.pdfTareas-Resueltas.pdf
Tareas-Resueltas.pdf
 
Modeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential EquationModeling the Dynamics of SGD by Stochastic Differential Equation
Modeling the Dynamics of SGD by Stochastic Differential Equation
 
ch3.ppt
ch3.pptch3.ppt
ch3.ppt
 
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
2018 MUMS Fall Course - Statistical Representation of Model Input (EDITED) - ...
 
Formulario Geometria Analitica.pdf
Formulario Geometria Analitica.pdfFormulario Geometria Analitica.pdf
Formulario Geometria Analitica.pdf
 
slides_online_optimization_david_mateos
slides_online_optimization_david_mateosslides_online_optimization_david_mateos
slides_online_optimization_david_mateos
 
QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...
QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...
QMC: Transition Workshop - Applying Quasi-Monte Carlo Methods to a Stochastic...
 
A Szemerédi-type theorem for subsets of the unit cube
A Szemerédi-type theorem for subsets of the unit cubeA Szemerédi-type theorem for subsets of the unit cube
A Szemerédi-type theorem for subsets of the unit cube
 
Properties of bivariate and conditional Gaussian PDFs
Properties of bivariate and conditional Gaussian PDFsProperties of bivariate and conditional Gaussian PDFs
Properties of bivariate and conditional Gaussian PDFs
 
Differential Calculus
Differential Calculus Differential Calculus
Differential Calculus
 
Nested sampling
Nested samplingNested sampling
Nested sampling
 
2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...
2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...
2018 MUMS Fall Course - Mathematical surrogate and reduced-order models - Ral...
 
Applications of Differential Calculus in real life
Applications of Differential Calculus in real life Applications of Differential Calculus in real life
Applications of Differential Calculus in real life
 
Time-Series Analysis on Multiperiodic Conditional Correlation by Sparse Covar...
Time-Series Analysis on Multiperiodic Conditional Correlation by Sparse Covar...Time-Series Analysis on Multiperiodic Conditional Correlation by Sparse Covar...
Time-Series Analysis on Multiperiodic Conditional Correlation by Sparse Covar...
 
Docslide.us 2002 formulae-and-tables
Docslide.us 2002 formulae-and-tablesDocslide.us 2002 formulae-and-tables
Docslide.us 2002 formulae-and-tables
 
cps170_bayes_nets.ppt
cps170_bayes_nets.pptcps170_bayes_nets.ppt
cps170_bayes_nets.ppt
 

More from Gabriel Peyré

Adaptive Signal and Image Processing
Adaptive Signal and Image ProcessingAdaptive Signal and Image Processing
Adaptive Signal and Image ProcessingGabriel Peyré
 
Mesh Processing Course : Introduction
Mesh Processing Course : IntroductionMesh Processing Course : Introduction
Mesh Processing Course : IntroductionGabriel Peyré
 
Mesh Processing Course : Geodesics
Mesh Processing Course : GeodesicsMesh Processing Course : Geodesics
Mesh Processing Course : GeodesicsGabriel Peyré
 
Mesh Processing Course : Geodesic Sampling
Mesh Processing Course : Geodesic SamplingMesh Processing Course : Geodesic Sampling
Mesh Processing Course : Geodesic SamplingGabriel Peyré
 
Mesh Processing Course : Differential Calculus
Mesh Processing Course : Differential CalculusMesh Processing Course : Differential Calculus
Mesh Processing Course : Differential CalculusGabriel Peyré
 
Signal Processing Course : Theory for Sparse Recovery
Signal Processing Course : Theory for Sparse RecoverySignal Processing Course : Theory for Sparse Recovery
Signal Processing Course : Theory for Sparse RecoveryGabriel Peyré
 
Signal Processing Course : Presentation of the Course
Signal Processing Course : Presentation of the CourseSignal Processing Course : Presentation of the Course
Signal Processing Course : Presentation of the CourseGabriel Peyré
 
Signal Processing Course : Orthogonal Bases
Signal Processing Course : Orthogonal BasesSignal Processing Course : Orthogonal Bases
Signal Processing Course : Orthogonal BasesGabriel Peyré
 
Signal Processing Course : Sparse Regularization of Inverse Problems
Signal Processing Course : Sparse Regularization of Inverse ProblemsSignal Processing Course : Sparse Regularization of Inverse Problems
Signal Processing Course : Sparse Regularization of Inverse ProblemsGabriel Peyré
 
Signal Processing Course : Fourier
Signal Processing Course : FourierSignal Processing Course : Fourier
Signal Processing Course : FourierGabriel Peyré
 
Signal Processing Course : Denoising
Signal Processing Course : DenoisingSignal Processing Course : Denoising
Signal Processing Course : DenoisingGabriel Peyré
 
Signal Processing Course : Compressed Sensing
Signal Processing Course : Compressed SensingSignal Processing Course : Compressed Sensing
Signal Processing Course : Compressed SensingGabriel Peyré
 
Signal Processing Course : Approximation
Signal Processing Course : ApproximationSignal Processing Course : Approximation
Signal Processing Course : ApproximationGabriel Peyré
 
Signal Processing Course : Wavelets
Signal Processing Course : WaveletsSignal Processing Course : Wavelets
Signal Processing Course : WaveletsGabriel Peyré
 
Sparsity and Compressed Sensing
Sparsity and Compressed SensingSparsity and Compressed Sensing
Sparsity and Compressed SensingGabriel Peyré
 
Optimal Transport in Imaging Sciences
Optimal Transport in Imaging SciencesOptimal Transport in Imaging Sciences
Optimal Transport in Imaging SciencesGabriel Peyré
 
An Introduction to Optimal Transport
An Introduction to Optimal TransportAn Introduction to Optimal Transport
An Introduction to Optimal TransportGabriel Peyré
 
A Review of Proximal Methods, with a New One
A Review of Proximal Methods, with a New OneA Review of Proximal Methods, with a New One
A Review of Proximal Methods, with a New OneGabriel Peyré
 

More from Gabriel Peyré (18)

Adaptive Signal and Image Processing
Adaptive Signal and Image ProcessingAdaptive Signal and Image Processing
Adaptive Signal and Image Processing
 
Mesh Processing Course : Introduction
Mesh Processing Course : IntroductionMesh Processing Course : Introduction
Mesh Processing Course : Introduction
 
Mesh Processing Course : Geodesics
Mesh Processing Course : GeodesicsMesh Processing Course : Geodesics
Mesh Processing Course : Geodesics
 
Mesh Processing Course : Geodesic Sampling
Mesh Processing Course : Geodesic SamplingMesh Processing Course : Geodesic Sampling
Mesh Processing Course : Geodesic Sampling
 
Mesh Processing Course : Differential Calculus
Mesh Processing Course : Differential CalculusMesh Processing Course : Differential Calculus
Mesh Processing Course : Differential Calculus
 
Signal Processing Course : Theory for Sparse Recovery
Signal Processing Course : Theory for Sparse RecoverySignal Processing Course : Theory for Sparse Recovery
Signal Processing Course : Theory for Sparse Recovery
 
Signal Processing Course : Presentation of the Course
Signal Processing Course : Presentation of the CourseSignal Processing Course : Presentation of the Course
Signal Processing Course : Presentation of the Course
 
Signal Processing Course : Orthogonal Bases
Signal Processing Course : Orthogonal BasesSignal Processing Course : Orthogonal Bases
Signal Processing Course : Orthogonal Bases
 
Signal Processing Course : Sparse Regularization of Inverse Problems
Signal Processing Course : Sparse Regularization of Inverse ProblemsSignal Processing Course : Sparse Regularization of Inverse Problems
Signal Processing Course : Sparse Regularization of Inverse Problems
 
Signal Processing Course : Fourier
Signal Processing Course : FourierSignal Processing Course : Fourier
Signal Processing Course : Fourier
 
Signal Processing Course : Denoising
Signal Processing Course : DenoisingSignal Processing Course : Denoising
Signal Processing Course : Denoising
 
Signal Processing Course : Compressed Sensing
Signal Processing Course : Compressed SensingSignal Processing Course : Compressed Sensing
Signal Processing Course : Compressed Sensing
 
Signal Processing Course : Approximation
Signal Processing Course : ApproximationSignal Processing Course : Approximation
Signal Processing Course : Approximation
 
Signal Processing Course : Wavelets
Signal Processing Course : WaveletsSignal Processing Course : Wavelets
Signal Processing Course : Wavelets
 
Sparsity and Compressed Sensing
Sparsity and Compressed SensingSparsity and Compressed Sensing
Sparsity and Compressed Sensing
 
Optimal Transport in Imaging Sciences
Optimal Transport in Imaging SciencesOptimal Transport in Imaging Sciences
Optimal Transport in Imaging Sciences
 
An Introduction to Optimal Transport
An Introduction to Optimal TransportAn Introduction to Optimal Transport
An Introduction to Optimal Transport
 
A Review of Proximal Methods, with a New One
A Review of Proximal Methods, with a New OneA Review of Proximal Methods, with a New One
A Review of Proximal Methods, with a New One
 

Recently uploaded

Education and training program in the hospital APR.pptx
Education and training program in the hospital APR.pptxEducation and training program in the hospital APR.pptx
Education and training program in the hospital APR.pptxraviapr7
 
Quality Assurance_GOOD LABORATORY PRACTICE
Quality Assurance_GOOD LABORATORY PRACTICEQuality Assurance_GOOD LABORATORY PRACTICE
Quality Assurance_GOOD LABORATORY PRACTICESayali Powar
 
CHUYÊN ĐỀ DẠY THÊM TIẾNG ANH LỚP 11 - GLOBAL SUCCESS - NĂM HỌC 2023-2024 - HK...
CHUYÊN ĐỀ DẠY THÊM TIẾNG ANH LỚP 11 - GLOBAL SUCCESS - NĂM HỌC 2023-2024 - HK...CHUYÊN ĐỀ DẠY THÊM TIẾNG ANH LỚP 11 - GLOBAL SUCCESS - NĂM HỌC 2023-2024 - HK...
CHUYÊN ĐỀ DẠY THÊM TIẾNG ANH LỚP 11 - GLOBAL SUCCESS - NĂM HỌC 2023-2024 - HK...Nguyen Thanh Tu Collection
 
What is the Future of QuickBooks DeskTop?
What is the Future of QuickBooks DeskTop?What is the Future of QuickBooks DeskTop?
What is the Future of QuickBooks DeskTop?TechSoup
 
Benefits & Challenges of Inclusive Education
Benefits & Challenges of Inclusive EducationBenefits & Challenges of Inclusive Education
Benefits & Challenges of Inclusive EducationMJDuyan
 
Practical Research 1 Lesson 9 Scope and delimitation.pptx
Practical Research 1 Lesson 9 Scope and delimitation.pptxPractical Research 1 Lesson 9 Scope and delimitation.pptx
Practical Research 1 Lesson 9 Scope and delimitation.pptxKatherine Villaluna
 
3.21.24 The Origins of Black Power.pptx
3.21.24  The Origins of Black Power.pptx3.21.24  The Origins of Black Power.pptx
3.21.24 The Origins of Black Power.pptxmary850239
 
M-2- General Reactions of amino acids.pptx
M-2- General Reactions of amino acids.pptxM-2- General Reactions of amino acids.pptx
M-2- General Reactions of amino acids.pptxDr. Santhosh Kumar. N
 
2024.03.23 What do successful readers do - Sandy Millin for PARK.pptx
2024.03.23 What do successful readers do - Sandy Millin for PARK.pptx2024.03.23 What do successful readers do - Sandy Millin for PARK.pptx
2024.03.23 What do successful readers do - Sandy Millin for PARK.pptxSandy Millin
 
How to Make a Field read-only in Odoo 17
How to Make a Field read-only in Odoo 17How to Make a Field read-only in Odoo 17
How to Make a Field read-only in Odoo 17Celine George
 
PISA-VET launch_El Iza Mohamedou_19 March 2024.pptx
PISA-VET launch_El Iza Mohamedou_19 March 2024.pptxPISA-VET launch_El Iza Mohamedou_19 March 2024.pptx
PISA-VET launch_El Iza Mohamedou_19 March 2024.pptxEduSkills OECD
 
CapTechU Doctoral Presentation -March 2024 slides.pptx
CapTechU Doctoral Presentation -March 2024 slides.pptxCapTechU Doctoral Presentation -March 2024 slides.pptx
CapTechU Doctoral Presentation -March 2024 slides.pptxCapitolTechU
 
Presentation on the Basics of Writing. Writing a Paragraph
Presentation on the Basics of Writing. Writing a ParagraphPresentation on the Basics of Writing. Writing a Paragraph
Presentation on the Basics of Writing. Writing a ParagraphNetziValdelomar1
 
Practical Research 1: Lesson 8 Writing the Thesis Statement.pptx
Practical Research 1: Lesson 8 Writing the Thesis Statement.pptxPractical Research 1: Lesson 8 Writing the Thesis Statement.pptx
Practical Research 1: Lesson 8 Writing the Thesis Statement.pptxKatherine Villaluna
 
How to Show Error_Warning Messages in Odoo 17
How to Show Error_Warning Messages in Odoo 17How to Show Error_Warning Messages in Odoo 17
How to Show Error_Warning Messages in Odoo 17Celine George
 
UKCGE Parental Leave Discussion March 2024
UKCGE Parental Leave Discussion March 2024UKCGE Parental Leave Discussion March 2024
UKCGE Parental Leave Discussion March 2024UKCGE
 
CAULIFLOWER BREEDING 1 Parmar pptx
CAULIFLOWER BREEDING 1 Parmar pptxCAULIFLOWER BREEDING 1 Parmar pptx
CAULIFLOWER BREEDING 1 Parmar pptxSaurabhParmar42
 
Diploma in Nursing Admission Test Question Solution 2023.pdf
Diploma in Nursing Admission Test Question Solution 2023.pdfDiploma in Nursing Admission Test Question Solution 2023.pdf
Diploma in Nursing Admission Test Question Solution 2023.pdfMohonDas
 
Easter in the USA presentation by Chloe.
Easter in the USA presentation by Chloe.Easter in the USA presentation by Chloe.
Easter in the USA presentation by Chloe.EnglishCEIPdeSigeiro
 

Recently uploaded (20)

Education and training program in the hospital APR.pptx
Education and training program in the hospital APR.pptxEducation and training program in the hospital APR.pptx
Education and training program in the hospital APR.pptx
 
Quality Assurance_GOOD LABORATORY PRACTICE
Quality Assurance_GOOD LABORATORY PRACTICEQuality Assurance_GOOD LABORATORY PRACTICE
Quality Assurance_GOOD LABORATORY PRACTICE
 
CHUYÊN ĐỀ DẠY THÊM TIẾNG ANH LỚP 11 - GLOBAL SUCCESS - NĂM HỌC 2023-2024 - HK...
CHUYÊN ĐỀ DẠY THÊM TIẾNG ANH LỚP 11 - GLOBAL SUCCESS - NĂM HỌC 2023-2024 - HK...CHUYÊN ĐỀ DẠY THÊM TIẾNG ANH LỚP 11 - GLOBAL SUCCESS - NĂM HỌC 2023-2024 - HK...
CHUYÊN ĐỀ DẠY THÊM TIẾNG ANH LỚP 11 - GLOBAL SUCCESS - NĂM HỌC 2023-2024 - HK...
 
What is the Future of QuickBooks DeskTop?
What is the Future of QuickBooks DeskTop?What is the Future of QuickBooks DeskTop?
What is the Future of QuickBooks DeskTop?
 
Benefits & Challenges of Inclusive Education
Benefits & Challenges of Inclusive EducationBenefits & Challenges of Inclusive Education
Benefits & Challenges of Inclusive Education
 
Practical Research 1 Lesson 9 Scope and delimitation.pptx
Practical Research 1 Lesson 9 Scope and delimitation.pptxPractical Research 1 Lesson 9 Scope and delimitation.pptx
Practical Research 1 Lesson 9 Scope and delimitation.pptx
 
3.21.24 The Origins of Black Power.pptx
3.21.24  The Origins of Black Power.pptx3.21.24  The Origins of Black Power.pptx
3.21.24 The Origins of Black Power.pptx
 
M-2- General Reactions of amino acids.pptx
M-2- General Reactions of amino acids.pptxM-2- General Reactions of amino acids.pptx
M-2- General Reactions of amino acids.pptx
 
2024.03.23 What do successful readers do - Sandy Millin for PARK.pptx
2024.03.23 What do successful readers do - Sandy Millin for PARK.pptx2024.03.23 What do successful readers do - Sandy Millin for PARK.pptx
2024.03.23 What do successful readers do - Sandy Millin for PARK.pptx
 
How to Make a Field read-only in Odoo 17
How to Make a Field read-only in Odoo 17How to Make a Field read-only in Odoo 17
How to Make a Field read-only in Odoo 17
 
Finals of Kant get Marx 2.0 : a general politics quiz
Finals of Kant get Marx 2.0 : a general politics quizFinals of Kant get Marx 2.0 : a general politics quiz
Finals of Kant get Marx 2.0 : a general politics quiz
 
PISA-VET launch_El Iza Mohamedou_19 March 2024.pptx
PISA-VET launch_El Iza Mohamedou_19 March 2024.pptxPISA-VET launch_El Iza Mohamedou_19 March 2024.pptx
PISA-VET launch_El Iza Mohamedou_19 March 2024.pptx
 
CapTechU Doctoral Presentation -March 2024 slides.pptx
CapTechU Doctoral Presentation -March 2024 slides.pptxCapTechU Doctoral Presentation -March 2024 slides.pptx
CapTechU Doctoral Presentation -March 2024 slides.pptx
 
Presentation on the Basics of Writing. Writing a Paragraph
Presentation on the Basics of Writing. Writing a ParagraphPresentation on the Basics of Writing. Writing a Paragraph
Presentation on the Basics of Writing. Writing a Paragraph
 
Practical Research 1: Lesson 8 Writing the Thesis Statement.pptx
Practical Research 1: Lesson 8 Writing the Thesis Statement.pptxPractical Research 1: Lesson 8 Writing the Thesis Statement.pptx
Practical Research 1: Lesson 8 Writing the Thesis Statement.pptx
 
How to Show Error_Warning Messages in Odoo 17
How to Show Error_Warning Messages in Odoo 17How to Show Error_Warning Messages in Odoo 17
How to Show Error_Warning Messages in Odoo 17
 
UKCGE Parental Leave Discussion March 2024
UKCGE Parental Leave Discussion March 2024UKCGE Parental Leave Discussion March 2024
UKCGE Parental Leave Discussion March 2024
 
CAULIFLOWER BREEDING 1 Parmar pptx
CAULIFLOWER BREEDING 1 Parmar pptxCAULIFLOWER BREEDING 1 Parmar pptx
CAULIFLOWER BREEDING 1 Parmar pptx
 
Diploma in Nursing Admission Test Question Solution 2023.pdf
Diploma in Nursing Admission Test Question Solution 2023.pdfDiploma in Nursing Admission Test Question Solution 2023.pdf
Diploma in Nursing Admission Test Question Solution 2023.pdf
 
Easter in the USA presentation by Chloe.
Easter in the USA presentation by Chloe.Easter in the USA presentation by Chloe.
Easter in the USA presentation by Chloe.
 

Model Selection with Piecewise Regular Gauges

  • 1. Gabriel Peyré www.numerical-tours.com Model Selection with Piecewise Regular Gauges Samuel Vaiter Charles Deledalle Jalal Fadili Joint work with: Joseph Salmon VISI N
  • 2. Overview • Inverse Problems • Gauge Decomposition and Model Selection • L2 Stability Performances • Model Stability Performances
  • 3. y = x0 + w 2 RP Inverse Problems Recovering x0 RN from noisy observations
  • 4. Examples: Inpainting, super-resolution, compressed-sensing y = x0 + w 2 RP Inverse Problems Recovering x0 RN from noisy observations x0 x0
  • 5. Regularized inversion: Estimators x(y) 2 argmin x2RN 1 2 ||y x||2 + J(x) Data fidelity Regularity Observations: y = x0 + w 2 RP .
  • 6. L2 error stability: ||x(y) x0|| = O(||w||). Promoted subspace (“model”) stability. Goal: Performance analysis: Regularized inversion: Estimators x(y) 2 argmin x2RN 1 2 ||y x||2 + J(x) ! Criteria on (x0, ||w||, ) to ensure Data fidelity Regularity Observations: y = x0 + w 2 RP .
  • 7. Overview • Inverse Problems • Gauge Decomposition and Model Selection • L2 Stability Performances • Model Stability Performances
  • 8. Coe cients x Image x Union of Linear Models for Data Processing Union of models: T 2 T linear spaces. Synthesis sparsity: T
  • 9. Coe cients x Image x Union of Linear Models for Data Processing Union of models: T 2 T linear spaces. Synthesis sparsity: T Structured sparsity:
  • 10. Coe cients x Image x Union of Linear Models for Data Processing D Image x Gradient D⇤ x Union of models: T 2 T linear spaces. Synthesis sparsity: T Structured sparsity: Analysis sparsity:
  • 11. Coe cients x Image x Multi-spectral imaging: xi,· = Pr j=1 Ai,jSj,· Union of Linear Models for Data Processing D Image x Gradient D⇤ x Union of models: T 2 T linear spaces. Synthesis sparsity: T Structured sparsity: Analysis sparsity: Low-rank: S1,· S2,· S3,·x
  • 12. Gauge: J : RN ! R+ 8 ↵ 2 R+ , J(↵x) = ↵J(x) Gauges for Union of Linear Models Convex
  • 13. Gauge: J : RN ! R+ 8 ↵ 2 R+ , J(↵x) = ↵J(x) J(x) = C(x) = inf {⇢ > 0 x 2 ⇢C} C = {x J(x) 6 1} (assuming 0 2 C) Gauges for Union of Linear Models J(x) C 1 Convex
  • 14. Gauge: , Union of linear models (T)T 2TPiecewise regular ball J : RN ! R+ 8 ↵ 2 R+ , J(↵x) = ↵J(x) J(x) = C(x) = inf {⇢ > 0 x 2 ⇢C} C = {x J(x) 6 1} (assuming 0 2 C) Gauges for Union of Linear Models J(x) C 1 Convex
  • 15. x T Gauge: , Union of linear models (T)T 2TPiecewise regular ball J : RN ! R+ 8 ↵ 2 R+ , J(↵x) = ↵J(x) J(x) = C(x) = inf {⇢ > 0 x 2 ⇢C} C = {x J(x) 6 1} (assuming 0 2 C) Gauges for Union of Linear Models J(x) = ||x||1 T = sparse vectors J(x) C 1 Convex
  • 16. x T Gauge: , Union of linear models (T)T 2TPiecewise regular ball J : RN ! R+ 8 ↵ 2 R+ , J(↵x) = ↵J(x) J(x) = C(x) = inf {⇢ > 0 x 2 ⇢C} C = {x J(x) 6 1} (assuming 0 2 C) Gauges for Union of Linear Models J(x) = ||x||1 x0 T0 T = sparse vectors J(x) C 1 Convex
  • 17. x T Gauge: , Union of linear models (T)T 2TPiecewise regular ball J : RN ! R+ 8 ↵ 2 R+ , J(↵x) = ↵J(x) J(x) = C(x) = inf {⇢ > 0 x 2 ⇢C} C = {x J(x) 6 1} (assuming 0 2 C) Gauges for Union of Linear Models J(x) = ||x||1 x0 T0 T = sparse vectors |x1|+||x2,3|| x0 xT T0 T = block vectors sparse J(x) C 1 Convex
  • 18. x T Gauge: , Union of linear models (T)T 2TPiecewise regular ball J : RN ! R+ 8 ↵ 2 R+ , J(↵x) = ↵J(x) J(x) = C(x) = inf {⇢ > 0 x 2 ⇢C} C = {x J(x) 6 1} (assuming 0 2 C) Gauges for Union of Linear Models J(x) = ||x||1 T = low-rank matrices J(x) = ||x||⇤ x x0 T0 T = sparse vectors |x1|+||x2,3|| x0 xT T0 T = block vectors sparse J(x) C 1 Convex
  • 19. x T Gauge: , Union of linear models (T)T 2TPiecewise regular ball J : RN ! R+ 8 ↵ 2 R+ , J(↵x) = ↵J(x) J(x) = C(x) = inf {⇢ > 0 x 2 ⇢C} C = {x J(x) 6 1} (assuming 0 2 C) Gauges for Union of Linear Models J(x) = ||x||1 T = low-rank matrices J(x) = ||x||⇤ x x0 T0 T = anti- sparse vectors J(x) = ||x||1 x x0 T0 T = sparse vectors |x1|+||x2,3|| x0 xT T0 T = block vectors sparse J(x) C 1 Convex
  • 20. Subdifferentials and Models @J(x) = ⌘ 2 RN 8 y, J(y) > J(x) + h⌘, y xi
  • 21. I = supp(x) = {i xi 6= 0} Subdifferentials and Models Example: J(x) = ||x||1 @||x||1 = ⇢ ⌘ supp(⌘) = I, 8 j /2 I, |⌘j| 6 1 x @J(x) 0 @J(x) = ⌘ 2 RN 8 y, J(y) > J(x) + h⌘, y xi
  • 22. x@J(x) 0 I = supp(x) = {i xi 6= 0} Subdifferentials and Models Example: J(x) = ||x||1 @||x||1 = ⇢ ⌘ supp(⌘) = I, 8 j /2 I, |⌘j| 6 1 x @J(x) 0 @J(x) = ⌘ 2 RN 8 y, J(y) > J(x) + h⌘, y xi
  • 23. Tx x@J(x) 0 Definition: I = supp(x) = {i xi 6= 0} Tx = VectHull(@J(x))? Subdifferentials and Models Tx = {⌘ supp(⌘) = I} Example: J(x) = ||x||1 @||x||1 = ⇢ ⌘ supp(⌘) = I, 8 j /2 I, |⌘j| 6 1 Tx x @J(x) 0 @J(x) = ⌘ 2 RN 8 y, J(y) > J(x) + h⌘, y xi
  • 24. Tx x@J(x) 0 Definition: I = supp(x) = {i xi 6= 0} Tx = VectHull(@J(x))? Subdifferentials and Models ex ex = ProjTx (@J(x)) ex = sign(x) Tx = {⌘ supp(⌘) = I} Example: J(x) = ||x||1 @||x||1 = ⇢ ⌘ supp(⌘) = I, 8 j /2 I, |⌘j| 6 1 ex Tx x @J(x) 0 @J(x) = ⌘ 2 RN 8 y, J(y) > J(x) + h⌘, y xi
  • 25. Examples `1 sparsity: J(x) = ||x||1 ex = sign(x) Tx = {z supp(z) ⇢ supp(x)} x0 x @J(x)
  • 26. Examples x0 x @J(x) `1 sparsity: J(x) = ||x||1 ex = sign(x) Tx = {z supp(z) ⇢ supp(x)} ex = (N(xb))b2B N(a) = a/||a||Structured sparsity: J(x) = P b ||xb|| Tx = {z supp(z) ⇢ supp(x)} x0 x @J(x)
  • 27. Examples x0 x @J(x) Tx = {z U⇤ ?zV? = 0}ex = UV ⇤ Nuclear norm: J(x) = ||x||⇤ x = U⇤V ⇤ SVD: `1 sparsity: J(x) = ||x||1 ex = sign(x) Tx = {z supp(z) ⇢ supp(x)} ex = (N(xb))b2B N(a) = a/||a||Structured sparsity: J(x) = P b ||xb|| Tx = {z supp(z) ⇢ supp(x)} x @J(x) x0 x @J(x)
  • 28. Examples x0 x @J(x) x x0 @J(x) I = {i |xi| = ||x||1}Anti-sparsity: J(x) = ||x||1 Tx = {y yI / sign(xI)}ex = |I| 1 sign(x) Tx = {z U⇤ ?zV? = 0}ex = UV ⇤ Nuclear norm: J(x) = ||x||⇤ x = U⇤V ⇤ SVD: `1 sparsity: J(x) = ||x||1 ex = sign(x) Tx = {z supp(z) ⇢ supp(x)} ex = (N(xb))b2B N(a) = a/||a||Structured sparsity: J(x) = P b ||xb|| Tx = {z supp(z) ⇢ supp(x)} x @J(x) x0 x @J(x)
  • 29. Overview • Inverse Problems • Gauge Decomposition and Model Selection • L2 Stability Performances • Model Stability Performances
  • 30. Noiseless recovery: min x= x0 J(x) (P0) x = x0 Dual Certificate and L2 Stability x?
  • 31. Noiseless recovery: min x= x0 J(x) (P0) Dual certificates: x = x0 ⌘ Proposition: D = Im( ⇤ ) @J(x0) 9 ⌘ 2 D () x0 solution of (P0) Dual Certificate and L2 Stability @J(x0) x?
  • 32. Noiseless recovery: min x= x0 J(x) (P0) Dual certificates: Tight dual certificates: x = x0 ⌘ Proposition: D = Im( ⇤ ) @J(x0) ¯D = Im( ⇤ ) ri(@J(x0)) 9 ⌘ 2 D () x0 solution of (P0) Dual Certificate and L2 Stability @J(x0) x?
  • 33. Noiseless recovery: min x= x0 J(x) (P0) Dual certificates: Tight dual certificates: x = x0 ⌘ Proposition: D = Im( ⇤ ) @J(x0) ¯D = Im( ⇤ ) ri(@J(x0)) 9 ⌘ 2 D () x0 solution of (P0) Dual Certificate and L2 Stability @J(x0) x? Theorem: [Fadili et al. 2013] for ⇠ ||w|| one has ||x? x0|| = O(||w||) If 9 ⌘ 2 ¯D and ker( ) Tx0 = {0}
  • 34. Noiseless recovery: min x= x0 J(x) (P0) Dual certificates: Tight dual certificates: x = x0 ⌘ Proposition: ! The constants depend on N . . . D = Im( ⇤ ) @J(x0) ¯D = Im( ⇤ ) ri(@J(x0)) 9 ⌘ 2 D () x0 solution of (P0) Dual Certificate and L2 Stability @J(x0) x? Theorem: [Fadili et al. 2013] for ⇠ ||w|| one has ||x? x0|| = O(||w||) If 9 ⌘ 2 ¯D and ker( ) Tx0 = {0}
  • 35. Noiseless recovery: min x= x0 J(x) (P0) Dual certificates: Tight dual certificates: x = x0 ⌘ Proposition: [Grassmair 2012]: J(x? x0) = O(||w||). [Grassmair, Haltmeier, Scherzer 2010]: J = || · ||1. ! The constants depend on N . . . D = Im( ⇤ ) @J(x0) ¯D = Im( ⇤ ) ri(@J(x0)) 9 ⌘ 2 D () x0 solution of (P0) Dual Certificate and L2 Stability @J(x0) x? Theorem: [Fadili et al. 2013] for ⇠ ||w|| one has ||x? x0|| = O(||w||) If 9 ⌘ 2 ¯D and ker( ) Tx0 = {0}
  • 36. Overview • Inverse Problems • Gauge Decomposition and Model Selection • L2 Stability Performances • Model Stability Performances
  • 37. ⌘ 2 D () and J (⌘) = 1 Minimal-norm Certificate ⌘ = ⇤ q ⌘T = e ⇢ T = Tx0 e = ex0
  • 38. ⌘ 2 D () We assume ker( ) T = {0} and J piecewise regular. and J (⌘) = 1 Minimal-norm Certificate ⌘ = ⇤ q ⌘T = e ⇢ T = Tx0 e = ex0
  • 39. ⌘0 = argmin ⌘= ⇤q,⌘T =e ||q|| ⌘ 2 D () We assume ker( ) T = {0} and J piecewise regular. and J (⌘) = 1 Minimal-norm Certificate ⌘ = ⇤ q ⌘T = e Minimal-norm pre-certificate: ⇢ T = Tx0 e = ex0
  • 40. ⌘0 = argmin ⌘= ⇤q,⌘T =e ||q|| ⌘ 2 D () We assume ker( ) T = {0} and J piecewise regular. and J (⌘) = 1 Minimal-norm Certificate Proposition: One has ⌘ = ⇤ q ⌘T = e Minimal-norm pre-certificate: ⇢ T = Tx0 e = ex0 ⌘0 = ( + T )⇤ e
  • 41. ⌘0 = argmin ⌘= ⇤q,⌘T =e ||q|| ⌘ 2 D () We assume ker( ) T = {0} and J piecewise regular. and J (⌘) = 1 Minimal-norm Certificate Proposition: ||w|| = O(⌫x0 ) and ⇠ ||w||,Theorem: the unique solution x? of P (y) for y = x0 + w satisfies Tx? = Tx0 and ||x? x0|| = O(||w||) [Vaiter et al. 2013] One has ⌘ = ⇤ q ⌘T = e Minimal-norm pre-certificate: ⇢ T = Tx0 e = ex0 ⌘0 = ( + T )⇤ e If ⌘0 2 ¯D,
  • 42. [Fuchs 2004]: J = || · ||1. [Bach 2008]: J = || · ||1,2 and J = || · ||⇤. [Vaiter et al. 2011]: J = ||D⇤ · ||1. ⌘0 = argmin ⌘= ⇤q,⌘T =e ||q|| ⌘ 2 D () We assume ker( ) T = {0} and J piecewise regular. and J (⌘) = 1 Minimal-norm Certificate Proposition: ||w|| = O(⌫x0 ) and ⇠ ||w||,Theorem: the unique solution x? of P (y) for y = x0 + w satisfies Tx? = Tx0 and ||x? x0|| = O(||w||) [Vaiter et al. 2013] One has ⌘ = ⇤ q ⌘T = e Minimal-norm pre-certificate: ⇢ T = Tx0 e = ex0 ⌘0 = ( + T )⇤ e If ⌘0 2 ¯D,
  • 43. ⇥x = i xi (· i) Increasing : reduces correlation. reduces resolution. J(x) = ||x||1 Example: 1-D Sparse Deconvolution x0 x0
  • 44. ⇥x = i xi (· i) Increasing : reduces correlation. reduces resolution. 0 10 2 support recovery. J(x) = ||x||1 () ||⌘0,Ic ||1 < 1 ⌘0 2 ¯D(x0) I = {j x0(j) 6= 0} ||⌘0,Ic ||1 Example: 1-D Sparse Deconvolution x0 x0 20 1 ()
  • 45. J(x) = ||rx||1 (rx)i = xi xi 1 = Id I = {i (rx0)i 6= 0} 8 j /2 I, ( ↵0)j = 0⌘0 = div(↵0) where Example: 1-D TV Denoising x0
  • 46. +1 1 I J Support stability. J(x) = ||rx||1 (rx)i = xi xi 1 = Id I = {i (rx0)i 6= 0} 8 j /2 I, ( ↵0)j = 0⌘0 = div(↵0) where ||↵0,Ic || < 1 Example: 1-D TV Denoising x0 x0
  • 47. +1 1 I J `2 stability onlySupport stability. x0 J(x) = ||rx||1 (rx)i = xi xi 1 = Id I = {i (rx0)i 6= 0} 8 j /2 I, ( ↵0)j = 0⌘0 = div(↵0) where ||↵0,Ic || < 1 ||↵0,Ic || = 1 Example: 1-D TV Denoising +1 1 J x0 x0
  • 48. Gauges: encode linear models as singular points. Conclusion
  • 49. Piecewise smooth gauges: enable model recovery Tx? = Tx0 . Gauges: encode linear models as singular points. Tight dual certificates: enables L2 stability. Conclusion
  • 50. Piecewise smooth gauges: enable model recovery Tx? = Tx0 . – Approximate model recovery Tx? ⇡ Tx0 . Gauges: encode linear models as singular points. – Infinite dimensional problems (measures, TV, etc.). Tight dual certificates: enables L2 stability. Conclusion Open problems: