The presentation in ICML2016 at New York, USA on June 20, 2016.
We propose a novel Riemannian manifold preconditioning approach for the tensor completion problem with rank constraint. A novel Riemannian metric or inner product is proposed that exploits the least-squares structure of the cost function and takes into account the structured symmetry that exists in Tucker decomposition. The specific metric allows to use the versatile framework of Riemannian optimization on quotient manifolds to develop preconditioned nonlinear conjugate gradient and stochastic gradient descent algorithms for batch and online setups, respectively. Concrete matrix representations of various optimization-related ingredients are listed. Numerical comparisons suggest that our proposed algorithms robustly outperform state-of-the-art algorithms across different synthetic and real-world datasets.
Defining Constituents, Data Vizzes and Telling a Data Story
ICML2016: Low-rank tensor completion: a Riemannian manifold preconditioning approach
1. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Low-rank tensor completion:
a Riemannian manifold preconditioning approach
Hiroyuki Kasai † and Bamdev Mishra††
†The University of Electro-Communications, Japan
††Amazon Development Centre India, India
ICML2016, New York, USA, June 20, 2016
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 1
2. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Tensor completion problem
▶ Goal: estimate entries of unknown or missing values.
Matrix completion problem Tensor completion problem
▶ Applications: collaborative filtering, signal recovery, etc.
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 2
3. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Our contributions (Summary)
▶ Address a Riemannian optimization framework.
▶ Propose a novel Riemannian metric by exploiting the
symmetry of constraints and the structure of the cost.
(Riemannian manifold preconditioning)
▶ Develop concrete matrix expressions and algorithms with this
novel metric.
▶ Show superior performances of our proposed algorithms for
low-sampling and ill-conditioned large-scale data (1012).
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 3
4. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Mathematical problem definition
▶ 3-order tensors X ∈ Rn1×n2×n3 is addressed hereafter.
Definition (Tensor completion problem with fixed-rank)
min
X∈Rn1×n2×n3
1
|Ω|
∥PΩ(X) − PΩ(X⋆
)∥2
F
subject to rank(X) = r,
(1)
▶ PΩ(·) extracts only elements in the observed set Ω.
▶ rank(X) = r = (r1, r2, r3) is multilinear rank of X.
▶ Higher scalability to huge-data due to NO expensive SVD
than nuclear norm regularization approaches.
▶ Assume a low-rank structure by Tucker decomposition.
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 4
5. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Tucker decomposition and Tucker manifold
▶ Tucker decomposition of rank r (=(r1, r2, r3)) is
[Kolda and Bader, 2009]
X = G×1U1×2U2×3U3 (∈ Rn1×n2×n3 )
▶ Ud ∈ St(rd, nd), i.e., Stiefel manifold of matrices of size
nd × rd with orthogonal columns, and G ∈ Rr1×r2×r3
.
▶ Tucker manifold M is defined as
M := St(r1, n1) × St(r2, n2) × St(r3, n3) × Rr1×r2×r3 .
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 5
6. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
The quotient structure of Tucker decomposition
▶ Solutions are not isolated, but equivalence classes due to
invariant property for Od ∈ O(rd),
[(U1, U2, U3, G)] :=
{(U1O1, U2O2, U3O3, G×1OT
1 ×2OT
2 ×3OT
3 ) : Od ∈ O(rd)}.
▶ Quotient manifold M/∼
M/∼ := M/(O(r1) × O(r2) × O(r3)),
⇓
▶ Solve (1) as an unconstrained optimization problem on a
Riemannian quotient manifold by endowing M/∼.
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 6
7. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
The least-squares structure of the cost function
▶ Problem (1) is convex and quadratic in X and
(U1, U2, U3, G) individually.
⇓
▶ Address the block diagonal approximation of ∥X − X⋆∥2
F for
a new metric.
((G1GT
1 ) ⊗ In1 , (G2GT
2 ) ⊗ In2 , (G3GT
3 ) ⊗ In3 , Ir1r2r3 ).
▶ As a result, exploit second-order information in first-order
algorithms.
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 7
8. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Propose a novel Riemannian metric
▶ Propose a metric from symmetry and least-squares structure.
Definition (New Riemannian metric)
gx : TxM × TxM → R is defined as
gx(ξx, ηx) = ⟨ξU1 , ηU1 (G1GT
1 )⟩ + ⟨ξU2 , ηU2 (G2GT
2 )⟩
+⟨ξU3 , ηU3 (G3GT
3 )⟩ + ⟨ξG, ηG⟩,
ξx = (ξU1
, ξU2
, ξU3
, ξG) and ηx = (ηU1
, ηU2
, ηU3
, ηG) ∈
TxM are tangent vectors.
▶ gx(ξx, ηx) is invariant to [x] (= [(U1, U2, U3, G)]).
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 8
9. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Geometry of quotient manifold
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 9
10. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Geometry of Tucker manifold
▶ Show concrete mathematical expressions.
▶ Propose a preconditioned nonlinear conjugate gradient.
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 10
11. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Numerical comparisons
▶ We compare with a number of state-of-the-art algorithms under
various scenarios.
▶ Tucker decomposition based algorithms;
▶ TOpt [Filipovi´c and Juki´c, 2013], geomCG
[Kressner et al., 2014]
▶ Nuclear norm minimization algorithms;
▶ HaLRTC [Liu et al., 2013], Latent [Tomioka et al., 2011],
Hard [Signoretto et al., 2014]
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 11
14. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Conclusions
▶ Tackled a low-rank tensor completion problem in a
Riemannian manifold preconditioning approach.
▶ Proposed a novel Riemannian metric that exploits the
fundamental structures of symmetry, due to non-uniqueness of
Tucker decomposition, and least-squares of the cost function.
▶ Use the versatile Riemannian optimization framework.
▶ Proposed a preconditioned nonlinear conjugate gradient
algorithm.
▶ Also proposed a stochastic gradient algorithm for online setup.
▶ Concrete matrix expressions are worked out.
▶ Numerical comparisons suggest that our proposed algorithm
has a superior performance on different benchmarks.
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 14
15. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Thank you for your attention
▶ Paper
▶ ICML paper:
http://jmlr.org/proceedings/papers/v48/kasai16.pdf
▶ ICML supplementary paper:
http://jmlr.org/proceedings/papers/v48/kasai16-supp.pdf
▶ Software (Matlab codes)
▶ Independent project:
http://bamdevmishra.com/codes/tensorcompletion/
▶ Built-in project in Manopt:
http://www.manopt.org/fixedrankfactory_tucker_
preconditioned
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 15
16. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
References I
▶ Absil, P.-A., Mahony, R., and Sepulchre, R. (2008).
Optimization Algorithms on Matrix Manifolds.
Princeton University Press.
▶ Boumal, N., Mishra, B., Absil, P.-A., and Sepulchre, R. (2014).
Manopt: a Matlab toolbox for optimization on manifolds.
J. Mach. Learn. Res., 15(1):1455–1459.
▶ Filipovi´c, M. and Juki´c, A. (2013).
Tucker factorization with missing data with application to low-n-rank tensor
completion.
Multidim. Syst. Sign. P.
Doi: 10.1007/s11045-013-0269-9.
▶ Foster, D. H., Nascimento, S. M. C., and Amano, K. (2007).
Information limits on neural identification of colored surfaces in natural scenes.
Visual Neurosci., 21(3):331–336.
▶ Kolda, T. G. and Bader, B. W. (2009).
Tensor decompositions and applications.
SIAM Review, 51(3):455–500.
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 16
17. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
References II
▶ Kressner, D., Steinlechner, M., and Vandereycken, B. (2014).
Low-rank tensor completion by Riemannian optimization.
BIT Numer. Math., 54(2):447–468.
▶ Liu, J., Musialski, P., Wonka, P., and Ye, J. (2013).
Tensor completion for estimating missing values in visual data.
IEEE Trans. Pattern Anal. Mach. Intell., 35(1):208–220.
▶ Ring, W. and Wirth, B. (2012).
Optimization methods on Riemannian manifolds and their application to shape
space.
SIAM J. Optim., 22(2):596–627.
▶ Sato, H. and Iwai, T. (2015).
A new, globally convergent Riemannian conjugate gradient method.
Optimization, 64(4):1011–1031.
▶ Signoretto, M., Dinh, Q. T., Lathauwer, L. D., and Suykens, J. A. K. (2014).
Learning with tensors: a framework based on convex optimization and spectral
regularization.
Mach. Learn., 94(3):303–351.
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 17
18. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
References III
▶ Tomioka, R., Hayashi, K., and Kashima, H. (2011).
Estimation of low-rank tensors via convex optimization.
Technical report, arXiv preprint arXiv:1010.0789.
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 18
20. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Proposal: Invariant property of new Riemannian metric
Proposition (Invariant property of new Riemannian metric)
Let (ξU1 , ξU2 , ξU3 , ξG) and (ηU1 , ηU2 , ηU3 , ηG) be tangent
vectors to the quotient manifold at (U1, U2, U3, G),
and (ξU1O1 , ξU2O2 , ξU3O3 , ξG×1OT
1 ×2OT
2 ×3OT
3 )
and (ηU1O1 , ηU2O2 , ηU3O3 , ηG×1OT
1 ×2OT
2 ×3OT
3
) be
tangent vectors to the quotient manifold at
(U1O1, U2O2, U3O3, G×1OT
1 ×2OT
2 ×3OT
3 ). The new
metric (2) are invariant along the equivalence class (6) as
g(U1,U2,U3,G)((ξU1 , ξU2 , ξU3 , ξG), (ηU1 , ηU2 , ηU3 , ηG))
= g(U1O1,U2O2,U3O3,G×1OT
1 ×2OT
2 ×3OT
3 )
((ξU1O1 , ξU2O2 , ξU3O3 , ξG×1OT
1 ×2OT
2 ×3OT
3
),
(ηU1O1 , ηU2O2 , ηU3O3 , ηG×1OT
1 ×2OT
2 ×3OT
3
)).
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 20
21. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Quotient manifold and horizontal lift
▶ x and y belong to the same equivalence class and they
represent a single point [x] := {y ∈ M : y ∼ x} on M/∼.
▶ Tangent space is decomposable into TxM = Vx ⊕ Hx, where
▶ the vertical space Vx is the tangent space of the equivalence
class [x], and
▶ the horizontal space Hx is the orthogonal subspace to Vx.
▶ Vx does NOT induce a displacement along the equivalence
class [x].
▶ horizontal lift
▶ a unique element ξx ∈ Hx that ξ[x] ∈ T[x](M/∼) at [x] has.
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 21
22. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Tangent space projection operator (1)
Normal space NxM
▶ Obtain from the extracted component of normal space to
TxM in the ambient space must locate on the tangent space.
▶ The following lemma can be shown for the normal space Nx;
Lemma (Normal space)
The quotient manifold endowed with the Riemannian metric
(2) has the matrix characterization of the normal space NxM
defined as
{(U1SU1 (G1GT
1 )−1
, U2SU2 (G2GT
2 )−1
, U3SU3 (G3GT
3 )−1
, 0)
: SUd
∈ Rrd×rd
, ST
Ud
= SUd
, for d ∈ {1, 2, 3}}.
where SUd
for all d ∈ {1, 2, 3} are symmetric matrices.
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 22
23. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Tangent space projection operator (2)
Tangent space projector Ψx
Proposition (Tangent space projection operator)
The quotient manifold endowed with the Riemannian met-
ric (2) admits a tangent projector Ψx : Rn1×r1 × Rn2×r2 ×
Rn3×r3 × Rr1×r2×r3 → TxM defined as
Ψx(YU1, YU2, YU3, YG) = (YU1−U1SU1 (G1GT
1 )−1,
YU2−U2SU2 (G2GT
2 )−1,
YU3−U3SU3 (G3GT
3 )−1, YG),
where SUd
is the solution to the Lyapunov equation below;
SUd
GdGT
d + GdGT
d SUd
= GdGT
d (YT
Ud
Ud + UT
d YUd
)GdGT
d
▶ They are solved efficiently with the Matlab’s lyap routine.
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 23
24. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Horizontal space projection operator (1)
Vertical space Vx and Horizontal space Hx
▶ This is obtained by
▶ Calculate the condition (Lemma below) of the horizontal space
from the orthogonal relationship with the vertical space.
▶ Remove the component along the vertical space, and the
remaining components must satisfy the condition above.
▶ The vertical space Vx has the matrix characterization
{(U1Ω1, U2Ω2, U3Ω3, −(G×1Ω1 + G×2Ω2 + G×3Ω3))
: Ωd ∈ Rrd×rd
, ΩT
d = −Ωd for d ∈ {1, 2, 3}}.
▶ where Ωd for all d ∈ {1, 2, 3} are skew symmetric matrices.
Lemma (Horizontal space)
Horizontal space ξx = (ξU1 , ξU2 , ξU3 , ξG) ∈ Hx must satisfy
(GdGT
d )ξT
Ud
Ud + ξGd
GT
d is symmetric for d ∈ {1, 2, 3}.
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 24
25. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Horizontal space projection operator (2)
Vertical space Vx and Horizontal space projector Πx
Proposition (Horizontal space projection operator)
The quotient manifold endowed with the Riemannian metric
(2) admits a horizontal projector Πx : TxM :→ Hx : ηx →
Πx(ηx) defined as
Πx(ηx) = (ηU1− U1Ω1, ηU2− U2Ω2, ηU3− U3Ω3,
ηG −(−(G×1Ω1+G×2Ω2+G×3Ω3))),
where ηx = (ηU1 , ηU2 , ηU3 , ηG) ∈ TxM and Ωd is a skew-
symmetric matrix of size rd × rd that is the solution to the
coupled Lyapunov equations below;
(continue to the next page.)
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 25
26. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Horizontal space projection operator (3)
Horizontal space projector Πx
Proposition (Horizontal space projection operator)
(continued from the previous page.)
G1GT
1 Ω1 + Ω1G1GT
1 − G1(Ir3 ⊗ Ω2)GT
1 − G1(Ω3 ⊗ Ir2 )GT
1
= Skew(UT
1 ηU1 G1GT
1 ) + Skew(G1ηT
G1
),
G2GT
2 Ω2 + Ω2G2GT
2 − G2(Ir3 ⊗ Ω1)GT
2 − G2(Ω3 ⊗ Ir1 )GT
2
= Skew(UT
2 ηU2 G2GT
2 ) + Skew(G2ηT
G2
),
G3GT
3 Ω3 + Ω3G3GT
3 − G3(Ir2 ⊗ Ω1)GT
3 − G3(Ω2 ⊗ Ir1 )GT
3
= Skew(UT
3 ηU3 G3GT
3 ) + Skew(G3ηT
G3
).
▶ Skew(·) extracts the skew-symmetric part of a square matrix, i.e.,
Skew(D) = (D − DT
)/2.
▶ The coupled Lyapunov equations are solved efficiently with the
Matlab’s pcg routine that is combined with a specific preconditioner
resulting from the Gauss-Seidel approximation.
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 26
27. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Retraction Rx(ξx)
▶ Map vectors in the horizontal space to points on the search
space M and satisfies the local rigidity condition
[Absil et al., 2008, Definition 4.1].
▶ Provides a natural way to move on the manifold along a
search direction.
▶ Due to the product nature of M, we can choose a retraction
by combining retractions on the individual manifolds, i.e.,
Rx(ξx) = (uf(U1 + ξU1 ), uf(U2 + ξU2 ), uf(U3 + ξU3 ), G + ξG),
where ξx ∈ Hx and uf(·) extracts the orthogonal factor of a
full column rank matrix, i.e., uf(A) = A(AT
A)−1/2.
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 27
28. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Vector transport Tηx
ξx
▶ Smooth mapping that transports a tangent vector ξx ∈ TxM
at x ∈ M to a vector in the tangent space at Rx(ηx)
[Absil et al., 2008, Section 8.1.4].
▶ Generalize the classical concept of translation of vectors in the
Euclidean space to manifolds.
▶ The horizontal lift of the abstract vector transport Tη[x]
ξ[x] on
M/∼ has the matrix characterization
ΠRx(ηx)(Tηx ξx) = ΠRx(ηx)(ΨRx(ηx)(ξx)),
where ξx and ηx are the horizontal lifts in Hx of ξ[x] and η[x]
that belong to T[x](M/∼).
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 28
29. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Riemannian gradient computations (1)
Horizontal lift (1)
▶ Finally, from the Riemannian submersion theory
[Absil et al., 2008, Section 3.6.2], the horizontal lift of
grad[x]f is equal to gradxf = Ψ(egradxf).
▶ Subsequently, we obtain below;
the horizontal lift of grad[x]f =
(S1(U3 ⊗ U2)GT
1 (G1GT
1 )−1
− U1BU1 (G1GT
1 )−1
,
S2(U3 ⊗ U1)GT
2 (G2GT
2 )−1
− U2BU2 (G2GT
2 )−1
,
S3(U2 ⊗ U1)GT
3 (G3GT
3 )−1
− U3BU3 (G3GT
3 )−1
,
S ×1 UT
1 ×2 UT
2 ×3 UT
3 ),
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 29
30. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Riemannian gradient computations (2)
Horizontal lift (2)
▶ where BUd
for d ∈ {1, 2, 3} are the solutions to the Lyapunov
equations
BU1 G1GT
1 + G1GT
1 BU1 = 2Sym(G1GT
1 UT
1 (S1(U3 ⊗ U2)GT
1 ),
BU2 G2GT
2 + G2GT
2 BU2 = 2Sym(G2GT
2 UT
2 (S2(U3 ⊗ U1)GT
2 ),
BU3 G3GT
3 + G3GT
3 BU3 = 2Sym(G3GT
3 UT
3 (S3(U2 ⊗ U1)GT
3 ),
which are solved efficiently with the Matlab’s lyap routine.
Sym(·) extracts the symmetric part of a square matrix, i.e.,
Sym(D) = (D + DT
)/2.
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 30
31. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Preconditioned Conjugate Gradient (CG) Algorithm
▶ Off-the-shelf conjugate gradient implementation of Manopt 1
[Boumal et al., 2014] for any smooth cost function.
▶ The convergence analysis of the Riemannian conjugate
gradient method follows from
[Sato and Iwai, 2015, Ring and Wirth, 2012].
▶ Cost function specific ingredients are needed;
▶ Compute the Riemannian gradient.
▶ Compute an initial guess for the step-size for CG.
▶ The total computational cost per iteration of our algorithm is
O(|Ω|r1r2r3), where |Ω| is the number of known entries.
1
http://www.manopt.org/Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 31
32. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Experiment conditions
▶ Comparisons with state-of-the-art algorithms that include
▶ Tucker decomposition based algorithms;
▶ TOpt [Filipovi´c and Juki´c, 2013], geomCG
[Kressner et al., 2014]
▶ Nuclear norm minimization algorithms;
▶ HaLRTC [Liu et al., 2013], Latent [Tomioka et al., 2011],
Hard [Signoretto et al., 2014]
▶ Problem instances
▶ Case S2: small-scale instances.
▶ Case S3: large-scale instances.
▶ Case S5: influence of ill-conditioning and low sampling.
▶ Case S7: asymmetric instances.
▶ Case R1: hyperspectral image “Ribeira” [Foster et al., 2007].
▶ Case R2: MovieLens-10M.
▶ Note that, compared with ONLY geomCG for large-scale
instances, i.e., except Case S2.
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 32
33. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Synthetic dataset cases
▶ Case S2: small-scale instances.
▶ Size 100 × 100 × 100, 150 × 150 × 150, and 200 × 200 × 200
and rank r = (10, 10, 10) are considered. OS is {10, 20, 30}.
▶ Case S3: large-scale instances.
▶ Tensors of size 3000 × 3000 × 3000, 5000 × 5000 × 5000, and
10000 × 10000 × 10000 and ranks r = (5, 5, 5) and
(10, 10, 10). OS is 10.
▶ Case S5: influence of ill-conditioning and low sampling.
▶ Case S4 with OS = 5. We impose a diagonal core G with
exponentially decaying positive values of condition numbers
(CN) 5, 50, and 100.
▶ Case S7: asymmetric instances.
▶ (a): tensors size 20000 × 7000 × 7000, 30000 × 6000 × 6000,
and 40000 × 5000 × 5000 with rank r = (5, 5, 5).
▶ (b): tensor size 10000 × 10000 × 10000 with ranks (7, 6, 6),
(10, 5, 5), and (15, 4, 4).
Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 33
34. Introduction Metric proposal Tucker manifold geometry Numerical Comparisons
Real-world dataset cases
▶ Case R1: hyperspectral image “Ribeira”.
[Foster et al., 2007]
▶ The tensor size is 203 × 268 × 33.
▶ Compare all the algorithms, and perform five random samplings
of the pixels based on the OS values 11 and 22, corresponding
to the rank r=(15, 15, 6) adopted in [Kressner et al., 2014].
▶ While OS = 22 corresponds to the observation ratio of 10%
studied in [Kressner et al., 2014], OS = 11 considers a
challenging scenario with the observation ratio of 5%.
▶ Case R2: MovieLens-10M2.
▶ This dataset contains 10000054 ratings corresponding to
71567 users and 10681 movies.
▶ Split the time into 7-days wide bins results, and finally, get a
tensor of size 71567 × 10681 × 731.
▶ The fraction of known entries is less than 0.002%.
2
http://grouplens.org/datasets/movielens/.Low-rank tensor completion: a Riemannian manifold preconditioning approach (ICML2016) (copyrights by Kasai & Mishra) 34