SlideShare a Scribd company logo
2
Most read
4
Most read
7
Most read
Tensor Decompositions and Applications
Tamara G. Kolda and Brett W. Bader
Part I
Jiguang Shen
September 22, 2015
Jiguang Shen Tensor Decompositions and Applications
What is tensor?
A N-th order tensor is an element of the tensor product of N vector spaces, each of
which has its own coordinate system.
a = ai ei b = bj˜ej c = ckˆek (Vector spaces)
A = a ○ b = ai bj
= Aij
ei ○ ˜ej (Matrix, second order tensor)
X = a ○ b ○ c = ai bj ck
= Xijk
ei ○ ˜ej ○ ˆek (Third order tensor)
......
N-th order (ways or modes) tensor has N dimensions.
Jiguang Shen Tensor Decompositions and Applications
From matrix to high order tensor
Table : Matrix to high order tensor
Matrix High order tensor
(Columns/Rows)Ai , A j (Fibers)Xij , Xi k , X jk ; (Slices) Xi , X j , X k
mode-3, 2, 1; Horizontal, Lateral, Frontal slices
A B = ∑
i=1
∑
j=1
Aij Bij X Y = ∑
i=1
∑
j=1
∑
k=1
Xijk Yijk
A F =
√
∑
i=1
∑
j=1
A2
ij X =
√
∑
i=1
∑
j=1
∑
k=1
X2
ijk
(Rank one matrix) A = abT (Rank one tensor) X = a ○ b ○ c
Aij = ai bj Xijk = ai bj ck
(Symmetric) A = AT (Supersymmetric: cubical + symmetry) X ∈ RI×I×I
Xijk is constant when permuting i, j, k
A = I, Aij = δij X = I, Xijk = δijk
Jiguang Shen Tensor Decompositions and Applications
Matricization (unfolding, flattening)
Vectorization of Matrix: arrange all columns of the matrix to be a column vector.
(
1 3
2 4
) ⇒
⎛
⎜
⎜
⎜
⎝
1
2
3
4
⎞
⎟
⎟
⎟
⎠
Mode-n matricization of a tensor: arrange all mode-n fibers to be the columns of the
resulting matrix. ( Let’s skip the formal definition. )
Xij1 =
⎛
⎜
⎝
12 4 7
2 10 6
1 16 9
⎞
⎟
⎠
Xij2 =
⎛
⎜
⎝
13 3 73
21 0 26
11 27 19
⎞
⎟
⎠
Mode-1 matricization: X(1)
⎛
⎜
⎝
12 4 7 13 3 73
2 10 6 21 0 26
1 16 9 11 27 19
⎞
⎟
⎠
Jiguang Shen Tensor Decompositions and Applications
Mode-2 matricization: X(2)
⎛
⎜
⎝
12 2 1 13 21 11
4 10 16 3 0 27
7 6 9 73 26 19
⎞
⎟
⎠
Mode-3 matricization: X(3)
(
12 2 1 4 10 16 ⋯ 6 9
13 21 11 3 0 27 ⋯ 26 19
)
How do we change the results if we have an additional slice?
Xij3 =
⎛
⎜
⎝
−1 −1 −1
−1 −1 −1
−1 −1 −1
⎞
⎟
⎠
Jiguang Shen Tensor Decompositions and Applications
Tensor multiplication: n-mode product
The n-mode product of a tensor X ∈ RI1 × RI2 × ⋯RIN with a matrix U ∈ RJ×In is
denoted by X ×n U.
• (X ×n U)i1⋯in−1jin+1⋯iN
=
In
∑
in=1
xi1⋯inin+1⋯iN
ujin
• Y = X ×n U ⇐⇒ Y(n) = UX(n)
• related to change of basis, when a tensor defines a multilinear operator.
The n-mode product of a tensor X ∈ RI1 × RI2 × ⋯RIN with a vector v ∈ RIn is denoted
by X ¯×nv.
• (X ¯×nv)i1⋯in−1in+1⋯iN
=
In
∑
in=1
xi1⋯iN
vin
• precedence matters, contraction of tensor.
Jiguang Shen Tensor Decompositions and Applications
Matrix Kronecker,Khatri–Rao and Hadamard Products
A, B
• Kronecker product: A ⊗ B =
⎛
⎜
⎝
a11B ⋯ a1J B
aI1B ⋯ aIJ B
⎞
⎟
⎠
• Khatri–Rao product: A B = [a1 ⊗ b1 a2 ⊗ b2 ⋯ aK ⊗ bK] (column matching)
• Hadamard Product: (A ∗ B)ij = Aij Bij
Examples:
A =
⎛
⎜
⎝
1 2 3
4 5 6
7 8 9
⎞
⎟
⎠
B = (
1 2 3
4 5 6
)
A ⊗ B?A B?A ∗ B?A ∗ A?
MATLAB:
kron(A,B); % Kronecker
for i = 1:3 % Khatri-Rao
C(:,i) = kron(A(:,i),B(:,i));
end
A.*B %Hadamard
Jiguang Shen Tensor Decompositions and Applications
Tensor Rank and the CP Decomposition
CP Decomposition (CANDECOMP/PARAFAC Decomposition), factorizes a tensor
into a sum of component rank-one tensors. CP decomposition can be treated as a
generalization of SVD to higher order tensors.
A =
R
∑
r=1
σr ur ○ vr , with σ1 ≥ σ2⋯ ≥ σR > 0 (SVD)
X ∈ RI×J×K
≈
R
∑
r=1
λr ar ○ br ○ cr = λ; A, B, C
A, B, and C are combination of the vectors from the rank-one components and are
normalized to length one.
In matricized form:
X(1) ≈ AΛ(C B)T
, X(2) ≈ BΛ(C A)T
, X(3) ≈ CΛ(B A)T
where Λ = diag(λ).
It can be extended to n-th order tensor.
Jiguang Shen Tensor Decompositions and Applications
Tensor Rank
The rank of a tensor X, denoted rank(X), is the smallest number of rank-one tensors
that generate X as their sum. (the smallest number of components in an exact “=”
CP decomposition. In other words, what is R?) An exact CP decomposition with
R = rank(X) components is called the rank decomposition.
• Definition is an exact analogue from matrix rank (the dimension of the vector
space spanned by its columns).
• Properties are quite different:
(1) Field dependent: may not the same over R and C.
(2) No straightforward algorithm to compute the rank of specific tensor
(NP-hard problem).
• Matrix decompositions are not unique (why?), however high order tensor
decompositions are often unique.
Jiguang Shen Tensor Decompositions and Applications
Low-Rank Approximations
Matrix: a best rank-k approximation is given by the leading k factors of the SVD. A
rank-k approximation that minimizes A − B is given by:
A =
k
∑
r=1
σr ur ○ vr , with σ1 ≥ σ2⋯ ≥ σR > 0
Not true for high order tensors! a best rank-k approximation may not exist!
(degeneracy)
Jiguang Shen Tensor Decompositions and Applications
Computing CP Decomposition
Suppose R is fixed, for a third order tensor X, we are looking for CP Decomposition
of ˆX:
min
ˆX
X − ˆX with ˆX = λ; A, B, C
The alternating least squares (ALS) method: (fixed all but one matrix)
• Fix C and B (specify in any way) and rewrite (how?), ˆA = AΛ
min
ˆA
X(1) − ˆA(C B)T
F
• Solution: ˆA = X(1)[(C B)T ]† , or a better version:
ˆA = X(1)(C B)[(CT
C ∗ BT
B)]†. † refers to Moore–Penrose pseudoinverse
(MATLAB: B = pinv(A)).
• Normalization to get A, storing norms as λ.
• Fix A and B to compute C. ⋯, until reach convergence (i.e. error stops
decreasing).
Implementability? Convergence? Efficiency?
Jiguang Shen Tensor Decompositions and Applications
Tensor toolbox MATLAB
Sandia National Laboratories provide a MATLAB Tensor Toolbox which includes an
implementation of CP Decomposition.
• cp als function implements ALS algorithm for CP Decomposition with fixed
rank. e.g P = cp als(X, 2);
• other approaches are avaliable cp apr cp nmu, cp opt, etc.
• special treatment for sparse tensor: a “greedy” CP.
Applications:
• signal/image processing
• neuroscience
• data mining (user × keyword × time: chatroom tensors)
• stochastic PDEs.
• ⋯
Jiguang Shen Tensor Decompositions and Applications
THANK YOU!
Jiguang Shen Tensor Decompositions and Applications

More Related Content

PPTX
Chapter 22 Finite Field
Tony Cervera Jr.
 
PDF
FPDE presentation
Divyansh Verma
 
PDF
Wasserstein GAN
Bar Vinograd
 
PPTX
Radix 4 FFT algorithm and it time complexity computation
Raj Jaiswal
 
PPTX
2. Linear regression with one variable.pptx
Emad Nabil
 
PDF
Lecture: Word Sense Disambiguation
Marina Santini
 
PPS
Unit iii
mrecedu
 
PDF
読書会 「トピックモデルによる統計的潜在意味解析」 第2回 3.2節 サンプリング近似法
健児 青木
 
Chapter 22 Finite Field
Tony Cervera Jr.
 
FPDE presentation
Divyansh Verma
 
Wasserstein GAN
Bar Vinograd
 
Radix 4 FFT algorithm and it time complexity computation
Raj Jaiswal
 
2. Linear regression with one variable.pptx
Emad Nabil
 
Lecture: Word Sense Disambiguation
Marina Santini
 
Unit iii
mrecedu
 
読書会 「トピックモデルによる統計的潜在意味解析」 第2回 3.2節 サンプリング近似法
健児 青木
 

What's hot (20)

PDF
Linguistic hedges in fuzzy logic
Siksha 'O' Anusandhan (Deemed to be University )
 
PDF
Machine Learning: Generative and Discriminative Models
butest
 
PPTX
Properties of Fourier transform
Muhammed Afsal Villan
 
POTX
LDA Beginner's Tutorial
Wayne Lee
 
PDF
Latent Dirichlet Allocation
Sangwoo Mo
 
PPTX
system linear equations and matrices
Aditya Vaishampayan
 
PPT
Discrete Math Lecture 01: Propositional Logic
IT Engineering Department
 
PPT
Direct method for soliton solution
MOHANRAJ PHYSICS
 
PDF
Matrix and Tensor Tools for Computer Vision
Andrews Cordolino Sobral
 
PPT
Chap4
nathanurag
 
PDF
[Paper] attention mechanism(luong)
Susang Kim
 
PDF
【AI論文解説】拡散モデルと自己回帰型モデルの融合 ~ 同時に使う手法と使い分ける手法
Sony - Neural Network Libraries
 
PDF
LATEX and BEAMER for Beginners
Tilak Devaraj
 
PDF
Evaluation metrics: Precision, Recall, F-Measure, ROC
Big Data Engineering, Faculty of Engineering, Dhurakij Pundit University
 
PPTX
Fuzzy Set
Ehsan Hamzei
 
PDF
Wasserstein GAN
Jinho Lee
 
PPTX
Lagrange's equation with one application
Zakaria Hossain
 
PDF
Markov Chain Monte Carlo Methods
Francesco Casalegno
 
PDF
Ridge regression
Ananda Swarup
 
PPT
03 truncation errors
maheej
 
Linguistic hedges in fuzzy logic
Siksha 'O' Anusandhan (Deemed to be University )
 
Machine Learning: Generative and Discriminative Models
butest
 
Properties of Fourier transform
Muhammed Afsal Villan
 
LDA Beginner's Tutorial
Wayne Lee
 
Latent Dirichlet Allocation
Sangwoo Mo
 
system linear equations and matrices
Aditya Vaishampayan
 
Discrete Math Lecture 01: Propositional Logic
IT Engineering Department
 
Direct method for soliton solution
MOHANRAJ PHYSICS
 
Matrix and Tensor Tools for Computer Vision
Andrews Cordolino Sobral
 
Chap4
nathanurag
 
[Paper] attention mechanism(luong)
Susang Kim
 
【AI論文解説】拡散モデルと自己回帰型モデルの融合 ~ 同時に使う手法と使い分ける手法
Sony - Neural Network Libraries
 
LATEX and BEAMER for Beginners
Tilak Devaraj
 
Evaluation metrics: Precision, Recall, F-Measure, ROC
Big Data Engineering, Faculty of Engineering, Dhurakij Pundit University
 
Fuzzy Set
Ehsan Hamzei
 
Wasserstein GAN
Jinho Lee
 
Lagrange's equation with one application
Zakaria Hossain
 
Markov Chain Monte Carlo Methods
Francesco Casalegno
 
Ridge regression
Ananda Swarup
 
03 truncation errors
maheej
 
Ad

Viewers also liked (19)

PDF
V4x - Plateforme RichMedia interactive Mobile TV
Thomas Guillaumin
 
PDF
STARTEK Offerings
Scott Martinez
 
PPTX
КСК "Новополье"
Sofia Ivanova
 
DOCX
Simon baker cv appendix final
Simon Baker
 
PDF
Inductum Absque Imperium
Murray Simons
 
PPTX
Herramientas web 2.0
Lauren Medina
 
PPT
Приймак Л. С. Денний сон
Ваня Костюкевич
 
PPTX
инклюзивное воспитание в творческой среде
Sofia Ivanova
 
PPTX
Отчет работы кружка по театральной деятельности
mv1386
 
PDF
Generalization of Tensor Factorization and Applications
Kohei Hayashi
 
PDF
Conferencia Ventas y negociación cide bod [modo de compatibilidad]
Yolmer Romero
 
PPTX
Catia introduccionv5
José Manuel Trujillo Cedillo
 
PPT
Apriori algorithm
nouraalkhatib
 
PDF
Alphorm.com support Formation AutoCAD 2016 atelier architectural-ss
Alphorm
 
PPTX
Ppt on catia
Tajender12singh
 
PDF
Creo parametric tips and tricks
Evan Winter
 
PDF
Economica
josem395
 
PDF
A brief survey of tensors
Berton Earnshaw
 
V4x - Plateforme RichMedia interactive Mobile TV
Thomas Guillaumin
 
STARTEK Offerings
Scott Martinez
 
КСК "Новополье"
Sofia Ivanova
 
Simon baker cv appendix final
Simon Baker
 
Inductum Absque Imperium
Murray Simons
 
Herramientas web 2.0
Lauren Medina
 
Приймак Л. С. Денний сон
Ваня Костюкевич
 
инклюзивное воспитание в творческой среде
Sofia Ivanova
 
Отчет работы кружка по театральной деятельности
mv1386
 
Generalization of Tensor Factorization and Applications
Kohei Hayashi
 
Conferencia Ventas y negociación cide bod [modo de compatibilidad]
Yolmer Romero
 
Catia introduccionv5
José Manuel Trujillo Cedillo
 
Apriori algorithm
nouraalkhatib
 
Alphorm.com support Formation AutoCAD 2016 atelier architectural-ss
Alphorm
 
Ppt on catia
Tajender12singh
 
Creo parametric tips and tricks
Evan Winter
 
Economica
josem395
 
A brief survey of tensors
Berton Earnshaw
 
Ad

Similar to presentation (20)

PDF
Determinants
Joey Fontanilla Valdriz
 
PPTX
Electromagnetic theory Chapter 1
Ali Farooq
 
PPT
Determinants - Mathematics
Drishti Bhalla
 
PDF
PART I.3 - Physical Mathematics
Maurice R. TREMBLAY
 
PPT
StructuralTheoryClass2.ppt
ChristopherArce4
 
PPTX
Electromagnetic theory EMT lecture 1
Ali Farooq
 
PPT
Determinant untuk kuliahteknik sipil atau umum
adjie25
 
PDF
Chapter 3: Linear Systems and Matrices - Part 3/Slides
Chaimae Baroudi
 
PPT
Matrices and determinants for graduation Engineering.ppt
sarabjitsingh478673
 
PDF
Numerical Solution of Linear algebraic Equation
payalpriyadarshinisa1
 
PPT
TABREZ KHAN.ppt
TabrezKhan733764
 
PDF
New data structures and algorithms for \\post-processing large data sets and ...
Alexander Litvinenko
 
PPTX
Presentation on matrix
Nahin Mahfuz Seam
 
PDF
determinants-160504230830.pdf
Praveen Kumar Verma PMP
 
PDF
determinants-160504230830_repaired.pdf
TGBSmile
 
PDF
PART X.2 - Superstring Theory
Maurice R. TREMBLAY
 
PDF
Engg maths k notes(4)
Ranjay Kumar
 
PPT
Application of Cylindrical and Spherical coordinate system in double-triple i...
Sonendra Kumar Gupta
 
PDF
Tenseur en algèbre lineaire numerique avancé
SosthneHounzinhin
 
PPT
Translation, Rotation and Transformation in Robotics.ppt
AnMo10
 
Electromagnetic theory Chapter 1
Ali Farooq
 
Determinants - Mathematics
Drishti Bhalla
 
PART I.3 - Physical Mathematics
Maurice R. TREMBLAY
 
StructuralTheoryClass2.ppt
ChristopherArce4
 
Electromagnetic theory EMT lecture 1
Ali Farooq
 
Determinant untuk kuliahteknik sipil atau umum
adjie25
 
Chapter 3: Linear Systems and Matrices - Part 3/Slides
Chaimae Baroudi
 
Matrices and determinants for graduation Engineering.ppt
sarabjitsingh478673
 
Numerical Solution of Linear algebraic Equation
payalpriyadarshinisa1
 
TABREZ KHAN.ppt
TabrezKhan733764
 
New data structures and algorithms for \\post-processing large data sets and ...
Alexander Litvinenko
 
Presentation on matrix
Nahin Mahfuz Seam
 
determinants-160504230830.pdf
Praveen Kumar Verma PMP
 
determinants-160504230830_repaired.pdf
TGBSmile
 
PART X.2 - Superstring Theory
Maurice R. TREMBLAY
 
Engg maths k notes(4)
Ranjay Kumar
 
Application of Cylindrical and Spherical coordinate system in double-triple i...
Sonendra Kumar Gupta
 
Tenseur en algèbre lineaire numerique avancé
SosthneHounzinhin
 
Translation, Rotation and Transformation in Robotics.ppt
AnMo10
 

presentation

  • 1. Tensor Decompositions and Applications Tamara G. Kolda and Brett W. Bader Part I Jiguang Shen September 22, 2015 Jiguang Shen Tensor Decompositions and Applications
  • 2. What is tensor? A N-th order tensor is an element of the tensor product of N vector spaces, each of which has its own coordinate system. a = ai ei b = bj˜ej c = ckˆek (Vector spaces) A = a ○ b = ai bj = Aij ei ○ ˜ej (Matrix, second order tensor) X = a ○ b ○ c = ai bj ck = Xijk ei ○ ˜ej ○ ˆek (Third order tensor) ...... N-th order (ways or modes) tensor has N dimensions. Jiguang Shen Tensor Decompositions and Applications
  • 3. From matrix to high order tensor Table : Matrix to high order tensor Matrix High order tensor (Columns/Rows)Ai , A j (Fibers)Xij , Xi k , X jk ; (Slices) Xi , X j , X k mode-3, 2, 1; Horizontal, Lateral, Frontal slices A B = ∑ i=1 ∑ j=1 Aij Bij X Y = ∑ i=1 ∑ j=1 ∑ k=1 Xijk Yijk A F = √ ∑ i=1 ∑ j=1 A2 ij X = √ ∑ i=1 ∑ j=1 ∑ k=1 X2 ijk (Rank one matrix) A = abT (Rank one tensor) X = a ○ b ○ c Aij = ai bj Xijk = ai bj ck (Symmetric) A = AT (Supersymmetric: cubical + symmetry) X ∈ RI×I×I Xijk is constant when permuting i, j, k A = I, Aij = δij X = I, Xijk = δijk Jiguang Shen Tensor Decompositions and Applications
  • 4. Matricization (unfolding, flattening) Vectorization of Matrix: arrange all columns of the matrix to be a column vector. ( 1 3 2 4 ) ⇒ ⎛ ⎜ ⎜ ⎜ ⎝ 1 2 3 4 ⎞ ⎟ ⎟ ⎟ ⎠ Mode-n matricization of a tensor: arrange all mode-n fibers to be the columns of the resulting matrix. ( Let’s skip the formal definition. ) Xij1 = ⎛ ⎜ ⎝ 12 4 7 2 10 6 1 16 9 ⎞ ⎟ ⎠ Xij2 = ⎛ ⎜ ⎝ 13 3 73 21 0 26 11 27 19 ⎞ ⎟ ⎠ Mode-1 matricization: X(1) ⎛ ⎜ ⎝ 12 4 7 13 3 73 2 10 6 21 0 26 1 16 9 11 27 19 ⎞ ⎟ ⎠ Jiguang Shen Tensor Decompositions and Applications
  • 5. Mode-2 matricization: X(2) ⎛ ⎜ ⎝ 12 2 1 13 21 11 4 10 16 3 0 27 7 6 9 73 26 19 ⎞ ⎟ ⎠ Mode-3 matricization: X(3) ( 12 2 1 4 10 16 ⋯ 6 9 13 21 11 3 0 27 ⋯ 26 19 ) How do we change the results if we have an additional slice? Xij3 = ⎛ ⎜ ⎝ −1 −1 −1 −1 −1 −1 −1 −1 −1 ⎞ ⎟ ⎠ Jiguang Shen Tensor Decompositions and Applications
  • 6. Tensor multiplication: n-mode product The n-mode product of a tensor X ∈ RI1 × RI2 × ⋯RIN with a matrix U ∈ RJ×In is denoted by X ×n U. • (X ×n U)i1⋯in−1jin+1⋯iN = In ∑ in=1 xi1⋯inin+1⋯iN ujin • Y = X ×n U ⇐⇒ Y(n) = UX(n) • related to change of basis, when a tensor defines a multilinear operator. The n-mode product of a tensor X ∈ RI1 × RI2 × ⋯RIN with a vector v ∈ RIn is denoted by X ¯×nv. • (X ¯×nv)i1⋯in−1in+1⋯iN = In ∑ in=1 xi1⋯iN vin • precedence matters, contraction of tensor. Jiguang Shen Tensor Decompositions and Applications
  • 7. Matrix Kronecker,Khatri–Rao and Hadamard Products A, B • Kronecker product: A ⊗ B = ⎛ ⎜ ⎝ a11B ⋯ a1J B aI1B ⋯ aIJ B ⎞ ⎟ ⎠ • Khatri–Rao product: A B = [a1 ⊗ b1 a2 ⊗ b2 ⋯ aK ⊗ bK] (column matching) • Hadamard Product: (A ∗ B)ij = Aij Bij Examples: A = ⎛ ⎜ ⎝ 1 2 3 4 5 6 7 8 9 ⎞ ⎟ ⎠ B = ( 1 2 3 4 5 6 ) A ⊗ B?A B?A ∗ B?A ∗ A? MATLAB: kron(A,B); % Kronecker for i = 1:3 % Khatri-Rao C(:,i) = kron(A(:,i),B(:,i)); end A.*B %Hadamard Jiguang Shen Tensor Decompositions and Applications
  • 8. Tensor Rank and the CP Decomposition CP Decomposition (CANDECOMP/PARAFAC Decomposition), factorizes a tensor into a sum of component rank-one tensors. CP decomposition can be treated as a generalization of SVD to higher order tensors. A = R ∑ r=1 σr ur ○ vr , with σ1 ≥ σ2⋯ ≥ σR > 0 (SVD) X ∈ RI×J×K ≈ R ∑ r=1 λr ar ○ br ○ cr = λ; A, B, C A, B, and C are combination of the vectors from the rank-one components and are normalized to length one. In matricized form: X(1) ≈ AΛ(C B)T , X(2) ≈ BΛ(C A)T , X(3) ≈ CΛ(B A)T where Λ = diag(λ). It can be extended to n-th order tensor. Jiguang Shen Tensor Decompositions and Applications
  • 9. Tensor Rank The rank of a tensor X, denoted rank(X), is the smallest number of rank-one tensors that generate X as their sum. (the smallest number of components in an exact “=” CP decomposition. In other words, what is R?) An exact CP decomposition with R = rank(X) components is called the rank decomposition. • Definition is an exact analogue from matrix rank (the dimension of the vector space spanned by its columns). • Properties are quite different: (1) Field dependent: may not the same over R and C. (2) No straightforward algorithm to compute the rank of specific tensor (NP-hard problem). • Matrix decompositions are not unique (why?), however high order tensor decompositions are often unique. Jiguang Shen Tensor Decompositions and Applications
  • 10. Low-Rank Approximations Matrix: a best rank-k approximation is given by the leading k factors of the SVD. A rank-k approximation that minimizes A − B is given by: A = k ∑ r=1 σr ur ○ vr , with σ1 ≥ σ2⋯ ≥ σR > 0 Not true for high order tensors! a best rank-k approximation may not exist! (degeneracy) Jiguang Shen Tensor Decompositions and Applications
  • 11. Computing CP Decomposition Suppose R is fixed, for a third order tensor X, we are looking for CP Decomposition of ˆX: min ˆX X − ˆX with ˆX = λ; A, B, C The alternating least squares (ALS) method: (fixed all but one matrix) • Fix C and B (specify in any way) and rewrite (how?), ˆA = AΛ min ˆA X(1) − ˆA(C B)T F • Solution: ˆA = X(1)[(C B)T ]† , or a better version: ˆA = X(1)(C B)[(CT C ∗ BT B)]†. † refers to Moore–Penrose pseudoinverse (MATLAB: B = pinv(A)). • Normalization to get A, storing norms as λ. • Fix A and B to compute C. ⋯, until reach convergence (i.e. error stops decreasing). Implementability? Convergence? Efficiency? Jiguang Shen Tensor Decompositions and Applications
  • 12. Tensor toolbox MATLAB Sandia National Laboratories provide a MATLAB Tensor Toolbox which includes an implementation of CP Decomposition. • cp als function implements ALS algorithm for CP Decomposition with fixed rank. e.g P = cp als(X, 2); • other approaches are avaliable cp apr cp nmu, cp opt, etc. • special treatment for sparse tensor: a “greedy” CP. Applications: • signal/image processing • neuroscience • data mining (user × keyword × time: chatroom tensors) • stochastic PDEs. • ⋯ Jiguang Shen Tensor Decompositions and Applications
  • 13. THANK YOU! Jiguang Shen Tensor Decompositions and Applications