Tensors. Lek-Heng Lim. Statistics Department Retreat. October 27, Thanks: NSF DMS and DMS
|
|
- Norman Marsh
- 5 years ago
- Views:
Transcription
1 Tensors Lek-Heng Lim Statistics Department Retreat October 27, 2012 Thanks: NSF DMS and DMS L.-H. Lim (Stat Retreat) Tensors October 27, / 20
2 tensors on one foot a tensor is a multilinear functional f : V 1 V d C if we give f coordinates, get hypermatrix A = (a j1 j d ) C n 1 n d where n 1 = dim V 1,..., n d = dim V d d-dimensional hypermatrix represents d-tensor the same way matrix represents 2-tensor (i.e. linear operators, bilinear forms, bivectors) for more info: P. McCullagh, Tensor Methods in Statistics, Chapman and Hall, London, plug: L.-H. Lim, Tensors, in L. Hogben (Ed.), Handbook of Linear Algebra, 2nd Ed., CRC Press, Boca Raton, FL, L.-H. Lim (Stat Retreat) Tensors October 27, / 20
3 where do we find tensors? higher-order derivatives f (x) R, f (x) R n, 2 f (x) R n n, 3 f (x) R n n n, 4 f (x) R n n n n,... multivariate moments and cumulants [Fisher-Wishart, 1932]: log E(exp(i t, x )) coefficients are symmetric tensors: m α =1 (κ α (x)) α =1 C p, (κ α (x)) α =2 C p p, i α κ α (x) tα α!. (κ α (x)) α =3 C p p p, (κ α (x)) α =4 C p p p p,... L.-H. Lim (Stat Retreat) Tensors October 27, / 20
4 where do we find tensors? quantum mechanics H1,..., H k state spaces, state space of unified system is H H 1 H k H contains factorizable states ψ 1 ψ k but also mixed states αψ 1 ψ k + + βϕ 1 ϕ k Hj : H j H j Hamiltonian of jth system and I identity operator H 1 I I + I H 2 I + + I I H k Hamiltonian of unified system provided systems do not interact self-concordance in convex optimization 3 f (x) 3 f (x) 4 2 f (x) 2 f (x) 2 f (x) L.-H. Lim (Stat Retreat) Tensors October 27, / 20
5 what can we do with a single tensor? rank hyperdeterminant various decompositions system of multilinear equations multilinear programming multilinear least squares eigenvalues and eigenvectors singular values and singular vectors Gaussian elimination and QR factorization nonnegative tensors and Perron-Frobenius theory spectral, operator, Hölder, Schatten, Ky Fan norms symmetric positive definite tensors and Cholesky decomposition linear preservers of rank, hyperdeterminant, singular, and eigenvalues L.-H. Lim (Stat Retreat) Tensors October 27, / 20
6 why study tensors? a rich source of new problems hypermatrix analogues of matrix notions problems trivial for matrices become non-trivial a rich source of tools for known applications quantum systems holographic algorithms algebraic complexity of matrix multiplication and inversion a rich source of tools for new applications causal inference phylogenetics inference higher order optimization theory principal components of higher order moments and cumulants spectral hypergraph theory encoding NP-hard and #P-hard problems multiarray signal processing diffusion MRI imaging caveat: there will be obstacles L.-H. Lim (Stat Retreat) Tensors October 27, / 20
7 tensor rank rank of A C l m n [Hitchcock, 1927] is rank(a) := min { r A = r i=1 σ iu i v i w i } computational complexity: Strassen matrix multiplication/inversion inf { ω rank ( n i,j,k=1 ϕ ik ϕ kj E ij ) = O(n ω ) } = 2? quantum computing: algebraic measure of entanglement GHZ = C machine learning: naïve Bayes model Pr(x, y, z) = Pr(h) Pr(x h) Pr(y h) Pr(z h) h H X Y Z L.-H. Lim (Stat Retreat) Tensors October 27, / 20
8 example: phylogenetic invariants Markov model for evolution of 3-taxon tree [Allman-Rhodes, 2006] probability distribution given by table with model P = π A ρ A σ A θ A + π C ρ C σ C θ C for i, j, k {A, C, G, T }, + π G ρ G σ G θ G + π T ρ T σ T θ T p ijk = π A ρ Ai σ Aj θ Ak + π C ρ Ci σ Cj θ Ck + π G ρ Gi σ Gj θ Gk + π T ρ Ti σ Tj θ Tk L.-H. Lim (Stat Retreat) Tensors October 27, / 20
9 multilinear systems and hyperdeterminants hyperdeterminant of A = (a ijk ) R [Cayley, 1845] is Det 2,2,2(A) = 1 [ ([ ] [ ]) a000 a 010 a100 a 110 det + 4 a 001 a 011 a 101 a 111 ([ ] [ ])] 2 a000 a 010 a100 a 110 det a 001 a 011 a 101 a 111 [ ] [ ] a000 a 010 a100 a det det a 001 a 011 a 101 a 111 a result that parallels the matrix case: system of bilinear equations a 000x 0y 0 + a 010x 0y 1 + a 100x 1y 0 + a 110x 1y 1 = 0, a 001x 0y 0 + a 011x 0y 1 + a 101x 1y 0 + a 111x 1y 1 = 0, a 000x 0z 0 + a 001x 0z 1 + a 100x 1z 0 + a 101x 1z 1 = 0, a 010x 0z 0 + a 011x 0z 1 + a 110x 1z 0 + a 111x 1z 1 = 0, a 000y 0z 0 + a 001y 0z 1 + a 010y 1z 0 + a 011y 1z 1 = 0, a 100y 0z 0 + a 101y 0z 1 + a 110y 1z 0 + a 111y 1z 1 = 0, has non-trivial solution iff Det 2,2,2 (A) = 0 L.-H. Lim (Stat Retreat) Tensors October 27, / 20
10 eigenvalues and singular values of tensors eigenvalues and singular values are Lagrange multipliers eigenvalues/vectors of S = (s ijk ) S 3 (C n ): cubic Rayleigh quotient [LHL, 2005; Qi, 2005] S(x, x, x) = n constrained to unit l 3 -sphere x 3 = 1 i,j,k=1 s ijkx i x j x k singular values/vectors of A = (a ijk ) C l m n : trilinear Rayleigh quotient [LHL, 2005] A(x, y, z) = l,m,n i,j,k=1 a ijkx i y j z k constrained to product of unit l 3 -spheres x 3 = y 3 = z 3 = 1 Perron-Frobenius theorem for nonnegative tensors [LHL, 2005], [Chang-Pearson-Zhang, 2010], [Friedland-Gaubert-Han, 2012] L.-H. Lim (Stat Retreat) Tensors October 27, / 20
11 tensor norms operator norm of A C l m n A 2,2,2 = A(x, y, z) max x 0,y 0,z 0 x y z = σ max(a) i.e. equals largest singular value of A Schatten and Ky Fan norms [LHL-Comon, 2012] { [ r A,p := inf λ i p] 1/p i=1 A = r λ iu i v i w i, i=1 } u i = v i = w i = 1, r N one interesting property [LHL-Comon, 2012] A,1 rank(a) A, analogue of v 1 v 0 v and M rank(m) M 2 for v C n and M C m n L.-H. Lim (Stat Retreat) Tensors October 27, / 20
12 most tensor problems are NP-hard NP-Hard Tensor Problems NP-Complete NP P some have no FPTAS some are NP-hard even to approximate some are #P-hard some are undecidable C.J. Hillar and L.-H. Lim, Most tensor problems are NP hard, J. Assoc. Comput. Mach., to appear. Matrix Problems L.-H. Lim (Stat Retreat) Tensors October 27, / 20
13 3-coloring encoded as tensor problem colorings of left graph can be encoded as nonzero real solutions to following square set of n = 35 quadratic polynomials in 35 real unknowns a i, b i, c i, d i (i = 1,..., 4), u, w i (i = 1,..., 18): a 1 c 1 b 1 d 1 u 2, b 1 c 1 + a 1 d 1, c 1 u a b2 1, d 1u 2a 1 b 1, a 1 u c d2 1, b 1u 2d 1 c 1, a 2 c 2 b 2 d 2 u 2, b 2 c 2 + a 2 d 2, c 2 u a b2 2, d 2u 2a 2 b 2, a 2 u c d2 2, b 2u 2d 2 c 2, a 3 c 3 b 3 d 3 u 2, b 3 c 3 + a 3 d 3, c 3 u a b2 3, d 3u 2a 3 b 3, a 3 u c d2 3, b 3u 2d 3 c 3, a 4 c 4 b 4 d 4 u 2, b 4 c 4 + a 4 d 4, c 4 u a b2 4, d 4u 2a 4 b 4, a 4 u c d2 4, b 4u 2d 4 c 4, a 2 1 b2 1 + a 1a 3 b 1 b 3 + a 2 3 b2 3, a2 1 b2 1 + a 1a 4 b 1 b 4 + a 2 4 b2 4, a2 1 b2 1 + a 1a 2 b 1 b 2 + a 2 2 b2 2, a 2 2 b2 2 + a 2a 3 b 2 b 3 + a 2 3 b2 3, a2 3 b2 3 + a 3a 4 b 3 b 4 + a 2 4 b2 4, 2a 1b 1 + a 1 b 2 + a 2 b 1 + 2a 2 b 2, 2a 2 b 2 + a 2 b 3 + a 3 b 2 + 2a 3 b 3, 2a 1 b 1 + a 1 b 3 + a 2 b 1 + 2a 3 b 3, 2a 1 b 1 + a 1 b 4 + a 4 b 1 + 2a 4 b 4, 2a 3 b 3 + a 3 b 4 + a 4 b 3 + 2a 4 b 4, w w w w 2 18 equivalent to checking if bilinear system has non-trivial solution: y A k z = 0, x B k z = 0, x C k y = 0, k = 1,..., n L.-H. Lim (Stat Retreat) Tensors October 27, / 20
14 spectral hypergraph theory G = (V, E) is 3-hypergraph, V vertices, E hyperedges { adjacency hypermatrix A C n n n 1 [i, j, k] E, a ijk = 0 otherwise Lemma (L, 2007) G m-regular 3-hypergraph and A adjacency hypermatrix. Then 1 m is an eigenvalue of A 2 if λ is an eigenvalue of A, then λ m 3 λ has multiplicity 1 if and only if G is connected Lemma (L, 2007) G connected m-regular k-partite k-hypergraph on n vertices. Then 1 k 1 mod 4, eigenvalue of A occurs with multiplicity a multiple of k 2 k 3 mod 4, spectrum of A symmetric, ie. λ is eigenvalue iff λ is L.-H. Lim (Stat Retreat) Tensors October 27, / 20
15 higher order optimization first and second order conditions for local minimum necessary: f (x) = 0, 2 f (x) 0 sufficient: f (x) = 0, 2 f (x) 0 for local minimum at x, wlog a 11 [ ]... 2 A 0 a f (x) = = pp 0 0 0, a 11,..., a pp > 0... A R p p : (1, 1)-block of 2 f (x) B R (n p) (n p) (n p) : (2, 2, 2)-block of 3 f (x) B R p (n p) (n p) : (1, 2, 2)-block of 3 f (x) C R (n p) (n p) (n p) (n p) : (2, 2, 2, 2)-block of 4 f (x) third and fourth order conditions for local minimum necessary: B = 0, 4C A 1, B B 0 sufficient: B = 0, 4C A 1, B B 0 L.-H. Lim (Stat Retreat) Tensors October 27, / 20 0
16 mapping the connectome identify neural fibers as accurately as possible from diffusion MRI data fodf Maxima Schultz-Seidel L.-H. Lim (Stat Retreat) Tensors October 27, / 20
17 mapping the connectome after preprocessing, may regard signal as function f : S 2 R f is homogeneous polynomial of even degree, f (x) = n j 1,...,j p=1 a j 1 j p x j1 x jp R[x 1,..., x n ] p coefficients are hypermatrices A = (a j1 j p ) R n n n model mandates that f must be sum of powers of linear forms: f (x) = r i=1 (vt i x) p equivalently, A has Cholesky decomposition : A = r i=1 v p i v i gives direction of ith fiber in a voxel [Schultz-Seidel, 2008], [Schultz-Fuster-Ghosh-Florack-Deriche-LHL, 2012], [LHL-Schultz, 2012] L.-H. Lim (Stat Retreat) Tensors October 27, / 20
18 principal components for higher-order cumulants 0.4 PCA right singular vectors 0.4 Principal kurtosis components Comp Comp 1 Comp Comp 1 Figure: 17 and 39 non-gaussian; all others Gaussian; left: 1st vs 2nd principal components; right: 1st vs 2nd principal kurtosis components; [LHL-Morton, 2012] L.-H. Lim (Stat Retreat) Tensors October 27, / 20
19 multiarray signal processing ith sensor, i = 1,..., l, impinged by r narrowband waves transmitted by independent radiating sources through linear stationary medium assumption: arrays may overlap but differ only by translations (a) (b) (c) signal received by ith sensor in jth array, j = 1,..., m, s i,j (k) = r σ p(t k )ε i,j (θ p ) p=1 assumption implies i and j decouple [LHL-Comon, 2010, 2012], ε i,j (θ p ) = ε i,1 (θ p )ϕ(j, p) may identify individual signals using low-rank tensor approximation L.-H. Lim (Stat Retreat) Tensors October 27, / 20
20 more plugs C.J. Hillar and L.-H. Lim, Most tensor problems are NP hard, J. Assoc. Comput. Mach., to appear. L.-H. Lim, Tensors, in L. Hogben (Ed.), Handbook of Linear Algebra, 2nd Ed., CRC Press, Boca Raton, FL, L.-H. Lim and J. Morton, Principal components of cumulants, preprint, (2012). L.-H. Lim and P. Comon, Multisensor signal processing: tensor decomposition meets compressed sensing, C. R. Acad. Sci. Paris, 338 (2010), no. 6, pp L.-H. Lim and P. Comon, Separable identification, preprint, (2012). T. Schultz, A. Fuster, A. Ghosh, L. Florack, R. Deriche, and L.-H. Lim, Higher-order tensors in diffusion imaging, in B. Burgeth, A.V. Bartroli, and C.-F. Westin (Eds.), Visualization and Processing of Tensors and Higher Order Descriptors for Multi-Valued Data, Springer Verlag, Berlin, L.-H. Lim (Stat Retreat) Tensors October 27, / 20
Spectrum and Pseudospectrum of a Tensor
Spectrum and Pseudospectrum of a Tensor Lek-Heng Lim University of California, Berkeley July 11, 2008 L.-H. Lim (Berkeley) Spectrum and Pseudospectrum of a Tensor July 11, 2008 1 / 27 Matrix eigenvalues
More informationEigenvalues of tensors
Eigenvalues of tensors and some very basic spectral hypergraph theory Lek-Heng Lim Matrix Computations and Scientific Computing Seminar April 16, 2008 Lek-Heng Lim (MCSC Seminar) Eigenvalues of tensors
More informationHyperdeterminants, secant varieties, and tensor approximations
Hyperdeterminants, secant varieties, and tensor approximations Lek-Heng Lim University of California, Berkeley April 23, 2008 Joint work with Vin de Silva L.-H. Lim (Berkeley) Tensor approximations April
More informationMost tensor problems are NP hard
Most tensor problems are NP hard (preliminary report) Lek-Heng Lim University of California, Berkeley August 25, 2009 (Joint work with: Chris Hillar, MSRI) L.H. Lim (Berkeley) Most tensor problems are
More informationNumerical multilinear algebra
Numerical multilinear algebra From matrices to tensors Lek-Heng Lim University of California, Berkeley March 1, 2008 Lek-Heng Lim (UC Berkeley) Numerical multilinear algebra March 1, 2008 1 / 29 Lesson
More informationPrincipal Cumulant Component Analysis
Principal Cumulant Component Analysis Lek-Heng Lim University of California, Berkeley July 4, 2009 (Joint work with: Jason Morton, Stanford University) L.-H. Lim (Berkeley) Principal Cumulant Component
More informationMultilinear Algebra in Data Analysis:
Multilinear Algebra in Data Analysis: tensors, symmetric tensors, nonnegative tensors Lek-Heng Lim Stanford University Workshop on Algorithms for Modern Massive Datasets Stanford, CA June 21 24, 2006 Thanks:
More informationComputational Complexity of Tensor Problems
Computational Complexity of Tensor Problems Chris Hillar Redwood Center for Theoretical Neuroscience (UC Berkeley) Joint work with Lek-Heng Lim (with L-H Lim) Most tensor problems are NP-hard, Journal
More informationConditioning and illposedness of tensor approximations
Conditioning and illposedness of tensor approximations and why engineers should care Lek-Heng Lim MSRI Summer Graduate Workshop July 7 18, 2008 L.-H. Lim (MSRI SGW) Conditioning and illposedness July 7
More informationAlgebraic models for higher-order correlations
Algebraic models for higher-order correlations Lek-Heng Lim and Jason Morton U.C. Berkeley and Stanford Univ. December 15, 2008 L.-H. Lim & J. Morton (MSRI Workshop) Algebraic models for higher-order correlations
More informationNumerical Multilinear Algebra: a new beginning?
Numerical Multilinear Algebra: a new beginning? Lek-Heng Lim Stanford University Matrix Computations and Scientific Computing Seminar Berkeley, CA October 18, 2006 Thanks: G. Carlsson, S. Lacoste-Julien,
More informationOpen Problems in Algebraic Statistics
Open Problems inalgebraic Statistics p. Open Problems in Algebraic Statistics BERND STURMFELS UNIVERSITY OF CALIFORNIA, BERKELEY and TECHNISCHE UNIVERSITÄT BERLIN Advertisement Oberwolfach Seminar Algebraic
More informationPerron-Frobenius theorem for nonnegative multilinear forms and extensions
Perron-Frobenius theorem for nonnegative multilinear forms and extensions Shmuel Friedland Univ. Illinois at Chicago ensors 18 December, 2010, Hong-Kong Overview 1 Perron-Frobenius theorem for irreducible
More informationdistances between objects of different dimensions
distances between objects of different dimensions Lek-Heng Lim University of Chicago joint work with: Ke Ye (CAS) and Rodolphe Sepulchre (Cambridge) thanks: DARPA D15AP00109, NSF DMS-1209136, NSF IIS-1546413,
More informationA Potpourri of Nonlinear Algebra
a a 2 a 3 + c c 2 c 3 =2, a a 3 b 2 + c c 3 d 2 =0, a 2 a 3 b + c 2 c 3 d =0, a 3 b b 2 + c 3 d d 2 = 4, a a 2 b 3 + c c 2 d 3 =0, a b 2 b 3 + c d 2 d 3 = 4, a 2 b b 3 + c 2 d 3 d =4, b b 2 b 3 + d d 2
More informationNumerical Analysis: Solving Systems of Linear Equations
Numerical Analysis: Solving Systems of Linear Equations Mirko Navara http://cmpfelkcvutcz/ navara/ Center for Machine Perception, Department of Cybernetics, FEE, CTU Karlovo náměstí, building G, office
More informationNumerical Multilinear Algebra I
Numerical Multilinear Algebra I Lek-Heng Lim University of California, Berkeley January 5 7, 2009 L.-H. Lim (ICM Lecture) Numerical Multilinear Algebra I January 5 7, 2009 1 / 55 Hope Past 50 years, Numerical
More informationPermutation transformations of tensors with an application
DOI 10.1186/s40064-016-3720-1 RESEARCH Open Access Permutation transformations of tensors with an application Yao Tang Li *, Zheng Bo Li, Qi Long Liu and Qiong Liu *Correspondence: liyaotang@ynu.edu.cn
More informationNumerical Multilinear Algebra in Data Analysis
Numerical Multilinear Algebra in Data Analysis Lek-Heng Lim Stanford University Computer Science Colloquium Ithaca, NY February 20, 2007 Collaborators: O. Alter, P. Comon, V. de Silva, I. Dhillon, M. Gu,
More informationSufficient Conditions to Ensure Uniqueness of Best Rank One Approximation
Sufficient Conditions to Ensure Uniqueness of Best Rank One Approximation Yang Qi Joint work with Pierre Comon and Lek-Heng Lim Supported by ERC grant #320594 Decoda August 3, 2015 Yang Qi (GIPSA-Lab)
More informationChordal structure in computer algebra: Permanents
Chordal structure in computer algebra: Permanents Diego Cifuentes Laboratory for Information and Decision Systems Electrical Engineering and Computer Science Massachusetts Institute of Technology Joint
More informationNumerical Multilinear Algebra II
Numerical Multilinear Algebra II Lek-Heng Lim University of California, Berkeley January 5 7, 2009 L.-H. Lim (ICM Lecture) Numerical Multilinear Algebra II January 5 7, 2009 1 / 61 Recap: tensor ranks
More informationApproximation algorithms for nonnegative polynomial optimization problems over unit spheres
Front. Math. China 2017, 12(6): 1409 1426 https://doi.org/10.1007/s11464-017-0644-1 Approximation algorithms for nonnegative polynomial optimization problems over unit spheres Xinzhen ZHANG 1, Guanglu
More informationEigenvalues of the Adjacency Tensor on Products of Hypergraphs
Int. J. Contemp. Math. Sciences, Vol. 8, 201, no. 4, 151-158 HIKARI Ltd, www.m-hikari.com Eigenvalues of the Adjacency Tensor on Products of Hypergraphs Kelly J. Pearson Department of Mathematics and Statistics
More informationTensors. Notes by Mateusz Michalek and Bernd Sturmfels for the lecture on June 5, 2018, in the IMPRS Ringvorlesung Introduction to Nonlinear Algebra
Tensors Notes by Mateusz Michalek and Bernd Sturmfels for the lecture on June 5, 2018, in the IMPRS Ringvorlesung Introduction to Nonlinear Algebra This lecture is divided into two parts. The first part,
More informationSTAT 309: MATHEMATICAL COMPUTATIONS I FALL 2017 LECTURE 5
STAT 39: MATHEMATICAL COMPUTATIONS I FALL 17 LECTURE 5 1 existence of svd Theorem 1 (Existence of SVD) Every matrix has a singular value decomposition (condensed version) Proof Let A C m n and for simplicity
More informationMOST TENSOR PROBLEMS ARE NP HARD CHRISTOPHER J. HILLAR AND LEK-HENG LIM
MOST TENSOR PROBLEMS ARE NP HARD CHRISTOPHER J. HILLAR AND LEK-HENG LIM Abstract. The idea that one might extend numerical linear algebra, the collection of matrix computational methods that form the workhorse
More informationAlgebra C Numerical Linear Algebra Sample Exam Problems
Algebra C Numerical Linear Algebra Sample Exam Problems Notation. Denote by V a finite-dimensional Hilbert space with inner product (, ) and corresponding norm. The abbreviation SPD is used for symmetric
More informationLinear Algebra And Its Applications Chapter 6. Positive Definite Matrix
Linear Algebra And Its Applications Chapter 6. Positive Definite Matrix KAIST wit Lab 2012. 07. 10 남성호 Introduction The signs of the eigenvalues can be important. The signs can also be related to the minima,
More informationFrom nonnegative matrices to nonnegative tensors
From nonnegative matrices to nonnegative tensors Shmuel Friedland Univ. Illinois at Chicago 7 October, 2011 Hamilton Institute, National University of Ireland Overview 1 Perron-Frobenius theorem for irreducible
More informationPreliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012
Instructions Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 The exam consists of four problems, each having multiple parts. You should attempt to solve all four problems. 1.
More informationMobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti
Mobile Robotics 1 A Compact Course on Linear Algebra Giorgio Grisetti SA-1 Vectors Arrays of numbers They represent a point in a n dimensional space 2 Vectors: Scalar Product Scalar-Vector Product Changes
More informationA linear algebraic view of partition regular matrices
A linear algebraic view of partition regular matrices Leslie Hogben Jillian McLeod June 7, 3 4 5 6 7 8 9 Abstract Rado showed that a rational matrix is partition regular over N if and only if it satisfies
More informationON THE LIMITING PROBABILITY DISTRIBUTION OF A TRANSITION PROBABILITY TENSOR
ON THE LIMITING PROBABILITY DISTRIBUTION OF A TRANSITION PROBABILITY TENSOR WEN LI AND MICHAEL K. NG Abstract. In this paper we propose and develop an iterative method to calculate a limiting probability
More informationMa/CS 6b Class 20: Spectral Graph Theory
Ma/CS 6b Class 20: Spectral Graph Theory By Adam Sheffer Eigenvalues and Eigenvectors A an n n matrix of real numbers. The eigenvalues of A are the numbers λ such that Ax = λx for some nonzero vector x
More informationFINITE-DIMENSIONAL LINEAR ALGEBRA
DISCRETE MATHEMATICS AND ITS APPLICATIONS Series Editor KENNETH H ROSEN FINITE-DIMENSIONAL LINEAR ALGEBRA Mark S Gockenbach Michigan Technological University Houghton, USA CRC Press Taylor & Francis Croup
More informationLecture 14: SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices) Lecturer: Sanjeev Arora
princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 14: SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices) Lecturer: Sanjeev Arora Scribe: Today we continue the
More information2. Matrix Algebra and Random Vectors
2. Matrix Algebra and Random Vectors 2.1 Introduction Multivariate data can be conveniently display as array of numbers. In general, a rectangular array of numbers with, for instance, n rows and p columns
More informationThe Miracles of Tropical Spectral Theory
The Miracles of Tropical Spectral Theory Emmanuel Tsukerman University of California, Berkeley e.tsukerman@berkeley.edu July 31, 2015 1 / 110 Motivation One of the most important results in max-algebra
More informationLinear Algebra for Machine Learning. Sargur N. Srihari
Linear Algebra for Machine Learning Sargur N. srihari@cedar.buffalo.edu 1 Overview Linear Algebra is based on continuous math rather than discrete math Computer scientists have little experience with it
More informationNumerical Methods in Matrix Computations
Ake Bjorck Numerical Methods in Matrix Computations Springer Contents 1 Direct Methods for Linear Systems 1 1.1 Elements of Matrix Theory 1 1.1.1 Matrix Algebra 2 1.1.2 Vector Spaces 6 1.1.3 Submatrices
More informationNumerical multilinear algebra in data analysis
Numerical multilinear algebra in data analysis (Ten ways to decompose a tensor) Lek-Heng Lim Stanford University April 5, 2007 Lek-Heng Lim (Stanford University) Numerical multilinear algebra in data analysis
More informationInverse Perron values and connectivity of a uniform hypergraph
Inverse Perron values and connectivity of a uniform hypergraph Changjiang Bu College of Automation College of Science Harbin Engineering University Harbin, PR China buchangjiang@hrbeu.edu.cn Jiang Zhou
More informationPreface to Second Edition... vii. Preface to First Edition...
Contents Preface to Second Edition..................................... vii Preface to First Edition....................................... ix Part I Linear Algebra 1 Basic Vector/Matrix Structure and
More informationMost Tensor Problems are NP-Hard
0 Most Tensor Problems are NP-Hard CHRISTOPHER J. HILLAR, Mathematical Sciences Research Institute LEK-HENG LIM, University of Chicago We prove that multilinear (tensor) analogues of many efficiently computable
More informationTHE PERTURBATION BOUND FOR THE SPECTRAL RADIUS OF A NON-NEGATIVE TENSOR
THE PERTURBATION BOUND FOR THE SPECTRAL RADIUS OF A NON-NEGATIVE TENSOR WEN LI AND MICHAEL K. NG Abstract. In this paper, we study the perturbation bound for the spectral radius of an m th - order n-dimensional
More informationLecture 13 Spectral Graph Algorithms
COMS 995-3: Advanced Algorithms March 6, 7 Lecture 3 Spectral Graph Algorithms Instructor: Alex Andoni Scribe: Srikar Varadaraj Introduction Today s topics: Finish proof from last lecture Example of random
More informationMathematical foundations - linear algebra
Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar
More information14 Singular Value Decomposition
14 Singular Value Decomposition For any high-dimensional data analysis, one s first thought should often be: can I use an SVD? The singular value decomposition is an invaluable analysis tool for dealing
More informationLinear Algebra & Geometry why is linear algebra useful in computer vision?
Linear Algebra & Geometry why is linear algebra useful in computer vision? References: -Any book on linear algebra! -[HZ] chapters 2, 4 Some of the slides in this lecture are courtesy to Prof. Octavia
More informationTopics in Tensors III Nonnegative tensors
Topics in Tensors III Nonnegative tensors A Summer School by Shmuel Friedland 1 July 6-8, 2011 given in Department of Mathematics University of Coimbra, Portugal 1 Department of Mathematics, Statistics
More informationMa/CS 6b Class 20: Spectral Graph Theory
Ma/CS 6b Class 20: Spectral Graph Theory By Adam Sheffer Recall: Parity of a Permutation S n the set of permutations of 1,2,, n. A permutation σ S n is even if it can be written as a composition of an
More informationSome tensor decomposition methods for machine learning
Some tensor decomposition methods for machine learning Massimiliano Pontil Istituto Italiano di Tecnologia and University College London 16 August 2016 1 / 36 Outline Problem and motivation Tucker decomposition
More informationTensor Analysis, Computation and Applications
First Prev Next Last Tensor Analysis, Computation and by LIQUN QI Department of Applied Mathematics The Hong Kong Polytechnic University Page 1 of 132 First Prev Next Last Outline Eigenvalues and Eigenvectors
More informationKey words. Strongly eventually nonnegative matrix, eventually nonnegative matrix, eventually r-cyclic matrix, Perron-Frobenius.
May 7, DETERMINING WHETHER A MATRIX IS STRONGLY EVENTUALLY NONNEGATIVE LESLIE HOGBEN 3 5 6 7 8 9 Abstract. A matrix A can be tested to determine whether it is eventually positive by examination of its
More informationSVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices)
Chapter 14 SVD, Power method, and Planted Graph problems (+ eigenvalues of random matrices) Today we continue the topic of low-dimensional approximation to datasets and matrices. Last time we saw the singular
More informationMATRIX COMPLETION AND TENSOR RANK
MATRIX COMPLETION AND TENSOR RANK HARM DERKSEN Abstract. In this paper, we show that the low rank matrix completion problem can be reduced to the problem of finding the rank of a certain tensor. arxiv:1302.2639v2
More informationTensor decomposition and moment matrices
Tensor decomposition and moment matrices J. Brachat (Ph.D.) & B. Mourrain GALAAD, INRIA Méditerranée, Sophia Antipolis Bernard.Mourrain@inria.fr SIAM AAG, Raleigh, NC, Oct. -9, 211 Collaboration with A.
More informationarxiv: v1 [math.ra] 11 Aug 2014
Double B-tensors and quasi-double B-tensors Chaoqian Li, Yaotang Li arxiv:1408.2299v1 [math.ra] 11 Aug 2014 a School of Mathematics and Statistics, Yunnan University, Kunming, Yunnan, P. R. China 650091
More information1 Principal component analysis and dimensional reduction
Linear Algebra Working Group :: Day 3 Note: All vector spaces will be finite-dimensional vector spaces over the field R. 1 Principal component analysis and dimensional reduction Definition 1.1. Given an
More informationChap 3. Linear Algebra
Chap 3. Linear Algebra Outlines 1. Introduction 2. Basis, Representation, and Orthonormalization 3. Linear Algebraic Equations 4. Similarity Transformation 5. Diagonal Form and Jordan Form 6. Functions
More informationLecture 7: Positive Semidefinite Matrices
Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.
More informationSection 9.2: Matrices. Definition: A matrix A consists of a rectangular array of numbers, or elements, arranged in m rows and n columns.
Section 9.2: Matrices Definition: A matrix A consists of a rectangular array of numbers, or elements, arranged in m rows and n columns. That is, a 11 a 12 a 1n a 21 a 22 a 2n A =...... a m1 a m2 a mn A
More informationAn Introduction to Linear Matrix Inequalities. Raktim Bhattacharya Aerospace Engineering, Texas A&M University
An Introduction to Linear Matrix Inequalities Raktim Bhattacharya Aerospace Engineering, Texas A&M University Linear Matrix Inequalities What are they? Inequalities involving matrix variables Matrix variables
More informationOn the Adjacency Spectra of Hypertrees
On the Adjacency Spectra of Hypertrees arxiv:1711.01466v1 [math.sp] 4 Nov 2017 Gregory J. Clark and Joshua N. Cooper Department of Mathematics University of South Carolina November 7, 2017 Abstract We
More informationLinear algebra and applications to graphs Part 1
Linear algebra and applications to graphs Part 1 Written up by Mikhail Belkin and Moon Duchin Instructor: Laszlo Babai June 17, 2001 1 Basic Linear Algebra Exercise 1.1 Let V and W be linear subspaces
More informationHands-on Matrix Algebra Using R
Preface vii 1. R Preliminaries 1 1.1 Matrix Defined, Deeper Understanding Using Software.. 1 1.2 Introduction, Why R?.................... 2 1.3 Obtaining R.......................... 4 1.4 Reference Manuals
More informationReview (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology
Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Some slides have been adopted from Prof. H.R. Rabiee s and also Prof. R. Gutierrez-Osuna
More informationBasic Calculus Review
Basic Calculus Review Lorenzo Rosasco ISML Mod. 2 - Machine Learning Vector Spaces Functionals and Operators (Matrices) Vector Space A vector space is a set V with binary operations +: V V V and : R V
More informationSpectral Algorithms I. Slides based on Spectral Mesh Processing Siggraph 2010 course
Spectral Algorithms I Slides based on Spectral Mesh Processing Siggraph 2010 course Why Spectral? A different way to look at functions on a domain Why Spectral? Better representations lead to simpler solutions
More informationAPPENDIX A. Background Mathematics. A.1 Linear Algebra. Vector algebra. Let x denote the n-dimensional column vector with components x 1 x 2.
APPENDIX A Background Mathematics A. Linear Algebra A.. Vector algebra Let x denote the n-dimensional column vector with components 0 x x 2 B C @. A x n Definition 6 (scalar product). The scalar product
More informationOn prediction and density estimation Peter McCullagh University of Chicago December 2004
On prediction and density estimation Peter McCullagh University of Chicago December 2004 Summary Having observed the initial segment of a random sequence, subsequent values may be predicted by calculating
More informationA concise proof of Kruskal s theorem on tensor decomposition
A concise proof of Kruskal s theorem on tensor decomposition John A. Rhodes 1 Department of Mathematics and Statistics University of Alaska Fairbanks PO Box 756660 Fairbanks, AK 99775 Abstract A theorem
More informationarxiv: v1 [math.ra] 13 Jan 2009
A CONCISE PROOF OF KRUSKAL S THEOREM ON TENSOR DECOMPOSITION arxiv:0901.1796v1 [math.ra] 13 Jan 2009 JOHN A. RHODES Abstract. A theorem of J. Kruskal from 1977, motivated by a latent-class statistical
More information1 Singular Value Decomposition and Principal Component
Singular Value Decomposition and Principal Component Analysis In these lectures we discuss the SVD and the PCA, two of the most widely used tools in machine learning. Principal Component Analysis (PCA)
More informationIntroduction to the Tensor Train Decomposition and Its Applications in Machine Learning
Introduction to the Tensor Train Decomposition and Its Applications in Machine Learning Anton Rodomanov Higher School of Economics, Russia Bayesian methods research group (http://bayesgroup.ru) 14 March
More informationA NEW EFFECTIVE PRECONDITIONED METHOD FOR L-MATRICES
Journal of Mathematical Sciences: Advances and Applications Volume, Number 2, 2008, Pages 3-322 A NEW EFFECTIVE PRECONDITIONED METHOD FOR L-MATRICES Department of Mathematics Taiyuan Normal University
More informationPrincipal Component Analysis
Machine Learning Michaelmas 2017 James Worrell Principal Component Analysis 1 Introduction 1.1 Goals of PCA Principal components analysis (PCA) is a dimensionality reduction technique that can be used
More informationMa/CS 6b Class 23: Eigenvalues in Regular Graphs
Ma/CS 6b Class 3: Eigenvalues in Regular Graphs By Adam Sheffer Recall: The Spectrum of a Graph Consider a graph G = V, E and let A be the adjacency matrix of G. The eigenvalues of G are the eigenvalues
More informationChapter 7: Symmetric Matrices and Quadratic Forms
Chapter 7: Symmetric Matrices and Quadratic Forms (Last Updated: December, 06) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved
More informationB553 Lecture 5: Matrix Algebra Review
B553 Lecture 5: Matrix Algebra Review Kris Hauser January 19, 2012 We have seen in prior lectures how vectors represent points in R n and gradients of functions. Matrices represent linear transformations
More informationThe Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)
Chapter 5 The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) 5.1 Basics of SVD 5.1.1 Review of Key Concepts We review some key definitions and results about matrices that will
More informationSection 3.9. Matrix Norm
3.9. Matrix Norm 1 Section 3.9. Matrix Norm Note. We define several matrix norms, some similar to vector norms and some reflecting how multiplication by a matrix affects the norm of a vector. We use matrix
More informationMAT 610: Numerical Linear Algebra. James V. Lambers
MAT 610: Numerical Linear Algebra James V Lambers January 16, 2017 2 Contents 1 Matrix Multiplication Problems 7 11 Introduction 7 111 Systems of Linear Equations 7 112 The Eigenvalue Problem 8 12 Basic
More informationLinear Algebra: Matrix Eigenvalue Problems
CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given
More informationChapter 3 Transformations
Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases
More informationTensor Decompositions for Machine Learning. G. Roeder 1. UBC Machine Learning Reading Group, June University of British Columbia
Network Feature s Decompositions for Machine Learning 1 1 Department of Computer Science University of British Columbia UBC Machine Learning Group, June 15 2016 1/30 Contact information Network Feature
More informationIntroduction to Mobile Robotics Compact Course on Linear Algebra. Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz
Introduction to Mobile Robotics Compact Course on Linear Algebra Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz Vectors Arrays of numbers Vectors represent a point in a n dimensional space
More informationG1110 & 852G1 Numerical Linear Algebra
The University of Sussex Department of Mathematics G & 85G Numerical Linear Algebra Lecture Notes Autumn Term Kerstin Hesse (w aw S w a w w (w aw H(wa = (w aw + w Figure : Geometric explanation of the
More informationMath 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination
Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column
More informationGeneric properties of Symmetric Tensors
2006 1/48 PComon Generic properties of Symmetric Tensors Pierre COMON - CNRS other contributors: Bernard MOURRAIN INRIA institute Lek-Heng LIM, Stanford University 2006 2/48 PComon Tensors & Arrays Definitions
More informationPositive Definite Matrix
1/29 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra Positive Definite, Negative Definite, Indefinite 2/29 Pure Quadratic Function
More informationAn Even Order Symmetric B Tensor is Positive Definite
An Even Order Symmetric B Tensor is Positive Definite Liqun Qi, Yisheng Song arxiv:1404.0452v4 [math.sp] 14 May 2014 October 17, 2018 Abstract It is easily checkable if a given tensor is a B tensor, or
More information1 9/5 Matrices, vectors, and their applications
1 9/5 Matrices, vectors, and their applications Algebra: study of objects and operations on them. Linear algebra: object: matrices and vectors. operations: addition, multiplication etc. Algorithms/Geometric
More informationQALGO workshop, Riga. 1 / 26. Quantum algorithms for linear algebra.
QALGO workshop, Riga. 1 / 26 Quantum algorithms for linear algebra., Center for Quantum Technologies and Nanyang Technological University, Singapore. September 22, 2015 QALGO workshop, Riga. 2 / 26 Overview
More informationTopic 1: Matrix diagonalization
Topic : Matrix diagonalization Review of Matrices and Determinants Definition A matrix is a rectangular array of real numbers a a a m a A = a a m a n a n a nm The matrix is said to be of order n m if it
More information7. Symmetric Matrices and Quadratic Forms
Linear Algebra 7. Symmetric Matrices and Quadratic Forms CSIE NCU 1 7. Symmetric Matrices and Quadratic Forms 7.1 Diagonalization of symmetric matrices 2 7.2 Quadratic forms.. 9 7.4 The singular value
More informationLecture 7 Spectral methods
CSE 291: Unsupervised learning Spring 2008 Lecture 7 Spectral methods 7.1 Linear algebra review 7.1.1 Eigenvalues and eigenvectors Definition 1. A d d matrix M has eigenvalue λ if there is a d-dimensional
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 1: Course Overview & Matrix-Vector Multiplication Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 20 Outline 1 Course
More informationUnsupervised Learning: Dimensionality Reduction
Unsupervised Learning: Dimensionality Reduction CMPSCI 689 Fall 2015 Sridhar Mahadevan Lecture 3 Outline In this lecture, we set about to solve the problem posed in the previous lecture Given a dataset,
More information