Uncorrelated Multilinear Principal Component Analysis through Successive Variance Maximization

Similar documents
MPCA: Multilinear Principal Component. Analysis of Tensor Objects

A Survey of Multilinear Subspace Learning for Tensor Data

18 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 19, NO. 1, JANUARY 2008

GAIT RECOGNITION THROUGH MPCA PLUS LDA. Haiping Lu, K.N. Plataniotis and A.N. Venetsanopoulos

Introduction to Machine Learning

Contents. References 23

Fundamentals of Multilinear Subspace Learning

Learning Canonical Correlations of Paired Tensor Sets Via Tensor-to-Vector Projection

Uncorrelated Multilinear Principal Component Analysis for Unsupervised Multilinear Subspace Learning

Tensor Fields for Multilinear Image Representation and Statistical Learning Models Applications

Salt Dome Detection and Tracking Using Texture Analysis and Tensor-based Subspace Learning

Multiple Similarities Based Kernel Subspace Learning for Image Classification

What is Principal Component Analysis?

Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi

Dimensionality Reduction:

PCA and LDA. Man-Wai MAK

Face Recognition. Face Recognition. Subspace-Based Face Recognition Algorithms. Application of Face Recognition

Methods for sparse analysis of high-dimensional data, II

Clustering by Mixture Models. General background on clustering Example method: k-means Mixture model based clustering Model estimation

Face Detection and Recognition

Semi-Orthogonal Multilinear PCA with Relaxed Start

ECE 521. Lecture 11 (not on midterm material) 13 February K-means clustering, Dimensionality reduction

Iterative Laplacian Score for Feature Selection

ECE 661: Homework 10 Fall 2014

L11: Pattern recognition principles

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin

Subspace Analysis for Facial Image Recognition: A Comparative Study. Yongbin Zhang, Lixin Lang and Onur Hamsici

Machine Learning 11. week

A Tensor Approximation Approach to Dimensionality Reduction

Machine Learning. B. Unsupervised Learning B.2 Dimensionality Reduction. Lars Schmidt-Thieme, Nicolas Schilling

Example: Face Detection

PCA and LDA. Man-Wai MAK

Eigenfaces. Face Recognition Using Principal Components Analysis

Machine Learning. Principal Components Analysis. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012

Statistical Pattern Recognition

PCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani

Statistical Machine Learning

arxiv: v2 [stat.ml] 7 May 2015

20 Unsupervised Learning and Principal Components Analysis (PCA)

Symmetric Two Dimensional Linear Discriminant Analysis (2DLDA)

Robot Image Credit: Viktoriya Sukhanova 123RF.com. Dimensionality Reduction

Machine Learning 2nd Edition

Preprocessing & dimensionality reduction

PCA and admixture models

Linear Dimensionality Reduction

Principal Components Analysis. Sargur Srihari University at Buffalo

Machine learning for pervasive systems Classification in high-dimensional spaces

Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations.

Methods for sparse analysis of high-dimensional data, II

N-mode Analysis (Tensor Framework) Behrouz Saghafi

December 20, MAA704, Multivariate analysis. Christopher Engström. Multivariate. analysis. Principal component analysis

Computation. For QDA we need to calculate: Lets first consider the case that

Principal Component Analysis (PCA)

Unsupervised Learning: K- Means & PCA

System 1 (last lecture) : limited to rigidly structured shapes. System 2 : recognition of a class of varying shapes. Need to:

Machine Learning (CSE 446): Unsupervised Learning: K-means and Principal Component Analysis

Lecture 13 Visual recognition

Principal Component Analysis

Dimensionality Reduction

Dimensionality Reduction Using PCA/LDA. Hongyu Li School of Software Engineering TongJi University Fall, 2014

Lecture: Face Recognition and Feature Reduction

Modeling Classes of Shapes Suppose you have a class of shapes with a range of variations: System 2 Overview

7. Variable extraction and dimensionality reduction

Data Mining Techniques

Data Mining. Dimensionality reduction. Hamid Beigy. Sharif University of Technology. Fall 1395

Correlation Preserving Unsupervised Discretization. Outline

Principal Component Analysis

Motivating the Covariance Matrix

STA 414/2104: Lecture 8

Lecture: Face Recognition and Feature Reduction

Neuroscience Introduction

Face recognition Computer Vision Spring 2018, Lecture 21

Supervised locally linear embedding

STA 414/2104: Lecture 8

Lecture 7: Con3nuous Latent Variable Models

Non-linear Dimensionality Reduction

Introduction to the Tensor Train Decomposition and Its Applications in Machine Learning

Machine Learning - MT & 14. PCA and MDS

Fantope Regularization in Metric Learning

Recognition Using Class Specific Linear Projection. Magali Segal Stolrasky Nadav Ben Jakov April, 2015

Dimension Reduction (PCA, ICA, CCA, FLD,

Machine Learning (Spring 2012) Principal Component Analysis

Lecture: Face Recognition

Principal Component Analysis and Linear Discriminant Analysis

Eigenface-based facial recognition

Discriminant Uncorrelated Neighborhood Preserving Projections

Notes on Latent Semantic Analysis

LEC 3: Fisher Discriminant Analysis (FDA)

Unsupervised Machine Learning and Data Mining. DS 5230 / DS Fall Lecture 7. Jan-Willem van de Meent

Dimension Reduction and Low-dimensional Embedding

Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation)

Riemannian Metric Learning for Symmetric Positive Definite Matrices

PCA FACE RECOGNITION

Department of Computer Science and Engineering

Sparse representation classification and positive L1 minimization

Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x =

Nonlinear Dimensionality Reduction. Jose A. Costa

Data Preprocessing Tasks

Lecture 16: Small Sample Size Problems (Covariance Estimation) Many thanks to Carlos Thomaz who authored the original version of these slides

Constructing Optimal Subspaces for Pattern Classification. Avinash Kak Purdue University. November 16, :40pm. An RVL Tutorial Presentation

Transcription:

Uncorrelated Multilinear Principal Component Analysis through Successive Variance Maximization Haiping Lu 1 K. N. Plataniotis 1 A. N. Venetsanopoulos 1,2 1 Department of Electrical & Computer Engineering, University of Toronto 2 Ryerson University The 25th International Conference on Machine Learning ICML 2008 Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 1 / 27

Outline 1 Motivation 2 The Proposed UMPCA Algorithm Tensor-to-Vector Projection Uncorrelated Multilinear PCA (UMPCA) 3 Experimental Evaluations Experimental Setup Experimental Results 4 Conclusions Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 2 / 27

Tensorial Data. Tensor: multidimensional array. Generalization of vector (first-order) and matrix (second-order). N modes Nth-order tensor. Wide range of applications: images, video sequences, streaming and mining data. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 3 / 27

Dimensionality Reduction Problem. Tensor objects are usually high-dimensional curse of dimensionality. Computationally expensive to handle. Many classifiers perform poorly in high-dimensional space given a small number of training samples. A class of tensor objects are mostly highly constrained to a subspace, a manifold of intrinsically low dimension. Dimensionality reduction (feature extraction): transformation to low-dimensional space while retaining most of the underlying structure. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 4 / 27

Dimensionality Reduction Problem. Tensor objects are usually high-dimensional curse of dimensionality. Computationally expensive to handle. Many classifiers perform poorly in high-dimensional space given a small number of training samples. A class of tensor objects are mostly highly constrained to a subspace, a manifold of intrinsically low dimension. Dimensionality reduction (feature extraction): transformation to low-dimensional space while retaining most of the underlying structure. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 4 / 27

Focus: Unsupervised Dimensionality Reduction-PCA. Linear method: principal component analysis (PCA) Produce uncorrelated features. Retain as much as possible the variations. Reshape tensors into vectors: high computational and memory demand, break natural structure in original data. Multilinear methods: feature extraction directly from tensors Tensor rank-one decomposition (TROD) [Shashua & Levin, 2001]. Two-dimensional PCA (2DPCA) [Yang et al., 2004]. Generalized low rank approximation of matrices (GLRAM) [Ye, 2005] & Generalized PCA (GPCA) [Ye et al., 2004]. Concurrent subspaces analysis (CSA) [Xu et al., 2005)]. Multilinear PCA (MPCA) [Lu et al., 2008)]. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 5 / 27

Focus: Unsupervised Dimensionality Reduction-PCA. Linear method: principal component analysis (PCA) Produce uncorrelated features. Retain as much as possible the variations. Reshape tensors into vectors: high computational and memory demand, break natural structure in original data. Multilinear methods: feature extraction directly from tensors Tensor rank-one decomposition (TROD) [Shashua & Levin, 2001]. Two-dimensional PCA (2DPCA) [Yang et al., 2004]. Generalized low rank approximation of matrices (GLRAM) [Ye, 2005] & Generalized PCA (GPCA) [Ye et al., 2004]. Concurrent subspaces analysis (CSA) [Xu et al., 2005)]. Multilinear PCA (MPCA) [Lu et al., 2008)]. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 5 / 27

Question to be Answered: Uncorrelated features Result in minimum redundancy. Ensure linear independence among features. Simplify classification task. Question: Can we extract uncorrelated features directly from tensor objects in an unsupervised way? Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 6 / 27

Outline 1 Motivation 2 The Proposed UMPCA Algorithm Tensor-to-Vector Projection Uncorrelated Multilinear PCA (UMPCA) 3 Experimental Evaluations Experimental Setup Experimental Results 4 Conclusions Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 7 / 27

Notations. Vector: lowercase boldface x. Matrix: uppercase boldface U. Tensor: calligraphic letter A. Subscript p: the feature index. Subscript m: the training sample index. Superscript (n): the n-mode. Superscript T : the transpose. Operation n : the n-mode multiplication. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 8 / 27

Elementary Multilinear Projection (EMP). y = X 1 u (1)T 2 u (2)T... N u (N)T Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 9 / 27

Tensor-to-Vector Projection (TVP). y = X N n=1 {u(n)t p, n = 1,..., N} P p=1 Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 10 / 27

Outline 1 Motivation 2 The Proposed UMPCA Algorithm Tensor-to-Vector Projection Uncorrelated Multilinear PCA (UMPCA) 3 Experimental Evaluations Experimental Setup Experimental Results 4 Conclusions Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 11 / 27

The UMPCA Problem Formulation. Input Tensorial training samples {X m R I 1... I N, m = 1,..., M}. The desired subspace dimensionality P. Objective {u (n)t p, n = 1,..., N} = arg max S y T p, S y T p = M m=1 (y m p y p ) 2. s.t. u (n)t p = 1 and g p : the pth coordinate vector, u (n) p g T p gq g p g q = δ pq, p, q = 1,..., P. g p (m) = y mp = X m N n=1 {u(n)t p, n = 1,..., N}. Output The TVP {u (n)t p, n = 1,..., N} P p=1 satisfying the objective criterion. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 12 / 27

The UMPCA Problem Formulation. Input Tensorial training samples {X m R I 1... I N, m = 1,..., M}. The desired subspace dimensionality P. Objective {u (n)t p, n = 1,..., N} = arg max S y T p, S y T p = M m=1 (y m p y p ) 2. s.t. u (n)t p = 1 and g p : the pth coordinate vector, u (n) p g T p gq g p g q = δ pq, p, q = 1,..., P. g p (m) = y mp = X m N n=1 {u(n)t p, n = 1,..., N}. Output The TVP {u (n)t p, n = 1,..., N} P p=1 satisfying the objective criterion. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 12 / 27

The UMPCA Problem Formulation. Input Tensorial training samples {X m R I 1... I N, m = 1,..., M}. The desired subspace dimensionality P. Objective {u (n)t p, n = 1,..., N} = arg max S y T p, S y T p = M m=1 (y m p y p ) 2. s.t. u (n)t p = 1 and g p : the pth coordinate vector, u (n) p g T p gq g p g q = δ pq, p, q = 1,..., P. g p (m) = y mp = X m N n=1 {u(n)t p, n = 1,..., N}. Output The TVP {u (n)t p, n = 1,..., N} P p=1 satisfying the objective criterion. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 12 / 27

The UMPCA Problem Formulation. Input Tensorial training samples {X m R I 1... I N, m = 1,..., M}. The desired subspace dimensionality P. Objective {u (n)t p, n = 1,..., N} = arg max S y T p, S y T p = M m=1 (y m p y p ) 2. s.t. u (n)t p = 1 and g p : the pth coordinate vector, u (n) p g T p gq g p g q = δ pq, p, q = 1,..., P. g p (m) = y mp = X m N n=1 {u(n)t p, n = 1,..., N}. Output The TVP {u (n)t p, n = 1,..., N} P p=1 satisfying the objective criterion. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 12 / 27

The UMPCA Problem Formulation. Input Tensorial training samples {X m R I 1... I N, m = 1,..., M}. The desired subspace dimensionality P. Objective {u (n)t p, n = 1,..., N} = arg max S y T p, S y T p = M m=1 (y m p y p ) 2. s.t. u (n)t p = 1 and g p : the pth coordinate vector, u (n) p g T p gq g p g q = δ pq, p, q = 1,..., P. g p (m) = y mp = X m N n=1 {u(n)t p, n = 1,..., N}. Output The TVP {u (n)t p, n = 1,..., N} P p=1 satisfying the objective criterion. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 12 / 27

The Approach of Successive Maximization. 1 Determine the first EMP {u (n)t 1, n = 1,..., N} by maximizing S y T 1 without any constraint. 2 Determine the second EMP {u (n)t 2, n = 1,..., N} by maximizing S y T 2 subject to the constraint that g T 2 g 1 = 0. 3 Determine the third EMP {u (n)t 3, n = 1,..., N} by maximizing S y T 3 subject to the constraint that g T 3 g 1 = 0 and g T 3 g 2 = 0. 4...... Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 13 / 27

The Alternating Projection Method. The need for an iterative solution Simultaneous determination of N sets is infeasible. Alternating projection: solve one set with all the other sets fixed and iterate (e.g., Alternating Least Square). The alternating projection method 1 Assume that {u (n) p, n n } is given. 2 Project {X m } in these N 1 modes to {ỹ (n ) m p }. 3 Determine u (n ) p that projects {ỹ (n ) m p } onto a line with variance maximized, subject to zero-correlation. PCA with input {ỹ (n ) m p } and total scatter matrix S (n ) T p. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 14 / 27

The Alternating Projection Method. The need for an iterative solution Simultaneous determination of N sets is infeasible. Alternating projection: solve one set with all the other sets fixed and iterate (e.g., Alternating Least Square). The alternating projection method 1 Assume that {u (n) p, n n } is given. 2 Project {X m } in these N 1 modes to {ỹ (n ) m p }. 3 Determine u (n ) p that projects {ỹ (n ) m p } onto a line with variance maximized, subject to zero-correlation. PCA with input {ỹ (n ) m p } and total scatter matrix S (n ) T p. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 14 / 27

The UMPCA Solution. For p = 1 u (n ) 1 : the unit eigenvector of S (n ) T 1 associated with the largest eigenvalue. For p > 1 1 Let Ỹ(n ) p = [ ỹ (n ) 1 p, ỹ (n ) 2 p,..., ỹ (n ) M p ]. 2 A reformulation: u (n ) p = arg max u (n ) T p S (n ) T p u (n ) p s.t. u (n ) T p u (n ) p = 1 and u (n ) T p Ỹ (n ) p g q = 0, q = 1,..., p 1. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 15 / 27

The UMPCA Solution. For p = 1 u (n ) 1 : the unit eigenvector of S (n ) T 1 associated with the largest eigenvalue. For p > 1 1 Let Ỹ(n ) p = [ ỹ (n ) 1 p, ỹ (n ) 2 p,..., ỹ (n ) M p ]. 2 A reformulation: u (n ) p = arg max u (n ) T p S (n ) T p u (n ) p s.t. u (n ) T p u (n ) p = 1 and u (n ) T p Ỹ (n ) p g q = 0, q = 1,..., p 1. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 15 / 27

The UMPCA Solution. For p = 1 u (n ) 1 : the unit eigenvector of S (n ) T 1 associated with the largest eigenvalue. For p > 1 1 Let Ỹ(n ) p = [ ỹ (n ) 1 p, ỹ (n ) 2 p,..., ỹ (n ) M p ]. 2 A reformulation: u (n ) p = arg max u (n ) T p S (n ) T p u (n ) p s.t. u (n ) T p u (n ) p = 1 and u (n ) T p Ỹ (n ) p g q = 0, q = 1,..., p 1. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 15 / 27

The UMPCA Solution. For p = 1 u (n ) 1 : the unit eigenvector of S (n ) T 1 associated with the largest eigenvalue. For p > 1 1 Let Ỹ(n ) p = [ ỹ (n ) 1 p, ỹ (n ) 2 p,..., ỹ (n ) M p ]. 2 A reformulation: u (n ) p = arg max u (n ) T p S (n ) T p u (n ) p s.t. u (n ) T p u (n ) p = 1 and u (n ) T p Ỹ (n ) p g q = 0, q = 1,..., p 1. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 15 / 27

Solution for p > 1. Theorem The solution to the UMPCA problem (for p > 1) is the (unit-length) eigenvector corresponding to the largest eigenvalue of the following eigenvalue problem: Ψ (n ) Ψ (n ) p S (n ) T p u = λu, ) p = I In Ỹ(n p G p 1 Φ 1 p G T ) T p 1Ỹ(n p, Φ p = G T ) T p 1Ỹ(n p Ỹ (n ) p G p 1, G p 1 = [ g 1 g 2...g p 1 ] R M (p 1). I In : an identity matrix of size I n I n. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 16 / 27

Solution for p > 1. Theorem The solution to the UMPCA problem (for p > 1) is the (unit-length) eigenvector corresponding to the largest eigenvalue of the following eigenvalue problem: Ψ (n ) Ψ (n ) p S (n ) T p u = λu, ) p = I In Ỹ(n p G p 1 Φ 1 p G T ) T p 1Ỹ(n p, Φ p = G T ) T p 1Ỹ(n p Ỹ (n ) p G p 1, G p 1 = [ g 1 g 2...g p 1 ] R M (p 1). I In : an identity matrix of size I n I n. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 16 / 27

Outline 1 Motivation 2 The Proposed UMPCA Algorithm Tensor-to-Vector Projection Uncorrelated Multilinear PCA (UMPCA) 3 Experimental Evaluations Experimental Setup Experimental Results 4 Conclusions Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 17 / 27

Experimental Setup. Data: the FERET (face) database (subset) Maximum pose variation: 15 degrees. Minimum number of face images per subject: 8. 721 face images from 70 subjects. Preprocessing: manually cropped and aligned, normalized to 80 80 pixels, 256 gray levels. Classification Nearest neighbor classifier. Euclidean distance measure. Performance: rank 1 identification rate. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 18 / 27

Experimental Setup. Data: the FERET (face) database (subset) Maximum pose variation: 15 degrees. Minimum number of face images per subject: 8. 721 face images from 70 subjects. Preprocessing: manually cropped and aligned, normalized to 80 80 pixels, 256 gray levels. Classification Nearest neighbor classifier. Euclidean distance measure. Performance: rank 1 identification rate. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 18 / 27

Experimental Setup. Data: the FERET (face) database (subset) Maximum pose variation: 15 degrees. Minimum number of face images per subject: 8. 721 face images from 70 subjects. Preprocessing: manually cropped and aligned, normalized to 80 80 pixels, 256 gray levels. Classification Nearest neighbor classifier. Euclidean distance measure. Performance: rank 1 identification rate. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 18 / 27

Outline 1 Motivation 2 The Proposed UMPCA Algorithm Tensor-to-Vector Projection Uncorrelated Multilinear PCA (UMPCA) 3 Experimental Evaluations Experimental Setup Experimental Results 4 Conclusions Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 19 / 27

Recognition Results for L = 1. L: number of training samples per subject. Extreme small sample size scenario. UMPCA outperforms other three methods. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 20 / 27

Recognition Results for L = 7. UMPCA outperforms other three methods. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 21 / 27

Examination of Variation Captured. L = 1 L = 7 Variation captured by UMPCA is much lower (due to zero-correlation & TVP). Too low variation limits contribution in recognition. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 22 / 27

Examination of Correlations Among Features. L = 1 L = 7 PCA and UMPCA: uncorrelated features. MPCA and TROD: correlated features. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 23 / 27

Summary UMPCA: uncorrelated feature extraction directly from tensor objects through TVP. The solution to UMPCA: successive variance maximization & alternating projection method. Evaluation: UMPCA outperforms PCA, MPCA, TROD in unsupervised face recognition task, especially in lower dimension. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 24 / 27

Future works Application to other unsupervised learning tasks, e.g. clustering. Investigation on design issues: initialization, projection order and termination. Combination of UMPCA features with PCA, MPCA or TROD features. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 25 / 27

Backup Slides Basic Operations. n-mode product: (A n U)(i 1,..., i n 1, j n, i n+1,..., i N ) = i n A(i 1,..., i N ) U(j n, i n ). Scalar product: < A, B >= i 1... i N A(i 1,..., i N ) B(i 1,..., i N ). Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 26 / 27

Backup Slides Design Issues in UMPCA. Initialization: vector 1 with normalization. Projection order: from 1 to N. Termination: maximum iteration number K. Haiping Lu, K. N. Plataniotis, A. N. Venetsanopoulos (U of T) Uncorrelated Multilinear PCA ICML 2008 27 / 27