Eigenfaces. Face Recognition Using Principal Components Analysis

Similar documents
Dimensionality Reduction Using PCA/LDA. Hongyu Li School of Software Engineering TongJi University Fall, 2014

Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation)

CS4495/6495 Introduction to Computer Vision. 8B-L2 Principle Component Analysis (and its use in Computer Vision)

The Mathematics of Facial Recognition

Lecture: Face Recognition

Face Recognition and Biometric Systems

COMP 408/508. Computer Vision Fall 2017 PCA for Recognition

Eigenface-based facial recognition

What is Principal Component Analysis?

Face Recognition. Face Recognition. Subspace-Based Face Recognition Algorithms. Application of Face Recognition

Image Analysis. PCA and Eigenfaces

Deriving Principal Component Analysis (PCA)

Face detection and recognition. Detection Recognition Sally

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Face Detection and Recognition

Introduction to Machine Learning

Face Recognition Using Eigenfaces

Course 495: Advanced Statistical Machine Learning/Pattern Recognition

Lecture 24: Principal Component Analysis. Aykut Erdem May 2016 Hacettepe University

CITS 4402 Computer Vision

Principal Component Analysis

A Unified Bayesian Framework for Face Recognition

Modeling Classes of Shapes Suppose you have a class of shapes with a range of variations: System 2 Overview

Pattern Recognition 2

System 1 (last lecture) : limited to rigidly structured shapes. System 2 : recognition of a class of varying shapes. Need to:

Dimensionality reduction

Example: Face Detection

PCA and LDA. Man-Wai MAK

Principal Component Analysis (PCA)

Principal Component Analysis

Robot Image Credit: Viktoriya Sukhanova 123RF.com. Dimensionality Reduction

Keywords Eigenface, face recognition, kernel principal component analysis, machine learning. II. LITERATURE REVIEW & OVERVIEW OF PROPOSED METHODOLOGY

Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations.

Expectation Maximization

Principal Component Analysis (PCA)

Linear Subspace Models

Principal Component Analysis

PCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani

1 Principal Components Analysis

Lecture 17: Face Recogni2on

Principal Components Analysis

Eigenimaging for Facial Recognition

Methods for sparse analysis of high-dimensional data, II

Lecture: Face Recognition and Feature Reduction

EECS 275 Matrix Computation

Lecture: Face Recognition and Feature Reduction

PCA and LDA. Man-Wai MAK

Heeyoul (Henry) Choi. Dept. of Computer Science Texas A&M University

Advanced Introduction to Machine Learning CMU-10715

Reconnaissance d objetsd et vision artificielle

Methods for sparse analysis of high-dimensional data, II

Two-Layered Face Detection System using Evolutionary Algorithm

Machine Learning 11. week

Face recognition Computer Vision Spring 2018, Lecture 21

CS168: The Modern Algorithmic Toolbox Lecture #7: Understanding Principal Component Analysis (PCA)

EECS490: Digital Image Processing. Lecture #26

COMP 551 Applied Machine Learning Lecture 13: Dimension reduction and feature selection

Lecture Topic Projects 1 Intro, schedule, and logistics 2 Applications of visual analytics, data types 3 Data sources and preparation Project 1 out 4

Lecture 17: Face Recogni2on

Uncorrelated Multilinear Principal Component Analysis through Successive Variance Maximization

Face Recognition. Lecture-14

MLCC 2015 Dimensionality Reduction and PCA

Non-linear Dimensionality Reduction

ECE 661: Homework 10 Fall 2014

Lecture 16: Small Sample Size Problems (Covariance Estimation) Many thanks to Carlos Thomaz who authored the original version of these slides

Image Analysis & Retrieval. Lec 14. Eigenface and Fisherface

Dr. Ulas Bagci

An Efficient Pseudoinverse Linear Discriminant Analysis method for Face Recognition

Face Recognition. Lecture-14

Linear Algebra & Geometry why is linear algebra useful in computer vision?

PCA FACE RECOGNITION

Lecture 13 Visual recognition

Recognition Using Class Specific Linear Projection. Magali Segal Stolrasky Nadav Ben Jakov April, 2015

Unsupervised Machine Learning and Data Mining. DS 5230 / DS Fall Lecture 7. Jan-Willem van de Meent

Comparative Assessment of Independent Component. Component Analysis (ICA) for Face Recognition.

Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi

Principal Component Analysis (PCA) CSC411/2515 Tutorial

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Data Mining Techniques

Face Recognition. Lauren Barker

Lecture 5 Supspace Tranformations Eigendecompositions, kernel PCA and CCA

PCA Review. CS 510 February 25 th, 2013

Image Analysis & Retrieval Lec 14 - Eigenface & Fisherface

Dimension Reduction and Component Analysis

Dimension Reduction and Component Analysis Lecture 4

SPECTRAL CLUSTERING AND KERNEL PRINCIPAL COMPONENT ANALYSIS ARE PURSUING GOOD PROJECTIONS

Eigenfaces and Fisherfaces

Principal Components Analysis (PCA)

Motivating the Covariance Matrix

Automatic Identity Verification Using Face Images

Introduction to Computer Vision

2D Image Processing Face Detection and Recognition

Tutorial on Principal Component Analysis

Machine Learning - MT & 14. PCA and MDS

ECE 521. Lecture 11 (not on midterm material) 13 February K-means clustering, Dimensionality reduction

All you want to know about GPs: Linear Dimensionality Reduction

L11: Pattern recognition principles

CS 4495 Computer Vision Principle Component Analysis

COS 429: COMPUTER VISON Face Recognition

Exercises * on Principal Component Analysis

Transcription:

Eigenfaces Face Recognition Using Principal Components Analysis M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive Neuroscience, 3(1), pp. 71-86, 1991. Slides : George Bebis, UNR

Principal Component Analysis (PCA) Pattern recognition in high-dimensional spaces Problems arise when performing recognition in a high-dimensional space (curse of dimensionality). Significant improvements can be achieved by first mapping the data into a lower-dimensional sub-space. The goal of PCA is to reduce the dimensionality of the data while retaining as much information (but no redundancy) as possible in the original dataset. 2

Principal Component Analysis (PCA) Dimensionality reduction PCA allows us to compute a linear transformation that maps data from a high dimensional space to a lower dimensional sub-space. 3

Principal Component Analysis (PCA) Lower dimensionality basis Approximate vectors by finding a basis in an appropriate lower dimensional space. (1) Higher-dimensional space representation: (2) Lower-dimensional space representation: 4

Principal Component Analysis (PCA) Information loss Dimensionality reduction implies information loss!! Want to preserve as much information as possible, that is: How to determine the best lower dimensional sub-space? 5

Principal Component Analysis (PCA) Methodology Suppose x 1, x 2,..., x M are N x 1 vectors 1 M 6

Principal Component Analysis (PCA) Methodology cont. 7

Principal Component Analysis (PCA) Linear transformation implied by PCA The linear transformation R N R K that performs the dimensionality reduction is: (i.e., simply computing coefficients of linear expansion) The above expression assumes that u i has unit length (i.e., normalized) 8

Principal Component Analysis (PCA) Geometric interpretation PCA projects the data along the directions where the data varies the most. These directions are determined by the eigenvectors of the covariance matrix corresponding to the largest eigenvalues. The magnitude of the Eigen values corresponds to the variance of the data along the eigenvector directions. 9

Principal Component Analysis (PCA) How to choose the principal components? To choose K, use the following criterion: In this case, we say that we preserve 90% or 95% of the information in our data. If K=N, then we preserve 100% of the information in our data. 10

Principal Component Analysis (PCA) Error due to dimensionality reduction The original vector x can be reconstructed using its principal components: It can be shown that the low-dimensional basis based on principal components minimizes the reconstruction error: It can be shown that the error is equal to: 11

Principal Component Analysis (PCA) Standardization The principal components are dependent on the units used to measure the original variables as well as on the range of values they assume. We should always standardize the data prior to using PCA. A common standardization method is to transform all the data to have zero mean and unit standard deviation: 12

Application to Faces Computation low-dimensional basis (i.e.,eigenfaces): 13

Application to Faces Computation of the eigenfaces cont. 1 M 14

Application to Faces Computation of the eigenfaces cont. T AA ui iui IMPORTANT! u i u Av and i i i i 15

Application to Faces Computation of the eigenfaces cont. 16

Eigenfaces example Training images 17

Eigenfaces example Top eigenvectors: u 1, u k Mean: μ 18

Representing faces onto this basis Application to Faces Face reconstruction: 19

Eigenfaces Case Study: Eigenfaces for Face Detection/Recognition M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive Neuroscience, vol. 3, no. 1, pp. 71-86, 1991. Face Recognition The simplest approach is to think of it as a template matching problem Problems arise when performing recognition in a high-dimensional space. Significant improvements can be achieved by first mapping the data into a lower dimensionality space. 20

Face Recognition Using Eigenfaces Eigenfaces ( where u 1) i where l l ( w w ) K i 1 i i 2 21

Eigenfaces Face Recognition Using Eigenfaces cont. The distance e r is called distance within face space (difs) The Euclidean distance can be used to compute e r, however, the Mahalanobis distance has shown to work better: 22

Face detection and recognition Detection Recognition Sally 23

Face Detection Using Eigenfaces Eigenfaces ( where u 1) i The distance e d is called distance from face space (dffs) 24