The Mathematics of Facial Recognition

Similar documents
Eigenfaces. Face Recognition Using Principal Components Analysis

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Example: Face Detection

CS4495/6495 Introduction to Computer Vision. 8B-L2 Principle Component Analysis (and its use in Computer Vision)

Eigenface-based facial recognition

Singular Value Decomposition

Principal Component Analysis (PCA)

Eigenimaging for Facial Recognition

Linear Subspace Models

What is Principal Component Analysis?

Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation)

Numerical Methods I Singular Value Decomposition

Face Recognition and Biometric Systems

Main matrix factorizations

Linear Algebra Methods for Data Mining

Recognition Using Class Specific Linear Projection. Magali Segal Stolrasky Nadav Ben Jakov April, 2015

Principal Component Analysis

Lecture 24: Principal Component Analysis. Aykut Erdem May 2016 Hacettepe University

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Face Detection and Recognition

Linear Algebra & Geometry why is linear algebra useful in computer vision?

be a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u

Machine Learning. B. Unsupervised Learning B.2 Dimensionality Reduction. Lars Schmidt-Thieme, Nicolas Schilling

System 1 (last lecture) : limited to rigidly structured shapes. System 2 : recognition of a class of varying shapes. Need to:

Deriving Principal Component Analysis (PCA)

Dimensionality Reduction Using PCA/LDA. Hongyu Li School of Software Engineering TongJi University Fall, 2014

Singular Value Decomposition

Advanced Introduction to Machine Learning CMU-10715

Modeling Classes of Shapes Suppose you have a class of shapes with a range of variations: System 2 Overview

CSE 554 Lecture 7: Alignment

Image Analysis & Retrieval. Lec 14. Eigenface and Fisherface

DATA MINING LECTURE 8. Dimensionality Reduction PCA -- SVD

A Tutorial on Data Reduction. Principal Component Analysis Theoretical Discussion. By Shireen Elhabian and Aly Farag

Lecture: Face Recognition and Feature Reduction

Face Recognition Using Eigenfaces

15 Singular Value Decomposition

Lecture: Face Recognition and Feature Reduction

Expectation Maximization

Principal Component Analysis

Face Recognition. Face Recognition. Subspace-Based Face Recognition Algorithms. Application of Face Recognition

Covariance-Based PCA for Multi-Size Data

Singular Value Decomposition and its. SVD and its Applications in Computer Vision

Principal Component Analysis

The Singular Value Decomposition

Singular Value Decomposition and Digital Image Compression

LECTURE 16: PCA AND SVD

Probabilistic Latent Semantic Analysis

Lecture: Face Recognition

Exercise Sheet 1. 1 Probability revision 1: Student-t as an infinite mixture of Gaussians

Machine Learning (Spring 2012) Principal Component Analysis

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

Data Mining Lecture 4: Covariance, EVD, PCA & SVD

Keywords Eigenface, face recognition, kernel principal component analysis, machine learning. II. LITERATURE REVIEW & OVERVIEW OF PROPOSED METHODOLOGY

Covariance and Principal Components

Lecture 6 Sept Data Visualization STAT 442 / 890, CM 462

Signal Analysis. Principal Component Analysis

Dimensionality reduction

Lecture 5 Singular value decomposition

Tutorial on Principal Component Analysis

EECS 275 Matrix Computation

COMP6237 Data Mining Covariance, EVD, PCA & SVD. Jonathon Hare

Principal Components Analysis (PCA)

Image Analysis & Retrieval Lec 14 - Eigenface & Fisherface

Image Analysis. PCA and Eigenfaces

LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach

Linear Algebra Methods for Data Mining

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin

Principal Component Analysis (PCA)

Homework 1. Yuan Yao. September 18, 2011

FSAN/ELEG815: Statistical Learning

Principal Component Analysis. Applied Multivariate Statistics Spring 2012

Independent Component Analysis

14 Singular Value Decomposition

Positive Definite Matrix

COMP 558 lecture 18 Nov. 15, 2010

CS 559: Machine Learning Fundamentals and Applications 5 th Set of Notes

Problems. Looks for literal term matches. Problems:

MLCC 2015 Dimensionality Reduction and PCA

Karhunen-Loève Transform KLT. JanKees van der Poel D.Sc. Student, Mechanical Engineering

1 Singular Value Decomposition and Principal Component

Maths for Signals and Systems Linear Algebra in Engineering

Singular Value Decomposition. 1 Singular Value Decomposition and the Four Fundamental Subspaces

Linear Algebra Review. Fei-Fei Li

Multivariate Statistics (I) 2. Principal Component Analysis (PCA)

Methods for sparse analysis of high-dimensional data, II

Principal Component Analysis

EIGENVALE PROBLEMS AND THE SVD. [5.1 TO 5.3 & 7.4]

Preconditioning. Noisy, Ill-Conditioned Linear Systems

Lecture 13. Principal Component Analysis. Brett Bernstein. April 25, CDS at NYU. Brett Bernstein (CDS at NYU) Lecture 13 April 25, / 26

CS 4495 Computer Vision Principle Component Analysis

UNIT 6: The singular value decomposition.

EE16B Designing Information Devices and Systems II

Relations Between Adjacency And Modularity Graph Partitioning: Principal Component Analysis vs. Modularity Component Analysis

PCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani

Multilinear Subspace Analysis of Image Ensembles

Robot Image Credit: Viktoriya Sukhanova 123RF.com. Dimensionality Reduction

Linear Least Squares. Using SVD Decomposition.

Covariance to PCA. CS 510 Lecture #8 February 17, 2014

Lecture 5 Supspace Tranformations Eigendecompositions, kernel PCA and CCA

Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x =

Transcription:

William Dean Gowin Graduate Student Appalachian State University July 26, 2007

Outline

EigenFaces Deconstruct a known face into an N-dimensional facespace where N is the number of faces in our data set. Deconstruct new faces and check how close they are in the facespace to the known face.

Objectives Explain Principle Component Analysis (PCA), and Singular Value Decomposition (SVD) Test any visual differences between PCA and SVD and then check the computational efficiency

Outline

PCA Utilized in: genomic studies biochemistry neurology biometrics The goal of PCA is to find the most important basis to represent a large data set, which filters out redundant or extraneous information.

Reduces dimensionality of the data set and filters out the redundant data. Calculate the eigenvalues and eigenvectors of the covariance matrix AA T. The k largest eigenvalues and their subsequent eigenvectors represent the most important patterns or characteristics in the data. That is you can reproduce the original faces reasonably well with only these k eigenvectors.

A simple Example of PCA

Subtract means, calculate eigenvalues and eigenvectors of the covariance matrix.

x y 2.6 2.3 0.6 0.7 2.2 2.8 1.9 2.1 3.2 3.2 2.3 2.7 2.0 1.6 1.0 1.2 1.4 1.6 1.1 0.8 2.8 3.2 1.3 1.1 x restore y restore 2.3623 2.5080 0.7019 0.6108 2.4366 2.5929 1.9596 2.0478 3.0686 3.3150 2.4305 2.5858 1.7552 1.8143 1.1232 1.0922 1.4949 1.5169 0.9683 0.9153 2.8951 3.1167 1.2037 1.1842

Let A be an n 2 m matrix, where m is the number of images in the data set, and each image is: n n, and n m. AA T has n 2 eigenvalues (λ i ), and eigenvectors (u i ), but n 2 is much to large to compute and therefore intractable. A T A has m eigenvalues and eigenvectors (v i ), and are the largest m eigenvalues of AA T. The eigenvectors of AA T are related to A T A by: u i = Av i. References: M. Turk and A. Pentland, Eigenfaces for Recognition, Journal of Cognitive Neuroscience, Vol. 3, no. 1, pp. 71-86, 1991

Outline

SVD Every m n matrix can be factored into a singular value decomposition (SVD) of the form: A = UΣV T (1) where the columns of U and V are the left and right (respectively) singular vectors of A and they are both orthogonal matrices: U T = U 1, V T = V 1, and Σ = diag(σ 1, σ 2,..., σ n ) where the σ i s are the singular values of A which are non-negative. References: Carl Meyer, Matrix Analysis and Applied Linear Algebra, Siam, Philadelphia, 2000. Jonathon Shlens, A Tutorial on Principal Component Analysis, http://www.snl.salk.edu/ shlens/notes.html

To see how the singular values relate to the eigenvalues, the SVD of A can be rewritten as: AV = UΣ and A T U = V Σ, which leads to: Av i = σ i u i A T Av i = A T σ i u i = σ i A T u i = σ 2 i v i A T Av i = λ i v i where v i represents the i th column vector of V, u i represents the i th column vector of U and λ i is the i th eigenvalue of A T A. The above proof demonstrates the relationship that the squared singular values of A are the eigenvalues of A T A and the column vectors of V are the eigenvectors of A T A and the column vectors of U are the corresponding eigenvectors of AA T.

Outline

Figure: Faces in the data set.

Figure: PCA facebasis.

Figure: SVD facebasis.

Figure: Recreating with top 5 PCA facebasis. Figure: Recreating with top 5 SVD facebasis.

No visual difference between PCA and SVD. On average the time to compute was twice as fast with PCA compared to SVD. Using only 5 eigenfaces produced similar results to using 20. Was able to identify most faces even ones that were quite different than the original.

Figure: Top 5 Closest faces in facespace using PCA. Figure: Top 5 Closest faces in facespace using SVD.

Figure: Reconstructing a face that is smiling and has eyes closed. Figure: Recreating with glasses, smiling and further away.

Figure: Recreating a face that is smiling. Figure: Recreating a face that is smiling.

Figure: Recreating a face that is turned and smiling. Figure: Recreating a face that is smiling.

Figure: Recreating face with only 5 eigenfaces. Figure: Recreating face with only 5 eigenfaces.

Outline

Improvements A larger data set Cropped closer to the face. Subtract off the background Use Eigenfaces first to check and then use face proportions (eye width...), instead of human input or threshold

Difficulties Aligning the faces (trying to get the camera centered & perpendicular to the face) can be quite difficult. Other factors: distance away from camera, head tilt, lighting, image resolution,... Cropping - not including hair, ears, neck. Working in RGB space is more difficult and computational expensive.