Face Recognition and Biometric Systems

Similar documents
CS4495/6495 Introduction to Computer Vision. 8B-L2 Principle Component Analysis (and its use in Computer Vision)

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Eigenfaces. Face Recognition Using Principal Components Analysis

Lecture: Face Recognition and Feature Reduction

Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation)

Lecture: Face Recognition and Feature Reduction

1 Principal Components Analysis

Face Recognition. Face Recognition. Subspace-Based Face Recognition Algorithms. Application of Face Recognition

Introduction to Machine Learning

Principal Component Analysis

Principal Component Analysis (PCA)

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Covariance and Principal Components

What is Principal Component Analysis?

Linear Algebra & Geometry why is linear algebra useful in computer vision?

The Mathematics of Facial Recognition

Robot Image Credit: Viktoriya Sukhanova 123RF.com. Dimensionality Reduction

Lecture 24: Principal Component Analysis. Aykut Erdem May 2016 Hacettepe University

Example: Face Detection

Principal Component Analysis (PCA)

INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY

COMP6237 Data Mining Covariance, EVD, PCA & SVD. Jonathon Hare

Signal Analysis. Principal Component Analysis

PCA Review. CS 510 February 25 th, 2013

EECS490: Digital Image Processing. Lecture #26

Face Recognition Using Eigenfaces

PCA and LDA. Man-Wai MAK

7. Variable extraction and dimensionality reduction

December 20, MAA704, Multivariate analysis. Christopher Engström. Multivariate. analysis. Principal component analysis

Probabilistic & Unsupervised Learning

System 1 (last lecture) : limited to rigidly structured shapes. System 2 : recognition of a class of varying shapes. Need to:

CS168: The Modern Algorithmic Toolbox Lecture #7: Understanding Principal Component Analysis (PCA)

ECE 661: Homework 10 Fall 2014

Diagonalization. Hung-yi Lee

Announcements (repeat) Principal Components Analysis

Lecture 3: Review of Linear Algebra

Lecture 3: Review of Linear Algebra

20 Unsupervised Learning and Principal Components Analysis (PCA)

Eigenface-based facial recognition

Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x =

Covariance to PCA. CS 510 Lecture #8 February 17, 2014

PCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani

CS168: The Modern Algorithmic Toolbox Lecture #8: How PCA Works

Lecture 16: Small Sample Size Problems (Covariance Estimation) Many thanks to Carlos Thomaz who authored the original version of these slides

PCA FACE RECOGNITION

Econ Slides from Lecture 8

PCA and LDA. Man-Wai MAK

Multivariate Statistical Analysis

LECTURE 16: PCA AND SVD

Principal Component Analysis and Linear Discriminant Analysis

GEOG 4110/5100 Advanced Remote Sensing Lecture 15

Methods for sparse analysis of high-dimensional data, II

Machine Learning. Dimensionality reduction. Hamid Beigy. Sharif University of Technology. Fall 1395

Principal Component Analysis

ICS 6N Computational Linear Algebra Symmetric Matrices and Orthogonal Diagonalization

Principal Component Analysis

Singular Value Decomposition. 1 Singular Value Decomposition and the Four Fundamental Subspaces

Advanced Introduction to Machine Learning CMU-10715

Pattern Recognition 2

Lecture 12 Eigenvalue Problem. Review of Eigenvalues Some properties Power method Shift method Inverse power method Deflation QR Method

Numerical Methods I Singular Value Decomposition

Econ Slides from Lecture 7

Methods for sparse analysis of high-dimensional data, II

Deriving Principal Component Analysis (PCA)

Eigenvalues, Eigenvectors, and an Intro to PCA

Image Analysis & Retrieval. Lec 14. Eigenface and Fisherface

Lecture: Face Recognition

Recognition Using Class Specific Linear Projection. Magali Segal Stolrasky Nadav Ben Jakov April, 2015

Karhunen-Loève Transform KLT. JanKees van der Poel D.Sc. Student, Mechanical Engineering

Probabilistic Latent Semantic Analysis

Principal Component Analysis

Dimensionality Reduction

CSC 411 Lecture 12: Principal Component Analysis

Covariance to PCA. CS 510 Lecture #14 February 23, 2018

Modeling Classes of Shapes Suppose you have a class of shapes with a range of variations: System 2 Overview

CITS 4402 Computer Vision

Eigenimaging for Facial Recognition

Eigenvalues, Eigenvectors, and an Intro to PCA

Eigenimages. Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 1

Machine Learning - MT & 14. PCA and MDS

Eigenvalues, Eigenvectors, and an Intro to PCA

STA 414/2104: Lecture 8

Image Analysis & Retrieval Lec 14 - Eigenface & Fisherface

Data Mining. Dimensionality reduction. Hamid Beigy. Sharif University of Technology. Fall 1395

A Tutorial on Data Reduction. Principal Component Analysis Theoretical Discussion. By Shireen Elhabian and Aly Farag

Computer Assisted Image Analysis

Eigenvalues and Eigenvectors. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Image Analysis. PCA and Eigenfaces

Keywords Eigenface, face recognition, kernel principal component analysis, machine learning. II. LITERATURE REVIEW & OVERVIEW OF PROPOSED METHODOLOGY

Chapter 3 Transformations

Data Mining Lecture 4: Covariance, EVD, PCA & SVD

LECTURE NOTE #11 PROF. ALAN YUILLE

Machine Learning (CSE 446): Unsupervised Learning: K-means and Principal Component Analysis

Statistical Pattern Recognition

Face Recognition. Lauren Barker

Real Time Face Detection and Recognition using Haar - Based Cascade Classifier and Principal Component Analysis

CSE 554 Lecture 7: Alignment

Neuroscience Introduction

COMP 408/508. Computer Vision Fall 2017 PCA for Recognition

Linear Subspace Models

Transcription:

The Eigenfaces method

Plan of the lecture Principal Components Analysis main idea Feature extraction by PCA face recognition Eigenfaces training feature extraction

Literature M.A.Turk, A.P.Pentland Face Recognition Using Eigenfaces

Phases of recognition Detection Normalisation Feature vectors comparison Feature extraction

Example...

Example...

Example...

PCA main issues Dimensionality input space data Coordinates system suited to a set Dimensionality reduction reduction error Principal Components Analysis (PCA) statistical method

PCA main issues Input data set vectors / points input space Space orthonormal basis each vector from the input space can be expressed as a linear combination of basis vectors basis vector defines a dimension

PCA main issues Dimensions are sorted each basis vector assigned with a weight Main directions of variance in the input set each direction generates one dimension Dimension importance proportional to variance

Input face space Normalised image described by pixel values Image as a point in a space e.g. 64x75 pixels 4800 dimensions Information excess too many dimensions Elimination of redundant information relevant information must remain feature extraction

PCA for feature extraction Set of normalised images input vectors (a.k.a. training set) PCA: finds new orthogonal basis number of dimensions can be reduced creates face space Feature extraction by PCA find a corresponding point in a new space works for any face image (not only from training set) Eigenfaces method PCA for face images

The Eigenfaces method Training 1. create covariance matrix for the training set 2. calculate eigenvalues and eigenvectors eigenvectors are the orthonormal basis and define the face space eigenvalues are associated with eigenvectors eigenvalues proportional to variance 3. select eigenvectors Feature extraction face image mapped to the new space (coordinates must be found) any face image may be processed

C Eigenfaces: training Input vectors: u,...,uu 1 M u N-dimensional vector M number of vectors in the set Average vector: μ Covariance matrix: = = 1 M u M i= 1 1 M ( u μ)( u μ) Τ or i i M C = 1 i= 1 i M A A T

Training example Training set (M=4, N=3): [1, 0, 2] [0, 3, 1] [4, 1, 2] [3, 0, -1] Average vector, covariance matrix

Eigenfaces: training Characteristic equation: eigenvalues Eigenvectors (v) det( C λ I ) = (λ) one for each eigenvalue C v Jacobi method (numerical method) = λ OpenCV - cveigenvv function v 0

Training optimisation Problem: large size of covariance matrix (NxN), e.g. 4800x4800 T Trick: A A T A A A T A v = λ v v' = λ ' v ' ( A v') = λ' ( A v') λ = λ' v = A Av desired eigenvectors v'

Eigenfaces: training Eigenvectors properties dimensionality equal to input vectors orthonormal (length = 1) sorted by corresponding eigenvalues may be scaled to pixel value range Eigenfaces eigenvectors transformed to images example dimensionality reduction New space, less dimensions

Eigenfaces: training C 00... C 0n......... C n0... C nn Normalised images Covariance matrix Eigenfaces

Eigenfaces: feature extraction Feature extraction ti input: set of eigenvectors and eigenvalues (delivered by training) normalised image Projection: x'= ψ T x ti ith i t ψ -matrix with eigenvectors x normalised image after average face subtraction x transformed vector

Feature extraction - example 2-dimensional space: eigenvectors: 2 2 2 2 [ ; ] [ ; ] 2 2 2 2 Vectors projection: [3; 1], [-2; -2], [10, 9] Dimensionality i reduction

Eigenfaces: feature extraction ψ matrix can be cut to reduce dimensions ψ ψ ψ Feature vector element is a scalar product: wi = v T x i w = ψ' T x Feature vector cut projected vector x

Eigenfaces: feature extraction K1 K2... Scalar products between normalised image and eigenvectors K3... Feature vector

Back projection: face image Feature vector face description information reduction Back projection: face image recovered from feature vector reduced information are lost Projection error: depends on similarity to the training set 2D example face images

Back projection: detection Back projection of images: face -> slightly modified face image flower -> image similar to a face Back projection error is higher for non-face images Can be used as a verifier threshold of accepted projection error

Summary Eigenfaces a basic face recognition method many derived methods Training and feature extraction Holistic approach High speed Average / low effectiveness may be improved

Thank you for your attention!