System 1 (last lecture) : limited to rigidly structured shapes. System 2 : recognition of a class of varying shapes. Need to:

Similar documents
Modeling Classes of Shapes Suppose you have a class of shapes with a range of variations: System 2 Overview

Lecture 24: Principal Component Analysis. Aykut Erdem May 2016 Hacettepe University

CS4495/6495 Introduction to Computer Vision. 8B-L2 Principle Component Analysis (and its use in Computer Vision)

Deriving Principal Component Analysis (PCA)

What is Principal Component Analysis?

Principal Component Analysis

Eigenface-based facial recognition

Face Recognition. Face Recognition. Subspace-Based Face Recognition Algorithms. Application of Face Recognition

Advanced Introduction to Machine Learning CMU-10715

Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation)

Principal Component Analysis (PCA)

Dimensionality reduction

Image Analysis. PCA and Eigenfaces

Example: Face Detection

Recognition Using Class Specific Linear Projection. Magali Segal Stolrasky Nadav Ben Jakov April, 2015

Introduction to Machine Learning

Face Detection and Recognition

Robot Image Credit: Viktoriya Sukhanova 123RF.com. Dimensionality Reduction

Expectation Maximization

Face detection and recognition. Detection Recognition Sally

Lecture: Face Recognition

PCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani

CS168: The Modern Algorithmic Toolbox Lecture #7: Understanding Principal Component Analysis (PCA)

CITS 4402 Computer Vision

Principal Component Analysis

CS 4495 Computer Vision Principle Component Analysis

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Eigenfaces. Face Recognition Using Principal Components Analysis

Linear Subspace Models

Dimensionality Reduction Using PCA/LDA. Hongyu Li School of Software Engineering TongJi University Fall, 2014

University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries

Principal Components Analysis (PCA)

Machine Learning - MT & 14. PCA and MDS

Machine learning for pervasive systems Classification in high-dimensional spaces

Dimensionality Reduction

Lecture: Face Recognition and Feature Reduction

PCA, Kernel PCA, ICA

EECS490: Digital Image Processing. Lecture #26

Lecture: Face Recognition and Feature Reduction

Eigenimaging for Facial Recognition

The Mathematics of Facial Recognition

Lecture 17: Face Recogni2on

Face Recognition and Biometric Systems

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin

Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations.

December 20, MAA704, Multivariate analysis. Christopher Engström. Multivariate. analysis. Principal component analysis

Principal Component Analysis (PCA)

Announcements (repeat) Principal Components Analysis

PCA FACE RECOGNITION

CSC 411 Lecture 12: Principal Component Analysis

November 28 th, Carlos Guestrin 1. Lower dimensional projections

CHAPTER 4 PRINCIPAL COMPONENT ANALYSIS-BASED FUSION

Lecture 13 Visual recognition

Lecture 17: Face Recogni2on

Image Analysis & Retrieval. Lec 14. Eigenface and Fisherface

Lecture 16: Small Sample Size Problems (Covariance Estimation) Many thanks to Carlos Thomaz who authored the original version of these slides

CS168: The Modern Algorithmic Toolbox Lecture #8: How PCA Works

Computation. For QDA we need to calculate: Lets first consider the case that

Image Analysis & Retrieval Lec 14 - Eigenface & Fisherface

Lecture Topic Projects 1 Intro, schedule, and logistics 2 Applications of visual analytics, data types 3 Data sources and preparation Project 1 out 4

20 Unsupervised Learning and Principal Components Analysis (PCA)

Singular Value Decomposition and Principal Component Analysis (PCA) I

Karhunen-Loève Transform KLT. JanKees van der Poel D.Sc. Student, Mechanical Engineering

Pattern Recognition 2

Machine Learning. Principal Components Analysis. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012

Principal Component Analysis (PCA) CSC411/2515 Tutorial

2D Image Processing Face Detection and Recognition

Principal Component Analysis and Singular Value Decomposition. Volker Tresp, Clemens Otte Summer 2014

Machine Learning 2nd Edition

Principal Component Analysis. Applied Multivariate Statistics Spring 2012

COS 429: COMPUTER VISON Face Recognition

CPSC 340: Machine Learning and Data Mining. More PCA Fall 2017

Uncorrelated Multilinear Principal Component Analysis through Successive Variance Maximization

COMP 408/508. Computer Vision Fall 2017 PCA for Recognition

Principal Component Analysis

Lecture 5 Supspace Tranformations Eigendecompositions, kernel PCA and CCA

Face Recognition. Lecture-14

Lecture Notes 1: Vector spaces

ECE 661: Homework 10 Fall 2014

Machine Learning 11. week

Principal components analysis COMS 4771

A Modular NMF Matching Algorithm for Radiation Spectra

PRINCIPAL COMPONENT ANALYSIS

Face recognition Computer Vision Spring 2018, Lecture 21

1 Principal Components Analysis

Comparative Assessment of Independent Component. Component Analysis (ICA) for Face Recognition.

Keywords Eigenface, face recognition, kernel principal component analysis, machine learning. II. LITERATURE REVIEW & OVERVIEW OF PROPOSED METHODOLOGY

Singular Value Decomposition and its. SVD and its Applications in Computer Vision

A Unified Bayesian Framework for Face Recognition

Principal Component Analysis

Covariance to PCA. CS 510 Lecture #14 February 23, 2018

Data Mining and Analysis: Fundamental Concepts and Algorithms

CSE 554 Lecture 7: Alignment

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Linear Algebra Methods for Data Mining

SPECTRAL CLUSTERING AND KERNEL PRINCIPAL COMPONENT ANALYSIS ARE PURSUING GOOD PROJECTIONS

All you want to know about GPs: Linear Dimensionality Reduction

Salt Dome Detection and Tracking Using Texture Analysis and Tensor-based Subspace Learning

Discriminant analysis and supervised classification

Covariance and Principal Components

Transcription:

System 2 : Modelling & Recognising Modelling and Recognising Classes of Classes of Shapes Shape : PDM & PCA All the same shape? System 1 (last lecture) : limited to rigidly structured shapes System 2 : recognition of a class of varying shapes thresholding (IVR) Advanced Vision Lecture 4 boundary tracking (from System 1) toby.breckon@ed.ac.uk corner finding (from System 1) orientation to standard position (PCA) Training : Point Distribution Model (PDM) Recognition : likelihood calculation Computer Vision Lab. Institute for Perception, Action & Behaviour School of Informatics Modelling & Recognising Classes of Shape 1 Modelling & Recognising Classes of Shape 2 Motivation Not all objects e.g. fruit classification for a given shape class. due to capture problems / natural variation / movement / configurations 2. Represent the shape class as statistical variation over these modes of variation i.e. within bounds of variation of a common structure 3. Use statistical recognition based on comparison to...we want to be able to recognize them as such! Need to: 1. Identify common modes (i.e. directions) of variation But if they all originate from the same class of object but have large variations in shape in common directions variations in production appear equally All T parts belong to the same shape class grow equally are made equal How do we recognize classes of shape? Modelling & Recognising Classes of Shape 3 shape class representation Modelling & Recognising Classes of Shape 4

System 2 Overview Principle Component Analysis 1 Vision concepts (this lecture) Principle Components Analysis (PCA) Point Distribution Models (PDM) Model learning and Data Classification Task Specifics Given a set of D dimensional points {xi } with mean position m e.g. 2D points from an image (could also be 3D points) a2 a1 m (next lecture) a1 = 1st principle component a2 = 2nd principle component rotate T s to standard orientation using PCA represent T s using PDM recognize unseen examples w/ statistical classification Modelling & Recognising Classes of Shape 5 Find a new set of D perpendicular co ordinate axes {aj } such that: i.e. point xi can be represented as the mean and a weighted sum of the axis directions Modelling & Recognising Classes of Shape 6 Principle Component Analysis 2 1. Choose axis a1 as direction of most variation in the dataset: Transform points to new PCA representation Practical PCA Technique 1 need to solve for wij As all PCA axes are perpendicular know: ak aj = 0 for k j (dot product) ak ak = 1 2. Project each point xi onto a D 1 dimensional subspace perpendicular to a1 to give xi'. Can identify individual weights wik, weight of point xi for axis ak: Modelling & Recognising Classes of Shape 7 removing the component of variation in direction a1 Modelling & Recognising Classes of Shape 8

Practical PCA Technique 1 3. Calculate the axis direction a2 as the direction of the most remaining Why use PCA? variation in {xi } Many possible axis sets {ai } PCA chooses axis directions ai in order of largest remaining variation gives an ordering of dimensions of variation from most to least significant a2 allows us to omit low significance axes e.g. if we omit axis a2 we get the following reduction 4. Project each xi onto a D 2 dimension subspace 5. Continue like this until all D new axes ai are found Modelling & Recognising Classes of Shape 9 Modelling & Recognising Classes of Shape 10 How to do PCA Technique 2 via Eigenanalysis: Given N D dimensional points {xi } Mean Mid lecture problem : PCA If you had a 3D dataset like this: 1. Produce scatter matrix 2. Perform Singular Value Decomposition (SVD) S = UDV' D = diagonal matrix matrices U, V such that U' U = V' V = I 3. PCA: ith column of V is axis ai (N.B. ' = transpose) (i.e. ith eigenvector of S) dii of D (diagonal elements) is a measure of significance (i.e. ith eigenvalue of S) Modelling & Recognising Classes of Shape 11 how many principle components does it have? what is the most likely direction of the 1st PC? Modelling & Recognising Classes of Shape 12

Point Distribution Models Gaussian Noise Distribution Given Noise with a probability density function (PDF) following the normal (or Gaussian) distribution Set of points from the same class set of point positions {xi} for each object instance Assume point positions have systematic structural variation Original Gaussian noise point distribution each measured value (pixel/point) will have changed from its original value by a small random amount the distribution of these changes follow a Gaussian distribution around the original value (i.e. mean). (Gaussian distribution has zero mean.) Gaussian noise distribution good estimation of most sensor noise width of Gaussian, hence level of variation estimated by standard deviation point position variations are correlated (structurally and statistically) Aim: construct a model that captures both structural and statistical position variation use model for recognition Modelling & Recognising Classes of Shape 13 Gaussian Noise (std. dev = 8) Modelling & Recognising Classes of Shape 14 Example : hands Point Distribution Models PDM Given: set of N observations (i.e. images) each with P boundary points {(rik, cik )} Key trick : rewrite {(rik, cik )} as a new 2 P vector xi A family of objects with shape variations k = 1..P (points in boundary) i = 1..N (observations) xi = (ri1, ci1, ri2, ci2, ri3, ci3,...rip, cip )' N vectors {xi } of dimension 2 P how do we represent them? Modelling & Recognising Classes of Shape 15 Modelling & Recognising Classes of Shape 16

Point Distribution Models PDM If shape variations are random, then components of {xi } will be uncorrelated PDM : The Structural Model PCA over the set {xi } gives 2P axes such that: If shape there is systematic shape variation, then 2P axes gives a complete representation for {xi } but we can approximate {xi } with a subset of M most significant axes: select number of axes based on eigenvalue from PCA smaller, generalised representation of {xi } as M < 2P (above) components of {xi } will be correlated Use PCA to find correlated variations Modelling & Recognising Classes of Shape 17 can represent individual xi using wi = {wi1,...wim )' approximate full shape using wi and vary components to vary shape Modelling & Recognising Classes of Shape 18 PDM : The Statistical Model Structural Model : varying PCA weights If we have a good structural model component values wi should be random and independent why? : because structural variation has been extracted only statistical variation remains Statistical Model Original Capture (point outlines of hand) Variations from PCA weight space (varying weights along PCA axes) i.e. set of weight vectors 1...N, each of length M, resulting from first M principle component axes of the observation set [Morris '04] Provides visualisation of the main components of structural variation Modelling & Recognising Classes of Shape 19 given set of N component projection vectors {wi } Mean vector of weights : Covariance matrix : (captures statistical variation from mean of PCA weight space) Modelling & Recognising Classes of Shape 20

PDM: Classification / Recognition Structural Model : M axes from PCA Statistical Model : mean & covariance of PCA weight space Given: Application to our T parts? an unknown observation (point set { p }) set of means { ti } and covariance matrices {Ci} for K classes For i in 1...K classes 1. Project { p } onto the structural model of class i w 2. Compute Mahalanobis distance from statistical model Next lecture... Select class with smallest distance or reject if distance is too large Modelling & Recognising Classes of Shape 21 Modelling & Recognising Classes of Shape 22 PCA based face recognition Eigenfaces [Turk & Pentland '91] representation of faces using PCA directly on images one of the most famous uses of PCA in computer vision seminal reference on the problem of Face Recognition Key principle EigenPictures [Sirovitch/Kirby '87] if D dimensional points can be represented as a weighted sum of D axes then images can be represented as a weighted sum of other images (Eigenpictures) Modelling & Recognising Classes of Shape 23 Single image can be reconstructed from a weighted sum of mean + N basis images Treat the RxC images as vectors dimension RC! perform PCA via eigenanalysis PCA axis vectors can be displayed as RxC images eigenpictures difficult to characterise variations visually When applied to compression of face database: EigenFaces [Morris '04] 40 eigenpictures to represent 115 (128x128) images with 3% error [Sirovitch/Kirby '87] Modelling & Recognising Classes of Shape 24

EigenFaces EigenFaces [Turk & Pentland '91] Learning: collect set of pictures of K people (varying capture conditions) Mean face and subset of principle component axes/images [Morris '04] Recognition : unknown face image F use PCA eigenanalysis to compute eigenfaces of complete set special trick for large RxC matrices Represent each person i=1..k as corresponding vector, wi, in PCA compute projection of F onto PCA weight space i.e. as weighted sum of eigenfaces result: weight vector wf measure distance, d(), between vector wf and vectors w1...wk weight space [Turk & Pentland '91] i.e. as a weighted sum of eigenfaces if d(wf,wi ) < threshold relies on common alignment of subject in images! Modelling & Recognising Classes of Shape 25 identify face as person i EigenFaces in detail else return as unknown face Modelling & Recognising Classes of Shape 26 EigenFaces Performance 2,500 128x128 image database [Turk & Pentland '91] Summary varied face lighting, orientation, size 96% successful recognition over lighting variation 85% successful recognition over orientation variation 64% successful recognition over size variation Issues for discussion face position, orientation, scale variance & occlusion big problems face position/orientation identification still a major research topic 96% still means 4 failures per 100 people (arriving at a busy airport!) Principle Component Analysis (PCA) constructing a structural model of variation PCA via eigenanalysis Point Distribution Models (PDM) constructing a statistical model of variation learning and recognition using PDM & PCA EigenPictures / EigenFaces PCA extended to raw images Next lecture : PCA/PDM to T models Modelling & Recognising Classes of Shape 27 Modelling & Recognising Classes of Shape 28