Modeling Classes of Shapes Suppose you have a class of shapes with a range of variations: System 2 Overview

Size: px
Start display at page:

Download "Modeling Classes of Shapes Suppose you have a class of shapes with a range of variations: System 2 Overview"

Transcription

1 Modeling Classes of Shapes Suppose you have a class of shapes with a range of variations: System processes System Overview Previous Systems: Thresholding, Boundary Tracking, Corner Finding (but here with better threshold) How to represent whole class? Statistical shape models This System: Orientation to standard position Training: Point Distribution Model calculation Recognition: likelihood calculation AV: Modeling Classes of Shapes Fisher lecture 4 slide AV: Modeling Classes of Shapes Fisher lecture 4 slide Motivation Not all objects are made equal: variations in production How to recognize shape classes? All TEE parts belong to the same shape class, but have large variations in shape in common ways grow equally: eg. fruit classification appear equal: movement, configurations But if they belong to the same class of object, we want to recognize them as such Need to:. Identify common modes (ie. directions) of variation for each class. Represent the shape class as statistical variation over these modes. Use statistical recognition based on comparison to statistical shape representation AV: Modeling Classes of Shapes Fisher lecture 4 slide AV: Modeling Classes of Shapes Fisher lecture 4 slide 4

2 Lecture Set Overview Principal Component Analysis Given a set of D dimension points { x i } with mean m 4 a a Principal Component Analysis 4 Point Distribution Models Model Learning and Data Classification Rotating TEEs to Standard Position Representing TEEs using Point Distribution Models Recognize new examples w/statistical classifier a st principal axis a nd principal axis 4 4 Find a new set of D perpendicular coordinate axes { a j } such that x i = m + j w ij a j m IE. point x i represented as a mean plus weighted sum of axis directions AV: Modeling Classes of Shapes Fisher lecture 4 slide AV: Modeling Classes of Shapes Fisher lecture 4 slide 6 Transforming points to the new representation Transforming points is easy as a k a j = and a k a k = for k j How to do PCA I. Choose axis a as the direction of the most variation in the dataset: 4 a r a x a w w m c Computing w ik is easy: a k ( x i m) = a k w ij a j = j j w ij a k a j = w ik. Project each x i onto a D dimensional subspace perpendicular to a (ie removing the component of variation in direction a ) to give x i. Calculate the axis a as the direction of the most remaining variation in { x i } AV: Modeling Classes of Shapes Fisher lecture 4 slide 7 AV: Modeling Classes of Shapes Fisher lecture 4 slide 8

3 Many possible axis sets { a i } Why PCA PCA chooses axis directions a i in order of largest remaining variation 4. Project each x i onto a D dimension subspace. Continue like this until all D new axes a i are found. Gives an ordering on dimensions from most to least significant Allows us to omit low significance axes. Eg, projecting a gives: 4 4 a a AV: Modeling Classes of Shapes Fisher lecture 4 slide 9 AV: Modeling Classes of Shapes Fisher lecture 4 slide How to Do PCA II Via Eigenanalysis Given N D-dimensional points { x i }. Mean m = N i x i. Compute scatter matrix S = i( x i m)( x i m). Compute Singular Value Decomposition (SVD): S = U D V, where D is a diagonal matrix and U U = V V = I 4. PCA: i th column of V is axis a i (i th eigenvector of S) d ii of D is a measure of significance (i th eigenvalue) Midlecture Problem If you had a D dataset like this How many principal components does it have? AV: Modeling Classes of Shapes Fisher lecture 4 slide AV: Modeling Classes of Shapes Fisher lecture 4 slide

4 Given: Point Distribution Models Data Example A family of objects with shape variations Set of objects from the same class Set of point positions { x i } for each object instance Assume: Point positions have a systematic structural variation plus a Gaussian noise point distribution Thus, point position variations are correlated Goals: Construct a model that captures structural as well as statistical position variation. Use model for recognition How to represent? AV: Modeling Classes of Shapes Fisher lecture 4 slide AV: Modeling Classes of Shapes Fisher lecture 4 slide 4 Point Distribution Models - PDMs Given a set of N observations, each with P boundary points {(r ik, c ik )}, k =..P, i =..N in corresponding positions. Key Trick: rewrite {(r ik, c ik )} as a new P vector x i = (r i, c i, r i, c i,..., r ip, c ip ) Gives N vectors { x i } of dimension P PDMs II If shape variations are random, then components of { x i } will be uncorrelated. If there is a systematic variation, then components will be correlated. Use PCA to find correlated variations. AV: Modeling Classes of Shapes Fisher lecture 4 slide AV: Modeling Classes of Shapes Fisher lecture 4 slide 6

5 PDM II: The Structural Model PCA over the set { x i } gives a set of P axes such that x i = m + P j= w ij a j P axes gives complete representation for { x i }. Approximate shapes using a subset M of the most significant axes (based on the eigenvalue size from PCA):. M x i = m + w ij a j () j= PDM II: The Structural Model Represent x i using w i = (w i,..., w im ) A smaller representation as M << P Goal: represent the essential structure variations Approximate full shape reconstruction using w i and () Can vary w i to vary shape AV: Modeling Classes of Shapes Fisher lecture 4 slide 7 AV: Modeling Classes of Shapes Fisher lecture 4 slide 8 Structural model - varying the weights Each row here varies one of top eigenvectors of model from hand outlines PDM III: The Statistical Model If we have a good structural model, then the component weights should characterise the shape. A family of shapes should have a distribution that characterises normal shapes (and abnormal shapes are outliers). We assume that the distribution of normal shape weights is Gaussian. Statistical Model: Given a set of N component projection vectors { w i } Visualisation of the main modes of structural variation AV: Modeling Classes of Shapes Fisher lecture 4 slide 9 Mean vector is t = N i w i Covariance matrix C = N i ( w i t)( w i t) AV: Modeling Classes of Shapes Fisher lecture 4 slide

6 Statistical Model Matlab code Uses inverse of C (invcor): % Vecs(N,D) is N observations of D % dimensional vector Mean = mean(vecs) ; diffs = Vecs - ones(n,)*mean ; Invcor = inv(diffs *diffs/(n-)); AV: Modeling Classes of Shapes Fisher lecture 4 slide Given: Classification/Recognition Unknown sample x Structural model: mean m + M variation axes a j Statistical model: class means { t i } and associated covariance matrices {C i } for i =..K classes For each class i:. Project x onto a j to get weights w (M dim vector). Compute Mahalanobis distances: d i ( w) = (( w t i ) (C i ) ( w t i )) Select class i with smallest distance d i ( w) Reject if smallest distance is too large AV: Modeling Classes of Shapes Fisher lecture 4 slide Algorithm Pre-processing Load image, convert to binary (e.g. IVR) Algorithm Pre-processing Get boundary and find corners (system ) Exploit problem constraints: long lines, meet at given angles delete short badly placed segments and extend to intersection Problem: poor segmentation varying numbers of corners Exploit problem constraints: long lines search for lines directly AV: Modeling Classes of Shapes Fisher lecture 4 slide AV: Modeling Classes of Shapes Fisher lecture 4 slide 4

7 Model Learning Summary. Load image, threshold, get boundary and find corners (c.f. TASK, except with a better corner finding threshold). Point Distribution Model learning method: (a) Rotate TEEs to standard position (b) Get vertices into vector in a standard order (c) Construct structural model using PCA (d) Project examples into PCA weight space (e) Estimate statistical model of projections Recognition Summary. Load image, threshold, get boundary and find corners (c.f. TASK, except with a better corner finding threshold). Point Distribution Model classification method: (a) Constraint: reject if not 8 vertices (b) Rotate TEE to standard position for PDM (Constraint: reject if not sets of 4 nearly parallel lines) (c) Get vertices into vector in a standard order (d) Project vector into PCA weight space (structural model) (e) Evaluate statistical likelihood of projection (statistical model) AV: Modeling Classes of Shapes Fisher lecture 4 slide AV: Modeling Classes of Shapes Fisher lecture 4 slide 6 Rotating TEE to standard position Assumes 8 lines in a rough TEE shape Use heuristic algorithm (about 6 lines of code) Example segmented boundary and standard alignment. Sort 8 lines into sets of 4 mutually nearly parallel lines (reject if not possible): find direction of one line, sort all others by whether angle with this line is π 4 or not. Find which set is the head of TEE (reject if neither or both satisfy criteria). Also sort into positional order: if longest is sufficiently longer than the next and the shortest are about the same length as the longest, the longest is the head of TEE. Estimate transformation of TEE to standard position with TEE head top parallel to column axis and center of TEE at origin. Apply transformation to TEE. ) ) aligned) aligned) AV: Modeling Classes of Shapes Fisher lecture 4 slide 7 AV: Modeling Classes of Shapes Fisher lecture 4 slide 8

8 function train Training Code % training phase datacount = ; maximages = ; imagestem=input( Training image file stem\n?, s ); for imagenum = : maximages currentimagergb = imread([imagestem,... intstr(imagenum),.jpg ], jpg ); currentimage = rgbgray(currentimagergb); % get corners using TASK s method Image = getbinary(currentimage,,,,,); %threshold [H,W] = size(image); [r,c] = find( bwperim(image,4) == ); %perimeter AV: Modeling Classes of Shapes Fisher lecture 4 slide 9 [sr,sc] = removespurs(r,c,h,w,); %clean [tr,tc] = boundarytrack(sr,sc,h,w,); %track datalines = zeros(,4); % space for results numlines = ; findcorners(tr,tc,h,w,9,6); %segment % process boundary, assuming it is a TEE if numlines == 8 % rotate datalines to standard position [newlines,flag] = standard_position(... datalines(:numlines,:),numlines,); % get vertices if flag % sort vertices into a standard order sortvertices=sortvert(newlines(:numlines,:)); AV: Modeling Classes of Shapes Fisher lecture 4 slide % add to scatter matrix [Vnum,Vcoord] = size(sortvertices); if Vnum == 8 % turn points into long array datacount = datacount + ; allvertices(datacount,:) =... reshape(sortvertices,,vnum*vcoord); end end end end % Create model meanvertex = mean(allvertices); vertexdev = allvertices - ones(datacount,)*meanvertex; scatter = vertexdev *vertexdev; [U,D,V]= svd(scatter); modeldev = V(:,:) % get projections onto data vecs = vertexdev*modeldev ; % use only first components % get class mean vector and covariance matrix [Mean,Invcor] = buildmodel(vecs,maximages); % save training data save modelfile modeldev meanvertex Mean Invcor AV: Modeling Classes of Shapes Fisher lecture 4 slide AV: Modeling Classes of Shapes Fisher lecture 4 slide

9 Mahalanobis Distance Vector distance measure that takes account of different range of values for different positions in vector Given vectors a, b from a set with covariance C, the Euclidean Distance between the vectors is: The Mahalanobis Distance is: a b = [( a b) ( a b)] Recognition Code Same as Training code up to where 8 sorted point list reshaped into 6-vector % turn 8 D points into 6 vector tmp = reshape(sortvertices,,*vnum); % project onto eigenvectors vec = (tmp - meanvertex)*modeldev ; IE, scaled differences [( a b) C ( a b)] % get class distance dist = mahalanobis(vec,mean,invcor); AV: Modeling Classes of Shapes Fisher lecture 4 slide AV: Modeling Classes of Shapes Fisher lecture 4 slide 4 Representing the TEEs using PDMs Have 8 D points for each TEE in standard position Midlecture Problem What might the first few principal components encode for the TEE data? Have N = instances with variations Can we make a model of the TEEs? YES! AV: Modeling Classes of Shapes Fisher lecture 4 slide AV: Modeling Classes of Shapes Fisher lecture 4 slide 6

10 Representing the TEEs using PDMs II Some of Training Data Each corner point in the TEE model has a: Standard position Modified by shape variations Use a Point Distribution Model (mean + PCA based main variation vectors) to represent structural variations and statistical model (mean + covariance matrix) to represent in-class variation AV: Modeling Classes of Shapes Fisher lecture 4 slide 7 AV: Modeling Classes of Shapes Fisher lecture 4 slide 8 Correlation of x and x Eigenvalues of Scatter Matrix for TEE corners. x 4... Note strong correlation AV: Modeling Classes of Shapes Fisher lecture 4 slide Use first five AV: Modeling Classes of Shapes Fisher lecture 4 slide 4

11 First Four Modes of Variation Correlation of c and c (of c... c ) * - good data, diamond = bad data Column goes from - to + standard deviations Note ) decorrelated, ) bad data tends to be further from mean AV: Modeling Classes of Shapes Fisher lecture 4 slide 4 AV: Modeling Classes of Shapes Fisher lecture 4 slide 4 All TEEs recognized Good TEE Results Plot of Mahalanobis distances for training data Distributed Chi-squared mean: /, std. dev.: sqrt(*) -sigma test threshold: Dim = = 6.7 Bad TEE Shapes: Corner () & Rotation () Failures AV: Modeling Classes of Shapes Fisher lecture 4 slide 4 AV: Modeling Classes of Shapes Fisher lecture 4 slide 44

12 .8:.7: 8.:.: Invalid TEE Shapes Aligned and Classified Problems.8: Getting same number of points ( failures on good data) Getting points in the same positions on smooth curves 9.: Values are Mahalanobis distances Alignment when shape too extreme AV: Modeling Classes of Shapes Fisher lecture 4 slide 4 AV: Modeling Classes of Shapes Fisher lecture 4 slide 46 Discussion Applicable to smooth curves Applicable to other data beside points, D, ultrasound Covariance matrix links correlations between pairs of features. What about triples,...? Can extend to include greylevel variations Training samples need not be in order (batch mode) What We Have Learned Point Distribution Model (PDM) method for representing classes of objects Principal Component Analysis (PCA) decomposition for estimating the dominant modes of variation AV: Modeling Classes of Shapes Fisher lecture 4 slide 47 AV: Modeling Classes of Shapes Fisher lecture 4 slide 48

13 PCA-based Face Recognition Eigenfaces (Turk & Pentland 99) Representation of faces using PCA directly on image intensities One of most famous uses of PCA in computer vision Seminal reference for face recognition (but would work better if we modeled shape variation rather than lightness variation) Key principle: Turn image array into long vector Represent sample image (face) as weighted sum of eigenimages (eigenfaces) AV: Modeling Classes of Shapes Fisher lecture 4 slide 49 AV: Modeling Classes of Shapes Fisher lecture 4 slide Eigenfaces. Given set of K registered face images (R C) with varying capture conditions. Represent as R C long vectors. Do PCA (special trick for large matrices) 4. Represent person i by projection weights w i AV: Modeling Classes of Shapes Fisher lecture 4 slide AV: Modeling Classes of Shapes Fisher lecture 4 slide

14 Eigenface Recognition Given unknown face image F u. Subtract mean face and project onto eigenfaces w u. Given database of projections { w i } K i=, find class c with smallest Mahalanobis distance d c to w u. If d c small enough, return c as identity Eigenface Results 8 8 image database, varied lighting 96% successful recognition over lighting variations 8% over orientation variations 64% over size variations AV: Modeling Classes of Shapes Fisher lecture 4 slide AV: Modeling Classes of Shapes Fisher lecture 4 slide 4 Eigenface Discussion Variations in position, orientation, scale & occlusion cause problems Research topics 4-6% failure rate a problem at busy airports AV: Modeling Classes of Shapes Fisher lecture 4 slide

System 1 (last lecture) : limited to rigidly structured shapes. System 2 : recognition of a class of varying shapes. Need to:

System 1 (last lecture) : limited to rigidly structured shapes. System 2 : recognition of a class of varying shapes. Need to: System 2 : Modelling & Recognising Modelling and Recognising Classes of Classes of Shapes Shape : PDM & PCA All the same shape? System 1 (last lecture) : limited to rigidly structured shapes System 2 :

More information

Lecture 24: Principal Component Analysis. Aykut Erdem May 2016 Hacettepe University

Lecture 24: Principal Component Analysis. Aykut Erdem May 2016 Hacettepe University Lecture 4: Principal Component Analysis Aykut Erdem May 016 Hacettepe University This week Motivation PCA algorithms Applications PCA shortcomings Autoencoders Kernel PCA PCA Applications Data Visualization

More information

Principal Component Analysis

Principal Component Analysis B: Chapter 1 HTF: Chapter 1.5 Principal Component Analysis Barnabás Póczos University of Alberta Nov, 009 Contents Motivation PCA algorithms Applications Face recognition Facial expression recognition

More information

Advanced Introduction to Machine Learning CMU-10715

Advanced Introduction to Machine Learning CMU-10715 Advanced Introduction to Machine Learning CMU-10715 Principal Component Analysis Barnabás Póczos Contents Motivation PCA algorithms Applications Some of these slides are taken from Karl Booksh Research

More information

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas Dimensionality Reduction: PCA Nicholas Ruozzi University of Texas at Dallas Eigenvalues λ is an eigenvalue of a matrix A R n n if the linear system Ax = λx has at least one non-zero solution If Ax = λx

More information

Deriving Principal Component Analysis (PCA)

Deriving Principal Component Analysis (PCA) -0 Mathematical Foundations for Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Deriving Principal Component Analysis (PCA) Matt Gormley Lecture 11 Oct.

More information

CS4495/6495 Introduction to Computer Vision. 8B-L2 Principle Component Analysis (and its use in Computer Vision)

CS4495/6495 Introduction to Computer Vision. 8B-L2 Principle Component Analysis (and its use in Computer Vision) CS4495/6495 Introduction to Computer Vision 8B-L2 Principle Component Analysis (and its use in Computer Vision) Wavelength 2 Wavelength 2 Principal Components Principal components are all about the directions

More information

Eigenface-based facial recognition

Eigenface-based facial recognition Eigenface-based facial recognition Dimitri PISSARENKO December 1, 2002 1 General This document is based upon Turk and Pentland (1991b), Turk and Pentland (1991a) and Smith (2002). 2 How does it work? The

More information

Example: Face Detection

Example: Face Detection Announcements HW1 returned New attendance policy Face Recognition: Dimensionality Reduction On time: 1 point Five minutes or more late: 0.5 points Absent: 0 points Biometrics CSE 190 Lecture 14 CSE190,

More information

Recognition Using Class Specific Linear Projection. Magali Segal Stolrasky Nadav Ben Jakov April, 2015

Recognition Using Class Specific Linear Projection. Magali Segal Stolrasky Nadav Ben Jakov April, 2015 Recognition Using Class Specific Linear Projection Magali Segal Stolrasky Nadav Ben Jakov April, 2015 Articles Eigenfaces vs. Fisherfaces Recognition Using Class Specific Linear Projection, Peter N. Belhumeur,

More information

Eigenfaces. Face Recognition Using Principal Components Analysis

Eigenfaces. Face Recognition Using Principal Components Analysis Eigenfaces Face Recognition Using Principal Components Analysis M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive Neuroscience, 3(1), pp. 71-86, 1991. Slides : George Bebis, UNR

More information

Principal Component Analysis (PCA)

Principal Component Analysis (PCA) Principal Component Analysis (PCA) Salvador Dalí, Galatea of the Spheres CSC411/2515: Machine Learning and Data Mining, Winter 2018 Michael Guerzhoy and Lisa Zhang Some slides from Derek Hoiem and Alysha

More information

Face Recognition. Face Recognition. Subspace-Based Face Recognition Algorithms. Application of Face Recognition

Face Recognition. Face Recognition. Subspace-Based Face Recognition Algorithms. Application of Face Recognition ace Recognition Identify person based on the appearance of face CSED441:Introduction to Computer Vision (2017) Lecture10: Subspace Methods and ace Recognition Bohyung Han CSE, POSTECH bhhan@postech.ac.kr

More information

Dimensionality Reduction Using PCA/LDA. Hongyu Li School of Software Engineering TongJi University Fall, 2014

Dimensionality Reduction Using PCA/LDA. Hongyu Li School of Software Engineering TongJi University Fall, 2014 Dimensionality Reduction Using PCA/LDA Hongyu Li School of Software Engineering TongJi University Fall, 2014 Dimensionality Reduction One approach to deal with high dimensional data is by reducing their

More information

CITS 4402 Computer Vision

CITS 4402 Computer Vision CITS 4402 Computer Vision A/Prof Ajmal Mian Adj/A/Prof Mehdi Ravanbakhsh Lecture 06 Object Recognition Objectives To understand the concept of image based object recognition To learn how to match images

More information

Dimensionality reduction

Dimensionality reduction Dimensionality Reduction PCA continued Machine Learning CSE446 Carlos Guestrin University of Washington May 22, 2013 Carlos Guestrin 2005-2013 1 Dimensionality reduction n Input data may have thousands

More information

Machine learning for pervasive systems Classification in high-dimensional spaces

Machine learning for pervasive systems Classification in high-dimensional spaces Machine learning for pervasive systems Classification in high-dimensional spaces Department of Communications and Networking Aalto University, School of Electrical Engineering stephan.sigg@aalto.fi Version

More information

Face detection and recognition. Detection Recognition Sally

Face detection and recognition. Detection Recognition Sally Face detection and recognition Detection Recognition Sally Face detection & recognition Viola & Jones detector Available in open CV Face recognition Eigenfaces for face recognition Metric learning identification

More information

December 20, MAA704, Multivariate analysis. Christopher Engström. Multivariate. analysis. Principal component analysis

December 20, MAA704, Multivariate analysis. Christopher Engström. Multivariate. analysis. Principal component analysis .. December 20, 2013 Todays lecture. (PCA) (PLS-R) (LDA) . (PCA) is a method often used to reduce the dimension of a large dataset to one of a more manageble size. The new dataset can then be used to make

More information

Face Detection and Recognition

Face Detection and Recognition Face Detection and Recognition Face Recognition Problem Reading: Chapter 18.10 and, optionally, Face Recognition using Eigenfaces by M. Turk and A. Pentland Queryimage face query database Face Verification

More information

PCA FACE RECOGNITION

PCA FACE RECOGNITION PCA FACE RECOGNITION The slides are from several sources through James Hays (Brown); Srinivasa Narasimhan (CMU); Silvio Savarese (U. of Michigan); Shree Nayar (Columbia) including their own slides. Goal

More information

PCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani

PCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani PCA & ICA CE-717: Machine Learning Sharif University of Technology Spring 2015 Soleymani Dimensionality Reduction: Feature Selection vs. Feature Extraction Feature selection Select a subset of a given

More information

Linear Subspace Models

Linear Subspace Models Linear Subspace Models Goal: Explore linear models of a data set. Motivation: A central question in vision concerns how we represent a collection of data vectors. The data vectors may be rasterized images,

More information

Lecture: Face Recognition

Lecture: Face Recognition Lecture: Face Recognition Juan Carlos Niebles and Ranjay Krishna Stanford Vision and Learning Lab Lecture 12-1 What we will learn today Introduction to face recognition The Eigenfaces Algorithm Linear

More information

Covariance to PCA. CS 510 Lecture #14 February 23, 2018

Covariance to PCA. CS 510 Lecture #14 February 23, 2018 Covariance to PCA CS 510 Lecture 14 February 23, 2018 Overview: Goal Assume you have a gallery (database) of images, and a probe (test) image. The goal is to find the database image that is most similar

More information

EECS490: Digital Image Processing. Lecture #26

EECS490: Digital Image Processing. Lecture #26 Lecture #26 Moments; invariant moments Eigenvector, principal component analysis Boundary coding Image primitives Image representation: trees, graphs Object recognition and classes Minimum distance classifiers

More information

Robot Image Credit: Viktoriya Sukhanova 123RF.com. Dimensionality Reduction

Robot Image Credit: Viktoriya Sukhanova 123RF.com. Dimensionality Reduction Robot Image Credit: Viktoriya Sukhanova 13RF.com Dimensionality Reduction Feature Selection vs. Dimensionality Reduction Feature Selection (last time) Select a subset of features. When classifying novel

More information

Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation)

Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation) Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation) PCA transforms the original input space into a lower dimensional space, by constructing dimensions that are linear combinations

More information

Expectation Maximization

Expectation Maximization Expectation Maximization Machine Learning CSE546 Carlos Guestrin University of Washington November 13, 2014 1 E.M.: The General Case E.M. widely used beyond mixtures of Gaussians The recipe is the same

More information

The Mathematics of Facial Recognition

The Mathematics of Facial Recognition William Dean Gowin Graduate Student Appalachian State University July 26, 2007 Outline EigenFaces Deconstruct a known face into an N-dimensional facespace where N is the number of faces in our data set.

More information

What is Principal Component Analysis?

What is Principal Component Analysis? What is Principal Component Analysis? Principal component analysis (PCA) Reduce the dimensionality of a data set by finding a new set of variables, smaller than the original set of variables Retains most

More information

20 Unsupervised Learning and Principal Components Analysis (PCA)

20 Unsupervised Learning and Principal Components Analysis (PCA) 116 Jonathan Richard Shewchuk 20 Unsupervised Learning and Principal Components Analysis (PCA) UNSUPERVISED LEARNING We have sample points, but no labels! No classes, no y-values, nothing to predict. Goal:

More information

Lecture 13 Visual recognition

Lecture 13 Visual recognition Lecture 13 Visual recognition Announcements Silvio Savarese Lecture 13-20-Feb-14 Lecture 13 Visual recognition Object classification bag of words models Discriminative methods Generative methods Object

More information

CSE 554 Lecture 7: Alignment

CSE 554 Lecture 7: Alignment CSE 554 Lecture 7: Alignment Fall 2012 CSE554 Alignment Slide 1 Review Fairing (smoothing) Relocating vertices to achieve a smoother appearance Method: centroid averaging Simplification Reducing vertex

More information

Lecture: Face Recognition and Feature Reduction

Lecture: Face Recognition and Feature Reduction Lecture: Face Recognition and Feature Reduction Juan Carlos Niebles and Ranjay Krishna Stanford Vision and Learning Lab 1 Recap - Curse of dimensionality Assume 5000 points uniformly distributed in the

More information

Lecture: Face Recognition and Feature Reduction

Lecture: Face Recognition and Feature Reduction Lecture: Face Recognition and Feature Reduction Juan Carlos Niebles and Ranjay Krishna Stanford Vision and Learning Lab Lecture 11-1 Recap - Curse of dimensionality Assume 5000 points uniformly distributed

More information

ECE 661: Homework 10 Fall 2014

ECE 661: Homework 10 Fall 2014 ECE 661: Homework 10 Fall 2014 This homework consists of the following two parts: (1) Face recognition with PCA and LDA for dimensionality reduction and the nearest-neighborhood rule for classification;

More information

Machine Learning 11. week

Machine Learning 11. week Machine Learning 11. week Feature Extraction-Selection Dimension reduction PCA LDA 1 Feature Extraction Any problem can be solved by machine learning methods in case of that the system must be appropriately

More information

Covariance to PCA. CS 510 Lecture #8 February 17, 2014

Covariance to PCA. CS 510 Lecture #8 February 17, 2014 Covariance to PCA CS 510 Lecture 8 February 17, 2014 Status Update Programming Assignment 2 is due March 7 th Expect questions about your progress at the start of class I still owe you Assignment 1 back

More information

University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries

University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout :. The Multivariate Gaussian & Decision Boundaries..15.1.5 1 8 6 6 8 1 Mark Gales mjfg@eng.cam.ac.uk Lent

More information

ROBERTO BATTITI, MAURO BRUNATO. The LION Way: Machine Learning plus Intelligent Optimization. LIONlab, University of Trento, Italy, Apr 2015

ROBERTO BATTITI, MAURO BRUNATO. The LION Way: Machine Learning plus Intelligent Optimization. LIONlab, University of Trento, Italy, Apr 2015 ROBERTO BATTITI, MAURO BRUNATO. The LION Way: Machine Learning plus Intelligent Optimization. LIONlab, University of Trento, Italy, Apr 2015 http://intelligentoptimization.org/lionbook Roberto Battiti

More information

Linear Algebra & Geometry why is linear algebra useful in computer vision?

Linear Algebra & Geometry why is linear algebra useful in computer vision? Linear Algebra & Geometry why is linear algebra useful in computer vision? References: -Any book on linear algebra! -[HZ] chapters 2, 4 Some of the slides in this lecture are courtesy to Prof. Octavia

More information

Image Analysis. PCA and Eigenfaces

Image Analysis. PCA and Eigenfaces Image Analysis PCA and Eigenfaces Christophoros Nikou cnikou@cs.uoi.gr Images taken from: D. Forsyth and J. Ponce. Computer Vision: A Modern Approach, Prentice Hall, 2003. Computer Vision course by Svetlana

More information

CS168: The Modern Algorithmic Toolbox Lecture #7: Understanding Principal Component Analysis (PCA)

CS168: The Modern Algorithmic Toolbox Lecture #7: Understanding Principal Component Analysis (PCA) CS68: The Modern Algorithmic Toolbox Lecture #7: Understanding Principal Component Analysis (PCA) Tim Roughgarden & Gregory Valiant April 0, 05 Introduction. Lecture Goal Principal components analysis

More information

Machine Learning 2nd Edition

Machine Learning 2nd Edition INTRODUCTION TO Lecture Slides for Machine Learning 2nd Edition ETHEM ALPAYDIN, modified by Leonardo Bobadilla and some parts from http://www.cs.tau.ac.il/~apartzin/machinelearning/ The MIT Press, 2010

More information

Image Analysis & Retrieval. Lec 14. Eigenface and Fisherface

Image Analysis & Retrieval. Lec 14. Eigenface and Fisherface Image Analysis & Retrieval Lec 14 Eigenface and Fisherface Zhu Li Dept of CSEE, UMKC Office: FH560E, Email: lizhu@umkc.edu, Ph: x 2346. http://l.web.umkc.edu/lizhu Z. Li, Image Analysis & Retrv, Spring

More information

COMP 551 Applied Machine Learning Lecture 13: Dimension reduction and feature selection

COMP 551 Applied Machine Learning Lecture 13: Dimension reduction and feature selection COMP 551 Applied Machine Learning Lecture 13: Dimension reduction and feature selection Instructor: Herke van Hoof (herke.vanhoof@cs.mcgill.ca) Based on slides by:, Jackie Chi Kit Cheung Class web page:

More information

Principal Component Analysis and Singular Value Decomposition. Volker Tresp, Clemens Otte Summer 2014

Principal Component Analysis and Singular Value Decomposition. Volker Tresp, Clemens Otte Summer 2014 Principal Component Analysis and Singular Value Decomposition Volker Tresp, Clemens Otte Summer 2014 1 Motivation So far we always argued for a high-dimensional feature space Still, in some cases it makes

More information

Announcements (repeat) Principal Components Analysis

Announcements (repeat) Principal Components Analysis 4/7/7 Announcements repeat Principal Components Analysis CS 5 Lecture #9 April 4 th, 7 PA4 is due Monday, April 7 th Test # will be Wednesday, April 9 th Test #3 is Monday, May 8 th at 8AM Just hour long

More information

Principal Component Analysis. Applied Multivariate Statistics Spring 2012

Principal Component Analysis. Applied Multivariate Statistics Spring 2012 Principal Component Analysis Applied Multivariate Statistics Spring 2012 Overview Intuition Four definitions Practical examples Mathematical example Case study 2 PCA: Goals Goal 1: Dimension reduction

More information

Machine Learning. Principal Components Analysis. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012

Machine Learning. Principal Components Analysis. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012 Machine Learning CSE6740/CS7641/ISYE6740, Fall 2012 Principal Components Analysis Le Song Lecture 22, Nov 13, 2012 Based on slides from Eric Xing, CMU Reading: Chap 12.1, CB book 1 2 Factor or Component

More information

CS 4495 Computer Vision Principle Component Analysis

CS 4495 Computer Vision Principle Component Analysis CS 4495 Computer Vision Principle Component Analysis (and it s use in Computer Vision) Aaron Bobick School of Interactive Computing Administrivia PS6 is out. Due *** Sunday, Nov 24th at 11:55pm *** PS7

More information

Covariance and Principal Components

Covariance and Principal Components COMP3204/COMP6223: Computer Vision Covariance and Principal Components Jonathon Hare jsh2@ecs.soton.ac.uk Variance and Covariance Random Variables and Expected Values Mathematicians talk variance (and

More information

Lecture 16: Small Sample Size Problems (Covariance Estimation) Many thanks to Carlos Thomaz who authored the original version of these slides

Lecture 16: Small Sample Size Problems (Covariance Estimation) Many thanks to Carlos Thomaz who authored the original version of these slides Lecture 16: Small Sample Size Problems (Covariance Estimation) Many thanks to Carlos Thomaz who authored the original version of these slides Intelligent Data Analysis and Probabilistic Inference Lecture

More information

Structure in Data. A major objective in data analysis is to identify interesting features or structure in the data.

Structure in Data. A major objective in data analysis is to identify interesting features or structure in the data. Structure in Data A major objective in data analysis is to identify interesting features or structure in the data. The graphical methods are very useful in discovering structure. There are basically two

More information

Dimensionality Reduction

Dimensionality Reduction Lecture 5 1 Outline 1. Overview a) What is? b) Why? 2. Principal Component Analysis (PCA) a) Objectives b) Explaining variability c) SVD 3. Related approaches a) ICA b) Autoencoders 2 Example 1: Sportsball

More information

Lecture Notes 1: Vector spaces

Lecture Notes 1: Vector spaces Optimization-based data analysis Fall 2017 Lecture Notes 1: Vector spaces In this chapter we review certain basic concepts of linear algebra, highlighting their application to signal processing. 1 Vector

More information

Uncorrelated Multilinear Principal Component Analysis through Successive Variance Maximization

Uncorrelated Multilinear Principal Component Analysis through Successive Variance Maximization Uncorrelated Multilinear Principal Component Analysis through Successive Variance Maximization Haiping Lu 1 K. N. Plataniotis 1 A. N. Venetsanopoulos 1,2 1 Department of Electrical & Computer Engineering,

More information

COMP 408/508. Computer Vision Fall 2017 PCA for Recognition

COMP 408/508. Computer Vision Fall 2017 PCA for Recognition COMP 408/508 Computer Vision Fall 07 PCA or Recognition Recall: Color Gradient by PCA v λ ( G G, ) x x x R R v, v : eigenvectors o D D with v ^v (, ) x x λ, λ : eigenvalues o D D with λ >λ v λ ( B B, )

More information

Machine Learning - MT & 14. PCA and MDS

Machine Learning - MT & 14. PCA and MDS Machine Learning - MT 2016 13 & 14. PCA and MDS Varun Kanade University of Oxford November 21 & 23, 2016 Announcements Sheet 4 due this Friday by noon Practical 3 this week (continue next week if necessary)

More information

A Unified Bayesian Framework for Face Recognition

A Unified Bayesian Framework for Face Recognition Appears in the IEEE Signal Processing Society International Conference on Image Processing, ICIP, October 4-7,, Chicago, Illinois, USA A Unified Bayesian Framework for Face Recognition Chengjun Liu and

More information

Singular Value Decomposition and Principal Component Analysis (PCA) I

Singular Value Decomposition and Principal Component Analysis (PCA) I Singular Value Decomposition and Principal Component Analysis (PCA) I Prof Ned Wingreen MOL 40/50 Microarray review Data per array: 0000 genes, I (green) i,i (red) i 000 000+ data points! The expression

More information

Image Analysis & Retrieval Lec 14 - Eigenface & Fisherface

Image Analysis & Retrieval Lec 14 - Eigenface & Fisherface CS/EE 5590 / ENG 401 Special Topics, Spring 2018 Image Analysis & Retrieval Lec 14 - Eigenface & Fisherface Zhu Li Dept of CSEE, UMKC http://l.web.umkc.edu/lizhu Office Hour: Tue/Thr 2:30-4pm@FH560E, Contact:

More information

Eigenimaging for Facial Recognition

Eigenimaging for Facial Recognition Eigenimaging for Facial Recognition Aaron Kosmatin, Clayton Broman December 2, 21 Abstract The interest of this paper is Principal Component Analysis, specifically its area of application to facial recognition

More information

Keywords Eigenface, face recognition, kernel principal component analysis, machine learning. II. LITERATURE REVIEW & OVERVIEW OF PROPOSED METHODOLOGY

Keywords Eigenface, face recognition, kernel principal component analysis, machine learning. II. LITERATURE REVIEW & OVERVIEW OF PROPOSED METHODOLOGY Volume 6, Issue 3, March 2016 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Eigenface and

More information

PCA, Kernel PCA, ICA

PCA, Kernel PCA, ICA PCA, Kernel PCA, ICA Learning Representations. Dimensionality Reduction. Maria-Florina Balcan 04/08/2015 Big & High-Dimensional Data High-Dimensions = Lot of Features Document classification Features per

More information

Salt Dome Detection and Tracking Using Texture Analysis and Tensor-based Subspace Learning

Salt Dome Detection and Tracking Using Texture Analysis and Tensor-based Subspace Learning Salt Dome Detection and Tracking Using Texture Analysis and Tensor-based Subspace Learning Zhen Wang*, Dr. Tamir Hegazy*, Dr. Zhiling Long, and Prof. Ghassan AlRegib 02/18/2015 1 /42 Outline Introduction

More information

Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x =

Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x = Linear Algebra Review Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1 x x = 2. x n Vectors of up to three dimensions are easy to diagram.

More information

Statistical Pattern Recognition

Statistical Pattern Recognition Statistical Pattern Recognition Feature Extraction Hamid R. Rabiee Jafar Muhammadi, Alireza Ghasemi, Payam Siyari Spring 2014 http://ce.sharif.edu/courses/92-93/2/ce725-2/ Agenda Dimensionality Reduction

More information

Principal Component Analysis

Principal Component Analysis Principal Component Analysis Yingyu Liang yliang@cs.wisc.edu Computer Sciences Department University of Wisconsin, Madison [based on slides from Nina Balcan] slide 1 Goals for the lecture you should understand

More information

Face Recognition and Biometric Systems

Face Recognition and Biometric Systems The Eigenfaces method Plan of the lecture Principal Components Analysis main idea Feature extraction by PCA face recognition Eigenfaces training feature extraction Literature M.A.Turk, A.P.Pentland Face

More information

ECE 521. Lecture 11 (not on midterm material) 13 February K-means clustering, Dimensionality reduction

ECE 521. Lecture 11 (not on midterm material) 13 February K-means clustering, Dimensionality reduction ECE 521 Lecture 11 (not on midterm material) 13 February 2017 K-means clustering, Dimensionality reduction With thanks to Ruslan Salakhutdinov for an earlier version of the slides Overview K-means clustering

More information

CSC 411 Lecture 12: Principal Component Analysis

CSC 411 Lecture 12: Principal Component Analysis CSC 411 Lecture 12: Principal Component Analysis Roger Grosse, Amir-massoud Farahmand, and Juan Carrasquilla University of Toronto UofT CSC 411: 12-PCA 1 / 23 Overview Today we ll cover the first unsupervised

More information

Lecture 3: Review of Linear Algebra

Lecture 3: Review of Linear Algebra ECE 83 Fall 2 Statistical Signal Processing instructor: R Nowak Lecture 3: Review of Linear Algebra Very often in this course we will represent signals as vectors and operators (eg, filters, transforms,

More information

Face recognition Computer Vision Spring 2018, Lecture 21

Face recognition Computer Vision Spring 2018, Lecture 21 Face recognition http://www.cs.cmu.edu/~16385/ 16-385 Computer Vision Spring 2018, Lecture 21 Course announcements Homework 6 has been posted and is due on April 27 th. - Any questions about the homework?

More information

EECS 275 Matrix Computation

EECS 275 Matrix Computation EECS 275 Matrix Computation Ming-Hsuan Yang Electrical Engineering and Computer Science University of California at Merced Merced, CA 95344 http://faculty.ucmerced.edu/mhyang Lecture 6 1 / 22 Overview

More information

Lecture 3: Review of Linear Algebra

Lecture 3: Review of Linear Algebra ECE 83 Fall 2 Statistical Signal Processing instructor: R Nowak, scribe: R Nowak Lecture 3: Review of Linear Algebra Very often in this course we will represent signals as vectors and operators (eg, filters,

More information

2D Image Processing Face Detection and Recognition

2D Image Processing Face Detection and Recognition 2D Image Processing Face Detection and Recognition Prof. Didier Stricker Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz http://av.dfki.de

More information

Lecture 17: Face Recogni2on

Lecture 17: Face Recogni2on Lecture 17: Face Recogni2on Dr. Juan Carlos Niebles Stanford AI Lab Professor Fei-Fei Li Stanford Vision Lab Lecture 17-1! What we will learn today Introduc2on to face recogni2on Principal Component Analysis

More information

Data Mining Techniques

Data Mining Techniques Data Mining Techniques CS 6220 - Section 3 - Fall 2016 Lecture 12 Jan-Willem van de Meent (credit: Yijun Zhao, Percy Liang) DIMENSIONALITY REDUCTION Borrowing from: Percy Liang (Stanford) Linear Dimensionality

More information

Unsupervised Machine Learning and Data Mining. DS 5230 / DS Fall Lecture 7. Jan-Willem van de Meent

Unsupervised Machine Learning and Data Mining. DS 5230 / DS Fall Lecture 7. Jan-Willem van de Meent Unsupervised Machine Learning and Data Mining DS 5230 / DS 4420 - Fall 2018 Lecture 7 Jan-Willem van de Meent DIMENSIONALITY REDUCTION Borrowing from: Percy Liang (Stanford) Dimensionality Reduction Goal:

More information

Karhunen-Loève Transform KLT. JanKees van der Poel D.Sc. Student, Mechanical Engineering

Karhunen-Loève Transform KLT. JanKees van der Poel D.Sc. Student, Mechanical Engineering Karhunen-Loève Transform KLT JanKees van der Poel D.Sc. Student, Mechanical Engineering Karhunen-Loève Transform Has many names cited in literature: Karhunen-Loève Transform (KLT); Karhunen-Loève Decomposition

More information

L11: Pattern recognition principles

L11: Pattern recognition principles L11: Pattern recognition principles Bayesian decision theory Statistical classifiers Dimensionality reduction Clustering This lecture is partly based on [Huang, Acero and Hon, 2001, ch. 4] Introduction

More information

Principal Component Analysis (PCA) CSC411/2515 Tutorial

Principal Component Analysis (PCA) CSC411/2515 Tutorial Principal Component Analysis (PCA) CSC411/2515 Tutorial Harris Chan Based on previous tutorial slides by Wenjie Luo, Ladislav Rampasek University of Toronto hchan@cs.toronto.edu October 19th, 2017 (UofT)

More information

Introduction to Machine Learning

Introduction to Machine Learning 10-701 Introduction to Machine Learning PCA Slides based on 18-661 Fall 2018 PCA Raw data can be Complex, High-dimensional To understand a phenomenon we measure various related quantities If we knew what

More information

November 28 th, Carlos Guestrin 1. Lower dimensional projections

November 28 th, Carlos Guestrin 1. Lower dimensional projections PCA Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University November 28 th, 2007 1 Lower dimensional projections Rather than picking a subset of the features, we can new features that are

More information

Principal Component Analysis

Principal Component Analysis Principal Component Analysis Anders Øland David Christiansen 1 Introduction Principal Component Analysis, or PCA, is a commonly used multi-purpose technique in data analysis. It can be used for feature

More information

Linear Algebra Methods for Data Mining

Linear Algebra Methods for Data Mining Linear Algebra Methods for Data Mining Saara Hyvönen, Saara.Hyvonen@cs.helsinki.fi Spring 2007 The Singular Value Decomposition (SVD) continued Linear Algebra Methods for Data Mining, Spring 2007, University

More information

Compressive Sensing, Low Rank models, and Low Rank Submatrix

Compressive Sensing, Low Rank models, and Low Rank Submatrix Compressive Sensing,, and Low Rank Submatrix NICTA Short Course 2012 yi.li@nicta.com.au http://users.cecs.anu.edu.au/~yili Sep 12, 2012 ver. 1.8 http://tinyurl.com/brl89pk Outline Introduction 1 Introduction

More information

Pattern Recognition 2

Pattern Recognition 2 Pattern Recognition 2 KNN,, Dr. Terence Sim School of Computing National University of Singapore Outline 1 2 3 4 5 Outline 1 2 3 4 5 The Bayes Classifier is theoretically optimum. That is, prob. of error

More information

Lecture 17: Face Recogni2on

Lecture 17: Face Recogni2on Lecture 17: Face Recogni2on Dr. Juan Carlos Niebles Stanford AI Lab Professor Fei-Fei Li Stanford Vision Lab Lecture 17-1! What we will learn today Introduc2on to face recogni2on Principal Component Analysis

More information

Linear Algebra Review. Vectors

Linear Algebra Review. Vectors Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors

More information

CS168: The Modern Algorithmic Toolbox Lecture #8: How PCA Works

CS168: The Modern Algorithmic Toolbox Lecture #8: How PCA Works CS68: The Modern Algorithmic Toolbox Lecture #8: How PCA Works Tim Roughgarden & Gregory Valiant April 20, 206 Introduction Last lecture introduced the idea of principal components analysis (PCA). The

More information

6.867 Machine Learning

6.867 Machine Learning 6.867 Machine Learning Problem Set 2 Due date: Wednesday October 6 Please address all questions and comments about this problem set to 6867-staff@csail.mit.edu. You will need to use MATLAB for some of

More information

Dimension Reduction (PCA, ICA, CCA, FLD,

Dimension Reduction (PCA, ICA, CCA, FLD, Dimension Reduction (PCA, ICA, CCA, FLD, Topic Models) Yi Zhang 10-701, Machine Learning, Spring 2011 April 6 th, 2011 Parts of the PCA slides are from previous 10-701 lectures 1 Outline Dimension reduction

More information

Data Preprocessing Tasks

Data Preprocessing Tasks Data Tasks 1 2 3 Data Reduction 4 We re here. 1 Dimensionality Reduction Dimensionality reduction is a commonly used approach for generating fewer features. Typically used because too many features can

More information

Comparative Assessment of Independent Component. Component Analysis (ICA) for Face Recognition.

Comparative Assessment of Independent Component. Component Analysis (ICA) for Face Recognition. Appears in the Second International Conference on Audio- and Video-based Biometric Person Authentication, AVBPA 99, ashington D. C. USA, March 22-2, 1999. Comparative Assessment of Independent Component

More information

LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach

LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach Dr. Guangliang Chen February 9, 2016 Outline Introduction Review of linear algebra Matrix SVD PCA Motivation The digits

More information

1 Singular Value Decomposition and Principal Component

1 Singular Value Decomposition and Principal Component Singular Value Decomposition and Principal Component Analysis In these lectures we discuss the SVD and the PCA, two of the most widely used tools in machine learning. Principal Component Analysis (PCA)

More information

COS 429: COMPUTER VISON Face Recognition

COS 429: COMPUTER VISON Face Recognition COS 429: COMPUTER VISON Face Recognition Intro to recognition PCA and Eigenfaces LDA and Fisherfaces Face detection: Viola & Jones (Optional) generic object models for faces: the Constellation Model Reading:

More information