Lecture 17: Face Recogni2on
|
|
- Lora Bradford
- 5 years ago
- Views:
Transcription
1 Lecture 17: Face Recogni2on Dr. Juan Carlos Niebles Stanford AI Lab Professor Fei-Fei Li Stanford Vision Lab Lecture 17-1!
2 What we will learn today Introduc2on to face recogni2on Principal Component Analysis (PCA) The Eigenfaces Algorithm Linear Discriminant Analysis (LDA) Turk and Pentland, Eigenfaces for Recogni2on, Journal of Cogni-ve Neuroscience 3 (1): P. Belhumeur, J. Hespanha, and D. Kriegman. "Eigenfaces vs. Fisherfaces: Recogni2on Using Class Specific Linear Projec2on". IEEE Transac-ons on pa7ern analysis and machine intelligence 19 (7): Lecture 17-2!
3 Faces in the brain Courtesy of Johannes M. Zanker Lecture 17-3!
4 Faces in the brain Lecture 17-4! Kanwisher, et al. 1997
5 Face Recogni2on Digital photography Lecture 17-5!
6 Face Recogni2on Digital photography Surveillance Lecture 17-6!
7 Face Recogni2on Digital photography Surveillance Album organiza2on Lecture 17-7!
8 Face Recogni2on Digital photography Surveillance Album organiza2on Person tracking/id. Lecture 17-8!
9 Face Recogni2on Digital photography Surveillance Album organiza2on Person tracking/id. Emo2ons and expressions Lecture 17-9!
10 Face Recogni2on Digital photography Surveillance Album organiza2on Person tracking/id. Emo2ons and expressions Security/warfare Tele-conferencing Etc. Lecture 17-10!
11 The Space of Faces An image is a point in a high dimensional space If represented in grayscale intensity, an N x M image is a point in R NM E.g. 100x100 image = 10,000 dim Slide credit: Chuck Dyer, Steve Seitz, Nishino Lecture 17-11!
12 The Space of Faces An image is a point in a high dimensional space If represented in grayscale intensity, an N x M image is a point in R NM E.g. 100x100 image = 10,000 dim However, rela2vely few high dimensional vectors correspond to valid face images We want to effec2vely model the subspace of face images Slide credit: Chuck Dyer, Steve Seitz, Nishino Lecture 17-12!
13 Image space Face space Compute n-dim subspace such that the projec2on of the data points onto the subspace has the largest variance among all n-dim subspaces. Maximize the scaier of the training images in face space Lecture 17-13!
14 Key Idea So, compress them to a low-dimensional subspace that captures key appearance characteris2cs of the visual DOFs. USE PCA for es2ma2ng the sub-space (dimensionality reduc2on) Compare two faces by projec2ng the images into the subspace and measuring the EUCLIDEAN distance between them. Lecture 17-14!
15 What we will learn today Introduc2on to face recogni2on Principal Component Analysis (PCA) The Eigenfaces Algorithm Linear Discriminant Analysis (LDA) Turk and Pentland, Eigenfaces for Recogni2on, Journal of Cogni-ve Neuroscience 3 (1): P. Belhumeur, J. Hespanha, and D. Kriegman. "Eigenfaces vs. Fisherfaces: Recogni2on Using Class Specific Linear Projec2on". IEEE Transac-ons on pa7ern analysis and machine intelligence 19 (7): Lecture 17-15!
16 technical note PCA Formula2on Basic idea: If the data lives in a subspace, it is going to look very flat when viewed from the full space, e.g. 1D subspace in 2D 2D subspace in 3D This means that if we fit a Gaussian to the data the equiprobability contours are going to be highly skewed ellipsoids Slide inspired by N. Vasconcelos Lecture 17-16!
17 technical note PCA Formula2on If x is Gaussian with covariance Σ, the equiprobability contours are the ellipses whose x 2 Principal components φ i are the eigenvectors of Σ Principal lengths λ i are the eigenvalues of Σ φ 2 λ 2 λ 1 φ 1 x 1 by compu2ng the eigenvalues we know the data is Not flat if λ 1 λ 2 Flat if λ 1 >> λ 2 Slide inspired by N. Vasconcelos Lecture 17-17!
18 technical note PCA Algorithm (training) Slide inspired by N. Vasconcelos Lecture 17-18!
19 technical note PCA Algorithm (tes2ng) Slide inspired by N. Vasconcelos Lecture 17-19!
20 technical note PCA by SVD An alterna2ve manner to compute the principal components, based on singular value decomposi2on Quick reminder: SVD Any real n x m matrix (n>m) can be decomposed as Where M is an (n x m) column orthonormal matrix of let singular vectors (columns of M) Π is an (m x m) diagonal matrix of singular values N T is an (m x m) row orthonormal matrix of right singular vectors (columns of N) Slide inspired by N. Vasconcelos Lecture 17-20!
21 technical note PCA by SVD To relate this to PCA, we consider the data matrix The sample mean is Slide inspired by N. Vasconcelos Lecture 17-21!
22 technical note PCA by SVD Center the data by subtrac2ng the mean to each column of X The centered data matrix is Slide inspired by N. Vasconcelos Lecture 17-22!
23 technical note PCA by SVD The sample covariance matrix is where x i c is the i th column of X c This can be wriien as Slide inspired by N. Vasconcelos Lecture 17-23!
24 technical note PCA by SVD The matrix is real (n x d). Assuming n>d it has SVD decomposi2on and Slide inspired by N. Vasconcelos Lecture 17-24!
25 technical note PCA by SVD Note that N is (d x d) and orthonormal, and Π 2 is diagonal. This is just the eigenvalue decomposi2on of Σ It follows that The eigenvectors of Σ are the columns of N The eigenvalues of Σ are This gives an alterna2ve algorithm for PCA Slide inspired by N. Vasconcelos Lecture 17-25!
26 technical note PCA by SVD In summary, computa2on of PCA by SVD Given X with one example per column Create the centered data matrix Compute its SVD Principal components are columns of N, eigenvalues are Slide inspired by N. Vasconcelos Lecture 17-26!
27 technical note Rule of thumb for finding the number of PCA components A natural measure is to pick the eigenvectors that explain p% of the data variability Can be done by ploxng the ra2o r k as a func2on of k E.g. we need 3 eigenvectors to cover 70% of the variability of this dataset Slide inspired by N. Vasconcelos Lecture 17-27!
28 What we will learn today Introduc2on to face recogni2on Principal Component Analysis (PCA) The Eigenfaces Algorithm Linear Discriminant Analysis (LDA) Turk and Pentland, Eigenfaces for Recogni2on, Journal of Cogni-ve Neuroscience 3 (1): P. Belhumeur, J. Hespanha, and D. Kriegman. "Eigenfaces vs. Fisherfaces: Recogni2on Using Class Specific Linear Projec2on". IEEE Transac-ons on pa7ern analysis and machine intelligence 19 (7): Lecture 17-28!
29 Eigenfaces: key idea Assume that most face images lie on a lowdimensional subspace determined by the first k (k<<d) direc2ons of maximum variance Use PCA to determine the vectors or eigenfaces that span that subspace Represent all face images in the dataset as linear combina2ons of eigenfaces M. Turk and A. Pentland, Face Recognition using Eigenfaces, CVPR 1991 Lecture 17-29!
30 technical note Eigenface algorithm Training 1. Align training images x 1, x 2,, x N Note that each image is formulated into a long vector! 2. Compute average face µ = 1 x i N 3. Compute the difference image (the centered data matrix) Lecture 17-30!
31 technical note Eigenface algorithm 4. Compute the covariance matrix 5. Compute the eigenvectors of the covariance matrix Σ 6. Compute each training image x i s projec2ons as x i ( x c i φ 1, x c i φ 2,..., x c i φ ) K ( a 1, a 2,..., a K ) 7. Visualize the es2mated training face x i x i µ + a 1 φ 1 + a 2 φ a K φ K Lecture 17-31!
32 technical note Eigenface algorithm a 1 φ 1 a 2 φ 2... a K φ K 6. Compute each training image x i s projec2ons as x i ( x c i φ 1, x c i φ 2,..., x c i φ ) K ( a 1, a 2,..., a K ) 7. Visualize the es2mated training face x i x i µ + a 1 φ 1 + a 2 φ a K φ K Lecture 17-32!
33 technical note Eigenface algorithm Tes2ng 1. Take query image t 2. Project into eigenface space and compute projec2on t ((t µ) φ 1, (t µ) φ 2,..., (t µ) φ K ) ( w 1, w 2,..., w K ) 3. Compare projec2on w with all N training projec2ons Simple comparison metric: Euclidean Simple decision: K-Nearest Neighbor (note: this K refers to the k-nn algorithm, is different from the previous K s referring to the # of principal components) Lecture 17-33!
34 Visualiza2on of eigenfaces Eigenfaces look somewhat like generic faces. Lecture 17-34!
35 Reconstruc2on and Errors K = 4 K = 200 K = 400 Only selec2ng the top K eigenfaces à reduces the dimensionality. Fewer eigenfaces result in more informa2on loss, and hence less discrimina2on between faces. Lecture 17-35!
36 Summary for Eigenface Pros Non-itera2ve, globally op2mal solu2on Limita2ons PCA projec2on is op&mal for reconstruc&on from a low dimensional basis, but may NOT be op&mal for discrimina&on See supplementary materials for Linear Discrimina2ve Analysis, aka Fisherfaces Lecture 17-36!
37 What we will learn today Introduc2on to face recogni2on Principal Component Analysis (PCA) The Eigenfaces Algorithm Linear Discriminant Analysis (LDA) Turk and Pentland, Eigenfaces for Recogni2on, Journal of Cogni-ve Neuroscience 3 (1): P. Belhumeur, J. Hespanha, and D. Kriegman. "Eigenfaces vs. Fisherfaces: Recogni2on Using Class Specific Linear Projec2on". IEEE Transac-ons on pa7ern analysis and machine intelligence 19 (7): Lecture 17-37!
38 Fischer s Linear Discriminant Analysis Goal: find the best separa2on between two classes Slide inspired by N. Vasconcelos Lecture 17-38!
39 Basic intui2on: PCA vs. LDA Lecture 17-39!
40 technical note Linear Discriminant Analysis (LDA) We have two classes such that We want to find the line z that best separates them One possibility would be to maximize Slide inspired by N. Vasconcelos Lecture 17-40!
41 technical note Linear Discriminant Analysis (LDA) However, this difference can be arbitrarily large by simply scaling w We are only interested in the direc2on, not the magnitude Need some type of normaliza2on Fisher suggested Slide inspired by N. Vasconcelos Lecture 17-41!
42 technical note Linear Discriminant Analysis (LDA) We have already seen that also Slide inspired by N. Vasconcelos Lecture 17-42!
43 technical note Linear Discriminant Analysis (LDA) And which can be wriien as Slide inspired by N. Vasconcelos Lecture 17-43!
44 technical note Visualiza2on x2 S W = S 1 + S 2 Within class scaier S 1 S 2 S B x1 Between class scaier Lecture 17-44!
45 technical note Linear Discriminant Analysis (LDA) Maximizing the ra2o Is equivalent to maximizing the numerator while keeping the denominator constant, i.e. And can be accomplished using Lagrange mul2pliers, where we define the Lagrangian as And maximize with respect to both w and λ Slide inspired by N. Vasconcelos Lecture 17-45!
46 technical note Linear Discriminant Analysis (LDA) Sexng the gradient of With respect to w to zeros we get or This is a generalized eigenvalue problem The solu2on is easy when exists Slide inspired by N. Vasconcelos Lecture 17-46!
47 technical note Linear Discriminant Analysis (LDA) In this case And using the defini2on of S No2ng that (μ 1 -μ 0 ) T w=α is a scalar this can be wriien as and since we don t care about the magnitude of w Slide inspired by N. Vasconcelos Lecture 17-47!
48 PCA vs. LDA Eigenfaces exploit the max scaier of the training images in face space Fisherfaces aiempt to maximise the between class sca<er, while minimising the within class sca<er. Lecture 17-48!
49 Basic intui2on: PCA vs. LDA Lecture 17-49!
50 Results: Eigenface vs. Fisherface (1) Input: 160 images of 16 people Train: 159 images Test: 1 image Variation in Facial Expression, Eyewear, and Lighting With glasses Without glasses 3 Lighting conditions 5 expressions Lecture 17-50!
51 Eigenface vs. Fisherface (2) Lecture 17-51!
52 What we have learned today Introduc2on to face recogni2on Principal Component Analysis (PCA) The Eigenfaces Algorithm Linear Discriminant Analysis (LDA) Turk and Pentland, Eigenfaces for Recogni2on, Journal of Cogni-ve Neuroscience 3 (1): P. Belhumeur, J. Hespanha, and D. Kriegman. "Eigenfaces vs. Fisherfaces: Recogni2on Using Class Specific Linear Projec2on". IEEE Transac-ons on pa7ern analysis and machine intelligence 19 (7): Lecture 17-52!
Lecture 17: Face Recogni2on
Lecture 17: Face Recogni2on Dr. Juan Carlos Niebles Stanford AI Lab Professor Fei-Fei Li Stanford Vision Lab Lecture 17-1! What we will learn today Introduc2on to face recogni2on Principal Component Analysis
More informationLecture: Face Recognition
Lecture: Face Recognition Juan Carlos Niebles and Ranjay Krishna Stanford Vision and Learning Lab Lecture 12-1 What we will learn today Introduction to face recognition The Eigenfaces Algorithm Linear
More informationFace recognition Computer Vision Spring 2018, Lecture 21
Face recognition http://www.cs.cmu.edu/~16385/ 16-385 Computer Vision Spring 2018, Lecture 21 Course announcements Homework 6 has been posted and is due on April 27 th. - Any questions about the homework?
More informationFace Recognition. Face Recognition. Subspace-Based Face Recognition Algorithms. Application of Face Recognition
ace Recognition Identify person based on the appearance of face CSED441:Introduction to Computer Vision (2017) Lecture10: Subspace Methods and ace Recognition Bohyung Han CSE, POSTECH bhhan@postech.ac.kr
More informationLecture: Face Recognition and Feature Reduction
Lecture: Face Recognition and Feature Reduction Juan Carlos Niebles and Ranjay Krishna Stanford Vision and Learning Lab Lecture 11-1 Recap - Curse of dimensionality Assume 5000 points uniformly distributed
More informationLecture 13 Visual recognition
Lecture 13 Visual recognition Announcements Silvio Savarese Lecture 13-20-Feb-14 Lecture 13 Visual recognition Object classification bag of words models Discriminative methods Generative methods Object
More informationLecture: Face Recognition and Feature Reduction
Lecture: Face Recognition and Feature Reduction Juan Carlos Niebles and Ranjay Krishna Stanford Vision and Learning Lab 1 Recap - Curse of dimensionality Assume 5000 points uniformly distributed in the
More informationCOS 429: COMPUTER VISON Face Recognition
COS 429: COMPUTER VISON Face Recognition Intro to recognition PCA and Eigenfaces LDA and Fisherfaces Face detection: Viola & Jones (Optional) generic object models for faces: the Constellation Model Reading:
More informationPCA FACE RECOGNITION
PCA FACE RECOGNITION The slides are from several sources through James Hays (Brown); Srinivasa Narasimhan (CMU); Silvio Savarese (U. of Michigan); Shree Nayar (Columbia) including their own slides. Goal
More informationExample: Face Detection
Announcements HW1 returned New attendance policy Face Recognition: Dimensionality Reduction On time: 1 point Five minutes or more late: 0.5 points Absent: 0 points Biometrics CSE 190 Lecture 14 CSE190,
More informationRecognition Using Class Specific Linear Projection. Magali Segal Stolrasky Nadav Ben Jakov April, 2015
Recognition Using Class Specific Linear Projection Magali Segal Stolrasky Nadav Ben Jakov April, 2015 Articles Eigenfaces vs. Fisherfaces Recognition Using Class Specific Linear Projection, Peter N. Belhumeur,
More informationDimensionality Reduction Using PCA/LDA. Hongyu Li School of Software Engineering TongJi University Fall, 2014
Dimensionality Reduction Using PCA/LDA Hongyu Li School of Software Engineering TongJi University Fall, 2014 Dimensionality Reduction One approach to deal with high dimensional data is by reducing their
More informationDeriving Principal Component Analysis (PCA)
-0 Mathematical Foundations for Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Deriving Principal Component Analysis (PCA) Matt Gormley Lecture 11 Oct.
More informationUnsupervised Learning: K- Means & PCA
Unsupervised Learning: K- Means & PCA Unsupervised Learning Supervised learning used labeled data pairs (x, y) to learn a func>on f : X Y But, what if we don t have labels? No labels = unsupervised learning
More informationMachine Learning. Data visualization and dimensionality reduction. Eric Xing. Lecture 7, August 13, Eric Xing Eric CMU,
Eric Xing Eric Xing @ CMU, 2006-2010 1 Machine Learning Data visualization and dimensionality reduction Eric Xing Lecture 7, August 13, 2010 Eric Xing Eric Xing @ CMU, 2006-2010 2 Text document retrieval/labelling
More informationPrincipal Component Analysis -- PCA (also called Karhunen-Loeve transformation)
Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation) PCA transforms the original input space into a lower dimensional space, by constructing dimensions that are linear combinations
More informationKeywords Eigenface, face recognition, kernel principal component analysis, machine learning. II. LITERATURE REVIEW & OVERVIEW OF PROPOSED METHODOLOGY
Volume 6, Issue 3, March 2016 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Eigenface and
More informationRobot Image Credit: Viktoriya Sukhanova 123RF.com. Dimensionality Reduction
Robot Image Credit: Viktoriya Sukhanova 13RF.com Dimensionality Reduction Feature Selection vs. Dimensionality Reduction Feature Selection (last time) Select a subset of features. When classifying novel
More informationDr. Ulas Bagci
CAP5415-Computer Vision Lecture 18-Face Recogni;on Dr. Ulas Bagci bagci@ucf.edu 1 Lecture 18 Face Detec;on and Recogni;on Detec;on Recogni;on Sally 2 Why wasn t Massachusetss Bomber iden6fied by the Massachuse8s
More informationLecture 24: Principal Component Analysis. Aykut Erdem May 2016 Hacettepe University
Lecture 4: Principal Component Analysis Aykut Erdem May 016 Hacettepe University This week Motivation PCA algorithms Applications PCA shortcomings Autoencoders Kernel PCA PCA Applications Data Visualization
More informationImage Analysis & Retrieval. Lec 14. Eigenface and Fisherface
Image Analysis & Retrieval Lec 14 Eigenface and Fisherface Zhu Li Dept of CSEE, UMKC Office: FH560E, Email: lizhu@umkc.edu, Ph: x 2346. http://l.web.umkc.edu/lizhu Z. Li, Image Analysis & Retrv, Spring
More informationReconnaissance d objetsd et vision artificielle
Reconnaissance d objetsd et vision artificielle http://www.di.ens.fr/willow/teaching/recvis09 Lecture 6 Face recognition Face detection Neural nets Attention! Troisième exercice de programmation du le
More informationCITS 4402 Computer Vision
CITS 4402 Computer Vision A/Prof Ajmal Mian Adj/A/Prof Mehdi Ravanbakhsh Lecture 06 Object Recognition Objectives To understand the concept of image based object recognition To learn how to match images
More informationImage Analysis & Retrieval Lec 14 - Eigenface & Fisherface
CS/EE 5590 / ENG 401 Special Topics, Spring 2018 Image Analysis & Retrieval Lec 14 - Eigenface & Fisherface Zhu Li Dept of CSEE, UMKC http://l.web.umkc.edu/lizhu Office Hour: Tue/Thr 2:30-4pm@FH560E, Contact:
More informationPrincipal Component Analysis (PCA)
Principal Component Analysis (PCA) Salvador Dalí, Galatea of the Spheres CSC411/2515: Machine Learning and Data Mining, Winter 2018 Michael Guerzhoy and Lisa Zhang Some slides from Derek Hoiem and Alysha
More informationCourse 495: Advanced Statistical Machine Learning/Pattern Recognition
Course 495: Advanced Statistical Machine Learning/Pattern Recognition Deterministic Component Analysis Goal (Lecture): To present standard and modern Component Analysis (CA) techniques such as Principal
More informationEigenimages. Digital Image Processing: Bernd Girod, 2013 Stanford University -- Eigenimages 1
Eigenimages " Unitary transforms" Karhunen-Loève transform" Eigenimages for recognition" Sirovich and Kirby method" Example: eigenfaces" Eigenfaces vs. Fisherfaces" Digital Image Processing: Bernd Girod,
More informationAdvanced Introduction to Machine Learning CMU-10715
Advanced Introduction to Machine Learning CMU-10715 Principal Component Analysis Barnabás Póczos Contents Motivation PCA algorithms Applications Some of these slides are taken from Karl Booksh Research
More informationWhat is Principal Component Analysis?
What is Principal Component Analysis? Principal component analysis (PCA) Reduce the dimensionality of a data set by finding a new set of variables, smaller than the original set of variables Retains most
More informationLecture 7: Con3nuous Latent Variable Models
CSC2515 Fall 2015 Introduc3on to Machine Learning Lecture 7: Con3nuous Latent Variable Models All lecture slides will be available as.pdf on the course website: http://www.cs.toronto.edu/~urtasun/courses/csc2515/
More informationPrincipal Component Analysis
Principal Component Analysis Yingyu Liang yliang@cs.wisc.edu Computer Sciences Department University of Wisconsin, Madison [based on slides from Nina Balcan] slide 1 Goals for the lecture you should understand
More informationLinear Subspace Models
Linear Subspace Models Goal: Explore linear models of a data set. Motivation: A central question in vision concerns how we represent a collection of data vectors. The data vectors may be rasterized images,
More informationLecture 13: Tracking mo3on features op3cal flow
Lecture 13: Tracking mo3on features op3cal flow Professor Fei- Fei Li Stanford Vision Lab Lecture 14-1! What we will learn today? Introduc3on Op3cal flow Feature tracking Applica3ons Reading: [Szeliski]
More informationEigenfaces. Face Recognition Using Principal Components Analysis
Eigenfaces Face Recognition Using Principal Components Analysis M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive Neuroscience, 3(1), pp. 71-86, 1991. Slides : George Bebis, UNR
More informationPrincipal Component Analysis
B: Chapter 1 HTF: Chapter 1.5 Principal Component Analysis Barnabás Póczos University of Alberta Nov, 009 Contents Motivation PCA algorithms Applications Face recognition Facial expression recognition
More informationFace detection and recognition. Detection Recognition Sally
Face detection and recognition Detection Recognition Sally Face detection & recognition Viola & Jones detector Available in open CV Face recognition Eigenfaces for face recognition Metric learning identification
More informationComputation. For QDA we need to calculate: Lets first consider the case that
Computation For QDA we need to calculate: δ (x) = 1 2 log( Σ ) 1 2 (x µ ) Σ 1 (x µ ) + log(π ) Lets first consider the case that Σ = I,. This is the case where each distribution is spherical, around the
More informationSystem 1 (last lecture) : limited to rigidly structured shapes. System 2 : recognition of a class of varying shapes. Need to:
System 2 : Modelling & Recognising Modelling and Recognising Classes of Classes of Shapes Shape : PDM & PCA All the same shape? System 1 (last lecture) : limited to rigidly structured shapes System 2 :
More informationWhen Fisher meets Fukunaga-Koontz: A New Look at Linear Discriminants
When Fisher meets Fukunaga-Koontz: A New Look at Linear Discriminants Sheng Zhang erence Sim School of Computing, National University of Singapore 3 Science Drive 2, Singapore 7543 {zhangshe, tsim}@comp.nus.edu.sg
More informationImage Analysis. PCA and Eigenfaces
Image Analysis PCA and Eigenfaces Christophoros Nikou cnikou@cs.uoi.gr Images taken from: D. Forsyth and J. Ponce. Computer Vision: A Modern Approach, Prentice Hall, 2003. Computer Vision course by Svetlana
More informationEigenimaging for Facial Recognition
Eigenimaging for Facial Recognition Aaron Kosmatin, Clayton Broman December 2, 21 Abstract The interest of this paper is Principal Component Analysis, specifically its area of application to facial recognition
More informationExpectation Maximization
Expectation Maximization Machine Learning CSE546 Carlos Guestrin University of Washington November 13, 2014 1 E.M.: The General Case E.M. widely used beyond mixtures of Gaussians The recipe is the same
More informationPCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani
PCA & ICA CE-717: Machine Learning Sharif University of Technology Spring 2015 Soleymani Dimensionality Reduction: Feature Selection vs. Feature Extraction Feature selection Select a subset of a given
More informationDecember 20, MAA704, Multivariate analysis. Christopher Engström. Multivariate. analysis. Principal component analysis
.. December 20, 2013 Todays lecture. (PCA) (PLS-R) (LDA) . (PCA) is a method often used to reduce the dimension of a large dataset to one of a more manageble size. The new dataset can then be used to make
More informationFace Detection and Recognition
Face Detection and Recognition Face Recognition Problem Reading: Chapter 18.10 and, optionally, Face Recognition using Eigenfaces by M. Turk and A. Pentland Queryimage face query database Face Verification
More information1 Principal Components Analysis
Lecture 3 and 4 Sept. 18 and Sept.20-2006 Data Visualization STAT 442 / 890, CM 462 Lecture: Ali Ghodsi 1 Principal Components Analysis Principal components analysis (PCA) is a very popular technique for
More informationDimensionality reduction
Dimensionality Reduction PCA continued Machine Learning CSE446 Carlos Guestrin University of Washington May 22, 2013 Carlos Guestrin 2005-2013 1 Dimensionality reduction n Input data may have thousands
More informationCSC 411 Lecture 12: Principal Component Analysis
CSC 411 Lecture 12: Principal Component Analysis Roger Grosse, Amir-massoud Farahmand, and Juan Carrasquilla University of Toronto UofT CSC 411: 12-PCA 1 / 23 Overview Today we ll cover the first unsupervised
More informationLecture 8: Principal Component Analysis; Kernel PCA
Lecture 8: Principal Component Analysis; Kernel PCA Lester Mackey April 23, 2014 Stats 306B: Unsupervised Learning Sta306b April 19, 2011 Principal components: 16 PCA example: digit data 130 threes, a
More informationA Unified Bayesian Framework for Face Recognition
Appears in the IEEE Signal Processing Society International Conference on Image Processing, ICIP, October 4-7,, Chicago, Illinois, USA A Unified Bayesian Framework for Face Recognition Chengjun Liu and
More informationPrincipal Component Analysis (PCA)
Principal Component Analysis (PCA) Additional reading can be found from non-assessed exercises (week 8) in this course unit teaching page. Textbooks: Sect. 6.3 in [1] and Ch. 12 in [2] Outline Introduction
More informationFocus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations.
Previously Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations y = Ax Or A simply represents data Notion of eigenvectors,
More informationPCA, Kernel PCA, ICA
PCA, Kernel PCA, ICA Learning Representations. Dimensionality Reduction. Maria-Florina Balcan 04/08/2015 Big & High-Dimensional Data High-Dimensions = Lot of Features Document classification Features per
More informationSTA 414/2104: Lecture 8
STA 414/2104: Lecture 8 6-7 March 2017: Continuous Latent Variable Models, Neural networks With thanks to Russ Salakhutdinov, Jimmy Ba and others Outline Continuous latent variable models Background PCA
More informationComparative Assessment of Independent Component. Component Analysis (ICA) for Face Recognition.
Appears in the Second International Conference on Audio- and Video-based Biometric Person Authentication, AVBPA 99, ashington D. C. USA, March 22-2, 1999. Comparative Assessment of Independent Component
More informationECE 661: Homework 10 Fall 2014
ECE 661: Homework 10 Fall 2014 This homework consists of the following two parts: (1) Face recognition with PCA and LDA for dimensionality reduction and the nearest-neighborhood rule for classification;
More informationCS168: The Modern Algorithmic Toolbox Lecture #7: Understanding Principal Component Analysis (PCA)
CS68: The Modern Algorithmic Toolbox Lecture #7: Understanding Principal Component Analysis (PCA) Tim Roughgarden & Gregory Valiant April 0, 05 Introduction. Lecture Goal Principal components analysis
More informationPCA and LDA. Man-Wai MAK
PCA and LDA Man-Wai MAK Dept. of Electronic and Information Engineering, The Hong Kong Polytechnic University enmwmak@polyu.edu.hk http://www.eie.polyu.edu.hk/ mwmak References: S.J.D. Prince,Computer
More informationSTA 414/2104: Lecture 8
STA 414/2104: Lecture 8 6-7 March 2017: Continuous Latent Variable Models, Neural networks Delivered by Mark Ebden With thanks to Russ Salakhutdinov, Jimmy Ba and others Outline Continuous latent variable
More informationECE 521. Lecture 11 (not on midterm material) 13 February K-means clustering, Dimensionality reduction
ECE 521 Lecture 11 (not on midterm material) 13 February 2017 K-means clustering, Dimensionality reduction With thanks to Ruslan Salakhutdinov for an earlier version of the slides Overview K-means clustering
More informationLecture 16: Small Sample Size Problems (Covariance Estimation) Many thanks to Carlos Thomaz who authored the original version of these slides
Lecture 16: Small Sample Size Problems (Covariance Estimation) Many thanks to Carlos Thomaz who authored the original version of these slides Intelligent Data Analysis and Probabilistic Inference Lecture
More informationCS4495/6495 Introduction to Computer Vision. 8B-L2 Principle Component Analysis (and its use in Computer Vision)
CS4495/6495 Introduction to Computer Vision 8B-L2 Principle Component Analysis (and its use in Computer Vision) Wavelength 2 Wavelength 2 Principal Components Principal components are all about the directions
More informationLocality Preserving Projections
Locality Preserving Projections Xiaofei He Department of Computer Science The University of Chicago Chicago, IL 60637 xiaofei@cs.uchicago.edu Partha Niyogi Department of Computer Science The University
More informationModeling Classes of Shapes Suppose you have a class of shapes with a range of variations: System 2 Overview
4 4 4 6 4 4 4 6 4 4 4 6 4 4 4 6 4 4 4 6 4 4 4 6 4 4 4 6 4 4 4 6 Modeling Classes of Shapes Suppose you have a class of shapes with a range of variations: System processes System Overview Previous Systems:
More informationAn Efficient Pseudoinverse Linear Discriminant Analysis method for Face Recognition
An Efficient Pseudoinverse Linear Discriminant Analysis method for Face Recognition Jun Liu, Songcan Chen, Daoqiang Zhang, and Xiaoyang Tan Department of Computer Science & Engineering, Nanjing University
More informationSubspace Methods for Visual Learning and Recognition
This is a shortened version of the tutorial given at the ECCV 2002, Copenhagen, and ICPR 2002, Quebec City. Copyright 2002 by Aleš Leonardis, University of Ljubljana, and Horst Bischof, Graz University
More informationDiscriminant Uncorrelated Neighborhood Preserving Projections
Journal of Information & Computational Science 8: 14 (2011) 3019 3026 Available at http://www.joics.com Discriminant Uncorrelated Neighborhood Preserving Projections Guoqiang WANG a,, Weijuan ZHANG a,
More informationLecture 5 Supspace Tranformations Eigendecompositions, kernel PCA and CCA
Lecture 5 Supspace Tranformations Eigendecompositions, kernel PCA and CCA Pavel Laskov 1 Blaine Nelson 1 1 Cognitive Systems Group Wilhelm Schickard Institute for Computer Science Universität Tübingen,
More informationPattern Recognition 2
Pattern Recognition 2 KNN,, Dr. Terence Sim School of Computing National University of Singapore Outline 1 2 3 4 5 Outline 1 2 3 4 5 The Bayes Classifier is theoretically optimum. That is, prob. of error
More informationReal Time Face Detection and Recognition using Haar - Based Cascade Classifier and Principal Component Analysis
Real Time Face Detection and Recognition using Haar - Based Cascade Classifier and Principal Component Analysis Sarala A. Dabhade PG student M. Tech (Computer Egg) BVDU s COE Pune Prof. Mrunal S. Bewoor
More informationCS 4495 Computer Vision Principle Component Analysis
CS 4495 Computer Vision Principle Component Analysis (and it s use in Computer Vision) Aaron Bobick School of Interactive Computing Administrivia PS6 is out. Due *** Sunday, Nov 24th at 11:55pm *** PS7
More informationCSC321 Lecture 20: Autoencoders
CSC321 Lecture 20: Autoencoders Roger Grosse Roger Grosse CSC321 Lecture 20: Autoencoders 1 / 16 Overview Latent variable models so far: mixture models Boltzmann machines Both of these involve discrete
More informationSymmetric Two Dimensional Linear Discriminant Analysis (2DLDA)
Symmetric Two Dimensional inear Discriminant Analysis (2DDA) Dijun uo, Chris Ding, Heng Huang University of Texas at Arlington 701 S. Nedderman Drive Arlington, TX 76013 dijun.luo@gmail.com, {chqding,
More informationMachine Learning. CUNY Graduate Center, Spring Lectures 11-12: Unsupervised Learning 1. Professor Liang Huang.
Machine Learning CUNY Graduate Center, Spring 2013 Lectures 11-12: Unsupervised Learning 1 (Clustering: k-means, EM, mixture models) Professor Liang Huang huang@cs.qc.cuny.edu http://acl.cs.qc.edu/~lhuang/teaching/machine-learning
More informationCOMP 562: Introduction to Machine Learning
COMP 562: Introduction to Machine Learning Lecture 20 : Support Vector Machines, Kernels Mahmoud Mostapha 1 Department of Computer Science University of North Carolina at Chapel Hill mahmoudm@cs.unc.edu
More informationCS 6140: Machine Learning Spring 2016
CS 6140: Machine Learning Spring 2016 Instructor: Lu Wang College of Computer and Informa?on Science Northeastern University Webpage: www.ccs.neu.edu/home/luwang Email: luwang@ccs.neu.edu Logis?cs Assignment
More informationEigenfaces and Fisherfaces
Eigenfaces and Fisherfaces Dimension Reduction and Component Analysis Jason Corso University of Michigan EECS 598 Fall 2014 Foundations of Computer Vision JJ Corso (University of Michigan) Eigenfaces and
More informationSignal Analysis. Principal Component Analysis
Multi dimensional Signal Analysis Lecture 2E Principal Component Analysis Subspace representation Note! Given avector space V of dimension N a scalar product defined by G 0 a subspace U of dimension M
More informationCS 6140: Machine Learning Spring What We Learned Last Week. Survey 2/26/16. VS. Model
Logis@cs CS 6140: Machine Learning Spring 2016 Instructor: Lu Wang College of Computer and Informa@on Science Northeastern University Webpage: www.ccs.neu.edu/home/luwang Email: luwang@ccs.neu.edu Assignment
More informationNovember 28 th, Carlos Guestrin 1. Lower dimensional projections
PCA Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University November 28 th, 2007 1 Lower dimensional projections Rather than picking a subset of the features, we can new features that are
More information2D Image Processing Face Detection and Recognition
2D Image Processing Face Detection and Recognition Prof. Didier Stricker Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz http://av.dfki.de
More information20 Unsupervised Learning and Principal Components Analysis (PCA)
116 Jonathan Richard Shewchuk 20 Unsupervised Learning and Principal Components Analysis (PCA) UNSUPERVISED LEARNING We have sample points, but no labels! No classes, no y-values, nothing to predict. Goal:
More informationMachine Learning - MT & 14. PCA and MDS
Machine Learning - MT 2016 13 & 14. PCA and MDS Varun Kanade University of Oxford November 21 & 23, 2016 Announcements Sheet 4 due this Friday by noon Practical 3 this week (continue next week if necessary)
More informationLinear Algebra & Geometry why is linear algebra useful in computer vision?
Linear Algebra & Geometry why is linear algebra useful in computer vision? References: -Any book on linear algebra! -[HZ] chapters 2, 4 Some of the slides in this lecture are courtesy to Prof. Octavia
More informationSimultaneous and Orthogonal Decomposition of Data using Multimodal Discriminant Analysis
Simultaneous and Orthogonal Decomposition of Data using Multimodal Discriminant Analysis Terence Sim Sheng Zhang Jianran Li Yan Chen School of Computing, National University of Singapore, Singapore 117417.
More informationCS168: The Modern Algorithmic Toolbox Lecture #8: How PCA Works
CS68: The Modern Algorithmic Toolbox Lecture #8: How PCA Works Tim Roughgarden & Gregory Valiant April 20, 206 Introduction Last lecture introduced the idea of principal components analysis (PCA). The
More informationEnhanced Fisher Linear Discriminant Models for Face Recognition
Appears in the 14th International Conference on Pattern Recognition, ICPR 98, Queensland, Australia, August 17-2, 1998 Enhanced isher Linear Discriminant Models for ace Recognition Chengjun Liu and Harry
More informationFace Recognition and Biometric Systems
The Eigenfaces method Plan of the lecture Principal Components Analysis main idea Feature extraction by PCA face recognition Eigenfaces training feature extraction Literature M.A.Turk, A.P.Pentland Face
More informationComputer Vision. Pa0ern Recogni4on Concepts Part I. Luis F. Teixeira MAP- i 2012/13
Computer Vision Pa0ern Recogni4on Concepts Part I Luis F. Teixeira MAP- i 2012/13 What is it? Pa0ern Recogni4on Many defini4ons in the literature The assignment of a physical object or event to one of
More informationPrincipal Component Analysis (PCA) CSC411/2515 Tutorial
Principal Component Analysis (PCA) CSC411/2515 Tutorial Harris Chan Based on previous tutorial slides by Wenjie Luo, Ladislav Rampasek University of Toronto hchan@cs.toronto.edu October 19th, 2017 (UofT)
More informationCS 559: Machine Learning Fundamentals and Applications 5 th Set of Notes
CS 559: Machine Learning Fundamentals and Applications 5 th Set of Notes Instructor: Philippos Mordohai Webpage: www.cs.stevens.edu/~mordohai E-mail: Philippos.Mordohai@stevens.edu Office: Lieb 25 Project:
More informationDimension Reduction and Component Analysis
Dimension Reduction and Component Analysis Jason Corso SUNY at Buffalo J. Corso (SUNY at Buffalo) Dimension Reduction and Component Analysis 1 / 103 Plan We learned about estimating parametric models and
More informationIntroduction to Machine Learning
10-701 Introduction to Machine Learning PCA Slides based on 18-661 Fall 2018 PCA Raw data can be Complex, High-dimensional To understand a phenomenon we measure various related quantities If we knew what
More informationDimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas
Dimensionality Reduction: PCA Nicholas Ruozzi University of Texas at Dallas Eigenvalues λ is an eigenvalue of a matrix A R n n if the linear system Ax = λx has at least one non-zero solution If Ax = λx
More informationDimensionality Reduction and Principal Components
Dimensionality Reduction and Principal Components Nuno Vasconcelos (Ken Kreutz-Delgado) UCSD Motivation Recall, in Bayesian decision theory we have: World: States Y in {1,..., M} and observations of X
More informationMathematical foundations - linear algebra
Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar
More informationFace Recognition Using Multi-viewpoint Patterns for Robot Vision
11th International Symposium of Robotics Research (ISRR2003), pp.192-201, 2003 Face Recognition Using Multi-viewpoint Patterns for Robot Vision Kazuhiro Fukui and Osamu Yamaguchi Corporate Research and
More informationNon-parametric Classification of Facial Features
Non-parametric Classification of Facial Features Hyun Sung Chang Department of Electrical Engineering and Computer Science Massachusetts Institute of Technology Problem statement In this project, I attempted
More informationDimension Reduction and Component Analysis Lecture 4
Dimension Reduction and Component Analysis Lecture 4 Jason Corso SUNY at Buffalo 28 January 2009 J. Corso (SUNY at Buffalo) Lecture 4 28 January 2009 1 / 103 Plan In lecture 3, we learned about estimating
More informationFace Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi
Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi Overview Introduction Linear Methods for Dimensionality Reduction Nonlinear Methods and Manifold
More information