System 2 : Modelling & Recognising Modelling and Recognising Classes of Classes of Shapes Shape : PDM & PCA All the same shape? System 1 (last lecture) : limited to rigidly structured shapes System 2 : recognition of a class of varying shapes thresholding (IVR) Advanced Vision Lecture 4 boundary tracking (from System 1) toby.breckon@ed.ac.uk corner finding (from System 1) orientation to standard position (PCA) Training : Point Distribution Model (PDM) Recognition : likelihood calculation Computer Vision Lab. Institute for Perception, Action & Behaviour School of Informatics Modelling & Recognising Classes of Shape 1 Modelling & Recognising Classes of Shape 2 Motivation Not all objects e.g. fruit classification for a given shape class. due to capture problems / natural variation / movement / configurations 2. Represent the shape class as statistical variation over these modes of variation i.e. within bounds of variation of a common structure 3. Use statistical recognition based on comparison to...we want to be able to recognize them as such! Need to: 1. Identify common modes (i.e. directions) of variation But if they all originate from the same class of object but have large variations in shape in common directions variations in production appear equally All T parts belong to the same shape class grow equally are made equal How do we recognize classes of shape? Modelling & Recognising Classes of Shape 3 shape class representation Modelling & Recognising Classes of Shape 4
System 2 Overview Principle Component Analysis 1 Vision concepts (this lecture) Principle Components Analysis (PCA) Point Distribution Models (PDM) Model learning and Data Classification Task Specifics Given a set of D dimensional points {xi } with mean position m e.g. 2D points from an image (could also be 3D points) a2 a1 m (next lecture) a1 = 1st principle component a2 = 2nd principle component rotate T s to standard orientation using PCA represent T s using PDM recognize unseen examples w/ statistical classification Modelling & Recognising Classes of Shape 5 Find a new set of D perpendicular co ordinate axes {aj } such that: i.e. point xi can be represented as the mean and a weighted sum of the axis directions Modelling & Recognising Classes of Shape 6 Principle Component Analysis 2 1. Choose axis a1 as direction of most variation in the dataset: Transform points to new PCA representation Practical PCA Technique 1 need to solve for wij As all PCA axes are perpendicular know: ak aj = 0 for k j (dot product) ak ak = 1 2. Project each point xi onto a D 1 dimensional subspace perpendicular to a1 to give xi'. Can identify individual weights wik, weight of point xi for axis ak: Modelling & Recognising Classes of Shape 7 removing the component of variation in direction a1 Modelling & Recognising Classes of Shape 8
Practical PCA Technique 1 3. Calculate the axis direction a2 as the direction of the most remaining Why use PCA? variation in {xi } Many possible axis sets {ai } PCA chooses axis directions ai in order of largest remaining variation gives an ordering of dimensions of variation from most to least significant a2 allows us to omit low significance axes e.g. if we omit axis a2 we get the following reduction 4. Project each xi onto a D 2 dimension subspace 5. Continue like this until all D new axes ai are found Modelling & Recognising Classes of Shape 9 Modelling & Recognising Classes of Shape 10 How to do PCA Technique 2 via Eigenanalysis: Given N D dimensional points {xi } Mean Mid lecture problem : PCA If you had a 3D dataset like this: 1. Produce scatter matrix 2. Perform Singular Value Decomposition (SVD) S = UDV' D = diagonal matrix matrices U, V such that U' U = V' V = I 3. PCA: ith column of V is axis ai (N.B. ' = transpose) (i.e. ith eigenvector of S) dii of D (diagonal elements) is a measure of significance (i.e. ith eigenvalue of S) Modelling & Recognising Classes of Shape 11 how many principle components does it have? what is the most likely direction of the 1st PC? Modelling & Recognising Classes of Shape 12
Point Distribution Models Gaussian Noise Distribution Given Noise with a probability density function (PDF) following the normal (or Gaussian) distribution Set of points from the same class set of point positions {xi} for each object instance Assume point positions have systematic structural variation Original Gaussian noise point distribution each measured value (pixel/point) will have changed from its original value by a small random amount the distribution of these changes follow a Gaussian distribution around the original value (i.e. mean). (Gaussian distribution has zero mean.) Gaussian noise distribution good estimation of most sensor noise width of Gaussian, hence level of variation estimated by standard deviation point position variations are correlated (structurally and statistically) Aim: construct a model that captures both structural and statistical position variation use model for recognition Modelling & Recognising Classes of Shape 13 Gaussian Noise (std. dev = 8) Modelling & Recognising Classes of Shape 14 Example : hands Point Distribution Models PDM Given: set of N observations (i.e. images) each with P boundary points {(rik, cik )} Key trick : rewrite {(rik, cik )} as a new 2 P vector xi A family of objects with shape variations k = 1..P (points in boundary) i = 1..N (observations) xi = (ri1, ci1, ri2, ci2, ri3, ci3,...rip, cip )' N vectors {xi } of dimension 2 P how do we represent them? Modelling & Recognising Classes of Shape 15 Modelling & Recognising Classes of Shape 16
Point Distribution Models PDM If shape variations are random, then components of {xi } will be uncorrelated PDM : The Structural Model PCA over the set {xi } gives 2P axes such that: If shape there is systematic shape variation, then 2P axes gives a complete representation for {xi } but we can approximate {xi } with a subset of M most significant axes: select number of axes based on eigenvalue from PCA smaller, generalised representation of {xi } as M < 2P (above) components of {xi } will be correlated Use PCA to find correlated variations Modelling & Recognising Classes of Shape 17 can represent individual xi using wi = {wi1,...wim )' approximate full shape using wi and vary components to vary shape Modelling & Recognising Classes of Shape 18 PDM : The Statistical Model Structural Model : varying PCA weights If we have a good structural model component values wi should be random and independent why? : because structural variation has been extracted only statistical variation remains Statistical Model Original Capture (point outlines of hand) Variations from PCA weight space (varying weights along PCA axes) i.e. set of weight vectors 1...N, each of length M, resulting from first M principle component axes of the observation set [Morris '04] Provides visualisation of the main components of structural variation Modelling & Recognising Classes of Shape 19 given set of N component projection vectors {wi } Mean vector of weights : Covariance matrix : (captures statistical variation from mean of PCA weight space) Modelling & Recognising Classes of Shape 20
PDM: Classification / Recognition Structural Model : M axes from PCA Statistical Model : mean & covariance of PCA weight space Given: Application to our T parts? an unknown observation (point set { p }) set of means { ti } and covariance matrices {Ci} for K classes For i in 1...K classes 1. Project { p } onto the structural model of class i w 2. Compute Mahalanobis distance from statistical model Next lecture... Select class with smallest distance or reject if distance is too large Modelling & Recognising Classes of Shape 21 Modelling & Recognising Classes of Shape 22 PCA based face recognition Eigenfaces [Turk & Pentland '91] representation of faces using PCA directly on images one of the most famous uses of PCA in computer vision seminal reference on the problem of Face Recognition Key principle EigenPictures [Sirovitch/Kirby '87] if D dimensional points can be represented as a weighted sum of D axes then images can be represented as a weighted sum of other images (Eigenpictures) Modelling & Recognising Classes of Shape 23 Single image can be reconstructed from a weighted sum of mean + N basis images Treat the RxC images as vectors dimension RC! perform PCA via eigenanalysis PCA axis vectors can be displayed as RxC images eigenpictures difficult to characterise variations visually When applied to compression of face database: EigenFaces [Morris '04] 40 eigenpictures to represent 115 (128x128) images with 3% error [Sirovitch/Kirby '87] Modelling & Recognising Classes of Shape 24
EigenFaces EigenFaces [Turk & Pentland '91] Learning: collect set of pictures of K people (varying capture conditions) Mean face and subset of principle component axes/images [Morris '04] Recognition : unknown face image F use PCA eigenanalysis to compute eigenfaces of complete set special trick for large RxC matrices Represent each person i=1..k as corresponding vector, wi, in PCA compute projection of F onto PCA weight space i.e. as weighted sum of eigenfaces result: weight vector wf measure distance, d(), between vector wf and vectors w1...wk weight space [Turk & Pentland '91] i.e. as a weighted sum of eigenfaces if d(wf,wi ) < threshold relies on common alignment of subject in images! Modelling & Recognising Classes of Shape 25 identify face as person i EigenFaces in detail else return as unknown face Modelling & Recognising Classes of Shape 26 EigenFaces Performance 2,500 128x128 image database [Turk & Pentland '91] Summary varied face lighting, orientation, size 96% successful recognition over lighting variation 85% successful recognition over orientation variation 64% successful recognition over size variation Issues for discussion face position, orientation, scale variance & occlusion big problems face position/orientation identification still a major research topic 96% still means 4 failures per 100 people (arriving at a busy airport!) Principle Component Analysis (PCA) constructing a structural model of variation PCA via eigenanalysis Point Distribution Models (PDM) constructing a statistical model of variation learning and recognition using PDM & PCA EigenPictures / EigenFaces PCA extended to raw images Next lecture : PCA/PDM to T models Modelling & Recognising Classes of Shape 27 Modelling & Recognising Classes of Shape 28