You Are How You Walk: Gait Recognition from Motion Capture Data

Size: px
Start display at page:

Download "You Are How You Walk: Gait Recognition from Motion Capture Data"

Transcription

1 You Are How You Walk: Gait Recognition from Motion Capture Data Michal Balazia Faculty of Informatics, Masaryk University, Brno, Czech Republic Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

2 Human Identification by Gait Data captured by a system of multiple cameras or a depth camera Large tracking space Multiple samples of a single walker High variance in encounter conditions Database of large amount of biometric samples Identification in real time Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

3 Motion Capture Data MoCap) Frame: y z x Structural motion data Skeleton of joints and bones Data = 3D positions of joints in time. Can be collected by a system of multiple cameras Vicon) or a depth camera Microsoft Kinect) Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

4 Raw MoCap Gait Data Model of human body has J joints Measured gait cycle has length of T video frames Raw MoCap gait sample is a tensor γ 1 ) γ 1 T ) g =..... γ J T ) γ J T ) γ j t) R 3 are 3D coordinates of j {1,..., J} at t {1,..., T } Dimensionality 3JT Sample space {g} Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

5 Geometric Features feature signals Examples of geometric gait features: joint angles angle in shoulder-elbow-wrist) video inter-joint frames distances feet distance) joint velocity or acceleration areas of joint polygons upper body span)... features video frames Examples of distance functions: Dynamic Time Warping Minkowski distances... Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

6 Linearly Learned Latent Features Labeled learning sample space {g n, l n )} N L n=1 l n is a label of one of the identity classes {I c } C L c=1 I c has a priori probability p c Consider an optimization criterion J Feature extraction is given by a feature matrix Φ RD D D-dimensional sample space {g n } N n=1 D-dimensional feature space {ĝ n } N n=1 Transform gait samples g n into gait templates ĝ n = Φ g n Examples of distance functions: Mahalanobis distance Minkowski distances... Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

7 Learning by MMC Optimize class separability of the feature space Margin of two classes is the Euclidean distance of their means µ c minus both their variances Σ c Maximum Margin Criterion used by the Support Vector Machines J = 1 2 C L c,c =1 p c p c =... = tr Σ B Σ W ) ) µ c µ c ) µ c µ c ) tr Σ c + Σ c ) Between-class scatter matrix Σ B, within-class scatter matrix Σ W Criterion for a feature matrix Φ ) J Φ) = tr Φ Σ B Σ W ) Φ Solution: solve the generalized eigenvalue problem Σ B Σ W ) Φ = ΛΦ Mahalanobis distance function on templates Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

8 Learning by PCA+LDA 2-stage feature extraction technique Principal Component Analysis and Linear Discriminant Analysis Total scatter matrix Σ T = Σ B + Σ W Criterion for a feature matrix Φ LDA ) J Φ PCA ) = tr Φ PCAΣ T Φ PCA Φ J Φ LDA ) = tr LDA Φ PCA Σ ) BΦ PCA Φ LDA Φ LDA Φ PCA Σ WΦ PCA Φ LDA Solution: solve the generalized eigenvalue problems Σ T Φ PCA = ΛΦ PCA Φ PCAΣ W Φ PCA ) 1 Φ PCAΣ B Φ PCA ) Φ LDA = ΛΦ LDA Mahalanobis distance function on templates Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

9 OPENING HOURS Identity Classification Pipeline BANK Phase I Phase II Spo ed walker Acquiring MoCap data MoCap data Detec ng gait BANK DEPARTURES Phase IV Iden fying walkers Phase III Extrac ng gait features Gait sample Iden fied walker Gait template Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

10 Identity Classification Pipeline DEPARTURES 39 51' 05'' N ' 34'' W 2017/06/21 07:55: ' 06'' N ' 10'' W 2017/06/23 13:24:19 BANK 39 43' 04'' N ' 50'' W 2017/06/24 22:49:38 Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

11 The Classification Problem Identity: Label of a registered identity class. Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

12 But In Video Surveillance Environment... How can we represent the identity in video surveillance environment? Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

13 The Class Discovery Problem Identity: Content of a discovered identity class = movement history. You are how you walk. crime present time Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

14 Universal Gait Features Data need to be acquired without walker s consent New identities can appear on the fly Labels for all encountered people may not always be available Universal gait features features of a high power in recognizing all people and not only those they were learned on We learn the universal gait features by MMC or by PCA+LDA on an auxiliary dataset The dataset is assumed to be rich on covariate conditions aspects of walk people differ in These features create an unsupervised environment particularly suitable for uncooperative person identification Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

15 Evaluation: Database ASF/AMC format of MoCap data CMU MoCap database obtained from the CMU Graphics Lab Extracted database contains only gait cycles motions of two steps) Normalization: position, walk direction and skeleton 7 extracted databases: # identities # gait cycles , , ,923 Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

16 Evaluation: Data Separation Data separation Samples Samples Learning Identities Learning Evaluation Identities Evaluation Normal evaluation Cross-identity evaluation Evaluation of classification estimated by nested cross-validation Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

17 Evaluation: Metrics Class separability coefficients: Davies-Bouldin Index DBI) Dunn Index DI) Silhouette Coefficient SC) Fisher s Discriminant Ratio FDR) Classification based metrics: Cumulative Match Characteristic False Accept / Reject Rate Receiver Operating Characteristic ROC) Recall / Precision Rate Correct Classification Rate CCR) Equal Error Rate EER) Area Under ROC Curve AUC) Mean Average Precision MAP) Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

18 Evaluation: Results method class separability coefficients classification based metrics scalability DBI DI SC FDR CCR EER AUC MAP DCT TD Ahmed Ali Andersson Ball Dikovski Gavrilova ,254 Jiang Krzeszowski ,795 Kumar ,950 Kwolek Preis 1, Sedmidubsky Sinha MMC BR MMC JC PCALDA BR PCALDA JC Random Raw BR ,229 Raw JC ,574 Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

19 Evaluation: Results Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

20 Evaluation: Results Homogeneous set-up with C L = C E {2,..., 27} Heterogeneous set-up with C L = C E {2,..., 27} % 100% 80%,2) 3,3) 4,4) 5,5) 6,6) 7,7) 8,8) 9,9) 0,10) 1,11) 2,12) 3,13) 4,14) 5,15) 6,16) 7,17) 8,18) 9,19) 0,20) 1,21) 2,22) 3,23) 4,24) 5,25) 6,26) 7,27) DBI homogeneous DBI heterogeneous CCR homogenous CCR heterogeneous 60% 40% DBI CCR 20% 0%,2) 3,3) 4,4) 5,5) 6,6) 7,7) 8,8) 9,9) 0,10) 1,11) 2,12) 3,13) 4,14) 5,15) 6,16) 7,17) 8,18) 9,19) 0,20) 1,21) 2,22) 3,23) 4,24) 5,25) 6,26) 7,27) Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

21 Evaluation: Results Heterogeneous set-up with C L {2,..., 27} and C E = 27 Heterogeneous set-up with C L {2,..., 52} and C E = 54 C L ,27) 3,27) 4,27) 5,27) 6,27) 7,27) 8,27) 9,27) 0,27) 1,27) 2,27) 3,27) 4,27) 5,27) 6,27) 7,27) 8,27) 9,27) 0,27) 1,27) 2,27) 3,27) 4,27) 5,27) 6,27) 7,27) DBI CCR,52) 3,51) 4,50) 5,49) 6,48) 7,47) 8,46) 9,45) 0,44) 1,43) 2,42) 3,41) 4,40) 5,39) 6,38) 7,37) 8,36) 9,35) 0,34) 1,33) 2,32) 3,31) 4,30) 5,29) 6,28) 7,27) 8,26) 9,25) 30,24) 31,23) 32,22) 33,21) 34,20) 35,19) 36,18) 37,17) 38,16) 39,15) 40,14) 41,13) 42,12) 43,11) 44,10) 45,9) 46,8) 47,7) 48,6) 49,5) 50,4) 51,3) 52,2) Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

22 Table 2. Classification metrics for all implemented methods. Homogeneous setup, evaluated on the full Evaluation: database with 10-foldResults cross-validation. Methods are ordered by their CCR score. method CCR EER AUC MAP CMC Raw JC RawJC) 1.00 MMC 0.95 MMCBR) 0.95 BR Raw BR RawBR) MMC JC MMCJC) PCA+LDA BR PCA+LDABR) KwolekB KwolekB 0.70 KrzeszowskiT KrzeszowskiT 0.65 PCA+LDA JC PCA+LDAJC) 0.60 DikovskiB DikovskiB 0.55 AhmedF AhmedF AnderssonVO AnderssonVO 0.40 NareshKumarMS NareshKumarMS 0.35 JiangS JiangS 0.30 BallA BallA0.25 SinhaA SinhaA AhmedM AhmedM 0.15 SedmidubskyJ SedmidubskyJ 0.10 AliS AliS 0.05 PreisJ PreisJ Random Random Rank Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26 rect Classification Rate Cumulative Corr

23 Evaluation: Results DCT 0.5 CONFIDENTIAL REVIEW COPY. DO NOT DISTRIBUTE. 0 IJCB DBI ROC ,62),62) 3,61) 3,61) 4,60) 4,60) 5,59) 5,59) 6,58) 6,58) 7,57) 7,57) 8,56) 8,56) 9,55) 9,55) 0,54) 0,54) 1,53) 1,53) Ahmed Andersson Dikovski Preis MMC Ali Ball Kwolek Sinha PCA+LDA 2,52) 2,52) 3,51) 3,51) 4,50) 4,50) 5,49) 6,48) 7,47) a) DBI 5,49) 6,48) 7,47) 8,46) 8,46) 9,45) 9,45) 0,44) 0,44) 1,43) 1,43) 2,42) 2,42) 3,41) 3,41) 4,40) 4,40) 5,39) 5,39) 6,38) 6,38) 7,37) 7,37) 8,36) 8,36) 9,35) 9,35) 30,34) 30,34) 31,33) 31,33) 32,32) 32,32) SC PR 0.1 Ahmed 0.0 Ali -0.1 Andersson -0.2 Ball -0.3 Dikovski -0.4 Kwolek Preis -0.5 Sinha -0.6 MMC 0.40 PCA+LDA Ahmed 48 Ali 49 Andersson 49 Ball Dikovski 49 Kwolek 49 Preis 49 Sinha 49 MMC 49 PCA+LDA Ahmed 50 Ali 50 Andersson Ball 50 Dikovski 50 Kwolek 50 Preis 50 Sinha MMC 50 PCA+LDA50 c) AUC d) MAP Figure 3: Simulations with 31 different C L, C E ) configurations horizontal axes) on four evaluation metrics vertical axes) Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, DBI23 / 26,62) 0.35 Ahmed 3,61) Ali 0.30 Andersson 0.25 Ball 0.20 Dikovski 0.15 Kwolek 0.10 Preis 0.05 Sinha 0.00 MMC,62) 3,61) PCA+LDA 4,60) 4,60) 5,59) 5,59) 6,58) 6,58) 7,57) 7,57) 8,56) 8,56) 9,55) 9,55) 0,54) 0,54) 1,53) 1,53) 2,52) 2,52) 3,51) 3,51) 4,50) 4,50) 5,49) 6,48) 7,47) b) SC 5,49) 6,48) 7,47) 8,46) 8,46) 9,45) 9,45) 0,44) 0,44) 1,43) 1,43) 2,42) 2,42) 3,41) 3,41) 4,40) 4,40) 5,39) 5,39) 6,38) 6,38) 7,37) 7,37) 8,36) 8,36) 9,35) 9,35) 30,34) 30,34) 31,33) 31,33) 32,32) 32,32)

24 c) AUC Evaluation: Results % score % score root lhipjoint rhipjoint lfemur rfemur ltibia rtibia lfoot rfoot ltoes rtoes lclavicle rclavicle lhumerus rhumerus lradius rradius lwrist rwrist lhand rhand lfingers rfingers lthumb rthumb lowerback upperback thorax lowerneck upperneck head pelvis torso left arm right arm arms left leg right leg legs left limbs right limbs limbs all but torso all but arms all but legs Figure 4: Four evaluated metrics of the MMC method with incomplete data. In each column a subset of joints is systematically 0.5 excluded from the input: first 31 columns root head) exclude excluded a single joint and last 14 columns pelvis all but legs) exclude multiple joints. 0 Structure of the human body is the following: head, pelvis = {root, lhipjoint, rhipjoint}, left leg = {lfemur, ltibia, lfoot, ltoes}, left arm = {lhumerus, lradius, lwrist, lhand, lfingers, lthumb}, right leg = {rfemur, rtibia, Ahmed rfoot, rtoes}, Andersson right arm = Dikovski {rhumerus, rradius, Preis rwrist, rhand, MMC rfingers, rthumb}, torso = {lowerback, upperback, Ali thorax, Ball lowerneck, upperneck, Kwolek lclavicle, Sinha rclavicle}. Configuration PCA+LDA 9, 55). DCT d) MAP Figure 3: Simulations with 31 different C L, C E ) configurations horizontal axes) on four evaluation metrics vertical axes). ROC root lhipjoint rhipjoint lfemur rfemur ltibia rtibia lfoot rfoot ltoes rtoes lclavicle rclavicle lhumerus rhumerus lradius rradius lwrist rwrist lhand rhand lfingers rfingers lthumb rthumb lowerback upperback thorax lowerneck upperneck head pelvis torso left arm right arm arms left leg right leg legs left limbs right limbs limbs all but torso all but arms all but legs % noise PR % noise a) ROC and PR for an x% noise simulated by multiplying each measured value by a random number in interval x /100, 1 + x /100). excluded ROC 0.80 Ahmed 0.75 Ali 0.70Andersson Ball 0.65 Dikovski 0.60Kwolek Preis 0.55 Sinha 0.50MMC PCA+LDA % noise PR DBI SC 0.30 Ahmed 0.25 Ali 0.20Andersson Ball 0.15 Dikovski 0.10Kwolek Preis 0.05 Sinha 0.00MMC PCA+LDA % noise b) ROC and PR for an x% noise simulated by substituting each measured value with a random value with probability x%. Figure 5: ROC and PR of all methods with noisy data. Configuration 9, 55). Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26 ROC DBI PRSC ROC PR Ahm Ali Ande Ball Diko Kwol Preis Sinha MMC PCA+

25 Evaluation Framework and Database Available online at Database extraction drive Implementations of all 20 methods Classifier learning and classification mechanism Evaluation mechanism and 12 performance metrics Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

26 Summary Universal gait features learned on an auxiliary dataset Our approach based on MMC and PCA+LDA Broad evaluation on normal and cross-identity setups MMC learned on 17 identities recognizes 37 identities with 95% CCR MMC learned yet on 7 identities best recognizes 57 identities Evaluation framework and database Thank you for attention. Questions? Michal Balazia FI MU) Gait Recognition from MoCap Data May 2nd, / 26

Dimensionality Reduction Using PCA/LDA. Hongyu Li School of Software Engineering TongJi University Fall, 2014

Dimensionality Reduction Using PCA/LDA. Hongyu Li School of Software Engineering TongJi University Fall, 2014 Dimensionality Reduction Using PCA/LDA Hongyu Li School of Software Engineering TongJi University Fall, 2014 Dimensionality Reduction One approach to deal with high dimensional data is by reducing their

More information

ECE 661: Homework 10 Fall 2014

ECE 661: Homework 10 Fall 2014 ECE 661: Homework 10 Fall 2014 This homework consists of the following two parts: (1) Face recognition with PCA and LDA for dimensionality reduction and the nearest-neighborhood rule for classification;

More information

Predicting large population data cumulative match characteristic performance from small population data

Predicting large population data cumulative match characteristic performance from small population data Predicting large population data cumulative match characteristic performance from small population data Amos Y. Johnson, Jie Sun, and Aaron F. Bobick GVU Center / College of Computing Georgia Tech, Atlanta,

More information

System 1 (last lecture) : limited to rigidly structured shapes. System 2 : recognition of a class of varying shapes. Need to:

System 1 (last lecture) : limited to rigidly structured shapes. System 2 : recognition of a class of varying shapes. Need to: System 2 : Modelling & Recognising Modelling and Recognising Classes of Classes of Shapes Shape : PDM & PCA All the same shape? System 1 (last lecture) : limited to rigidly structured shapes System 2 :

More information

Linear Dimensionality Reduction

Linear Dimensionality Reduction Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Principal Component Analysis 3 Factor Analysis

More information

Introduction to Computer Vision

Introduction to Computer Vision Introduction to Computer Vision Michael J. Black Oct 2009 PCA and applications Probability and classification Admin Assignment 1 grades back shortly. Assignment 2, parts 1 and 2 due October 22 at 11:00am

More information

Lecture: Face Recognition

Lecture: Face Recognition Lecture: Face Recognition Juan Carlos Niebles and Ranjay Krishna Stanford Vision and Learning Lab Lecture 12-1 What we will learn today Introduction to face recognition The Eigenfaces Algorithm Linear

More information

Human Motion Production

Human Motion Production Human Motion Production External Forces & Moments Multi-Joint Dynamics Neural Command α 1 α 2 Musculotendon Dynamics F 1 F 2 Musculoskeletal Geometry T 1 T 2 EOM*.. θ 1.. θ 2. θ 1 1 θ. θ 2 θ 2 Sensory

More information

Machine Learning 2nd Edition

Machine Learning 2nd Edition INTRODUCTION TO Lecture Slides for Machine Learning 2nd Edition ETHEM ALPAYDIN, modified by Leonardo Bobadilla and some parts from http://www.cs.tau.ac.il/~apartzin/machinelearning/ The MIT Press, 2010

More information

COS 429: COMPUTER VISON Face Recognition

COS 429: COMPUTER VISON Face Recognition COS 429: COMPUTER VISON Face Recognition Intro to recognition PCA and Eigenfaces LDA and Fisherfaces Face detection: Viola & Jones (Optional) generic object models for faces: the Constellation Model Reading:

More information

Boosting: Algorithms and Applications

Boosting: Algorithms and Applications Boosting: Algorithms and Applications Lecture 11, ENGN 4522/6520, Statistical Pattern Recognition and Its Applications in Computer Vision ANU 2 nd Semester, 2008 Chunhua Shen, NICTA/RSISE Boosting Definition

More information

Face detection and recognition. Detection Recognition Sally

Face detection and recognition. Detection Recognition Sally Face detection and recognition Detection Recognition Sally Face detection & recognition Viola & Jones detector Available in open CV Face recognition Eigenfaces for face recognition Metric learning identification

More information

Discriminant analysis and supervised classification

Discriminant analysis and supervised classification Discriminant analysis and supervised classification Angela Montanari 1 Linear discriminant analysis Linear discriminant analysis (LDA) also known as Fisher s linear discriminant analysis or as Canonical

More information

N-mode Analysis (Tensor Framework) Behrouz Saghafi

N-mode Analysis (Tensor Framework) Behrouz Saghafi N-mode Analysis (Tensor Framework) Behrouz Saghafi N-mode Analysis (Tensor Framework) Drawback of 1-mode analysis (e.g. PCA): Captures the variance among just a single factor Our training set contains

More information

Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi

Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi Face Recognition Using Laplacianfaces He et al. (IEEE Trans PAMI, 2005) presented by Hassan A. Kingravi Overview Introduction Linear Methods for Dimensionality Reduction Nonlinear Methods and Manifold

More information

INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY

INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY [Gaurav, 2(1): Jan., 2013] ISSN: 2277-9655 IJESRT INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY Face Identification & Detection Using Eigenfaces Sachin.S.Gurav *1, K.R.Desai 2 *1

More information

Modeling Classes of Shapes Suppose you have a class of shapes with a range of variations: System 2 Overview

Modeling Classes of Shapes Suppose you have a class of shapes with a range of variations: System 2 Overview 4 4 4 6 4 4 4 6 4 4 4 6 4 4 4 6 4 4 4 6 4 4 4 6 4 4 4 6 4 4 4 6 Modeling Classes of Shapes Suppose you have a class of shapes with a range of variations: System processes System Overview Previous Systems:

More information

L11: Pattern recognition principles

L11: Pattern recognition principles L11: Pattern recognition principles Bayesian decision theory Statistical classifiers Dimensionality reduction Clustering This lecture is partly based on [Huang, Acero and Hon, 2001, ch. 4] Introduction

More information

Shape analysis on Lie groups and Homogeneous Manifolds with applications in computer animation

Shape analysis on Lie groups and Homogeneous Manifolds with applications in computer animation Shape analysis on Lie groups and Homogeneous Manifolds with applications in computer animation Department of Mathematical Sciences, NTNU, Trondheim, Norway joint work with Markus Eslitzbichler and Alexander

More information

Introduction to Machine Learning

Introduction to Machine Learning 10-701 Introduction to Machine Learning PCA Slides based on 18-661 Fall 2018 PCA Raw data can be Complex, High-dimensional To understand a phenomenon we measure various related quantities If we knew what

More information

Advanced Introduction to Machine Learning CMU-10715

Advanced Introduction to Machine Learning CMU-10715 Advanced Introduction to Machine Learning CMU-10715 Principal Component Analysis Barnabás Póczos Contents Motivation PCA algorithms Applications Some of these slides are taken from Karl Booksh Research

More information

Multi-scale Geometric Summaries for Similarity-based Upstream S

Multi-scale Geometric Summaries for Similarity-based Upstream S Multi-scale Geometric Summaries for Similarity-based Upstream Sensor Fusion Duke University, ECE / Math 3/6/2019 Overall Goals / Design Choices Leverage multiple, heterogeneous modalities in identification

More information

Adaptive Kalman Filter for Orientation Estimation in Micro-sensor Motion Capture

Adaptive Kalman Filter for Orientation Estimation in Micro-sensor Motion Capture 14th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 211 Adaptive Kalman Filter for Orientation Estimation in Micro-sensor Motion Capture Shuyan Sun 1,2, Xiaoli Meng 1,2,

More information

Shape Outlier Detection Using Pose Preserving Dynamic Shape Models

Shape Outlier Detection Using Pose Preserving Dynamic Shape Models Shape Outlier Detection Using Pose Preserving Dynamic Shape Models Chan-Su Lee and Ahmed Elgammal Rutgers, The State University of New Jersey Department of Computer Science Outline Introduction Shape Outlier

More information

CITS 4402 Computer Vision

CITS 4402 Computer Vision CITS 4402 Computer Vision A/Prof Ajmal Mian Adj/A/Prof Mehdi Ravanbakhsh Lecture 06 Object Recognition Objectives To understand the concept of image based object recognition To learn how to match images

More information

PCA FACE RECOGNITION

PCA FACE RECOGNITION PCA FACE RECOGNITION The slides are from several sources through James Hays (Brown); Srinivasa Narasimhan (CMU); Silvio Savarese (U. of Michigan); Shree Nayar (Columbia) including their own slides. Goal

More information

Machine Learning (Spring 2012) Principal Component Analysis

Machine Learning (Spring 2012) Principal Component Analysis 1-71 Machine Learning (Spring 1) Principal Component Analysis Yang Xu This note is partly based on Chapter 1.1 in Chris Bishop s book on PRML and the lecture slides on PCA written by Carlos Guestrin in

More information

Face Recognition. Face Recognition. Subspace-Based Face Recognition Algorithms. Application of Face Recognition

Face Recognition. Face Recognition. Subspace-Based Face Recognition Algorithms. Application of Face Recognition ace Recognition Identify person based on the appearance of face CSED441:Introduction to Computer Vision (2017) Lecture10: Subspace Methods and ace Recognition Bohyung Han CSE, POSTECH bhhan@postech.ac.kr

More information

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin 1 Introduction to Machine Learning PCA and Spectral Clustering Introduction to Machine Learning, 2013-14 Slides: Eran Halperin Singular Value Decomposition (SVD) The singular value decomposition (SVD)

More information

Model Accuracy Measures

Model Accuracy Measures Model Accuracy Measures Master in Bioinformatics UPF 2017-2018 Eduardo Eyras Computational Genomics Pompeu Fabra University - ICREA Barcelona, Spain Variables What we can measure (attributes) Hypotheses

More information

Feature selection and extraction Spectral domain quality estimation Alternatives

Feature selection and extraction Spectral domain quality estimation Alternatives Feature selection and extraction Error estimation Maa-57.3210 Data Classification and Modelling in Remote Sensing Markus Törmä markus.torma@tkk.fi Measurements Preprocessing: Remove random and systematic

More information

Face recognition Computer Vision Spring 2018, Lecture 21

Face recognition Computer Vision Spring 2018, Lecture 21 Face recognition http://www.cs.cmu.edu/~16385/ 16-385 Computer Vision Spring 2018, Lecture 21 Course announcements Homework 6 has been posted and is due on April 27 th. - Any questions about the homework?

More information

Face Detection and Recognition

Face Detection and Recognition Face Detection and Recognition Face Recognition Problem Reading: Chapter 18.10 and, optionally, Face Recognition using Eigenfaces by M. Turk and A. Pentland Queryimage face query database Face Verification

More information

Lecture 16: Small Sample Size Problems (Covariance Estimation) Many thanks to Carlos Thomaz who authored the original version of these slides

Lecture 16: Small Sample Size Problems (Covariance Estimation) Many thanks to Carlos Thomaz who authored the original version of these slides Lecture 16: Small Sample Size Problems (Covariance Estimation) Many thanks to Carlos Thomaz who authored the original version of these slides Intelligent Data Analysis and Probabilistic Inference Lecture

More information

Introduction to Signal Detection and Classification. Phani Chavali

Introduction to Signal Detection and Classification. Phani Chavali Introduction to Signal Detection and Classification Phani Chavali Outline Detection Problem Performance Measures Receiver Operating Characteristics (ROC) F-Test - Test Linear Discriminant Analysis (LDA)

More information

Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation)

Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation) Principal Component Analysis -- PCA (also called Karhunen-Loeve transformation) PCA transforms the original input space into a lower dimensional space, by constructing dimensions that are linear combinations

More information

Lecture Notes 1: Vector spaces

Lecture Notes 1: Vector spaces Optimization-based data analysis Fall 2017 Lecture Notes 1: Vector spaces In this chapter we review certain basic concepts of linear algebra, highlighting their application to signal processing. 1 Vector

More information

Lecture 24: Principal Component Analysis. Aykut Erdem May 2016 Hacettepe University

Lecture 24: Principal Component Analysis. Aykut Erdem May 2016 Hacettepe University Lecture 4: Principal Component Analysis Aykut Erdem May 016 Hacettepe University This week Motivation PCA algorithms Applications PCA shortcomings Autoencoders Kernel PCA PCA Applications Data Visualization

More information

Machine learning for pervasive systems Classification in high-dimensional spaces

Machine learning for pervasive systems Classification in high-dimensional spaces Machine learning for pervasive systems Classification in high-dimensional spaces Department of Communications and Networking Aalto University, School of Electrical Engineering stephan.sigg@aalto.fi Version

More information

MULTIVARIATE HOMEWORK #5

MULTIVARIATE HOMEWORK #5 MULTIVARIATE HOMEWORK #5 Fisher s dataset on differentiating species of Iris based on measurements on four morphological characters (i.e. sepal length, sepal width, petal length, and petal width) was subjected

More information

Clustering VS Classification

Clustering VS Classification MCQ Clustering VS Classification 1. What is the relation between the distance between clusters and the corresponding class discriminability? a. proportional b. inversely-proportional c. no-relation Ans:

More information

Uncorrelated Multilinear Principal Component Analysis through Successive Variance Maximization

Uncorrelated Multilinear Principal Component Analysis through Successive Variance Maximization Uncorrelated Multilinear Principal Component Analysis through Successive Variance Maximization Haiping Lu 1 K. N. Plataniotis 1 A. N. Venetsanopoulos 1,2 1 Department of Electrical & Computer Engineering,

More information

Deriving Principal Component Analysis (PCA)

Deriving Principal Component Analysis (PCA) -0 Mathematical Foundations for Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Deriving Principal Component Analysis (PCA) Matt Gormley Lecture 11 Oct.

More information

Bayesian Identity Clustering

Bayesian Identity Clustering Bayesian Identity Clustering Simon JD Prince Department of Computer Science University College London James Elder Centre for Vision Research York University http://pvlcsuclacuk sprince@csuclacuk The problem

More information

Lecture 7: Con3nuous Latent Variable Models

Lecture 7: Con3nuous Latent Variable Models CSC2515 Fall 2015 Introduc3on to Machine Learning Lecture 7: Con3nuous Latent Variable Models All lecture slides will be available as.pdf on the course website: http://www.cs.toronto.edu/~urtasun/courses/csc2515/

More information

2D Image Processing Face Detection and Recognition

2D Image Processing Face Detection and Recognition 2D Image Processing Face Detection and Recognition Prof. Didier Stricker Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz http://av.dfki.de

More information

Face Recognition Using Eigenfaces

Face Recognition Using Eigenfaces Face Recognition Using Eigenfaces Prof. V.P. Kshirsagar, M.R.Baviskar, M.E.Gaikwad, Dept. of CSE, Govt. Engineering College, Aurangabad (MS), India. vkshirsagar@gmail.com, madhumita_baviskar@yahoo.co.in,

More information

Eigenfaces. Face Recognition Using Principal Components Analysis

Eigenfaces. Face Recognition Using Principal Components Analysis Eigenfaces Face Recognition Using Principal Components Analysis M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive Neuroscience, 3(1), pp. 71-86, 1991. Slides : George Bebis, UNR

More information

Principal Component Analysis

Principal Component Analysis Principal Component Analysis Yingyu Liang yliang@cs.wisc.edu Computer Sciences Department University of Wisconsin, Madison [based on slides from Nina Balcan] slide 1 Goals for the lecture you should understand

More information

ECE 521. Lecture 11 (not on midterm material) 13 February K-means clustering, Dimensionality reduction

ECE 521. Lecture 11 (not on midterm material) 13 February K-means clustering, Dimensionality reduction ECE 521 Lecture 11 (not on midterm material) 13 February 2017 K-means clustering, Dimensionality reduction With thanks to Ruslan Salakhutdinov for an earlier version of the slides Overview K-means clustering

More information

Unsupervised Learning: K- Means & PCA

Unsupervised Learning: K- Means & PCA Unsupervised Learning: K- Means & PCA Unsupervised Learning Supervised learning used labeled data pairs (x, y) to learn a func>on f : X Y But, what if we don t have labels? No labels = unsupervised learning

More information

PCA and LDA. Man-Wai MAK

PCA and LDA. Man-Wai MAK PCA and LDA Man-Wai MAK Dept. of Electronic and Information Engineering, The Hong Kong Polytechnic University enmwmak@polyu.edu.hk http://www.eie.polyu.edu.hk/ mwmak References: S.J.D. Prince,Computer

More information

Face Recognition. Lecture-14

Face Recognition. Lecture-14 Face Recognition Lecture-14 Face Recognition imple Approach Recognize faces mug shots) using gray levels appearance). Each image is mapped to a long vector of gray levels. everal views of each person are

More information

MACHINE LEARNING. Methods for feature extraction and reduction of dimensionality: Probabilistic PCA and kernel PCA

MACHINE LEARNING. Methods for feature extraction and reduction of dimensionality: Probabilistic PCA and kernel PCA 1 MACHINE LEARNING Methods for feature extraction and reduction of dimensionality: Probabilistic PCA and kernel PCA 2 Practicals Next Week Next Week, Practical Session on Computer Takes Place in Room GR

More information

Biometrics: Introduction and Examples. Raymond Veldhuis

Biometrics: Introduction and Examples. Raymond Veldhuis Biometrics: Introduction and Examples Raymond Veldhuis 1 Overview Biometric recognition Face recognition Challenges Transparent face recognition Large-scale identification Watch list Anonymous biometrics

More information

Principal Component Analysis

Principal Component Analysis B: Chapter 1 HTF: Chapter 1.5 Principal Component Analysis Barnabás Póczos University of Alberta Nov, 009 Contents Motivation PCA algorithms Applications Face recognition Facial expression recognition

More information

PCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani

PCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani PCA & ICA CE-717: Machine Learning Sharif University of Technology Spring 2015 Soleymani Dimensionality Reduction: Feature Selection vs. Feature Extraction Feature selection Select a subset of a given

More information

ROBERTO BATTITI, MAURO BRUNATO. The LION Way: Machine Learning plus Intelligent Optimization. LIONlab, University of Trento, Italy, Apr 2015

ROBERTO BATTITI, MAURO BRUNATO. The LION Way: Machine Learning plus Intelligent Optimization. LIONlab, University of Trento, Italy, Apr 2015 ROBERTO BATTITI, MAURO BRUNATO. The LION Way: Machine Learning plus Intelligent Optimization. LIONlab, University of Trento, Italy, Apr 2015 http://intelligentoptimization.org/lionbook Roberto Battiti

More information

Principal component analysis

Principal component analysis Principal component analysis Motivation i for PCA came from major-axis regression. Strong assumption: single homogeneous sample. Free of assumptions when used for exploration. Classical tests of significance

More information

18 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 19, NO. 1, JANUARY 2008

18 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 19, NO. 1, JANUARY 2008 18 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 19, NO. 1, JANUARY 2008 MPCA: Multilinear Principal Component Analysis of Tensor Objects Haiping Lu, Student Member, IEEE, Konstantinos N. (Kostas) Plataniotis,

More information

Image Analysis & Retrieval. Lec 14. Eigenface and Fisherface

Image Analysis & Retrieval. Lec 14. Eigenface and Fisherface Image Analysis & Retrieval Lec 14 Eigenface and Fisherface Zhu Li Dept of CSEE, UMKC Office: FH560E, Email: lizhu@umkc.edu, Ph: x 2346. http://l.web.umkc.edu/lizhu Z. Li, Image Analysis & Retrv, Spring

More information

Distance Metric Learning in Data Mining (Part II) Fei Wang and Jimeng Sun IBM TJ Watson Research Center

Distance Metric Learning in Data Mining (Part II) Fei Wang and Jimeng Sun IBM TJ Watson Research Center Distance Metric Learning in Data Mining (Part II) Fei Wang and Jimeng Sun IBM TJ Watson Research Center 1 Outline Part I - Applications Motivation and Introduction Patient similarity application Part II

More information

PCA and LDA. Man-Wai MAK

PCA and LDA. Man-Wai MAK PCA and LDA Man-Wai MAK Dept. of Electronic and Information Engineering, The Hong Kong Polytechnic University enmwmak@polyu.edu.hk http://www.eie.polyu.edu.hk/ mwmak References: S.J.D. Prince,Computer

More information

Basics of Multivariate Modelling and Data Analysis

Basics of Multivariate Modelling and Data Analysis Basics of Multivariate Modelling and Data Analysis Kurt-Erik Häggblom 2. Overview of multivariate techniques 2.1 Different approaches to multivariate data analysis 2.2 Classification of multivariate techniques

More information

Face Recognition. Lecture-14

Face Recognition. Lecture-14 Face Recognition Lecture-14 Face Recognition imple Approach Recognize faces (mug shots) using gray levels (appearance). Each image is mapped to a long vector of gray levels. everal views of each person

More information

Using Tobit Kalman filtering in order to improve the motion recorded by Microsoft Kinect

Using Tobit Kalman filtering in order to improve the motion recorded by Microsoft Kinect International Workshop on Applied Probabilities 8th International Workshop on Applied Probabilities IWAP2016 Using Tobit Kalman filtering in order to improve the motion recorded by Microsoft Kinect K.

More information

Dimension Reduction (PCA, ICA, CCA, FLD,

Dimension Reduction (PCA, ICA, CCA, FLD, Dimension Reduction (PCA, ICA, CCA, FLD, Topic Models) Yi Zhang 10-701, Machine Learning, Spring 2011 April 6 th, 2011 Parts of the PCA slides are from previous 10-701 lectures 1 Outline Dimension reduction

More information

Motivating the Covariance Matrix

Motivating the Covariance Matrix Motivating the Covariance Matrix Raúl Rojas Computer Science Department Freie Universität Berlin January 2009 Abstract This note reviews some interesting properties of the covariance matrix and its role

More information

Class 4: Classification. Quaid Morris February 11 th, 2011 ML4Bio

Class 4: Classification. Quaid Morris February 11 th, 2011 ML4Bio Class 4: Classification Quaid Morris February 11 th, 211 ML4Bio Overview Basic concepts in classification: overfitting, cross-validation, evaluation. Linear Discriminant Analysis and Quadratic Discriminant

More information

What is Principal Component Analysis?

What is Principal Component Analysis? What is Principal Component Analysis? Principal component analysis (PCA) Reduce the dimensionality of a data set by finding a new set of variables, smaller than the original set of variables Retains most

More information

Data Mining. Dimensionality reduction. Hamid Beigy. Sharif University of Technology. Fall 1395

Data Mining. Dimensionality reduction. Hamid Beigy. Sharif University of Technology. Fall 1395 Data Mining Dimensionality reduction Hamid Beigy Sharif University of Technology Fall 1395 Hamid Beigy (Sharif University of Technology) Data Mining Fall 1395 1 / 42 Outline 1 Introduction 2 Feature selection

More information

Unsupervised Anomaly Detection for High Dimensional Data

Unsupervised Anomaly Detection for High Dimensional Data Unsupervised Anomaly Detection for High Dimensional Data Department of Mathematics, Rowan University. July 19th, 2013 International Workshop in Sequential Methodologies (IWSM-2013) Outline of Talk Motivation

More information

Notation. Pattern Recognition II. Michal Haindl. Outline - PR Basic Concepts. Pattern Recognition Notions

Notation. Pattern Recognition II. Michal Haindl. Outline - PR Basic Concepts. Pattern Recognition Notions Notation S pattern space X feature vector X = [x 1,...,x l ] l = dim{x} number of features X feature space K number of classes ω i class indicator Ω = {ω 1,...,ω K } g(x) discriminant function H decision

More information

Example: Face Detection

Example: Face Detection Announcements HW1 returned New attendance policy Face Recognition: Dimensionality Reduction On time: 1 point Five minutes or more late: 0.5 points Absent: 0 points Biometrics CSE 190 Lecture 14 CSE190,

More information

Dimensionality Reduction

Dimensionality Reduction Dimensionality Reduction Le Song Machine Learning I CSE 674, Fall 23 Unsupervised learning Learning from raw (unlabeled, unannotated, etc) data, as opposed to supervised data where a classification of

More information

Chemometrics: Classification of spectra

Chemometrics: Classification of spectra Chemometrics: Classification of spectra Vladimir Bochko Jarmo Alander University of Vaasa November 1, 2010 Vladimir Bochko Chemometrics: Classification 1/36 Contents Terminology Introduction Big picture

More information

Machine Learning 11. week

Machine Learning 11. week Machine Learning 11. week Feature Extraction-Selection Dimension reduction PCA LDA 1 Feature Extraction Any problem can be solved by machine learning methods in case of that the system must be appropriately

More information

Fisher Linear Discriminant Analysis

Fisher Linear Discriminant Analysis Fisher Linear Discriminant Analysis Cheng Li, Bingyu Wang August 31, 2014 1 What s LDA Fisher Linear Discriminant Analysis (also called Linear Discriminant Analysis(LDA)) are methods used in statistics,

More information

Principal Component Analysis, A Powerful Scoring Technique

Principal Component Analysis, A Powerful Scoring Technique Principal Component Analysis, A Powerful Scoring Technique George C. J. Fernandez, University of Nevada - Reno, Reno NV 89557 ABSTRACT Data mining is a collection of analytical techniques to uncover new

More information

Assignment 3. Introduction to Machine Learning Prof. B. Ravindran

Assignment 3. Introduction to Machine Learning Prof. B. Ravindran Assignment 3 Introduction to Machine Learning Prof. B. Ravindran 1. In building a linear regression model for a particular data set, you observe the coefficient of one of the features having a relatively

More information

Overview of Statistical Tools. Statistical Inference. Bayesian Framework. Modeling. Very simple case. Things are usually more complicated

Overview of Statistical Tools. Statistical Inference. Bayesian Framework. Modeling. Very simple case. Things are usually more complicated Fall 3 Computer Vision Overview of Statistical Tools Statistical Inference Haibin Ling Observation inference Decision Prior knowledge http://www.dabi.temple.edu/~hbling/teaching/3f_5543/index.html Bayesian

More information

Image Analysis & Retrieval Lec 14 - Eigenface & Fisherface

Image Analysis & Retrieval Lec 14 - Eigenface & Fisherface CS/EE 5590 / ENG 401 Special Topics, Spring 2018 Image Analysis & Retrieval Lec 14 - Eigenface & Fisherface Zhu Li Dept of CSEE, UMKC http://l.web.umkc.edu/lizhu Office Hour: Tue/Thr 2:30-4pm@FH560E, Contact:

More information

High Dimensional Discriminant Analysis

High Dimensional Discriminant Analysis High Dimensional Discriminant Analysis Charles Bouveyron LMC-IMAG & INRIA Rhône-Alpes Joint work with S. Girard and C. Schmid ASMDA Brest May 2005 Introduction Modern data are high dimensional: Imagery:

More information

Eigenimaging for Facial Recognition

Eigenimaging for Facial Recognition Eigenimaging for Facial Recognition Aaron Kosmatin, Clayton Broman December 2, 21 Abstract The interest of this paper is Principal Component Analysis, specifically its area of application to facial recognition

More information

Template-based Recognition of Static Sitting Postures

Template-based Recognition of Static Sitting Postures Template-based Recognition of Static Sitting Postures Manli Zhu 1, Aleix M. Martínez 1 and Hong Z. Tan 2 1 Dept. of Electrical Engineering The Ohio State University 2 School of Electrical and Computer

More information

Linear Classifiers as Pattern Detectors

Linear Classifiers as Pattern Detectors Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2013/2014 Lesson 18 23 April 2014 Contents Linear Classifiers as Pattern Detectors Notation...2 Linear

More information

Distance Metric Learning

Distance Metric Learning Distance Metric Learning Technical University of Munich Department of Informatics Computer Vision Group November 11, 2016 M.Sc. John Chiotellis: Distance Metric Learning 1 / 36 Outline Computer Vision

More information

Technical Report. Results from 200 billion iris cross-comparisons. John Daugman. Number 635. June Computer Laboratory

Technical Report. Results from 200 billion iris cross-comparisons. John Daugman. Number 635. June Computer Laboratory Technical Report UCAM-CL-TR-635 ISSN 1476-2986 Number 635 Computer Laboratory Results from 200 billion iris cross-comparisons John Daugman June 2005 15 JJ Thomson Avenue Cambridge CB3 0FD United Kingdom

More information

Table of Contents. Multivariate methods. Introduction II. Introduction I

Table of Contents. Multivariate methods. Introduction II. Introduction I Table of Contents Introduction Antti Penttilä Department of Physics University of Helsinki Exactum summer school, 04 Construction of multinormal distribution Test of multinormality with 3 Interpretation

More information

Dimensionality Reduction

Dimensionality Reduction Lecture 5 1 Outline 1. Overview a) What is? b) Why? 2. Principal Component Analysis (PCA) a) Objectives b) Explaining variability c) SVD 3. Related approaches a) ICA b) Autoencoders 2 Example 1: Sportsball

More information

Unsupervised machine learning

Unsupervised machine learning Chapter 9 Unsupervised machine learning Unsupervised machine learning (a.k.a. cluster analysis) is a set of methods to assign objects into clusters under a predefined distance measure when class labels

More information

Recognition Using Class Specific Linear Projection. Magali Segal Stolrasky Nadav Ben Jakov April, 2015

Recognition Using Class Specific Linear Projection. Magali Segal Stolrasky Nadav Ben Jakov April, 2015 Recognition Using Class Specific Linear Projection Magali Segal Stolrasky Nadav Ben Jakov April, 2015 Articles Eigenfaces vs. Fisherfaces Recognition Using Class Specific Linear Projection, Peter N. Belhumeur,

More information

Relationship between identification metrics: Expected Confusion and Area Under a ROC curve

Relationship between identification metrics: Expected Confusion and Area Under a ROC curve Relationship between identification metrics: Expected Confusion and Area Under a ROC curve Amos Y. Johnson Aaron F. obick Electrical and Computer Engineering VU Center/College of Computing eorgia Tech

More information

Robotics 2 Target Tracking. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard

Robotics 2 Target Tracking. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard Robotics 2 Target Tracking Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard Linear Dynamical System (LDS) Stochastic process governed by is the state vector is the input vector is the process

More information

A Tutorial on Data Reduction. Principal Component Analysis Theoretical Discussion. By Shireen Elhabian and Aly Farag

A Tutorial on Data Reduction. Principal Component Analysis Theoretical Discussion. By Shireen Elhabian and Aly Farag A Tutorial on Data Reduction Principal Component Analysis Theoretical Discussion By Shireen Elhabian and Aly Farag University of Louisville, CVIP Lab November 2008 PCA PCA is A backbone of modern data

More information

Dimensionality Reduction:

Dimensionality Reduction: Dimensionality Reduction: From Data Representation to General Framework Dong XU School of Computer Engineering Nanyang Technological University, Singapore What is Dimensionality Reduction? PCA LDA Examples:

More information

Dimensionality Reduction and Principal Components

Dimensionality Reduction and Principal Components Dimensionality Reduction and Principal Components Nuno Vasconcelos (Ken Kreutz-Delgado) UCSD Motivation Recall, in Bayesian decision theory we have: World: States Y in {1,..., M} and observations of X

More information

STA 414/2104: Lecture 8

STA 414/2104: Lecture 8 STA 414/2104: Lecture 8 6-7 March 2017: Continuous Latent Variable Models, Neural networks With thanks to Russ Salakhutdinov, Jimmy Ba and others Outline Continuous latent variable models Background PCA

More information

Course 495: Advanced Statistical Machine Learning/Pattern Recognition

Course 495: Advanced Statistical Machine Learning/Pattern Recognition Course 495: Advanced Statistical Machine Learning/Pattern Recognition Deterministic Component Analysis Goal (Lecture): To present standard and modern Component Analysis (CA) techniques such as Principal

More information

Machine Learning. CUNY Graduate Center, Spring Lectures 11-12: Unsupervised Learning 1. Professor Liang Huang.

Machine Learning. CUNY Graduate Center, Spring Lectures 11-12: Unsupervised Learning 1. Professor Liang Huang. Machine Learning CUNY Graduate Center, Spring 2013 Lectures 11-12: Unsupervised Learning 1 (Clustering: k-means, EM, mixture models) Professor Liang Huang huang@cs.qc.cuny.edu http://acl.cs.qc.edu/~lhuang/teaching/machine-learning

More information