Machie Learig for Data Sciece (CS4786) Lecture 9 Pricipal Compoet Aalysis Course Webpage : http://www.cs.corell.eu/courses/cs4786/207fa/
DIM REDUCTION: LINEAR TRANSFORMATION x > y > Pick a low imesioal subspace X Project liearly to this subspace W = Subspace retais as much iformatio Y x > y > i = x > i W K y > K
Example: Stuets i classroom z y x
PCA: VARIANCE MAXIMIZATION 0.8 0.6 0.4 0.2 0-0.2-0.4-0.6-0.8 - -.2 -.5 - -0.5 0 0.5.5
PCA: VARIANCE MAXIMIZATION 0.8 0.6 0.4 0.2 0-0.2-0.4-0.6-0.8 - -.2 -.5 - -0.5 0 0.5.5
PCA: VARIANCE MAXIMIZATION 0.8 0.6 0.4 0.2 0-0.2-0.4-0.6-0.8 - -.2 -.5 - -0.5 0 0.5.5
DIM REDUCTION: LINEAR TRANSFORMATION Prelue: reucig to imesio Pick a low imesioal subspace w Project liearly to this subspace x x 2 Subspace retais as much iformatio y = w > x = kx k cos (\wx ) x 4 x 3 0
PCA: VARIANCE MAXIMIZATION Pick irectios alog which ata varies the most First pricipal compoet: Variace = X = = = = y t X s= y s! 2! X X 2 w > x t w > x s s= X w > x t w > X w > (x t µ) 2!! X 2 x s s= = average square ier prouct
Which Directio? 0.8 0.6 0.4 O average parallel O average orthogoal 0.2 0-0.2-0.4-0.6-0.8 - X w > (x t -.2 -.5 - -0.5 0 0.5.5 µ) 2 = X kx t µk 2 cosie(w, x t µ)
PCA: VARIANCE MAXIMIZATION Pick irectios alog which ata varies the most First pricipal compoet: w = arg max w w 2 = = arg max w w 2 = = arg max w w 2 = = arg max w w 2 = is the covariace matrix w x t w (x t µ) 2 2 w x t w (x t µ)(x t µ) w w w
Review Review covariace Review Eige vectors
PCA: VARIANCE MAXIMIZATION Covariace matrix: = (x t µ)(x t µ) Its a matrix, [i, j] measures covariace of features i a j [i, j] = (x t [i] µ[i])(x t [j] µ[j])
What are Eige Vectors?.5 0.5 0-0.5 - -.5 -.5 - -0.5 0 0.5.5
What are Eige Vectors?.5 x 7! Ax 0.5 0-0.5 - -.5 -.5 - -0.5 0 0.5.5
What are Eige Vectors?.5 x 7! Ax 0.5 0-0.5 - Ax = -.5 -.5 - -0.5 0 0.5.5 x
Which Directio? 0.8 0.6 0.4 O average parallel O average orthogoal 0.2 0-0.2-0.4-0.6-0.8 - -.2 -.5 - -0.5 0 0.5.5 Top Eigevector of covariace matrix
What if we wat more tha oe umber for each ata poit? That is we wat to reuce to K > imesios? z y x
PCA: VARIANCE MAXIMIZATION How o we fi the K compoets? We are lookig for orthogoal irectios that maximize total sprea i each irectio Fi orthoormal W that maximizes K j= y t [j] 2 y t [j] = = K j= w j x t K w j w j j= 2 x t This solutios is give by W = Top K eigevectors of
PCA: VARIANCE MAXIMIZATION How o we fi the K compoets? As: Maximize sum of sprea i the K irectios We are lookig for orthogoal irectios that maximize total sprea i each irectio Fi orthoormal W that maximizes K j= y t [j] 2 y t [j] = = K j= w j x t K w j w j j= 2 x t This solutios is give by W = Top K eigevectors of
PCA: VARIANCE MAXIMIZATION How o we fi the K compoets? We are lookig for orthogoal irectios that maximize total sprea i each irectio Fi orthoormal W that maximizes K j= y t [j] 2 y t [j] = = K j= w j x t K w j w j j= X w i [k]w j [k] =0 & k= x t X w i [k] = k= 2 This solutios is give by W = Top K eigevectors of
! PRINCIPAL COMPONENT ANALYSIS Eigevectors of the covariace matrix are the pricipal compoets. =cov X Top K pricipal compoets are the eigevectors with K largest eigevalues 2. W = eigs,k Projectio = Data Top Keigevectors ( ) Recostructio = Projectio Traspose of top K eigevectors X µ Iepeetly iscovere by Pearso i 90 a Hotellig i 933. 3. Y = W
A Alterative View of PCA
PCA: MINIMIZING RECONSTRUCTION ERROR 0.8 0.6 0.4 0.2 0-0.2-0.4-0.6-0.8 - x ˆx -.2 -.5 - -0.5 0 0.5.5 X kˆx t x t k 2
Maximize Sprea Miimize Recostructio Error
ORTHONORMAL PROJECTIONS Thik of w,...,w K as cooriate system for PCA (i a K imesioal subspace) y values provie coefficiets i this system Without loss of geerality, w,...,w K ca be orthoormal, i.e. w i w j & w i =. 2 Recostructio: ˆx t = kw i k 2 2 = X K y t [j]w j If we take all,...,w, the x t = j= y t[j]w j. To reuce w i? w j ) w i [k]w j [k] =0 imesioality we oly cosier first K vectors of the basis k= X j= k= w i [k] 2
CENTERING DATA 0 Compressig these ata poits
CENTERING DATA 0 is same as compressig these.
ORTHONORMAL PROJECTIONS (Cetere) Data-poits as liear combiatio of some orthoormal basis, i.e. x t = µ + y t [j]w j j= where w,...,w R are the orthoormal basis a µ = x t. Represet ata as liear combiatio of just K orthoormal basis, ˆx t = µ + K y t [j]w j + µ j=
PCA: MINIMIZING RECONSTRUCTION ERROR Goal: fi the basis that miimizes recostructio error, ˆx t x t 2 2 = = = = = K y t [j]w j + µ x t j= K y t [j]w j + µ j= j=k+ y t [j]w j j=k+ 2 2 y t [j] 2 w j 2 2 + 2 2 2 y t [j]w j µ j= 2 2 (but a + b 2 2 = a2 2 + b2 2 + 2a b) j=k+ i=j+ y t [j]y t [i]w j w i y t [j] 2 w j 2 2 (last step because w j w i )
PCA: MINIMIZING RECONSTRUCTION ERROR ˆx t x t 2 2 = = = = = y t [j] 2 w j 2 2 (but w j = ) y t [j] 2 (ow y j = w j (x t µ)) (w j (x t µ)) 2 w j w j w j (x t µ)(x t µ) w j
PCA: MINIMIZING RECONSTRUCTION ERROR ˆx t x t 2 2 = = = = = y t [j] 2 w j 2 2 (but w j = ) y t [j] 2 (ow y j = w j (x t µ)) (w j (x t µ)) 2 w j w j w j (x t µ)(x t µ) w j
PCA: MINIMIZING RECONSTRUCTION ERROR ˆx t x t 2 2 = = = = = y t [j] 2 w j 2 2 (but w j = ) y t [j] 2 (ow y j = w j (x t µ)) (w j (x t µ)) 2 w j w j w j (x t µ)(x t µ) w j
PCA: MINIMIZING RECONSTRUCTION ERROR ˆx t x t 2 2 = = = = = y t [j] 2 w j 2 2 (but w j = ) y t [j] 2 (ow y j = w j (x t µ)) (w j (x t µ)) 2 w j w j w j (x t µ)(x t µ) w j
PCA: MINIMIZING RECONSTRUCTION ERROR ˆx t x t 2 2 = = = = = y t [j] 2 w j 2 2 (but w j = ) y t [j] 2 (ow y j = w j (x t µ)) (w j (x t µ)) 2 w j w j w j (x t µ)(x t µ) w j
PCA: MINIMIZING RECONSTRUCTION ERROR Miimize w.r.t. w,...,w K s that are orthoormal, argmi j, w j 2 = w j w j Solutio, (iscar) w K+,...,w are bottom K eigevectors Hece w,...,w K are the top K eigevectors
! PRINCIPAL COMPONENT ANALYSIS Eigevectors of the covariace matrix are the pricipal compoets. =cov X Top K pricipal compoets are the eigevectors with K largest eigevalues 2. W = eigs,k Projectio = Data Top Keigevectors ( ) Recostructio = Projectio Traspose of top K eigevectors X µ Iepeetly iscovere by Pearso i 90 a Hotellig i 933. 3. Y = W
RECONSTRUCTION 4. bx W > +µ = Y