New Schedule. Dec. 8 same same same Oct. 21. ^2 weeks ^1 week ^1 week. Pattern Recognition for Vision

Size: px
Start display at page:

Download "New Schedule. Dec. 8 same same same Oct. 21. ^2 weeks ^1 week ^1 week. Pattern Recognition for Vision"

Transcription

1 ew Schedule Dec. 8 same same same Oct. ^ weeks ^ week ^ week Fall 004 Patter Recogto for Vso

2 9.93 Patter Recogto for Vso Classfcato Berd Hesele Fall 004

3 Overvew Itroducto Lear Dscrmat Aalyss Support Vector Maches Lterature & Homework Fall 004 Patter Recogto for Vso

4 Itroducto Classfcato Lear, o-lear separato wo class, mult-class problems wo approaches: Desty estmato, classfy wth Bayes decso: Lear Dscr. Aalyss (LDA), Quadratc Dscr. Aalyss (QDA) Wthout desty estmato: Support Vector Maches (SVM) Fall 004 Patter Recogto for Vso

5 LDA Bayes Rule Bayes Rule p(x,w ) = p(x w ) P (w ) = P(w x ) p ( x ) lkelhood pror P ( w x ) = posteror p ( x w ) P( w ) p ( x ) evdece x : radom vector w : class Fall 004 Patter Recogto for Vso

6 LDA Bayes Decso Rule x Decde w f P(w ) P(w x) Lkelhood Rato > ; otherwse w p(x w ) P ( w ) > p(x w ) P ( w ) Log Lkelhood Rato l p(x w ) ( w ) p(x w ) P( ) P > 0, l w + l Ł p(x w ) ( w ) Ł p(x w ) Ł P( ) > 0 P ł ł w ł Fall 004 Patter Recogto for Vso

7 LDA wo Classes, Idetcal Covarace p ( x w ) P ( w ) l + l p ( x w ) P ( w ) Ł ł Ł ł - - P ( w ) = ( x - m ) S ( x - m ) - ( x - m ) S ( x - m ) + l Ł P ( w ) ł - - P ( w ) = x S ( m - m ) + ( m + m ) S ( m - m ) + l Ł P ( w ) w ł = + xw b lear decso fucto: w f xw + b > 0 b ( ) - x m ( x m ) - - S - Gaussa: p ( x w ) = e ( ) d / / p S assume detcal covarace matrces S =S : Fall 004 Patter Recogto for Vso

8 X w LDA wo Classes, Idetcal Covarace f U Rotato f D -/ Scalg X x earest eghbor classfer X - - f ( x ) = x S ( m - m ) + ( m + m ) S ( m - m ) + l( P ( w ) / P ( w )) = x ( m - m ) + ( m + m ) ( m - m ) + l( P ( w ) / P ( w )), GG = S, x = x G, x = G x, m - m = G ( m - m ) - S = UDU, S = UD U = GGG=, D U - - -/ b m m f ( x ) = 0 Fall 004 Patter Recogto for Vso

9 LDA Computato m ˆ = x, = ˆ S = x - ˆ x - ˆ (, m )(, m ) + = = Desty estmato f ( x ) = sg ( xw + b ) w b ˆ - m ˆ ˆ m = S ( - ) - P( w ) = ( m ˆ ˆ ˆ ˆ + m ) S ( m - m ) + l Ł P ( w ) ł Approxmate by Fall 004 Patter Recogto for Vso

10 QDA wo classes, dfferet covarace matrx decde w f f ( x ) > 0 ( ( w )) w ( ( w )) ( ( w )) f ( x ) = l p x + l( P ( ))- l p x - l P - l ( p ( x w )) =- l S - ( x - m ) S ( x - m ) + l P ( w ) where f ( x ) = xax + wx + w - quadratc w Quadratc Dscrmat Aalyss A =- =S m -S w o = 0 ( - - S -S ) m well, the rest of t - a matrx - a vector - a scalar Fall 004 Patter Recogto for Vso

11 LDA Multclass, Idetcal Covarace Fd the lear decso boudares for k classes: For two classes we have: f ( ) x = xw x > + b decde w f f ( ) 0 I the mult-class case we have k - decso fuctos: f, ( x ) = xw, f,3 ( ) f, + b,, x = xw,3 + b,3, k ( x) = xw, k + b, k we have to determe (k -)( p +) parameters, p s dmeso of x Fall 004 Patter Recogto for Vso

12 LDA Multclass, Idetcal Covarace w m m 3 X m y ( m ) m 3 m X m w y ( m ) Fd the - dmesoal subspace that gves the best lear dscrmato betw. the k classes. y = ( w w... w ) x also kow as Fsher Lear Dscrmat Fall 004 Patter Recogto for Vso

13 LDA Multclass, Idetcal Covarace Computato compute the d k matrx M = (m m... m ) ad cov. matrx S k compute the m : M =G M, S - =GG compute the cov. matrx B of m compute the egevectors v of B raked by egevalues calculate y by projectg x to X ad the oto the egevector: G y = v x w =G v Fall 004 Patter Recogto for Vso

14 LDA Fsher s Approach Fd w such that the rato of betwee-class ad -class varace s maxmzed f the data s projected oto w : y = w Bw max, B = MM the covarace of the m 's w Sw ca be wrtte as: max w x w Bw subject to w Sw = geeralzed egevalue problem, soluto are the raked egevectors of...same s a prevous dervato. - S B Fall 004 Patter Recogto for Vso

15 he Coffee Problem: LDA vs. PCA Image removed due to copyrght cosderatos. See: : R. Guterrez-Osua Fall 004 Patter Recogto for Vso

16 LDA/QDA Summary Advatages: LDA s the Bayes classfer for multvarate Gaussa dstrbutos wth commo covarace. LDA creates lear boudares whch are smple to compute. LDA ca be used for represetg mult-class data low dmesos. QDA s the Bayes classfer for multvarate Gaussa dstrbutos. QDA creates quadratc boudares. Problems: LDA s based o a sgle prototype per class (class ceter) whch s ofte suffcet practce. Fall 004 Patter Recogto for Vso

17 Varats of LDA oparameterc LDA (Fukuaga) removes the umodal assumpto by the scatter matrx usg local formato. More tha k- features ca be extracted. Orthoormal LDA (Okada&omta) computes projectos that maxmze separablty ad are par-wse orthoormal. Geeralzed LDA (Lowe) Icorporates a cost fucto smlar to Bayes Rsk mmzato..ad may may more (see Elemets of Statstcal Learg Haste, bshra, Fredma) Fall 004 Patter Recogto for Vso

18 SVM Lear, Separable (LS) Fd the separatg fucto f ( x ) = sg( xw+ b) whch maxmzes the marg M = d o the trag data. Maxmum marg classfer. Fall 004 Patter Recogto for Vso

19 w, b SVM Prmal, (LS) { x x } { } rag data cossts of pars (, y ),...,(, y ), y -,. he problem of maxmzg the marg d max d subject to y ( xw + b ) d w = ca be formulated as: w or alteratvely: w = w / d, b = b / d w y ( xw + b ), where d = m subject to w, b Covex optmzato problem wth quadratc objectve fucto ad lear costrats. w Fall 004 Patter Recogto for Vso

20 SVM Dual, (LS) Multply costrat equatos by postve Lagrage multplers ad subtract them from the objectve fucto: L = - º y ( x w + b )-ø ß P w a Ø = M. L w.r. t. w ad b ad max. w.r. t. a, subject to a 0. P Set dervatves d L / d w ad d L / d b to zero ad max. w.r. t. a : = = substttug L we get the so called Wolfe dual: max. L = D a - aa k yy kx x k = k = = subject to a 0, p w = a y x, a y = 0. P P = a y = 0 solve for a the compute w = a y x ad b from a ( ) 0 º Ø y x w + b - ß ø = Fall 004 Patter Recogto for Vso

21 SVM Prmal vs. Dual (LS) Prmal: w m w, b Dual: subject to: ( xw + b ) max subject to: 0, a y a aa k yy kx x k a a y = k = = = - = 0 he prmal has a dese equalty costrat for every pot the trag set. he dual has a sgle dese equalty costrat ad a set of box costrats whch makes t easer to solve tha the prmal. Fall 004 Patter Recogto for Vso

22 SVM Optmalty Codtos (LS) Optmalty codtos for the learly separable data: = a y = 0, a 0 ", y ( x w + b )- 0 " w = a y x f ( x ) = a y x x + b a = = ( y ( x w + b )- ) = 0 " a =0 for pots whch are a 0 ot o the boudary of the a =0 marg. a =0 a 0 pots wth a > 0 are support vectors. Fall 004 Patter Recogto for Vso

23 SVM Lear, o-separable (LS) f ( x ) = f ( x ) = 0 f ( x ) =- x d = M m w w, b x + C x subject to: = y ( xw + b ) - x ", x > 0 " C are called slack varables costat, pealzes errors Fall 004 Patter Recogto for Vso

24 SVM Dual (LS) Same procedure as separable case max. L = D a - aa = k = = = kyy kx x k subject to 0 a C, a y = 0 solve for a the compute w ad b from y ( x w + b )- = 0 = a for ay sample x for whch 0 y x < a < C Fall 004 Patter Recogto for Vso

25 SVM Optmalty Codtos (LS) f ( x ) = f ( x ) = 0 f ( x ) =- x d = M a = 0 x = 0, yf ( x ) > 0 < a < C x = 0 yf ( x ) = a = C x 0, yf ( x ) = a x x + = f ( x ) y b ubouded support vectors bouded support vectors Fall 004 Patter Recogto for Vso

26 SVM o-lear (L) x put space o-lear mappg: x =F( x ) x feature space x x Project to feature space, apply SVM procedure Fall 004 Patter Recogto for Vso

27 SVM Kerel rck k k = k = = max. a - aa yyx x k subject to 0 C, a y = 0 a = Oly the er product of the samples appears the objectve fucto. If we ca wrte: K(x, x ) = x x k k we ca avod ay computatos the feature space. he soluto f (x ) = x w+ b ca be wrtte as: x = a y F( x ) F(x ) + b = = f ( ) y x. usg w = a = a y K (x x) + b, Fall 004 Patter Recogto for Vso

28 SVM Kerels Whe s a Kerel K ( uv, ) a er product a Hlbert space? K ( uv, ) = lf ( u ) f ( v ) wth postve coeffcets f for ay g ( u ) L K ( uv, ) g ( u ) g ( v ) dud v 0 Mercer's codto l Some examples of commoly used kerels: Lear kerel: Polyomal kerel: ( + u v ) Gaussa kerel (RBF): u v MLP: tah( u v -q ) d exp( - u- v ), shft var. Fall 004 Patter Recogto for Vso

29 SVM Polyomal Kerel Polyomal secod degree kerel K ( uv, ) = ( + u v ), uv, R = + ( u v ) + ( u v ) + uvu v + u v + uv = (, u, u, u u, u, u )(, v, v, v v, v, v ) F ( x ) = (, x, x, x x, x, x ) Shft varat kerel K ( uv, ) = defed o L d ([0, ] ) ca as the Fourer seres of K : K ( u- v ) be wrtte j p kt j p kt j pkt - l k e, ( 0 ) l k e e k =- k =- f( t ) = f t - t = j p k - j p k d ( - ) = k u k v u v l k " k k ([-, ] ) k = 0 K e e Z 0 Fall 004 Patter Recogto for Vso

30 SVM Uqueess Are the a the soluto uque? o f ( x ) = a y + b x + x = w = (, 0) w two solutos: - + a = (0.5,0.5,0.5,0.5) 3 - x a = (0.5,0.5,0,0) costrats: a 0, a y = = 0 are satsfed x x 4 More mportat: s the soluto f ( x ) uque? Yes, the soluto s uque ad global f the objectve fucto s strctly covex. y = + x Fall 004 Patter Recogto for Vso

31 SVM Multclass Bottom-Up vs A or B or C or D A vs. All B,C,D A or B C or D B A,C,D C A,B,D A B C D D A,B,C rag: k (k-) / Classfcato : k- rag: k Classfcato : k Fall 004 Patter Recogto for Vso

32 SVM Choosg the Kerel How to choose the kerel? Lear SVMs are smple to compute, fast at rutme but ofte ot suffcet for complex tasks. SVM wth Gaussa kerels showed excellet performace may applcatos (after some tug of sgma). Slow at ru-tme. Polyomal wth d are commoly used computer vso applcatos. Good trade off betwee classfcato performace computatoal complexty. Fall 004 Patter Recogto for Vso

33 SVM Example Face detecto wth lear ad d degree poly. SVM & LDA Fall 004 Patter Recogto for Vso

34 SVM Choosg C How to choose the C-value? m w, b w + C x = C-value pealzes pots wth the marg. Large C-value ca lead to poor geeralzato performace (over-fttg). From ow experece object detecto tasks: Fd a kerel ad C-values whch gve you zero errors o the trag set. Fall 004 Patter Recogto for Vso

35 SVM Computato durg Classfcato I computer vso applcatos fast classfcato s usually more mportat tha fast trag. wo ways of computg the decso fucto f ( x ): a) w F( x ) + b b) a y K (x, x) + b Whch oe s faster? = -For a lear kerel a) -For a polyomal d degree kerel: Multplcatos for a): G F, poly = ( + ), where s dm. of x Multplcatos for b): G K, poly = ( + ) s, where s s b. of sv's -Gaussa kerel: oly b) sce dm. of F( x ) s. Fall 004 Patter Recogto for Vso

36 Learg heory Problem Formulato From a gve set of trag examples {x, y }lear the mappg x f y. he learg mache s defed by a set of possble mappgs x f f (x,a ) where a s the adjustable parameter of f. he goal s to mmze the expected rsk R : R ( a ) = V ( f (x,a), y ) d P ( x, y ) V s the loss fucto P s the probablty dstrbuto fucto We ca't compute R ( a ) sce we do't kow P (x, y ) Fall 004 Patter Recogto for Vso

37 Learg heory Emprcal Rsk Mmzato o solve the problem mmze the "emprcal rsk" R emp over the trag set: R a = emp ( ) V ( f ( x, a ), y ) = V s the loss fucto Commo loss fuctos: V ( f y ) ( y f ( x ), = - ( x )) least squares V ( f ( x ), y ) = (- yf ( x )) hge loss where ( x ) max( x, 0) + + yf ( x ) Fall 004 Patter Recogto for Vso

38 Learg heory & SVM Boud o the expected rsk: For a loss fucto wth 0 V ( f ( x ), y ) wth probablty - h, 0 h the followg boud holds: R ( a ) R emp R ( a ) + emp ( a ) emprcal rsk umber of trag examples h l( / h ) + h -l( h / 4) h Vapk Chervoeks (VC) dmeso Boud s depedat of the probabllty dstrbuto P ( x, y). Keep all parameters the boud fxed except oe: (- h) boud, boud fl, h boud Fall 004 Patter Recogto for Vso

39 Learg heory VC Dmeso he VC dmeso s a property of the set of fuctos If for a set of pots labeled all possble ways { } { f a } oe ca fd a f f ( a) whch separates the pots correctly oe says that the set of pots s shattered by { f ( a )}. he VC dmeso s the maxmum umber of pots that ca be shattered by { f ( a ). } ( ). he VC dmeso of a fuctos f : w x + b = 0 dm: Fall 004 Patter Recogto for Vso

40 Learg heory SVM D M he expected rsk ER ( ) for the optmal hyperplaes: ED ( / M ) ER ( ) where the expectato s over all trag sets of sze. 'Algorthms that maxmze the marg have better geeralzato performace.' Fall 004 Patter Recogto for Vso

41 Bouds Most bouds o expected rsk are very loose to compute stead: Cross Valdato Error Error o a cross valdato set whch s dfferet from the trag set. Leave-oe-out Error Leave oe trag example out of the trag set, tra classfer ad test o the example whch was left out. Do ths for all examples. For SVMs upper bouded by the # of support vectors. Fall 004 Patter Recogto for Vso

42 Regularzato heory Gve examples ( x, y ), x R, y {0,} solve: m f H where = f V ( f ( x ), y ) K + g Hlbert Space (RKHS) H,wth the reproducg kerel K, g s the regularzato par ameter. g f K = K f Uder rather geeral codtos the soluto ca be wrtte as: f ( x ) = ck ( xx, ) s the orm a Reproducg Kerel ca be terpreted as a smoothess costrat. Smooth fucto Fall 004 Patter Recogto for Vso

43 Regularzato Reproducg Kerel Hlbert Space (RKHS) Reproducg Kerel Hlbert Space (RKHS) H f ( x ) = K ( xy, ), f ( y ) Postve umbers l ad orthoormal set of fuctos f ( x ), f ( x ) f ( x) d x = 0 for m, ad otherwse : m K ( xy, ) lf ( x ) f ( y ), l f ( x ) = a f ( x ), a = f ( x ) f ( x) d x, f ( x ), f ( y ) H f ( x ) = f ( x ), f ( x ) = a H H l H a b l are oegatve egevalues of K / l Fall 004 Patter Recogto for Vso

44 K( x, y ) = exp( ( x y )),, 0, wrte - - xy [ ] K( x, y ) as Fourer expaso usg shft theorem: K( x, y ) = l exp( j p x ) exp( - j p y ) Perod = where l l Regularzato Smple Example of RKHS Kerel s a oe dmesoal Gaussa wth s =: l are the Fourer = A exp( - / ) decreases wth hgher frequeces (creasg ). hs s a property of most kerels. he regularzato term: f ( x ) = a / l, where a are the Fourer coeff. of f ( x ) H coeff. of exp( -x pealzes hgh freq. more tha low freq. f smoothess! ) Fall 004 Patter Recogto for Vso

45 Regularzato SVM For the hge loss fucto V ( f ( x ), y ) = (- yf ( x )) t ca be show that the regularzato problem s equvalet to the SVM problem: m (- yf ( x )) + + l f H = K troducg slack varables x = - yf ( x ) = = f + we ca rewrte: m x + l f, subject to: yf ( x ) - x, ad x 0 " f H K It ca be show that ths s equvalet to the SVM problem (up to b ): SVM: m w + C x C = /( l ) w, b subject to: y ( xw + b ) - x, x 0 " Fall 004 Patter Recogto for Vso

46 SVM Summary SVMs are maxmum marg classfers. Oly trag pots close to the boudary (support vectors) occur the SVM soluto. he SVM problem s covex, the soluto s global ad uque. SVMs ca hadle o-separable data. o-lear separato the put space s possble by projectg the data to a feature space. All calculatos ca be doe the put space (kerel trck). SVMs are kow to perform well hgh dmesoal problems wth few examples. Depedg o the kerel, SVMs ca be slow durg classfcato SVMs are bary classfers. ot effcet for problems wth large umber of classes. Fall 004 Patter Recogto for Vso

47 Lterature. Haste, R. bshra, J. Fredma: he Elemets of Statstcal Learg, Sprger, 00: LDA, QDA, extesos to LDA, SVM & Regularzato. C. Burges: A utoral o SVM for Patter Recogto, 999: Learg heory, SVM. R. Rfk: Everythg Old s ew aga: A fresh Look at Hstorcal Approaches Mache Learg, 00: SVM trag, SVM multclass.. Evgeou, M. Potl,. Poggo: Regularzato etworks ad SVMs, 999: SVM & Regularzato. V. Vapk: he ature of Statstcal Learg, 995: Statstcal learg theory, SVM. Fall 004 Patter Recogto for Vso

48 Homework Classfcato problem o the IS hadwrtte dgts data volvg PCA, LDA ad SVMs. PCA code wll be posted today Fall 004 Patter Recogto for Vso

An Introduction to. Support Vector Machine

An Introduction to. Support Vector Machine A Itroducto to Support Vector Mache Support Vector Mache (SVM) A classfer derved from statstcal learg theory by Vapk, et al. 99 SVM became famous whe, usg mages as put, t gave accuracy comparable to eural-etwork

More information

Kernel-based Methods and Support Vector Machines

Kernel-based Methods and Support Vector Machines Kerel-based Methods ad Support Vector Maches Larr Holder CptS 570 Mache Learg School of Electrcal Egeerg ad Computer Scece Washgto State Uverst Refereces Muller et al. A Itroducto to Kerel-Based Learg

More information

CS 1675 Introduction to Machine Learning Lecture 12 Support vector machines

CS 1675 Introduction to Machine Learning Lecture 12 Support vector machines CS 675 Itroducto to Mache Learg Lecture Support vector maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Mdterm eam October 9, 7 I-class eam Closed book Stud materal: Lecture otes Correspodg chapters

More information

Binary classification: Support Vector Machines

Binary classification: Support Vector Machines CS 57 Itroducto to AI Lecture 6 Bar classfcato: Support Vector Maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 57 Itro to AI Supervsed learg Data: D { D, D,.., D} a set of eamples D, (,,,,,

More information

Support vector machines II

Support vector machines II CS 75 Mache Learg Lecture Support vector maches II Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Learl separable classes Learl separable classes: here s a hperplae that separates trag staces th o error

More information

Introduction to local (nonparametric) density estimation. methods

Introduction to local (nonparametric) density estimation. methods Itroducto to local (oparametrc) desty estmato methods A slecture by Yu Lu for ECE 66 Sprg 014 1. Itroducto Ths slecture troduces two local desty estmato methods whch are Parze desty estmato ad k-earest

More information

Support vector machines

Support vector machines CS 75 Mache Learg Lecture Support vector maches Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Learg Outle Outle: Algorthms for lear decso boudary Support vector maches Mamum marg hyperplae.

More information

Dimensionality Reduction and Learning

Dimensionality Reduction and Learning CMSC 35900 (Sprg 009) Large Scale Learg Lecture: 3 Dmesoalty Reducto ad Learg Istructors: Sham Kakade ad Greg Shakharovch L Supervsed Methods ad Dmesoalty Reducto The theme of these two lectures s that

More information

Bayes (Naïve or not) Classifiers: Generative Approach

Bayes (Naïve or not) Classifiers: Generative Approach Logstc regresso Bayes (Naïve or ot) Classfers: Geeratve Approach What do we mea by Geeratve approach: Lear p(y), p(x y) ad the apply bayes rule to compute p(y x) for makg predctos Ths s essetally makg

More information

6. Nonparametric techniques

6. Nonparametric techniques 6. Noparametrc techques Motvato Problem: how to decde o a sutable model (e.g. whch type of Gaussa) Idea: just use the orgal data (lazy learg) 2 Idea 1: each data pot represets a pece of probablty P(x)

More information

Lecture 7: Linear and quadratic classifiers

Lecture 7: Linear and quadratic classifiers Lecture 7: Lear ad quadratc classfers Bayes classfers for ormally dstrbuted classes Case : Σ σ I Case : Σ Σ (Σ daoal Case : Σ Σ (Σ o-daoal Case 4: Σ σ I Case 5: Σ Σ j eeral case Lear ad quadratc classfers:

More information

KLT Tracker. Alignment. 1. Detect Harris corners in the first frame. 2. For each Harris corner compute motion between consecutive frames

KLT Tracker. Alignment. 1. Detect Harris corners in the first frame. 2. For each Harris corner compute motion between consecutive frames KLT Tracker Tracker. Detect Harrs corers the frst frame 2. For each Harrs corer compute moto betwee cosecutve frames (Algmet). 3. Lk moto vectors successve frames to get a track 4. Itroduce ew Harrs pots

More information

Chapter 4 (Part 1): Non-Parametric Classification (Sections ) Pattern Classification 4.3) Announcements

Chapter 4 (Part 1): Non-Parametric Classification (Sections ) Pattern Classification 4.3) Announcements Aoucemets No-Parametrc Desty Estmato Techques HW assged Most of ths lecture was o the blacboard. These sldes cover the same materal as preseted DHS Bometrcs CSE 90-a Lecture 7 CSE90a Fall 06 CSE90a Fall

More information

Multivariate Transformation of Variables and Maximum Likelihood Estimation

Multivariate Transformation of Variables and Maximum Likelihood Estimation Marquette Uversty Multvarate Trasformato of Varables ad Maxmum Lkelhood Estmato Dael B. Rowe, Ph.D. Assocate Professor Departmet of Mathematcs, Statstcs, ad Computer Scece Copyrght 03 by Marquette Uversty

More information

Point Estimation: definition of estimators

Point Estimation: definition of estimators Pot Estmato: defto of estmators Pot estmator: ay fucto W (X,..., X ) of a data sample. The exercse of pot estmato s to use partcular fuctos of the data order to estmate certa ukow populato parameters.

More information

Generalized Linear Regression with Regularization

Generalized Linear Regression with Regularization Geeralze Lear Regresso wth Regularzato Zoya Bylsk March 3, 05 BASIC REGRESSION PROBLEM Note: I the followg otes I wll make explct what s a vector a what s a scalar usg vec t or otato, to avo cofuso betwee

More information

TESTS BASED ON MAXIMUM LIKELIHOOD

TESTS BASED ON MAXIMUM LIKELIHOOD ESE 5 Toy E. Smth. The Basc Example. TESTS BASED ON MAXIMUM LIKELIHOOD To llustrate the propertes of maxmum lkelhood estmates ad tests, we cosder the smplest possble case of estmatg the mea of the ormal

More information

Dimensionality reduction Feature selection

Dimensionality reduction Feature selection CS 750 Mache Learg Lecture 3 Dmesoalty reducto Feature selecto Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 750 Mache Learg Dmesoalty reducto. Motvato. Classfcato problem eample: We have a put data

More information

LECTURE 2: Linear and quadratic classifiers

LECTURE 2: Linear and quadratic classifiers LECURE : Lear ad quadratc classfers g Part : Bayesa Decso heory he Lkelhood Rato est Maxmum A Posteror ad Maxmum Lkelhood Dscrmat fuctos g Part : Quadratc classfers Bayes classfers for ormally dstrbuted

More information

THE ROYAL STATISTICAL SOCIETY 2016 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE MODULE 5

THE ROYAL STATISTICAL SOCIETY 2016 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE MODULE 5 THE ROYAL STATISTICAL SOCIETY 06 EAMINATIONS SOLUTIONS HIGHER CERTIFICATE MODULE 5 The Socety s provdg these solutos to assst cadtes preparg for the examatos 07. The solutos are teded as learg ads ad should

More information

Feature Selection: Part 2. 1 Greedy Algorithms (continued from the last lecture)

Feature Selection: Part 2. 1 Greedy Algorithms (continued from the last lecture) CSE 546: Mache Learg Lecture 6 Feature Selecto: Part 2 Istructor: Sham Kakade Greedy Algorthms (cotued from the last lecture) There are varety of greedy algorthms ad umerous amg covetos for these algorthms.

More information

Overview. Basic concepts of Bayesian learning. Most probable model given data Coin tosses Linear regression Logistic regression

Overview. Basic concepts of Bayesian learning. Most probable model given data Coin tosses Linear regression Logistic regression Overvew Basc cocepts of Bayesa learg Most probable model gve data Co tosses Lear regresso Logstc regresso Bayesa predctos Co tosses Lear regresso 30 Recap: regresso problems Iput to learg problem: trag

More information

Functions of Random Variables

Functions of Random Variables Fuctos of Radom Varables Chapter Fve Fuctos of Radom Varables 5. Itroducto A geeral egeerg aalyss model s show Fg. 5.. The model output (respose) cotas the performaces of a system or product, such as weght,

More information

Generative classification models

Generative classification models CS 75 Mache Learg Lecture Geeratve classfcato models Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square Data: D { d, d,.., d} d, Classfcato represets a dscrete class value Goal: lear f : X Y Bar classfcato

More information

Chapter 5 Properties of a Random Sample

Chapter 5 Properties of a Random Sample Lecture 6 o BST 63: Statstcal Theory I Ku Zhag, /0/008 Revew for the prevous lecture Cocepts: t-dstrbuto, F-dstrbuto Theorems: Dstrbutos of sample mea ad sample varace, relatoshp betwee sample mea ad sample

More information

Radial Basis Function Networks

Radial Basis Function Networks Radal Bass Fucto Netorks Radal Bass Fucto Netorks A specal types of ANN that have three layers Iput layer Hdde layer Output layer Mappg from put to hdde layer s olear Mappg from hdde to output layer s

More information

Linear Regression Linear Regression with Shrinkage. Some slides are due to Tommi Jaakkola, MIT AI Lab

Linear Regression Linear Regression with Shrinkage. Some slides are due to Tommi Jaakkola, MIT AI Lab Lear Regresso Lear Regresso th Shrkage Some sldes are due to Tomm Jaakkola, MIT AI Lab Itroducto The goal of regresso s to make quattatve real valued predctos o the bass of a vector of features or attrbutes.

More information

Special Instructions / Useful Data

Special Instructions / Useful Data JAM 6 Set of all real umbers P A..d. B, p Posso Specal Istructos / Useful Data x,, :,,, x x Probablty of a evet A Idepedetly ad detcally dstrbuted Bomal dstrbuto wth parameters ad p Posso dstrbuto wth

More information

Assignment 5/MATH 247/Winter Due: Friday, February 19 in class (!) (answers will be posted right after class)

Assignment 5/MATH 247/Winter Due: Friday, February 19 in class (!) (answers will be posted right after class) Assgmet 5/MATH 7/Wter 00 Due: Frday, February 9 class (!) (aswers wll be posted rght after class) As usual, there are peces of text, before the questos [], [], themselves. Recall: For the quadratc form

More information

Econometric Methods. Review of Estimation

Econometric Methods. Review of Estimation Ecoometrc Methods Revew of Estmato Estmatg the populato mea Radom samplg Pot ad terval estmators Lear estmators Ubased estmators Lear Ubased Estmators (LUEs) Effcecy (mmum varace) ad Best Lear Ubased Estmators

More information

Bayes Decision Theory - II

Bayes Decision Theory - II Bayes Decso Theory - II Ke Kreutz-Delgado (Nuo Vascocelos) ECE 175 Wter 2012 - UCSD Nearest Neghbor Classfer We are cosderg supervsed classfcato Nearest Neghbor (NN) Classfer A trag set D = {(x 1,y 1 ),,

More information

MATH 247/Winter Notes on the adjoint and on normal operators.

MATH 247/Winter Notes on the adjoint and on normal operators. MATH 47/Wter 00 Notes o the adjot ad o ormal operators I these otes, V s a fte dmesoal er product space over, wth gve er * product uv, T, S, T, are lear operators o V U, W are subspaces of V Whe we say

More information

CHAPTER VI Statistical Analysis of Experimental Data

CHAPTER VI Statistical Analysis of Experimental Data Chapter VI Statstcal Aalyss of Expermetal Data CHAPTER VI Statstcal Aalyss of Expermetal Data Measuremets do ot lead to a uque value. Ths s a result of the multtude of errors (maly radom errors) that ca

More information

Applications of Multiple Biological Signals

Applications of Multiple Biological Signals Applcatos of Multple Bologcal Sgals I the Hosptal of Natoal Tawa Uversty, curatve gastrectomy could be performed o patets of gastrc cacers who are udergoe the curatve resecto to acqure sgal resposes from

More information

PGE 310: Formulation and Solution in Geosystems Engineering. Dr. Balhoff. Interpolation

PGE 310: Formulation and Solution in Geosystems Engineering. Dr. Balhoff. Interpolation PGE 30: Formulato ad Soluto Geosystems Egeerg Dr. Balhoff Iterpolato Numercal Methods wth MATLAB, Recktewald, Chapter 0 ad Numercal Methods for Egeers, Chapra ad Caale, 5 th Ed., Part Fve, Chapter 8 ad

More information

ESS Line Fitting

ESS Line Fitting ESS 5 014 17. Le Fttg A very commo problem data aalyss s lookg for relatoshpetwee dfferet parameters ad fttg les or surfaces to data. The smplest example s fttg a straght le ad we wll dscuss that here

More information

{ }{ ( )} (, ) = ( ) ( ) ( ) Chapter 14 Exercises in Sampling Theory. Exercise 1 (Simple random sampling): Solution:

{ }{ ( )} (, ) = ( ) ( ) ( ) Chapter 14 Exercises in Sampling Theory. Exercise 1 (Simple random sampling): Solution: Chapter 4 Exercses Samplg Theory Exercse (Smple radom samplg: Let there be two correlated radom varables X ad A sample of sze s draw from a populato by smple radom samplg wthout replacemet The observed

More information

ENGI 4421 Joint Probability Distributions Page Joint Probability Distributions [Navidi sections 2.5 and 2.6; Devore sections

ENGI 4421 Joint Probability Distributions Page Joint Probability Distributions [Navidi sections 2.5 and 2.6; Devore sections ENGI 441 Jot Probablty Dstrbutos Page 7-01 Jot Probablty Dstrbutos [Navd sectos.5 ad.6; Devore sectos 5.1-5.] The jot probablty mass fucto of two dscrete radom quattes, s, P ad p x y x y The margal probablty

More information

Qualifying Exam Statistical Theory Problem Solutions August 2005

Qualifying Exam Statistical Theory Problem Solutions August 2005 Qualfyg Exam Statstcal Theory Problem Solutos August 5. Let X, X,..., X be d uform U(,),

More information

ρ < 1 be five real numbers. The

ρ < 1 be five real numbers. The Lecture o BST 63: Statstcal Theory I Ku Zhag, /0/006 Revew for the prevous lecture Deftos: covarace, correlato Examples: How to calculate covarace ad correlato Theorems: propertes of correlato ad covarace

More information

Announcements. Recognition II. Computer Vision I. Example: Face Detection. Evaluating a binary classifier

Announcements. Recognition II. Computer Vision I. Example: Face Detection. Evaluating a binary classifier Aoucemets Recogto II H3 exteded to toght H4 to be aouced today. Due Frday 2/8. Note wll take a whle to ru some thgs. Fal Exam: hursday 2/4 at 7pm-0pm CSE252A Lecture 7 Example: Face Detecto Evaluatg a

More information

Introduction to Matrices and Matrix Approach to Simple Linear Regression

Introduction to Matrices and Matrix Approach to Simple Linear Regression Itroducto to Matrces ad Matrx Approach to Smple Lear Regresso Matrces Defto: A matrx s a rectagular array of umbers or symbolc elemets I may applcatos, the rows of a matrx wll represet dvduals cases (people,

More information

Unsupervised Learning and Other Neural Networks

Unsupervised Learning and Other Neural Networks CSE 53 Soft Computg NOT PART OF THE FINAL Usupervsed Learg ad Other Neural Networs Itroducto Mture Destes ad Idetfablty ML Estmates Applcato to Normal Mtures Other Neural Networs Itroducto Prevously, all

More information

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS Exam: ECON430 Statstcs Date of exam: Frday, December 8, 07 Grades are gve: Jauary 4, 08 Tme for exam: 0900 am 00 oo The problem set covers 5 pages Resources allowed:

More information

Lecture 3 Probability review (cont d)

Lecture 3 Probability review (cont d) STATS 00: Itroducto to Statstcal Iferece Autum 06 Lecture 3 Probablty revew (cot d) 3. Jot dstrbutos If radom varables X,..., X k are depedet, the ther dstrbuto may be specfed by specfyg the dvdual dstrbuto

More information

X X X E[ ] E X E X. is the ()m n where the ( i,)th. j element is the mean of the ( i,)th., then

X X X E[ ] E X E X. is the ()m n where the ( i,)th. j element is the mean of the ( i,)th., then Secto 5 Vectors of Radom Varables Whe workg wth several radom varables,,..., to arrage them vector form x, t s ofte coveet We ca the make use of matrx algebra to help us orgaze ad mapulate large umbers

More information

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS Postpoed exam: ECON430 Statstcs Date of exam: Jauary 0, 0 Tme for exam: 09:00 a.m. :00 oo The problem set covers 5 pages Resources allowed: All wrtte ad prted

More information

Dr. Shalabh. Indian Institute of Technology Kanpur

Dr. Shalabh. Indian Institute of Technology Kanpur Aalyss of Varace ad Desg of Expermets-I MODULE -I LECTURE - SOME RESULTS ON LINEAR ALGEBRA, MATRIX THEORY AND DISTRIBUTIONS Dr. Shalabh Departmet t of Mathematcs t ad Statstcs t t Ida Isttute of Techology

More information

Study on a Fire Detection System Based on Support Vector Machine

Study on a Fire Detection System Based on Support Vector Machine Sesors & Trasducers, Vol. 8, Issue, November 04, pp. 57-6 Sesors & Trasducers 04 by IFSA Publshg, S. L. http://www.sesorsportal.com Study o a Fre Detecto System Based o Support Vector Mache Ye Xaotg, Wu

More information

Rademacher Complexity. Examples

Rademacher Complexity. Examples Algorthmc Foudatos of Learg Lecture 3 Rademacher Complexty. Examples Lecturer: Patrck Rebesch Verso: October 16th 018 3.1 Itroducto I the last lecture we troduced the oto of Rademacher complexty ad showed

More information

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA

THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA THE ROYAL STATISTICAL SOCIETY EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA PAPER II STATISTICAL THEORY & METHODS The Socety provdes these solutos to assst caddates preparg for the examatos future years ad for

More information

6.867 Machine Learning

6.867 Machine Learning 6.867 Mache Learg Problem set Due Frday, September 9, rectato Please address all questos ad commets about ths problem set to 6.867-staff@a.mt.edu. You do ot eed to use MATLAB for ths problem set though

More information

Logistic regression (continued)

Logistic regression (continued) STAT562 page 138 Logstc regresso (cotued) Suppose we ow cosder more complex models to descrbe the relatoshp betwee a categorcal respose varable (Y) that takes o two (2) possble outcomes ad a set of p explaatory

More information

Lecture 12: Multilayer perceptrons II

Lecture 12: Multilayer perceptrons II Lecture : Multlayer perceptros II Bayes dscrmats ad MLPs he role of hdde uts A eample Itroducto to Patter Recoto Rcardo Guterrez-Osua Wrht State Uversty Bayes dscrmats ad MLPs ( As we have see throuhout

More information

Supervised learning: Linear regression Logistic regression

Supervised learning: Linear regression Logistic regression CS 57 Itroducto to AI Lecture 4 Supervsed learg: Lear regresso Logstc regresso Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 57 Itro to AI Data: D { D D.. D D Supervsed learg d a set of eamples s

More information

Block-Based Compact Thermal Modeling of Semiconductor Integrated Circuits

Block-Based Compact Thermal Modeling of Semiconductor Integrated Circuits Block-Based Compact hermal Modelg of Semcoductor Itegrated Crcuts Master s hess Defese Caddate: Jg Ba Commttee Members: Dr. Mg-Cheg Cheg Dr. Daqg Hou Dr. Robert Schllg July 27, 2009 Outle Itroducto Backgroud

More information

Bounds on the expected entropy and KL-divergence of sampled multinomial distributions. Brandon C. Roy

Bounds on the expected entropy and KL-divergence of sampled multinomial distributions. Brandon C. Roy Bouds o the expected etropy ad KL-dvergece of sampled multomal dstrbutos Brado C. Roy bcroy@meda.mt.edu Orgal: May 18, 2011 Revsed: Jue 6, 2011 Abstract Iformato theoretc quattes calculated from a sampled

More information

X ε ) = 0, or equivalently, lim

X ε ) = 0, or equivalently, lim Revew for the prevous lecture Cocepts: order statstcs Theorems: Dstrbutos of order statstcs Examples: How to get the dstrbuto of order statstcs Chapter 5 Propertes of a Radom Sample Secto 55 Covergece

More information

LECTURE 21: Support Vector Machines

LECTURE 21: Support Vector Machines LECURE 2: Support Vector Maches Emprcal Rsk Mmzato he VC dmeso Structural Rsk Mmzato Maxmum mar hyperplae he Laraa dual problem Itroducto to Patter Aalyss Rcardo Guterrez-Osua exas A&M Uversty Itroducto

More information

Model Fitting, RANSAC. Jana Kosecka

Model Fitting, RANSAC. Jana Kosecka Model Fttg, RANSAC Jaa Kosecka Fttg: Issues Prevous strateges Le detecto Hough trasform Smple parametrc model, two parameters m, b m + b Votg strateg Hard to geeralze to hgher dmesos a o + a + a 2 2 +

More information

Department of Agricultural Economics. PhD Qualifier Examination. August 2011

Department of Agricultural Economics. PhD Qualifier Examination. August 2011 Departmet of Agrcultural Ecoomcs PhD Qualfer Examato August 0 Istructos: The exam cossts of sx questos You must aswer all questos If you eed a assumpto to complete a questo, state the assumpto clearly

More information

Vapnik Chervonenkis classes

Vapnik Chervonenkis classes Vapk Chervoeks classes Maxm Ragsky September 23, 2014 A key result o the ERM algorthm, proved the prevous lecture, was that P( f ) L (F ) + 4ER (F (Z )) + 2log(1/δ) wth probablty at least 1 δ. The quatty

More information

Maximum Likelihood Estimation

Maximum Likelihood Estimation Marquette Uverst Maxmum Lkelhood Estmato Dael B. Rowe, Ph.D. Professor Departmet of Mathematcs, Statstcs, ad Computer Scece Coprght 08 b Marquette Uverst Maxmum Lkelhood Estmato We have bee sag that ~

More information

QR Factorization and Singular Value Decomposition COS 323

QR Factorization and Singular Value Decomposition COS 323 QR Factorzato ad Sgular Value Decomposto COS 33 Why Yet Aother Method? How do we solve least-squares wthout currg codto-squarg effect of ormal equatos (A T A A T b) whe A s sgular, fat, or otherwse poorly-specfed?

More information

Lecture 3. Sampling, sampling distributions, and parameter estimation

Lecture 3. Sampling, sampling distributions, and parameter estimation Lecture 3 Samplg, samplg dstrbutos, ad parameter estmato Samplg Defto Populato s defed as the collecto of all the possble observatos of terest. The collecto of observatos we take from the populato s called

More information

Homework 1: Solutions Sid Banerjee Problem 1: (Practice with Asymptotic Notation) ORIE 4520: Stochastics at Scale Fall 2015

Homework 1: Solutions Sid Banerjee Problem 1: (Practice with Asymptotic Notation) ORIE 4520: Stochastics at Scale Fall 2015 Fall 05 Homework : Solutos Problem : (Practce wth Asymptotc Notato) A essetal requremet for uderstadg scalg behavor s comfort wth asymptotc (or bg-o ) otato. I ths problem, you wll prove some basc facts

More information

MA/CSSE 473 Day 27. Dynamic programming

MA/CSSE 473 Day 27. Dynamic programming MA/CSSE 473 Day 7 Dyamc Programmg Bomal Coeffcets Warshall's algorthm (Optmal BSTs) Studet questos? Dyamc programmg Used for problems wth recursve solutos ad overlappg subproblems Typcally, we save (memoze)

More information

Numerical Analysis Formulae Booklet

Numerical Analysis Formulae Booklet Numercal Aalyss Formulae Booklet. Iteratve Scemes for Systems of Lear Algebrac Equatos:.... Taylor Seres... 3. Fte Dfferece Approxmatos... 3 4. Egevalues ad Egevectors of Matrces.... 3 5. Vector ad Matrx

More information

hp calculators HP 30S Statistics Averages and Standard Deviations Average and Standard Deviation Practice Finding Averages and Standard Deviations

hp calculators HP 30S Statistics Averages and Standard Deviations Average and Standard Deviation Practice Finding Averages and Standard Deviations HP 30S Statstcs Averages ad Stadard Devatos Average ad Stadard Devato Practce Fdg Averages ad Stadard Devatos HP 30S Statstcs Averages ad Stadard Devatos Average ad stadard devato The HP 30S provdes several

More information

best estimate (mean) for X uncertainty or error in the measurement (systematic, random or statistical) best

best estimate (mean) for X uncertainty or error in the measurement (systematic, random or statistical) best Error Aalyss Preamble Wheever a measuremet s made, the result followg from that measuremet s always subject to ucertaty The ucertaty ca be reduced by makg several measuremets of the same quatty or by mprovg

More information

A conic cutting surface method for linear-quadraticsemidefinite

A conic cutting surface method for linear-quadraticsemidefinite A coc cuttg surface method for lear-quadratcsemdefte programmg Mohammad R. Osoorouch Calfora State Uversty Sa Marcos Sa Marcos, CA Jot wor wth Joh E. Mtchell RPI July 3, 2008 Outle: Secod-order coe: defto

More information

ECON 5360 Class Notes GMM

ECON 5360 Class Notes GMM ECON 560 Class Notes GMM Geeralzed Method of Momets (GMM) I beg by outlg the classcal method of momets techque (Fsher, 95) ad the proceed to geeralzed method of momets (Hase, 98).. radtoal Method of Momets

More information

Ordinary Least Squares Regression. Simple Regression. Algebra and Assumptions.

Ordinary Least Squares Regression. Simple Regression. Algebra and Assumptions. Ordary Least Squares egresso. Smple egresso. Algebra ad Assumptos. I ths part of the course we are gog to study a techque for aalysg the lear relatoshp betwee two varables Y ad X. We have pars of observatos

More information

Chapter 9 Jordan Block Matrices

Chapter 9 Jordan Block Matrices Chapter 9 Jorda Block atrces I ths chapter we wll solve the followg problem. Gve a lear operator T fd a bass R of F such that the matrx R (T) s as smple as possble. f course smple s a matter of taste.

More information

Lecture 3. Least Squares Fitting. Optimization Trinity 2014 P.H.S.Torr. Classic least squares. Total least squares.

Lecture 3. Least Squares Fitting. Optimization Trinity 2014 P.H.S.Torr. Classic least squares. Total least squares. Lecture 3 Optmzato Trt 04 P.H.S.Torr Least Squares Fttg Classc least squares Total least squares Robust Estmato Fttg: Cocepts ad recpes Least squares le fttg Data:,,,, Le equato: = m + b Fd m, b to mmze

More information

Nonparametric Density Estimation Intro

Nonparametric Density Estimation Intro Noarametrc Desty Estmato Itro Parze Wdows No-Parametrc Methods Nether robablty dstrbuto or dscrmat fucto s kow Haes qute ofte All we have s labeled data a lot s kow easer salmo bass salmo salmo Estmate

More information

1 Solution to Problem 6.40

1 Solution to Problem 6.40 1 Soluto to Problem 6.40 (a We wll wrte T τ (X 1,...,X where the X s are..d. wth PDF f(x µ, σ 1 ( x µ σ g, σ where the locato parameter µ s ay real umber ad the scale parameter σ s > 0. Lettg Z X µ σ we

More information

Relations to Other Statistical Methods Statistical Data Analysis with Positive Definite Kernels

Relations to Other Statistical Methods Statistical Data Analysis with Positive Definite Kernels Relatos to Other Statstcal Methods Statstcal Data Aalyss wth Postve Defte Kerels Kej Fukuzu Isttute of Statstcal Matheatcs, ROIS Departet of Statstcal Scece, Graduate Uversty for Advaced Studes October

More information

Solving Constrained Flow-Shop Scheduling. Problems with Three Machines

Solving Constrained Flow-Shop Scheduling. Problems with Three Machines It J Cotemp Math Sceces, Vol 5, 2010, o 19, 921-929 Solvg Costraed Flow-Shop Schedulg Problems wth Three Maches P Pada ad P Rajedra Departmet of Mathematcs, School of Advaced Sceces, VIT Uversty, Vellore-632

More information

Chapter 14 Logistic Regression Models

Chapter 14 Logistic Regression Models Chapter 4 Logstc Regresso Models I the lear regresso model X β + ε, there are two types of varables explaatory varables X, X,, X k ad study varable y These varables ca be measured o a cotuous scale as

More information

UNIT 2 SOLUTION OF ALGEBRAIC AND TRANSCENDENTAL EQUATIONS

UNIT 2 SOLUTION OF ALGEBRAIC AND TRANSCENDENTAL EQUATIONS Numercal Computg -I UNIT SOLUTION OF ALGEBRAIC AND TRANSCENDENTAL EQUATIONS Structure Page Nos..0 Itroducto 6. Objectves 7. Ital Approxmato to a Root 7. Bsecto Method 8.. Error Aalyss 9.4 Regula Fals Method

More information

α1 α2 Simplex and Rectangle Elements Multi-index Notation of polynomials of degree Definition: The set P k will be the set of all functions:

α1 α2 Simplex and Rectangle Elements Multi-index Notation of polynomials of degree Definition: The set P k will be the set of all functions: Smplex ad Rectagle Elemets Mult-dex Notato = (,..., ), o-egatve tegers = = β = ( β,..., β ) the + β = ( + β,..., + β ) + x = x x x x = x x β β + D = D = D D x x x β β Defto: The set P of polyomals of degree

More information

Gender Classification from ECG Signal Analysis using Least Square Support Vector Machine

Gender Classification from ECG Signal Analysis using Least Square Support Vector Machine Amerca Joural of Sgal Processg, (5): 45-49 DOI:.593/.asp.5.8 Geder Classfcato from ECG Sgal Aalyss usg Least Square Support Vector Mache Raesh Ku. rpathy,*, Ashutosh Acharya, Sumt Kumar Choudhary Departmet

More information

Bayesian Classification. CS690L Data Mining: Classification(2) Bayesian Theorem: Basics. Bayesian Theorem. Training dataset. Naïve Bayes Classifier

Bayesian Classification. CS690L Data Mining: Classification(2) Bayesian Theorem: Basics. Bayesian Theorem. Training dataset. Naïve Bayes Classifier Baa Classfcato CS6L Data Mg: Classfcato() Referece: J. Ha ad M. Kamber, Data Mg: Cocepts ad Techques robablstc learg: Calculate explct probabltes for hypothess, amog the most practcal approaches to certa

More information

STK4011 and STK9011 Autumn 2016

STK4011 and STK9011 Autumn 2016 STK4 ad STK9 Autum 6 Pot estmato Covers (most of the followg materal from chapter 7: Secto 7.: pages 3-3 Secto 7..: pages 3-33 Secto 7..: pages 35-3 Secto 7..3: pages 34-35 Secto 7.3.: pages 33-33 Secto

More information

Ideal multigrades with trigonometric coefficients

Ideal multigrades with trigonometric coefficients Ideal multgrades wth trgoometrc coeffcets Zarathustra Brady December 13, 010 1 The problem A (, k) multgrade s defed as a par of dstct sets of tegers such that (a 1,..., a ; b 1,..., b ) a j = =1 for all

More information

CS 2750 Machine Learning. Lecture 8. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x

CS 2750 Machine Learning. Lecture 8. Linear regression. CS 2750 Machine Learning. Linear regression. is a linear combination of input components x CS 75 Mache Learg Lecture 8 Lear regresso Mlos Hauskrecht mlos@cs.ptt.edu 539 Seott Square CS 75 Mache Learg Lear regresso Fucto f : X Y s a lear combato of put compoets f + + + K d d K k - parameters

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Revew for the prevous lecture: Theorems ad Examples: How to obta the pmf (pdf) of U = g (, Y) ad V = g (, Y) Chapter 4 Multple Radom Varables Chapter 44 Herarchcal Models ad Mxture Dstrbutos Examples:

More information

Simulation Output Analysis

Simulation Output Analysis Smulato Output Aalyss Summary Examples Parameter Estmato Sample Mea ad Varace Pot ad Iterval Estmato ermatg ad o-ermatg Smulato Mea Square Errors Example: Sgle Server Queueg System x(t) S 4 S 4 S 3 S 5

More information

Comparison of SVMs in Number Plate Recognition

Comparison of SVMs in Number Plate Recognition Comparso of SVMs Number Plate Recogto Lhog Zheg, Xaga He ad om Htz Uversty of echology, Sydey, Departmet of Computer Systems, {lzheg, sea, htz}@t.uts.edu.au Abstract. Hgh accuracy ad hgh speed are two

More information

3D Geometry for Computer Graphics. Lesson 2: PCA & SVD

3D Geometry for Computer Graphics. Lesson 2: PCA & SVD 3D Geometry for Computer Graphcs Lesso 2: PCA & SVD Last week - egedecomposto We wat to lear how the matrx A works: A 2 Last week - egedecomposto If we look at arbtrary vectors, t does t tell us much.

More information

Lecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model

Lecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model Lecture 7. Cofdece Itervals ad Hypothess Tests the Smple CLR Model I lecture 6 we troduced the Classcal Lear Regresso (CLR) model that s the radom expermet of whch the data Y,,, K, are the outcomes. The

More information

Principal Components. Analysis. Basic Intuition. A Method of Self Organized Learning

Principal Components. Analysis. Basic Intuition. A Method of Self Organized Learning Prcpal Compoets Aalss A Method of Self Orgazed Learg Prcpal Compoets Aalss Stadard techque for data reducto statstcal patter matchg ad sgal processg Usupervsed learg: lear from examples wthout a teacher

More information

CS286.2 Lecture 4: Dinur s Proof of the PCP Theorem

CS286.2 Lecture 4: Dinur s Proof of the PCP Theorem CS86. Lecture 4: Dur s Proof of the PCP Theorem Scrbe: Thom Bohdaowcz Prevously, we have prove a weak verso of the PCP theorem: NP PCP 1,1/ (r = poly, q = O(1)). Wth ths result we have the desred costat

More information

1 Review and Overview

1 Review and Overview CS9T/STATS3: Statstcal Learg Teory Lecturer: Tegyu Ma Lecture #7 Scrbe: Bra Zag October 5, 08 Revew ad Overvew We wll frst gve a bref revew of wat as bee covered so far I te frst few lectures, we stated

More information

Lecture Note to Rice Chapter 8

Lecture Note to Rice Chapter 8 ECON 430 HG revsed Nov 06 Lecture Note to Rce Chapter 8 Radom matrces Let Y, =,,, m, =,,, be radom varables (r.v. s). The matrx Y Y Y Y Y Y Y Y Y Y = m m m s called a radom matrx ( wth a ot m-dmesoal dstrbuto,

More information

THE ROYAL STATISTICAL SOCIETY 2009 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA MODULAR FORMAT MODULE 2 STATISTICAL INFERENCE

THE ROYAL STATISTICAL SOCIETY 2009 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA MODULAR FORMAT MODULE 2 STATISTICAL INFERENCE THE ROYAL STATISTICAL SOCIETY 009 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA MODULAR FORMAT MODULE STATISTICAL INFERENCE The Socety provdes these solutos to assst caddates preparg for the examatos future

More information

PROJECTION PROBLEM FOR REGULAR POLYGONS

PROJECTION PROBLEM FOR REGULAR POLYGONS Joural of Mathematcal Sceces: Advaces ad Applcatos Volume, Number, 008, Pages 95-50 PROJECTION PROBLEM FOR REGULAR POLYGONS College of Scece Bejg Forestry Uversty Bejg 0008 P. R. Cha e-mal: sl@bjfu.edu.c

More information

Cubic Nonpolynomial Spline Approach to the Solution of a Second Order Two-Point Boundary Value Problem

Cubic Nonpolynomial Spline Approach to the Solution of a Second Order Two-Point Boundary Value Problem Joural of Amerca Scece ;6( Cubc Nopolyomal Sple Approach to the Soluto of a Secod Order Two-Pot Boudary Value Problem W.K. Zahra, F.A. Abd El-Salam, A.A. El-Sabbagh ad Z.A. ZAk * Departmet of Egeerg athematcs

More information

ECONOMETRIC THEORY. MODULE VIII Lecture - 26 Heteroskedasticity

ECONOMETRIC THEORY. MODULE VIII Lecture - 26 Heteroskedasticity ECONOMETRIC THEORY MODULE VIII Lecture - 6 Heteroskedastcty Dr. Shalabh Departmet of Mathematcs ad Statstcs Ida Isttute of Techology Kapur . Breusch Paga test Ths test ca be appled whe the replcated data

More information