Statistical Paradigm

Size: px
Start display at page:

Download "Statistical Paradigm"

Transcription

1 EE 688 Overvew of Sascal Models for Vdeo Indexng Prof. ShhFu Chang Columba Unversy TA: Erc Zavesky Fall 7, Lecure 4 Course web se: hp:// Sascal Paradgm Many problems can be posed as paern recognon Image classfcaon: ndoor vs. oudoor? Face? sho boundary deecon, sory segmenaon Is he curren pon a boundary? Sascal models o handle uncerany and provde flexbly Image processng ools avalable E.g., homework # Rch ools for learnng and predcon See course web se Increasng daa avalable NIST TREC Vdeo: 3+ hours Consumer and youube vdeos

2 A Very HghLevel Sa. Paern Recog. Archecure (From Jan, Dun, and Mao, SPR Revew, 99) Imporan ssues () Image/vdeo processng Wha s he adequae qualy, resoluon, ec? Feaure exracon Color, exure, moon, regon, shape, neres pons, audo, speech, ex, ec Feaure represenaon Hsogram, bag, graph ec Invarance o scale, roaon, ranslaon, vew, llumnaon, How o reduce dmensons?

3 Imporan ssues () Dsance measuremen How o measure smlary beween mages/vdeos? L, L, Mahalanobs, Earh Mover s dsance, vecor/graph machng Classfcaon models Generave vs. dscrmnave Mulmodal fuson, early fuson vs. lae fuson E.g., how o use jon audovsual feaures o deec evens (dancng, weddng ) Effcency ssues how o speed up ranng and esng processes? How o rapdly buld a model for new domans Valdaon and evaluaon How o measure performance? Are models generalzable o new domans? Three relaed problems Rereval, Rankng Gven a query mage, fnd relevan ones May apply rank hreshold o decde relevance Classfcaon, caegorzaon, deecon Gven an mage x, predc class label y Cluserng, groupng Group mages/vdeos no clusers of dsnc arbues 3

4 An example News sory segmenaon usng mulmodal, mulscale feaures Frs Undersand Daa Types and Explore Unque Characerscs (3.%) (8.8%) Percenage of conen (a): regular anchor segmen (5.%) (b) dfferen anchor (.3%) (c): mulsory n an anchor seg. (d): con. spors brefngs (e): con. shor brefngs (f) sep. by musc or anm. (g): weaher repor (h): anchor leadn before comm. (): comm. afer spors : sory : weaher : commercal : msc./anmaon : vsual anchors 4

5 News Sory Segmenaon Objecve: a sory boundary a me τ k? τ = { sho boundares or sgnfcan pauses} k observaon τ k {vdeo, audo} me τ τ k k + An anchor face? moon changes? change from musc o speech? speech segmen? {cue words} appear {cue words} j appear Modaly Vdeo Audo Tex Msc. Need o decde how o formulae feaures Challenge: dverse feaures Raw Feaures moon sho boundary face commercal pause pch jump sgnfcan pause musc./spch. dsc. spch seg./rapdy ASR cue erms VOCR cue erms ex seg. score combnaoral spors Daa Type segmen pon segmen segmen pon pon pon segmen segmen pon pon pon pon segmen Value connuous bnary connuous bnary connuous connuous connuous bnary connuous bnary bnary connuous bnary bnary ex seg. score musc comm. sgpas face sho moon canddae pon One way s o use bnary predcae: f x > hreshold, hen predc segmen boundary (b=) 5

6 6 Example Predcaes The surroundng observaon wndow has a pause wh he duraon larger han.5 second. Pause A speech segmen sars n he surroundng observaon wndow Speech segmen 6 A commercal sars n 5 o seconds afer he canddae pon. Commercal 7 A speech segmen ends afer he canddae pon Speech segmen 8 A speech segmen before he canddae pon Speech segmen 5 An anchor face segmen occupes a leas % of nex wndow Anchor face 9 An audo pause wh he duraon larger han. second appears afer he boundary pon. Pause 3 An anchor face segmen jus sars afer he canddae pon Anchor Face Sgnfcan pause Sgnfcan pause & noncommercal raw feaure se The surroundng observaon wndow has a sgnfcan pause wh he pch jump nensy larger han he normalzed pch hreshold. and he pause duraon larger han.5 second. 4 A sgnfcan pause whn he noncommercal secon appears n he surroundng observaon wndow. Predcaes no Collec Feaures from Tranng Samples b One ranng sample Each row represens one predcae Face Moon Sgnfcan Pause Speech segmen Commercal Tex segmenaon score ASR cue erms f

7 Choose Model Maxmum enropy model For example λ λ λ ( = ) = /( + ) λ λ λ ( = ) = /( + ) qλ ( b x) = e Z ( x) λ f ( xb, ) where f ( x, b), b {,} predcae f = ' anchor face' f = ' sgnfcan pause' f curren observaon: face = YES pause = NO qb YES x e e e qb NO x e e e λ Classfcaon: f q(b=yes x) >.5, hen predc YES. Background: Enropy Enropy (bs) H = KullbackLebler (KL) Dsance A measure of dsance beween dsrbuons D KL D KL, and = ( p( x), q( x) ) ff p m = p log p = or = ( ) = q( ) x q( x) q( x)log p( x) q( x) q( x)log dx p( x) No necessarly symmerc, may no sasfy rangular nequaly 7

8 How o Deermne he Weghs n he Model? Esmae q ( b x) from ranng daa T = {( x λ k, bk)} by mnmzng KullbackLebler dvergence, defned as pb ( x) D( p qλ ) = p ( b, x)log p x b qλ ( b x) = pxb (, )log q( b x) + consan( p ) Fnd o maxmze he enropy L ( ) (, )log ( ) p q p λ x b qλ b x Ieravely fnd λ = λ + Δλ λ x x b b λ pxb (, ) f(, ) log, xb xb Δ λ = M pxq ( ) ( b x) f(, ) xb, λ xb The objecve funcon s convex. So he erave process can reach he opmum. λ emprcal dsrbuon from daa D( p q) q esmaed model The same model used o selec feaures Inpu: collecon of canddae feaures, ranng samples, and he desred model sze Oupu: opmal subse of feaures and her correspondng exponenal weghs Curren model q augmened wh feaure h wh wegh α, α h( x, b) e q( b x) qα, h( b x) = Zα ( x) Selec he canddae whch mproves curren model he mos, n each eraon; { { α, h }} α { { Lp qα, h Lp q }} * h = D p q D p q h C arg max sup ( ) ( ) = arg max sup ( ) ( ) h C α q Reducon of dvergence Increase of loglkelhood 8

9 Opmal Feaures (from CNN news vdeo) * The frs A+V feaures auomacally dscovered for he CNN channel no raw feaure se Anchor Face Sgnfcan pause & noncommercal gan λ nerpreaon An anchor face segmen jus sars afer he canddae pon A sgnfcan pause whn he noncommercal secon appears n he surroundng observaon wndow. 3 Pause An audo pause wh he duraon larger han. second appears afer he boundary pon. 4 Sgnfcan pause The surroundng observaon wndow has a sgnfcan pause wh he pch jump nensy larger han he normalzed pch hreshold. and he pause duraon larger han.5 second. 5 Speech segmen A speech segmen before he canddae pon 6 Speech segmen A speech segmen sars n he surroundng observaon wndow 7 Commercal.5.78 A commercal sars n 5 o seconds afer he canddae pon. 8 Speech segmen..47 A speech segmen ends afer he canddae pon 9 Anchor face.6.75 An anchor face segmen occupes a leas % of nex wndow Pause The surroundng observaon wndow has a pause wh he duraon larger han.5 second. every modaly helps : anchor face, prosody, and speech segmen Issues of hs model (Dscusson) Feaures Bnary predcaes reasonable? Does capure he unque characerscs? qλ ( b x) = e Z ( x) Models Exponenal models wh lnear weghs adequae? How abou he learnng algorhm? Enough daa o learn he probably models? Speed and complexy bnary predcae: f x > hreshold, hen predc segmen boundary (b=) λ λ f( xb, ) 9

10 A Broader Perspecve: Classfcaon Paradgms Lkelhood P(x C=) > P(x C=)? Class Class hreshold Generave x feaures x Decson f(x) > Boundary f(x) < x f(x) dscrmnan funcon Dscrmnave Whch one does he prevous model fall no? Generave Models Gaussan Mxure Model

11 One common ssue s o learn probably models Gaussan dsrbuon p( x) = e πσ Gven he same mean and varance, Gaussan has he max enropy Sum of a large number of small, ndependen random varables approaches Gaussan ( x μ ) σ Pr[ x μ σ ].68 Pr[ x μ σ ].95 Pr[ x μ 3σ ].997 Mahalanobs dsance from x o μ r = x μ / σ Enropy of Gaussan : H gau =.5 + log ( πσ ) Comparson w. unform Mulvarae Gaussan Mulvarae Gaussan, T ( x μ) Σ ( x μ) N( μ, Σ) p( x) = e D / ( π ) Σ where x, μ are Ddmensonal vecors Σ : D D marx Σ s he deermnan of Σ x x σ Σ = x σ General Σ ( σ ) =Σ (, ) = cov( ( ), ( )) j j x x j = E[( x ( ) μ( ))( x( j) μ( j))] x

12 Effec of Lnear Transformaon Lnear ransformaon of Gaussan y= A x y N( A μ, AΣA) y: k, A: d k, x: d Whenng ransform Σ=ΦΛΦ (SVD, Egenvecors) Φ : [ φ φ... φ ] columns are orhogonal ev Λ: dag. marx of egenvalues A Σ A= AΦΛΦ A= I d Whenng Trans. also PCA Transform y = A x N( A μ, I) w w A w =ΦΛ / Mahalanobs Dsance Mahalanobs ds n D p( x) = e πσ ( x μ ) σ Mahalanobs dsance from x o μ r = x μ / σ Pr[ x μ σ] = Pr[ r ].68 Pr[ x μ σ] = Pr[ r ].95 Pr[ x μ 3 σ] = Pr[ r 3].997 Mul Dmensonal case p( x) = exp( ( x μ) Σ ( x μ) ) D / ( π ) Σ = exp( / ( x μ) ΦΛ Φ ( x μ) ) D ( π ) Σ = exp( ( A / w ( x μ) ) ( Aw ( x μ) )) D π Σ ( ) r s he Mahalanobs dsance s also he Eucldean ds n he PCA space r

13 Malalanobs dsance from pon x o he mean of a Gaussan x u Gaussan Used In Classfcaon ( j ) ( ) x C, f p C x p C x, when j j MAP classfer p ML classfcaon: ( Cj x) pror lkelhood pxc ( j) pc ( j) = px ( ) ( j ) evdence f unform p C C = arg max p( x C) pxc ( ) can be modeled by Gaussans j C 3

14 How o Esmae Gaussan Model Parameers? l = ln P( xk θ ) = ln( πσ ) ( x ) k μ σ (ln Px ( k θ )) ( xk μ ) μ θ θl = = ( x ) (ln Px ( )) k μ k θ θ σ ( σ ) xk n θ ln Px ( k θ ) = k ˆ μ = ˆ σ = ( x ˆ k μ) n n k = θ = (μ, σ ) k MulDmensonal θ= ( μ, Σ) ˆ (/ ) ˆ μ = n x Σ= (/ n) ( x μ)( x μ) k k k k k ML esmaor: mean > sample mean, varance >based sample varance Mxure Of Gaussans Real dsrbuons seldom follow a sngle Gaussan mxure of Gaussans p(x) π π p( x) = p( x, z) = pzpx z ( ) ( z ) Gven daa x,, x N, defne loglkelhood: N ( π μ π μ ) l= log( p( x )) = log N( x, Σ ) + N( x, Σ ) n responsbly of ( componen ) z z ( μ, ) = π N x Σ z z z = Z z= π z ( π ) T ( x μz) Σz ( x μz) n n n n= Poseror probably of x beng generaed by a specfc componen D / ( ) poserers = τ = p z = x, θ, θ = { μ, Σ, μ, Σ } Σ z e x 4

15 EM Opmzaon Mehod log lkelhood Maxmzaon of l(θ) drecly s hard due o log_of_sum Insead, look a Δl( θ ) = l( θ ) l( θ ), θ : curren esmaon of θ Jensen s Inequaly If f s concave, e. g., f log If f s convex, f N n= ( x n, z ) l( θ ) = log p θ f ( E{ x} ) E( f{ x} ) f ( E{ g( x) }) E( f{ g( x) }) ( x) = log( x) ( E{} x ) E( f{ x} ) z px p log ( x ), Log(x) p(x) where x p = Auxlary Funcon n EM Δ l( θ ) = l( θ) l( θ ) = log p( x θ ) log p( x θ ) N p x = log p x N N ( n θ ) ( θ ) n= n N = log p x p z x, log ( θ ) n n n= n= = log ( n, z θ ) (, z θ ) n n= z p xn N ( θ ) ( θ ) N p( xn, z θ ) p( x θ ) ( θ ) ( θ ) n= z n p x, z p x, z p x p x, z n n n= z n n N = ( θ ) log p z x, ( n, z θ ) (, z θ ) p x n n= z p xn ( ) = Q θ θ margnalzaon Jensen s nequaly Noe here s no log_of_sum. So akng dervave s easer 5

16 EM mproves lkelhood Auxlary funcon derved based on Jensen s Inequaly, N Now esmae θ + by maxmzng Q θ+ = arg maxq( θ θ ) θ ( z x, θ ) log p( x, z ) + cons Q( θ θ ) p z n n θ = n= expecaon over z wh curren θ jon lkelhood of observed & hdden z So n he expecaon sep, compue τ n, he responsbly of componen z for sample x n In he maxmzaon sep, ake dervave of Q over θ, and fnd he new esmae for θ (Noe only sum_of_log s nvolved) EM Always Improves Lkelhood Why does EM always mprove l(θ)? Δ l( θ ) = l( θ ) l( θ ) Q( θ θ ) Q( θ + θ ) = max Q( θ θ ) Q( θ θ ) θ N ( z x, θ ) log p( x, z ) + cons Q( θ θ ) p z n n θ = General seps of EM: n= expecaon over jon lkelhood of z wh curren θ observed & hdden = l θ + Δ ( ) Defne lkelhood model wh parameers θ Idenfy hdden varables z Derve he auxlary funcon and he E and M equaons In each eraon, esmae he poserors of hdden varables Reesmae he model parameers. Repea unl sop 6

17 ExpecaonMaxmzaon (EM) Soluon of GMM τ EM for esmang θ and. N ( z x, θ ) log p( x, z ) + cons Q( θ θ ) p z n n θ = n= expecaon over z wh curren θ jon lkelhoodof observed & hdden Follow dvde and conquer prncple. In eraon sep : ( ) ( ) ( ) () π N( xn μ, Σ ) Expecaon : τ n = ( ) () () Wegh from componen π N( xn μ j, Σ j ) j T ( ) τ n x () () () n τ n ( xn μ )( xn μ ) ( + ) n Maxmaon : μ = ( + ) n () Σ = τ () n τ n n n () τ n ( ) n Dvde daa o each group, π + = Compue mean and varance N from each group Dscrmnave Models 7

18 Smple Dscrmnave Classfer Fnd mos opporunsc dmenson n each sep Selecon creron Enropy Varance before / afer Sop creron Avod overfng x() TH o o o o + ++ o o o o + TH x() x(3) x()>th Y N C + x()>th Y N C o C + Paramerc Dscrmnan Analyss Example of dscrmnan funcon Lnear and quadrac dscrmnan x() + o o + o + Decson o boundary o + x() g( x) = ax + bx x() gx ( ) > o o Decson o o o o o boundary o + o o o gx ( ) < x() g( x) = ax + bx + cx x 8

19 Lnear Dscrmnan Classfers g( x) = w x+ w Augmened Vecor x y = = x x d fnd wegh w a w w w w w d = = map y o class ω f g( y)>, oherwse class ω Desgn Objecve ay ay >b, f class ω, < b, f class ω, y, b> Each y defnes a half plane n he wegh space (a). Noe we search wegh soluons n he aspace. and bas wo g( x) = g( y) = a y aspace Suppor Vecor Machne (uoral by Burges 98) Look for separaon plane wh he hghes margn Decson boundary : H wx+ b= Lnearly separable wx + b + x n class ω.e. y =+ wx + b x n class ω.e. y = Inequaly consrans : ( label ) ( wx + b), Two parallel hyperplanes defnng he margn hyperplane H( H ) : + wx+ b=+ hyperplane H( H ) : wx+ b= Margn: sum of dsances of he closes pons o he separaon plane margn = / w Bes plane defned by w and b 9

20 Fndng he maxmal margn mnmze w Use he Lagrange mulpler echnque for he consraned op. problem mnmze L wr... w and b p L = ( y ( + b) ) l p w α w x = α dl l p = w= αyx dw = dl l p = αy = db = subjec o nequaly consrans y ( wx + b) =,, l ; y s label maxmze L wr... w and b L D j j j = = j= = l l l = α αα y y x x wh condons : l α α y = D Quadrac Programmng Prmal Problem Dual Problem Prme and Dual have he same soluons of w and b KKT condons l * w = αyx = α > Wegh sum from posve class = Wegh sum from negave class Drecon of w: roughly from negave suppor vecors o posve ones w α = f α >, x s on H+ or H and s a suppor vecor

21 Nonseparable: no every sample can be correcly classfed Add slack varables ξ f ξ >, hen x s msclassfed (.e. ranng error) Lagrange mulpler: mnmze New objecve funcon Ensure posvy KKT Condons for nonseparable Soluons If < α < C, hen ξ = : x s on H or H If α = C, hen ξ > : x s nsde he margn or on he wrong sde or ξ = : x s on H or H

22 All he pons locaed n he margn gap or he wrong sde wll ge α = C Wha f C ncreases? b and ξ boh < α < C α = C afer C ncreases When C ncreases, ncorrec samples ge more weghs ry o mnmze ncorrec samples beer ranng accuracy, bu smaller margn less generalzaon performance Generalzed Lnear Dscrmnan Funcons Include more han jus he lnear erms d d d g( x ) = w + wx + w xx = w + w x + x Wx In general Example j j = = j= Shape of decson boundary ellpsod, hyperhyperbolod, lnes ec d g( x) = aφ ( x) = aφ = gx ( ) = a+ ax+ ax 3 = [ a a a3] x x g( x) = a x + a x + a x x 3 = [ a a a ][ x x x ] 3 Daa become separable n hgherdmensonal space learnng parameers n hgh dmenson s hard (curse of dmensonaly) nsead, ry o maxmze margns SVM

23 NonLnear Space Map o a hgh dmensonal space, o make he daa separable Fnd he SVM n he hghdm space N s g( x) = α yφ( s ) Φ ( x) + b = w Luckly, we don have o fnd We can use he same mehod o maxmze L D o fnd l l l L = α αα y y Φ( x ) Φ( x ) D j j j = = j= l l l α αα jyyk j ( x, x j) = = j= = Φ( s ) nor α y Φ( s ) l = Insead, we defne kernel K ( s, x) = Φ( s ) Φ( x) N s g( x) = α yk( s, x) + b = α Some popular kernels polynomal Gaussan Radal Bass Funcon (RBF) sgmodal neural nework separable Cubc polynomal nonseparable 3

24 See he SVM demos (Erc) Evaluaon V n = " Relevan" " Irrelevan " n = N N Tes Image K deeced resuls K R = A/( A + C ) Deecon A = V Recall n n K False Alarms B = ( V n n ) Precson = P = A/( A + B) N Msses C = ( V A n n) = Fallou N Correc Dsmssals D = ( ( V B F = B /( B + D) n n)) = Combned Reurned D B A C Relevan Ground Truh P R F = ( P + R) / 4

25 Evaluaon Measures Precson Recall Curve P R. Recever Operang Characersc (ROC Curve) A (h) A vs B B(false) Evaluaon Merc: Average Precson S Ranked ls of daa n response o a query D5 D8 D63 D Ds Ground ruh Precson / / /3 3/4 3/5 3/6 3/7 s R j Average precson: AP = I j, R j= j R : oal number of relevan daa AP measures he average of precson values a R relevan daa pons Precson. P 3 R j j j 5

26 Evaluaon Merc: Average Precson () AP depends on he rankngs of relevan daa and he sze of he relevan daa se. E.g., R= Case I: Pre: AP= Case II: + Pre: / / / / / / / / / / AP=/ Case III: Pre: / / / AP~.3 Example: SVM for News Sory Segmenaon Precson SVMbased Maxmum Enropy SVM wh 95 bnary feaures performs he bes SVM has excellen feaure fuson capably Predcae bnarzaon shelds nose n he feaure BST Recall 6

27 Tranng / Valdaon / Tesng x() Tranng Use hs o opmze parameers x() x() Valdaon Selec opmal models hrough valdaon x() x() Tesng Evaluae performance over es daa x() Approprae f he same dsrbuons are followed over dfferen ses Tranng / Valdaon / Tesng (con.) Cross valdaon, leaveoneou K Tranng Tesng Roae he choce of he es se and average he performance over runs 7

28 Curse of Dmensonaly and Overranng x() overfng A case of overfng x() es ranng es performance Very rough rule of humb # of ranng samples per class feaure dmenson > 8

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4 CS434a/54a: Paern Recognon Prof. Olga Veksler Lecure 4 Oulne Normal Random Varable Properes Dscrmnan funcons Why Normal Random Varables? Analycally racable Works well when observaon comes form a corruped

More information

An introduction to Support Vector Machine

An introduction to Support Vector Machine An nroducon o Suppor Vecor Machne 報告者 : 黃立德 References: Smon Haykn, "Neural Neworks: a comprehensve foundaon, second edon, 999, Chaper 2,6 Nello Chrsann, John Shawe-Tayer, An Inroducon o Suppor Vecor Machnes,

More information

Clustering (Bishop ch 9)

Clustering (Bishop ch 9) Cluserng (Bshop ch 9) Reference: Daa Mnng by Margare Dunham (a slde source) 1 Cluserng Cluserng s unsupervsed learnng, here are no class labels Wan o fnd groups of smlar nsances Ofen use a dsance measure

More information

CHAPTER 10: LINEAR DISCRIMINATION

CHAPTER 10: LINEAR DISCRIMINATION CHAPER : LINEAR DISCRIMINAION Dscrmnan-based Classfcaon 3 In classfcaon h K classes (C,C,, C k ) We defned dscrmnan funcon g j (), j=,,,k hen gven an es eample, e chose (predced) s class label as C f g

More information

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance INF 43 3.. Repeon Anne Solberg (anne@f.uo.no Bayes rule for a classfcaon problem Suppose we have J, =,...J classes. s he class label for a pxel, and x s he observed feaure vecor. We can use Bayes rule

More information

Machine Learning 2nd Edition

Machine Learning 2nd Edition INTRODUCTION TO Lecure Sldes for Machne Learnng nd Edon ETHEM ALPAYDIN, modfed by Leonardo Bobadlla and some pars from hp://www.cs.au.ac.l/~aparzn/machnelearnng/ The MIT Press, 00 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/mle

More information

Advanced Machine Learning & Perception

Advanced Machine Learning & Perception Advanced Machne Learnng & Percepon Insrucor: Tony Jebara SVM Feaure & Kernel Selecon SVM Eensons Feaure Selecon (Flerng and Wrappng) SVM Feaure Selecon SVM Kernel Selecon SVM Eensons Classfcaon Feaure/Kernel

More information

( ) [ ] MAP Decision Rule

( ) [ ] MAP Decision Rule Announcemens Bayes Decson Theory wh Normal Dsrbuons HW0 due oday HW o be assgned soon Proec descrpon posed Bomercs CSE 90 Lecure 4 CSE90, Sprng 04 CSE90, Sprng 04 Key Probables 4 ω class label X feaure

More information

Lecture 11 SVM cont

Lecture 11 SVM cont Lecure SVM con. 0 008 Wha we have done so far We have esalshed ha we wan o fnd a lnear decson oundary whose margn s he larges We know how o measure he margn of a lnear decson oundary Tha s: he mnmum geomerc

More information

Introduction to Boosting

Introduction to Boosting Inroducon o Boosng Cynha Rudn PACM, Prnceon Unversy Advsors Ingrd Daubeches and Rober Schapre Say you have a daabase of news arcles, +, +, -, -, +, +, -, -, +, +, -, -, +, +, -, + where arcles are labeled

More information

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model Probablsc Model for Tme-seres Daa: Hdden Markov Model Hrosh Mamsuka Bonformacs Cener Kyoo Unversy Oulne Three Problems for probablsc models n machne learnng. Compung lkelhood 2. Learnng 3. Parsng (predcon

More information

Machine Learning Linear Regression

Machine Learning Linear Regression Machne Learnng Lnear Regresson Lesson 3 Lnear Regresson Bascs of Regresson Leas Squares esmaon Polynomal Regresson Bass funcons Regresson model Regularzed Regresson Sascal Regresson Mamum Lkelhood (ML)

More information

Robust and Accurate Cancer Classification with Gene Expression Profiling

Robust and Accurate Cancer Classification with Gene Expression Profiling Robus and Accurae Cancer Classfcaon wh Gene Expresson Proflng (Compuaonal ysems Bology, 2005) Auhor: Hafeng L, Keshu Zhang, ao Jang Oulne Background LDA (lnear dscrmnan analyss) and small sample sze problem

More information

Normal Random Variable and its discriminant functions

Normal Random Variable and its discriminant functions Noral Rando Varable and s dscrnan funcons Oulne Noral Rando Varable Properes Dscrnan funcons Why Noral Rando Varables? Analycally racable Works well when observaon coes for a corruped snle prooype 3 The

More information

Lecture 6: Learning for Control (Generalised Linear Regression)

Lecture 6: Learning for Control (Generalised Linear Regression) Lecure 6: Learnng for Conrol (Generalsed Lnear Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure 6: RLSC - Prof. Sehu Vjayakumar Lnear Regresson

More information

Department of Economics University of Toronto

Department of Economics University of Toronto Deparmen of Economcs Unversy of Torono ECO408F M.A. Economercs Lecure Noes on Heeroskedascy Heeroskedascy o Ths lecure nvolves lookng a modfcaons we need o make o deal wh he regresson model when some of

More information

Robustness Experiments with Two Variance Components

Robustness Experiments with Two Variance Components Naonal Insue of Sandards and Technology (NIST) Informaon Technology Laboraory (ITL) Sascal Engneerng Dvson (SED) Robusness Expermens wh Two Varance Componens by Ana Ivelsse Avlés avles@ns.gov Conference

More information

CHAPTER 2: Supervised Learning

CHAPTER 2: Supervised Learning HATER 2: Supervsed Learnng Learnng a lass from Eamples lass of a famly car redcon: Is car a famly car? Knowledge eracon: Wha do people epec from a famly car? Oupu: osve (+) and negave ( ) eamples Inpu

More information

Lecture VI Regression

Lecture VI Regression Lecure VI Regresson (Lnear Mehods for Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure VI: MLSC - Dr. Sehu Vjayakumar Lnear Regresson Model M

More information

Pattern Classification (III) & Pattern Verification

Pattern Classification (III) & Pattern Verification Preare by Prof. Hu Jang CSE638 --4 CSE638 3. Seech & Language Processng o.5 Paern Classfcaon III & Paern Verfcaon Prof. Hu Jang Dearmen of Comuer Scence an Engneerng York Unversy Moel Parameer Esmaon Maxmum

More information

CHAPTER 5: MULTIVARIATE METHODS

CHAPTER 5: MULTIVARIATE METHODS CHAPER 5: MULIVARIAE MEHODS Mulvarae Daa 3 Mulple measuremens (sensors) npus/feaures/arbues: -varae N nsances/observaons/eamples Each row s an eample Each column represens a feaure X a b correspons o he

More information

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms Course organzaon Inroducon Wee -2) Course nroducon A bref nroducon o molecular bology A bref nroducon o sequence comparson Par I: Algorhms for Sequence Analyss Wee 3-8) Chaper -3, Models and heores» Probably

More information

Lecture 2 L n i e n a e r a M od o e d l e s

Lecture 2 L n i e n a e r a M od o e d l e s Lecure Lnear Models Las lecure You have learned abou ha s machne learnng Supervsed learnng Unsupervsed learnng Renforcemen learnng You have seen an eample learnng problem and he general process ha one

More information

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering CS 536: Machne Learnng Nonparamerc Densy Esmaon Unsupervsed Learnng - Cluserng Fall 2005 Ahmed Elgammal Dep of Compuer Scence Rugers Unversy CS 536 Densy Esmaon - Cluserng - 1 Oulnes Densy esmaon Nonparamerc

More information

Computing Relevance, Similarity: The Vector Space Model

Computing Relevance, Similarity: The Vector Space Model Compung Relevance, Smlary: The Vecor Space Model Based on Larson and Hears s sldes a UC-Bereley hp://.sms.bereley.edu/courses/s0/f00/ aabase Managemen Sysems, R. Ramarshnan ocumen Vecors v ocumens are

More information

Variants of Pegasos. December 11, 2009

Variants of Pegasos. December 11, 2009 Inroducon Varans of Pegasos SooWoong Ryu bshboy@sanford.edu December, 009 Youngsoo Cho yc344@sanford.edu Developng a new SVM algorhm s ongong research opc. Among many exng SVM algorhms, we wll focus on

More information

January Examinations 2012

January Examinations 2012 Page of 5 EC79 January Examnaons No. of Pages: 5 No. of Quesons: 8 Subjec ECONOMICS (POSTGRADUATE) Tle of Paper EC79 QUANTITATIVE METHODS FOR BUSINESS AND FINANCE Tme Allowed Two Hours ( hours) Insrucons

More information

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model BGC1: Survval and even hsory analyss Oslo, March-May 212 Monday May 7h and Tuesday May 8h The addve regresson model Ørnulf Borgan Deparmen of Mahemacs Unversy of Oslo Oulne of program: Recapulaon Counng

More information

Notes on the stability of dynamic systems and the use of Eigen Values.

Notes on the stability of dynamic systems and the use of Eigen Values. Noes on he sabl of dnamc ssems and he use of Egen Values. Source: Macro II course noes, Dr. Davd Bessler s Tme Seres course noes, zarads (999) Ineremporal Macroeconomcs chaper 4 & Techncal ppend, and Hamlon

More information

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6)

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6) Econ7 Appled Economercs Topc 5: Specfcaon: Choosng Independen Varables (Sudenmund, Chaper 6 Specfcaon errors ha we wll deal wh: wrong ndependen varable; wrong funconal form. Ths lecure deals wh wrong ndependen

More information

FACIAL IMAGE FEATURE EXTRACTION USING SUPPORT VECTOR MACHINES

FACIAL IMAGE FEATURE EXTRACTION USING SUPPORT VECTOR MACHINES FACIAL IMAGE FEATURE EXTRACTION USING SUPPORT VECTOR MACHINES H. Abrsham Moghaddam K. N. Toos Unversy of Technology, P.O. Box 635-355, Tehran, Iran moghadam@saba.knu.ac.r M. Ghayoum Islamc Azad Unversy,

More information

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015

Learning Objectives. Self Organization Map. Hamming Distance(1/5) Introduction. Hamming Distance(3/5) Hamming Distance(2/5) 15/04/2015 /4/ Learnng Objecves Self Organzaon Map Learnng whou Exaples. Inroducon. MAXNET 3. Cluserng 4. Feaure Map. Self-organzng Feaure Map 6. Concluson 38 Inroducon. Learnng whou exaples. Daa are npu o he syse

More information

Chapter 6 DETECTION AND ESTIMATION: Model of digital communication system. Fundamental issues in digital communications are

Chapter 6 DETECTION AND ESTIMATION: Model of digital communication system. Fundamental issues in digital communications are Chaper 6 DEECIO AD EIMAIO: Fundamenal ssues n dgal communcaons are. Deecon and. Esmaon Deecon heory: I deals wh he desgn and evaluaon of decson makng processor ha observes he receved sgnal and guesses

More information

CHAPTER 7: CLUSTERING

CHAPTER 7: CLUSTERING CHAPTER 7: CLUSTERING Semparamerc Densy Esmaon 3 Paramerc: Assume a snge mode for p ( C ) (Chapers 4 and 5) Semparamerc: p ( C ) s a mure of denses Mupe possbe epanaons/prooypes: Dfferen handwrng syes,

More information

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany Herarchcal Markov Normal Mxure models wh Applcaons o Fnancal Asse Reurns Appendx: Proofs of Theorems and Condonal Poseror Dsrbuons John Geweke a and Gann Amsano b a Deparmens of Economcs and Sascs, Unversy

More information

Linear Classification, SVMs and Nearest Neighbors

Linear Classification, SVMs and Nearest Neighbors 1 CSE 473 Lecture 25 (Chapter 18) Lnear Classfcaton, SVMs and Nearest Neghbors CSE AI faculty + Chrs Bshop, Dan Klen, Stuart Russell, Andrew Moore Motvaton: Face Detecton How do we buld a classfer to dstngush

More information

Single-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method

Single-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method 10 h US Naonal Congress on Compuaonal Mechancs Columbus, Oho 16-19, 2009 Sngle-loop Sysem Relably-Based Desgn & Topology Opmzaon (SRBDO/SRBTO): A Marx-based Sysem Relably (MSR) Mehod Tam Nguyen, Junho

More information

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!") i+1,q - [(!

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!) i+1,q - [(! ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL The frs hng o es n wo-way ANOVA: Is here neracon? "No neracon" means: The man effecs model would f. Ths n urn means: In he neracon plo (wh A on he horzonal

More information

Improved Classification Based on Predictive Association Rules

Improved Classification Based on Predictive Association Rules Proceedngs of he 009 IEEE Inernaonal Conference on Sysems, Man, and Cybernecs San Anono, TX, USA - Ocober 009 Improved Classfcaon Based on Predcve Assocaon Rules Zhxn Hao, Xuan Wang, Ln Yao, Yaoyun Zhang

More information

Math 128b Project. Jude Yuen

Math 128b Project. Jude Yuen Mah 8b Proec Jude Yuen . Inroducon Le { Z } be a sequence of observed ndependen vecor varables. If he elemens of Z have a on normal dsrbuon hen { Z } has a mean vecor Z and a varancecovarance marx z. Geomercally

More information

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue. Lnear Algebra Lecure # Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons

More information

TSS = SST + SSE An orthogonal partition of the total SS

TSS = SST + SSE An orthogonal partition of the total SS ANOVA: Topc 4. Orhogonal conrass [ST&D p. 183] H 0 : µ 1 = µ =... = µ H 1 : The mean of a leas one reamen group s dfferen To es hs hypohess, a basc ANOVA allocaes he varaon among reamen means (SST) equally

More information

Anomaly Detection. Lecture Notes for Chapter 9. Introduction to Data Mining, 2 nd Edition by Tan, Steinbach, Karpatne, Kumar

Anomaly Detection. Lecture Notes for Chapter 9. Introduction to Data Mining, 2 nd Edition by Tan, Steinbach, Karpatne, Kumar Anomaly eecon Lecure Noes for Chaper 9 Inroducon o aa Mnng, 2 nd Edon by Tan, Senbach, Karpane, Kumar 2/14/18 Inroducon o aa Mnng, 2nd Edon 1 Anomaly/Ouler eecon Wha are anomales/oulers? The se of daa

More information

Support Vector Machines. Vibhav Gogate The University of Texas at dallas

Support Vector Machines. Vibhav Gogate The University of Texas at dallas Support Vector Machnes Vbhav Gogate he Unversty of exas at dallas What We have Learned So Far? 1. Decson rees. Naïve Bayes 3. Lnear Regresson 4. Logstc Regresson 5. Perceptron 6. Neural networks 7. K-Nearest

More information

FI 3103 Quantum Physics

FI 3103 Quantum Physics /9/4 FI 33 Quanum Physcs Aleander A. Iskandar Physcs of Magnesm and Phooncs Research Grou Insu Teknolog Bandung Basc Conces n Quanum Physcs Probably and Eecaon Value Hesenberg Uncerany Prncle Wave Funcon

More information

ECE 366 Honors Section Fall 2009 Project Description

ECE 366 Honors Section Fall 2009 Project Description ECE 366 Honors Secon Fall 2009 Projec Descrpon Inroducon: Muscal genres are caegorcal labels creaed by humans o characerze dfferen ypes of musc. A muscal genre s characerzed by he common characerscs shared

More information

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s Ordnary Dfferenal Equaons n Neuroscence wh Malab eamples. Am - Gan undersandng of how o se up and solve ODE s Am Undersand how o se up an solve a smple eample of he Hebb rule n D Our goal a end of class

More information

Objectives. Image R 1. Segmentation. Objects. Pixels R N. i 1 i Fall LIST 2

Objectives. Image R 1. Segmentation. Objects. Pixels R N. i 1 i Fall LIST 2 Image Segmenaon Obecves Image Pels Segmenaon R Obecs R N N R I -Fall LIS Ke Problems Feaure Sace Dsconnu and Smlar Classfer Lnear nonlnear - fuzz arallel seral -Fall LIS 3 Feaure Eracon Image Sace Feaure

More information

ISCA Archive

ISCA Archive ISCA Archve hp://www.sca-speech.org/archve ODYSSEY4 - The Speaker and Language Recognon Workshop Toledo, Span May 3 - June 3, 4 Speaker Idenfcaon wh Dual Penalzed Logsc Regresson Machne Tomoko Masu and

More information

Graduate Macroeconomics 2 Problem set 5. - Solutions

Graduate Macroeconomics 2 Problem set 5. - Solutions Graduae Macroeconomcs 2 Problem se. - Soluons Queson 1 To answer hs queson we need he frms frs order condons and he equaon ha deermnes he number of frms n equlbrum. The frms frs order condons are: F K

More information

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL Sco Wsdom, John Hershey 2, Jonahan Le Roux 2, and Shnj Waanabe 2 Deparmen o Elecrcal Engneerng, Unversy o Washngon, Seale, WA, USA

More information

GMM parameter estimation. Xiaoye Lu CMPS290c Final Project

GMM parameter estimation. Xiaoye Lu CMPS290c Final Project GMM paraeer esaon Xaoye Lu M290c Fnal rojec GMM nroducon Gaussan ure Model obnaon of several gaussan coponens Noaon: For each Gaussan dsrbuon:, s he ean and covarance ar. A GMM h ures(coponens): p ( 2π

More information

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS R&RATA # Vol.) 8, March FURTHER AALYSIS OF COFIDECE ITERVALS FOR LARGE CLIET/SERVER COMPUTER ETWORKS Vyacheslav Abramov School of Mahemacal Scences, Monash Unversy, Buldng 8, Level 4, Clayon Campus, Wellngon

More information

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue. Mah E-b Lecure #0 Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons are

More information

WiH Wei He

WiH Wei He Sysem Idenfcaon of onlnear Sae-Space Space Baery odels WH We He wehe@calce.umd.edu Advsor: Dr. Chaochao Chen Deparmen of echancal Engneerng Unversy of aryland, College Par 1 Unversy of aryland Bacground

More information

Fall 2010 Graduate Course on Dynamic Learning

Fall 2010 Graduate Course on Dynamic Learning Fall 200 Graduae Course on Dynamc Learnng Chaper 4: Parcle Flers Sepember 27, 200 Byoung-Tak Zhang School of Compuer Scence and Engneerng & Cognve Scence and Bran Scence Programs Seoul aonal Unversy hp://b.snu.ac.kr/~bzhang/

More information

Supervised Learning in Multilayer Networks

Supervised Learning in Multilayer Networks Copyrgh Cambrdge Unversy Press 23. On-screen vewng permed. Prnng no permed. hp://www.cambrdge.org/521642981 You can buy hs book for 3 pounds or $5. See hp://www.nference.phy.cam.ac.uk/mackay/la/ for lnks.

More information

Solution in semi infinite diffusion couples (error function analysis)

Solution in semi infinite diffusion couples (error function analysis) Soluon n sem nfne dffuson couples (error funcon analyss) Le us consder now he sem nfne dffuson couple of wo blocks wh concenraon of and I means ha, n a A- bnary sysem, s bondng beween wo blocks made of

More information

EE 6885 Statistical Pattern Recognition

EE 6885 Statistical Pattern Recognition EE 6885 Sascal Paer Recogo Fall 005 Prof. Shh-Fu Chag hp://www.ee.columba.edu/~sfchag Lecure 5 (9//05 4- Readg Model Parameer Esmao ML Esmao, Chap. 3. Mure of Gaussa ad EM Referece Boo, HTF Chap. 8.5 Teboo,

More information

Hidden Markov Models

Hidden Markov Models 11-755 Machne Learnng for Sgnal Processng Hdden Markov Models Class 15. 12 Oc 2010 1 Admnsrva HW2 due Tuesday Is everyone on he projecs page? Where are your projec proposals? 2 Recap: Wha s an HMM Probablsc

More information

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment EEL 6266 Power Sysem Operaon and Conrol Chaper 5 Un Commmen Dynamc programmng chef advanage over enumeraon schemes s he reducon n he dmensonaly of he problem n a src prory order scheme, here are only N

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Face Detection: The Problem

Face Detection: The Problem Face Deecon and Head Trackng Yng Wu yngwu@ece.norhwesern.edu Elecrcal Engneerng & Comuer Scence Norhwesern Unversy, Evanson, IL h://www.ece.norhwesern.edu/~yngwu Face Deecon: The Problem The Goal: Idenfy

More information

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005 Dynamc Team Decson Theory EECS 558 Proec Shruvandana Sharma and Davd Shuman December 0, 005 Oulne Inroducon o Team Decson Theory Decomposon of he Dynamc Team Decson Problem Equvalence of Sac and Dynamc

More information

The Finite Element Method for the Analysis of Non-Linear and Dynamic Systems

The Finite Element Method for the Analysis of Non-Linear and Dynamic Systems Swss Federal Insue of Page 1 The Fne Elemen Mehod for he Analyss of Non-Lnear and Dynamc Sysems Prof. Dr. Mchael Havbro Faber Dr. Nebojsa Mojslovc Swss Federal Insue of ETH Zurch, Swzerland Mehod of Fne

More information

Lecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press,

Lecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press, Lecure Sldes for INTRDUCTIN T Machne Learnng ETHEM ALAYDIN The MIT ress, 2004 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/2ml CHATER 3: Hdden Marov Models Inroducon Modelng dependences n npu; no

More information

Data Collection Definitions of Variables - Conceptualize vs Operationalize Sample Selection Criteria Source of Data Consistency of Data

Data Collection Definitions of Variables - Conceptualize vs Operationalize Sample Selection Criteria Source of Data Consistency of Data Apply Sascs and Economercs n Fnancal Research Obj. of Sudy & Hypoheses Tesng From framework objecves of sudy are needed o clarfy, hen, n research mehodology he hypoheses esng are saed, ncludng esng mehods.

More information

Dual Approximate Dynamic Programming for Large Scale Hydro Valleys

Dual Approximate Dynamic Programming for Large Scale Hydro Valleys Dual Approxmae Dynamc Programmng for Large Scale Hydro Valleys Perre Carpener and Jean-Phlppe Chanceler 1 ENSTA ParsTech and ENPC ParsTech CMM Workshop, January 2016 1 Jon work wh J.-C. Alas, suppored

More information

Single and Multiple Object Tracking Using a Multi-Feature Joint Sparse Representation

Single and Multiple Object Tracking Using a Multi-Feature Joint Sparse Representation Sngle and Mulple Objec Trackng Usng a Mul-Feaure Jon Sparse Represenaon Wemng Hu, We L, and Xaoqn Zhang (Naonal Laboraory of Paern Recognon, Insue of Auomaon, Chnese Academy of Scences, Bejng 100190) {wmhu,

More information

Chapter 4. Neural Networks Based on Competition

Chapter 4. Neural Networks Based on Competition Chaper 4. Neural Neworks Based on Compeon Compeon s mporan for NN Compeon beween neurons has been observed n bologcal nerve sysems Compeon s mporan n solvng many problems To classfy an npu paern _1 no

More information

Chapter 6: AC Circuits

Chapter 6: AC Circuits Chaper 6: AC Crcus Chaper 6: Oulne Phasors and he AC Seady Sae AC Crcus A sable, lnear crcu operang n he seady sae wh snusodal excaon (.e., snusodal seady sae. Complee response forced response naural response.

More information

MCs Detection Approach Using Bagging and Boosting Based Twin Support Vector Machine

MCs Detection Approach Using Bagging and Boosting Based Twin Support Vector Machine Proceedngs of he 009 IEEE Inernaonal Conference on Sysems, Man, and Cybernecs San Anono, TX, USA - Ocober 009 MCs Deecon Approach Usng Baggng and Boosng Based Twn Suppor Vecor Machne Xnsheng Zhang School

More information

Discrete Markov Process. Introduction. Example: Balls and Urns. Stochastic Automaton. INTRODUCTION TO Machine Learning 3rd Edition

Discrete Markov Process. Introduction. Example: Balls and Urns. Stochastic Automaton. INTRODUCTION TO Machine Learning 3rd Edition EHEM ALPAYDI he MI Press, 04 Lecure Sldes for IRODUCIO O Machne Learnng 3rd Edon alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/ml3e Sldes from exboo resource page. Slghly eded and wh addonal examples

More information

Structural Optimization Using Metamodels

Structural Optimization Using Metamodels Srucural Opmzaon Usng Meamodels 30 Mar. 007 Dep. o Mechancal Engneerng Dong-A Unvers Korea Kwon-Hee Lee Conens. Numercal Opmzaon. Opmzaon Usng Meamodels Impac beam desgn WB Door desgn 3. Robus Opmzaon

More information

On One Analytic Method of. Constructing Program Controls

On One Analytic Method of. Constructing Program Controls Appled Mahemacal Scences, Vol. 9, 05, no. 8, 409-407 HIKARI Ld, www.m-hkar.com hp://dx.do.org/0.988/ams.05.54349 On One Analyc Mehod of Consrucng Program Conrols A. N. Kvko, S. V. Chsyakov and Yu. E. Balyna

More information

Comparison of Differences between Power Means 1

Comparison of Differences between Power Means 1 In. Journal of Mah. Analyss, Vol. 7, 203, no., 5-55 Comparson of Dfferences beween Power Means Chang-An Tan, Guanghua Sh and Fe Zuo College of Mahemacs and Informaon Scence Henan Normal Unversy, 453007,

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 0 Canoncal Transformaons (Chaper 9) Wha We Dd Las Tme Hamlon s Prncple n he Hamlonan formalsm Dervaon was smple δi δ Addonal end-pon consrans pq H( q, p, ) d 0 δ q ( ) δq ( ) δ

More information

Forecasting customer behaviour in a multi-service financial organisation: a profitability perspective

Forecasting customer behaviour in a multi-service financial organisation: a profitability perspective Forecasng cusomer behavour n a mul-servce fnancal organsaon: a profably perspecve A. Audzeyeva, Unversy of Leeds & Naonal Ausrala Group Europe, UK B. Summers, Unversy of Leeds, UK K.R. Schenk-Hoppé, Unversy

More information

Video-Based Face Recognition Using Adaptive Hidden Markov Models

Video-Based Face Recognition Using Adaptive Hidden Markov Models Vdeo-Based Face Recognon Usng Adapve Hdden Markov Models Xaomng Lu and suhan Chen Elecrcal and Compuer Engneerng, Carnege Mellon Unversy, Psburgh, PA, 523, U.S.A. xaomng@andrew.cmu.edu suhan@cmu.edu Absrac

More information

Chapter Lagrangian Interpolation

Chapter Lagrangian Interpolation Chaper 5.4 agrangan Inerpolaon Afer readng hs chaper you should be able o:. dere agrangan mehod of nerpolaon. sole problems usng agrangan mehod of nerpolaon and. use agrangan nerpolans o fnd deraes and

More information

12d Model. Civil and Surveying Software. Drainage Analysis Module Detention/Retention Basins. Owen Thornton BE (Mech), 12d Model Programmer

12d Model. Civil and Surveying Software. Drainage Analysis Module Detention/Retention Basins. Owen Thornton BE (Mech), 12d Model Programmer d Model Cvl and Surveyng Soware Dranage Analyss Module Deenon/Reenon Basns Owen Thornon BE (Mech), d Model Programmer owen.hornon@d.com 4 January 007 Revsed: 04 Aprl 007 9 February 008 (8Cp) Ths documen

More information

Nonlinear Classifiers II

Nonlinear Classifiers II Nonlnear Classfers II Nonlnear Classfers: Introducton Classfers Supervsed Classfers Lnear Classfers Perceptron Least Squares Methods Lnear Support Vector Machne Nonlnear Classfers Part I: Mult Layer Neural

More information

( ) () we define the interaction representation by the unitary transformation () = ()

( ) () we define the interaction representation by the unitary transformation () = () Hgher Order Perurbaon Theory Mchael Fowler 3/7/6 The neracon Represenaon Recall ha n he frs par of hs course sequence, we dscussed he chrödnger and Hesenberg represenaons of quanum mechancs here n he chrödnger

More information

FTCS Solution to the Heat Equation

FTCS Solution to the Heat Equation FTCS Soluon o he Hea Equaon ME 448/548 Noes Gerald Reckenwald Porland Sae Unversy Deparmen of Mechancal Engneerng gerry@pdxedu ME 448/548: FTCS Soluon o he Hea Equaon Overvew Use he forward fne d erence

More information

CS286.2 Lecture 14: Quantum de Finetti Theorems II

CS286.2 Lecture 14: Quantum de Finetti Theorems II CS286.2 Lecure 14: Quanum de Fne Theorems II Scrbe: Mara Okounkova 1 Saemen of he heorem Recall he las saemen of he quanum de Fne heorem from he prevous lecure. Theorem 1 Quanum de Fne). Le ρ Dens C 2

More information

Digital Speech Processing Lecture 20. The Hidden Markov Model (HMM)

Digital Speech Processing Lecture 20. The Hidden Markov Model (HMM) Dgal Speech Processng Lecure 20 The Hdden Markov Model (HMM) Lecure Oulne Theory of Markov Models dscree Markov processes hdden Markov processes Soluons o he Three Basc Problems of HMM s compuaon of observaon

More information

Support Vector Machines

Support Vector Machines /14/018 Separatng boundary, defned by w Support Vector Machnes CISC 5800 Professor Danel Leeds Separatng hyperplane splts class 0 and class 1 Plane s defned by lne w perpendcular to plan Is data pont x

More information

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas)

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas) Lecure 8: The Lalace Transform (See Secons 88- and 47 n Boas) Recall ha our bg-cure goal s he analyss of he dfferenal equaon, ax bx cx F, where we emloy varous exansons for he drvng funcon F deendng on

More information

Online Supplement for Dynamic Multi-Technology. Production-Inventory Problem with Emissions Trading

Online Supplement for Dynamic Multi-Technology. Production-Inventory Problem with Emissions Trading Onlne Supplemen for Dynamc Mul-Technology Producon-Invenory Problem wh Emssons Tradng by We Zhang Zhongsheng Hua Yu Xa and Baofeng Huo Proof of Lemma For any ( qr ) Θ s easy o verfy ha he lnear programmng

More information

UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 2017 EXAMINATION

UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 2017 EXAMINATION INTERNATIONAL TRADE T. J. KEHOE UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 27 EXAMINATION Please answer wo of he hree quesons. You can consul class noes, workng papers, and arcles whle you are workng on he

More information

Kristin P. Bennett. Rensselaer Polytechnic Institute

Kristin P. Bennett. Rensselaer Polytechnic Institute Support Vector Machnes and Other Kernel Methods Krstn P. Bennett Mathematcal Scences Department Rensselaer Polytechnc Insttute Support Vector Machnes (SVM) A methodology for nference based on Statstcal

More information

CHAPTER 10: LINEAR DISCRIMINATION

CHAPTER 10: LINEAR DISCRIMINATION HAPER : LINEAR DISRIMINAION Dscmnan-based lassfcaon 3 In classfcaon h K classes ( k ) We defned dsmnan funcon g () = K hen gven an es eample e chose (pedced) s class label as f g () as he mamum among g

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Konstantn Tretyakov (kt@ut.ee) MTAT.03.227 Machne Learnng So far So far Supervsed machne learnng Lnear models Non-lnear models Unsupervsed machne learnng Generc scaffoldng So far

More information

CH.3. COMPATIBILITY EQUATIONS. Continuum Mechanics Course (MMC) - ETSECCPB - UPC

CH.3. COMPATIBILITY EQUATIONS. Continuum Mechanics Course (MMC) - ETSECCPB - UPC CH.3. COMPATIBILITY EQUATIONS Connuum Mechancs Course (MMC) - ETSECCPB - UPC Overvew Compably Condons Compably Equaons of a Poenal Vecor Feld Compably Condons for Infnesmal Srans Inegraon of he Infnesmal

More information

Support Vector Machines

Support Vector Machines Separatng boundary, defned by w Support Vector Machnes CISC 5800 Professor Danel Leeds Separatng hyperplane splts class 0 and class 1 Plane s defned by lne w perpendcular to plan Is data pont x n class

More information

An Affine Symmetric Approach to Natural Image Compression

An Affine Symmetric Approach to Natural Image Compression An Affne Symmerc Approach o Naural Image Compresson Heechan Par, Abhr Bhalerao, Graham Marn and Andy C. Yu Deparmen of Compuer Scence, Unversy of Warwc, UK {heechan abhr grm andycyu}@dcs.warwc.ac.u ABSTRACT

More information

Clustering with Gaussian Mixtures

Clustering with Gaussian Mixtures Noe o oher eachers and users of hese sldes. Andrew would be delghed f you found hs source maeral useful n gvng your own lecures. Feel free o use hese sldes verbam, or o modfy hem o f your own needs. PowerPon

More information

Filtrage particulaire et suivi multi-pistes Carine Hue Jean-Pierre Le Cadre and Patrick Pérez

Filtrage particulaire et suivi multi-pistes Carine Hue Jean-Pierre Le Cadre and Patrick Pérez Chaînes de Markov cachées e flrage parculare 2-22 anver 2002 Flrage parculare e suv mul-pses Carne Hue Jean-Perre Le Cadre and Parck Pérez Conex Applcaons: Sgnal processng: arge rackng bearngs-onl rackng

More information

Lecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press,

Lecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press, Lecure Sdes for INTRODUCTION TO Machne Learnng ETHEM ALPAYDIN The MIT Press, 2004 aaydn@boun.edu.r h://www.cme.boun.edu.r/~ehem/2m CHAPTER 7: Cuserng Semaramerc Densy Esmaon Paramerc: Assume a snge mode

More information

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study)

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study) Inernaonal Mahemacal Forum, Vol. 8, 3, no., 7 - HIKARI Ld, www.m-hkar.com hp://dx.do.org/.988/mf.3.3488 New M-Esmaor Objecve Funcon n Smulaneous Equaons Model (A Comparave Sudy) Ahmed H. Youssef Professor

More information