Hidden Markov Models for Speech Recognition. Bhiksha Raj and Rita Singh

Size: px
Start display at page:

Download "Hidden Markov Models for Speech Recognition. Bhiksha Raj and Rita Singh"

Transcription

1 Hidden Markov Model for Speech Recogniion Bhikha Raj and Ria Singh

2 Recap: T 11 T 22 T 33 T 12 T 23 T 13 Thi rcre i a generic repreenaion of a aiical model for procee ha generae ime erie The egmen in he ime erie are referred o a ae The proce pae hrogh hee ae o generae ime erie The enire rcre may be viewed a one generalizaion of he DTW model we have diced h far

3 The HMM Proce The HMM model he proce nderlying he obervaion a going hrogh a nmber of ae For inance in prodcing he ond W i fir goe hrogh a ae where i prodce he ond UH hen goe ino a ae where i raniion from UH o AH and finally o a ae where i prodced AH W AH UH The re nderlying proce i he vocal rac here Which roghly goe from he configraion for UH o he configraion for AH

4 are abracion The ae are no direcly oberved Here ae of he proce are analogo o configraion of he vocal rac ha prodce he ignal We only hear he peech; we do no ee he vocal rac i.e. he ae are hidden The inerpreaion of ae i no alway obvio The vocal rac acally goe hrogh a coninm of configraion The model repreen all of hee ing only a fixed nmber of ae The model abrac he proce ha generae he daa The yem goe hrogh a finie nmber of ae When in any ae i can eiher remain a ha ae or go o anoher wih ome probabiliy When a any ae i generae obervaion according o a diribion aociaed wih ha ae

5 Hidden Markov Model A Hidden Markov Model coni of wo componen A ae/raniion backbone ha pecifie how many ae here are and how hey can follow one anoher A e of probabiliy diribion one for each ae which pecifie he diribion of all vecor in ha ae Markov chain Thi can be facored ino wo eparae probabiliic eniie A probabiliic Markov chain wih ae and raniion A e of daa probabiliy diribion aociaed wih he ae Daa diribion

6 HMM a a aiical model An HMM i a aiical model for a ime-varying proce The proce i alway in one of a conable nmber of ae a any ime When he proce vii in any ae i generae an obervaion by a random draw from a diribion aociaed wih ha ae The proce conanly move from ae o ae. The probabiliy ha he proce will move o any ae i deermined olely by he crren ae i.e. he dynamic of he proce are Markovian The enire model repreen a probabiliy diribion over he eqence of obervaion I ha a pecific probabiliy of generaing any pariclar eqence The probabiliie of all poible obervaion eqence m o 1

7 How an HMM model a proce HMM amed o be generaing daa ae eqence ae diribion obervaion eqence

8 The opology of he HMM HMM Parameer No. of ae and allowed raniion E.g. here we have 3 ae and canno go from he ble ae o he red The raniion probabiliie Ofen repreened a a marix a here Tij i he probabiliy ha when in ae i he proce will move o j The probabiliy of beginning a a pariclar ae The ae op diribion 0.6 T =

9 HMM ae op diribion The ae op diribion repreen he diribion of daa prodced from any ae In he previo lecre we ame he ae op diribion o be Gaian Albei largely in a DTW conex P( v) = Gaian( v; m C) = 1 2π C e 0.5( v m) T C 1 ( v m) In realiy he diribion of vecor for any ae need no be Gaian In he mo general cae i can be arbirarily complex The Gaian i only a coare repreenaion of hi diribion If we model he op diribion of ae beer we can expec he model o be a beer repreenaion of he daa

10 Gaian Mixre A Gaian Mixre i lierally a mixre of Gaian. I i a weighed combinaion of everal Gaian diribion P( v) = K 1 i= 0 w Gaian( v; m i v i any daa vecor. P(v) i he probabiliy given o ha vecor by he Gaian mixre K i he nmber of Gaian being mixed w i i he mixre weigh of he i h Gaian. m i i i mean and C i i i covariance The Gaian mixre diribion i alo a diribion I i poiive everywhere. The oal volme nder a Gaian mixre i 1.0. Conrain: he mixre weigh w i m all be poiive and m o 1 i C i )

11 Generaing an obervaion from a Gaian mixre ae diribion Fir draw he ideniy of he Gaian from he a priori probabiliy diribion of Gaian (mixre weigh) Then draw a vecor from he eleced Gaian

12 Gaian Mixre A Gaian mixre can repreen daa diribion far beer han a imple Gaian The wo panel how he hiogram of an nknown random variable The fir panel how how i i modeled by a imple Gaian The econd panel model he hiogram by a mixre of wo Gaian Cavea: I i hard o know he opimal nmber of Gaian in a mixre diribion for any random variable

13 wih Gaian mixre ae diribion The parameer of an HMM wih Gaian mixre ae diribion are: π he e of iniial ae probabiliie for all ae T he marix of raniion probabiliie A Gaian mixre diribion for every ae in he HMM. The Gaian mixre for he i h ae i characerized by K i he nmber of Gaian in he mixre for he i h ae The e of mixre weigh w ij 0<j<K i The e of Gaian mean m ij 0 <j<k i The e of Covariance marice C ij 0 < j <K i

14 Given an HMM: Three Baic HMM Problem Wha i he probabiliy ha i will generae a pecific obervaion eqence Given a obervaion eqence how do we deermine which obervaion wa generaed from which ae The ae egmenaion problem How do we learn he parameer of he HMM from obervaion eqence

15 Comping he Probabiliy of an Obervaion Seqence Two apec o prodcing he obervaion: Preceing hrogh a eqence of ae Prodcing obervaion from hee ae

16 Preceing hrogh ae HMM amed o be generaing daa ae eqence The proce begin a ome ae (red) here From ha ae i make an allowed raniion To arrive a he ame or any oher ae From ha ae i make anoher allowed raniion And o on

17 Probabiliy ha he HMM will follow a pariclar ae eqence P(...) = P( ) P( ) P( ) P( 1 ) i he probabiliy ha he proce will iniially be in ae 1 P( i i ) i he raniion probabiliy of moving o ae i a he nex ime inan when he yem i crrenly in i Alo denoed by Tij earlier

18 Generaing Obervaion from Sae HMM amed o be generaing daa ae eqence ae diribion obervaion eqence A each ime i generae an obervaion from he ae i i in a ha ime

19 Probabiliy ha he HMM will generae a pariclar obervaion eqence given a ae eqence (ae eqence known) P( o o o......) = P( o ) P( o ) P( o ) Comped from he Gaian or Gaian mixre for ae 1 P(o i i ) i he probabiliy of generaing obervaion o i when he yem i in ae i

20 Preceing hrogh Sae and Prodcing Obervaion HMM amed o be generaing daa ae eqence ae diribion obervaion eqence A each ime i prodce an obervaion and make a raniion

21 Probabiliy ha he HMM will generae a pariclar ae eqence and from i a pariclar obervaion eqence P( o o o......) = P( o1 o2 o ) P( ) = P( o 1 1) P( o2 2) P( o3 3)... P( 1) P( 2 1) P( 3 2)...

22 Probabiliy of Generaing an Obervaion Seqence If only he obervaion i known he precie ae eqence followed o prodce i i no known All poible ae eqence m be conidered P( o1 o2 o3...) = all. poible ae. eqence P( o o o......) = all. poible ae. eqence P( o ) P( o ) P( o )... P( ) P( ) P( )

23 Comping i Efficienly Explici mming over all ae eqence i no efficien A very large nmber of poible ae eqence For long obervaion eqence i may be inracable Fornaely we have an efficien algorihm for hi: The forward algorihm A each ime for each ae compe he oal probabiliy of all ae eqence ha generae obervaion nil ha ime and end a ha ae

24 Illraive Example Conider a generic HMM wih 5 ae and a erminaing ae. We wih o find he probabiliy of he be ae eqence for an obervaion eqence aming i wa generaed by hi HMM P( i ) = 1 for ae 1 and 0 for oher The arrow repreen raniion for which he probabiliy i no 0. P( i i ) = a ij We omeime alo repreen he ae op probabiliy of i a P(o i ) = b i () for breviy 91

25 Diverion: The Trelli Sae index α () -1 Feare vecor (ime) The relli i a graphical repreenaion of all poible pah hrogh he HMM o prodce a given obervaion Analogo o he DTW earch graph / relli The Y-axi repreen HMM ae X axi repreen obervaion Every edge in he graph repreen a valid raniion in he HMM over a ingle ime ep Every node repreen he even of a pariclar obervaion being generaed from a pariclar ae

26 The Forward Algorihm α () = P( x x... x ae() = λ) 1 2 Sae index α () -1 ime α () i he oal probabiliy of ALL ae eqence ha end a ae a ime and all obervaion nil x

27 The Forward Algorihm α () = P( x x... x ae() = λ) 1 2 Sae index α ( 1) α () Can be recrively eimaed aring from he fir ime inan (forward recrion) α ( 1 1) α () = α ( 1) P ( ) P( x ) -1 ime α () can be recrively comped in erm of α ( ) he forward probabiliie a ime -1

28 The Forward Algorihm Toalprob = α ( T ) Sae index T ime In he final obervaion he alpha a each ae give he probabiliy of all ae eqence ending a ha ae The oal probabiliy of he obervaion i he m of he alpha vale a all ae

29 Problem 2: The ae egmenaion problem Given only a eqence of obervaion how do we deermine which eqence of ae wa followed in prodcing i?

30 The HMM a a generaor HMM amed o be generaing daa ae eqence ae diribion obervaion eqence The proce goe hrogh a erie of ae and prodce obervaion from hem

31 Sae are Hidden HMM amed o be generaing daa ae eqence ae diribion obervaion eqence The obervaion do no reveal he nderlying ae

32 The ae egmenaion problem HMM amed o be generaing daa ae eqence ae diribion obervaion eqence Sae egmenaion: Eimae ae eqence given obervaion

33 Eimaing he Sae Seqence Any nmber of ae eqence cold have been ravered in prodcing he obervaion In he wor cae every ae eqence may have prodced i Solion: Idenify he mo probable ae eqence The ae eqence for which he probabiliy of progreing hrogh ha eqence and gen eraing he obervaion eqence i maximm i.e i maximm P( o o o......) =

34 Eimaing he ae eqence Once again exhaive evalaion i impoibly expenive B once again a imple dynamic-programming olion i available P( o o o......) P( o 1 1) P( o2 2) P( o3 3)... P( 1) P( 2 1) P( 3 2)... Needed: = arg max... P( o1 1) P( 1) P( o2 2) P( 2 1) P( o3 3) P( 3 2) 1 2 3

35 Eimaing he ae eqence Once again exhaive evalaion i impoibly expenive B once again a imple dynamic-programming olion i available P( o o o......) P( o 1 1) P( o2 2) P( o3 3)... P( 1) P( 2 1) P( 3 2)... Needed: = arg max... P( o1 1) P( 1) P( o2 2) P( 2 1) P( o3 3) P( 3 2) 1 2 3

36 The ae eqence The probabiliy of a ae eqence???? x y ending a ime i imply he probabiliy of???? x mliplied by P(o y )P( y x ) The be ae eqence ha end wih x y a will have a probabiliy eqal o he probabiliy of he be ae eqence ending a -1 a x ime P(o y )P( y x ) Since he la erm i independen of he ae eqence leading o x a -1

37 Trelli The graph below how he e of all poible ae eqence hrogh hi HMM in five ime inan ime 94

38 The co of exending a ae eqence The co of exending a ae eqence ending a x i only dependen on he raniion from x o y and he obervaion probabiliy a y y x ime 94

39 The co of exending a ae eqence The be pah o y hrogh x i imply an exenion of he be pah o x y x ime 94

40 The Recrion The overall be pah o x i an exenion of he be pah o one of he ae a he previo ime y x ime

41 The Recrion Bepah prob( y ) = Be (Bepah prob(? ) * P( y? ) * P(o y )) y x ime

42 Finding he be ae eqence Thi give a imple recrive formlaion o find he overall be ae eqence: 1. The be ae eqence X 1i of lengh 1 ending a ae i i imply i. The probabiliy C(X 1i ) of X 1i i P(o 1 i ) P( i ) 2. The be ae eqence of lengh +1 i imply given by (argmax Xi C(X i )P(o +1 j ) P( j i )) i 3. The be overall ae eqence for an erance of lengh T i given by argmax Xi j C(X Ti ) The ae eqence of lengh T wih he highe overall probabiliy 89

43 Finding he be ae eqence The imple algorihm j preened i called he VITERBI algorihm in he lierare Afer A.J.Vierbi who invened hi dynamic programming algorihm for a compleely differen prpoe: decoding error correcion code! The Vierbi algorihm can alo be viewed a a breadh-fir graph earch algorihm The HMM form he Y axi of a 2-D plane Edge co of hi graph are raniion probabiliie P( ). Node co are P(o ) A linear graph wih every node a a ime ep form he X axi A relli i a graph formed a he croprodc of hee wo graph The Vierbi algorihm find he be pah hrogh hi graph 90

44 Vierbi Search (cond.) Iniial ae iniialized wih pah-core = P( 1 )b 1 (1) All oher ae have core 0 ince P( i ) = 0 for hem ime 92

45 Vierbi Search (cond.) P () j Sae wih be pah-core Sae wih pah-core < be Sae wiho a valid pah-core = max [P i (-1) a ij b j ()] i Sae raniion probabiliy i o j Score for ae j given he inp a ime Toal pah-core ending p a ae j a ime ime 93

46 Vierbi Search (cond.) P () j = max [P i (-1) a ij b j ()] i Sae raniion probabiliy i o j Score for ae j given he inp a ime Toal pah-core ending p a ae j a ime ime 94

47 Vierbi Search (cond.) ime 94

48 Vierbi Search (cond.) ime 94

49 Vierbi Search (cond.) ime 94

50 Vierbi Search (cond.) ime 94

51 Vierbi Search (cond.) ime 94

52 Vierbi Search (cond.) THE BEST STATE SEQUENCE IS THE ESTIMATE OF THE STATE SEQUENCE FOLLOWED IN GENERATING THE OBSERVATION ime 94

53 Vierbi and DTW The Vierbi algorihm i idenical o he ring-maching procedre ed for DTW ha we aw earlier I compe an eimae of he ae eqence followed in prodcing he obervaion I alo give he probabiliy of he be ae eqence

54 Problem3: Training HMM parameer We can compe he probabiliy of an obervaion and he be ae eqence given an obervaion ing he HMM parameer B where do he HMM parameer come from? They m be learned from a collecion of obervaion eqence We have already een one echniqe for raining : The egmenal K-mean procedre

55 Modified egmenal K-mean AKA Vierbi raining The enire egmenal K-mean algorihm: 1. Iniialize all parameer Sae mean and covariance Traniion probabiliie Iniial ae probabiliie 2. Segmen all raining eqence 3. Reeimae parameer from egmened raining eqence 4. If no converged rern o 2

56 Iniialize Segmenal K-mean Ierae T1 T2 T3 T4 The procedre can be conined nil convergence Convergence i achieved when he oal be-alignmen error for all raining eqence doe no change ignificanly wih frher refinemen of he model

57 A Beer Techniqe The Segmenal K-mean echniqe niqely aign each obervaion o one ae However hi i only an eimae and may be wrong A beer approach i o ake a of deciion Aign each obervaion o every ae wih a probabiliy

58 The probabiliy of a ae The probabiliy aigned o any ae for any obervaion x i he probabiliy ha he proce wa a when i generaed x We wan o compe P ( ae( ) = x x2... xt ) P( ae( ) = x1 x2... x 1 T We will compe P ae( ) = x x... x ) fir ( 1 2 T Thi i he probabiliy ha he proce viied a ime while prodcing he enire obervaion )

59 Probabiliy of Aigning an Obervaion o a Sae The probabiliy ha he HMM wa in a pariclar ae when generaing he obervaion eqence i he probabiliy ha i followed a ae eqence ha paed hrogh a ime ime

60 Probabiliy of Aigning an Obervaion o a Sae Thi can be decompoed ino wo mliplicaive ecion The ecion of he laice leading ino ae a ime and he ecion leading o of i ime

61 Probabiliy of Aigning an Obervaion o a Sae The probabiliy of he red ecion i he oal probabiliy of all ae eqence ending a ae a ime Thi i imply α() Can be comped ing he forward algorihm ime

62 The forward algorihm α () = P( x x... x ae() = λ) 1 2 Sae index α ( 1) α () Can be recrively eimaed aring from he fir ime inan (forward recrion) α ( 1 1) α () = α ( 1) P ( ) P( x ) -1 ime λ repreen he complee crren e of HMM parameer

63 The Fre Pah The ble porion repreen he probabiliy of all ae eqence ha began a ae a ime Like he red porion i can be comped ing a backward recrion ime

64 The Backward Recrion β ( ) = P( x x... x ae( ) = λ) T β () β β ( N + 1) ( +1) Can be recrively eimaed aring from he final ime ime inan (backward recrion) +1 β () = β ( + ) P ( ) P( x ) ime β () i he oal probabiliy of ALL ae eqence ha depar from a ime and all obervaion afer x β(t) = 1 a he final ime inan for all valid final ae

65 The complee probabiliy α ( ) β ( ) = P( x x... x ae( ) = λ) 1 2 T β ( N + 1) α ( ) 1 β ( +1) α ( 1 1) ime = P( X ae( ) = λ)

66 Poerior probabiliy of a ae The probabiliy ha he proce wa in ae a ime given ha we have oberved he daa i obained by imple normalizaion Pae ( ( ) = X λ) = P( X ae( ) = λ) P( X ae( ) = λ) Thi erm i ofen referred o a he gamma erm and denoed by γ = α () β () α ( ) β ( )

67 Updae Rle Once we have he ae probabiliie (he gamma) he pdae rle are obained hrogh a imple modificaion of he formlae ed for egmenal K-mean Thi new learning algorihm i known a he Bam-Welch learning procedre Cae1: Sae op deniie are Gaian

68 Updae Rle = x x N 1 μ = x γ γ μ = x T x x N C ) ( ) ( 1 μ μ = T x x C ) ( ) ( γ μ μ γ Segmenal K-mean Bam Welch A imilar pdae formla reeimae raniion probabiliie The iniial ae probabiliie P() alo have a imilar pdae rle

69 Cae 2: Sae op deniie are Gaian Mixre When ae op deniie are Gaian mixre more parameer m be eimaed P( x ) = K 1 i= 0 w igaian( x; μ i C i ) The mixre weigh w i mean μ i and covariance C i of every Gaian in he diribion of each ae m be eimaed

70 Spliing he Gamma We pli he gamma for any ae among all he Gaian a ha ae Re-eimaion of ae parameer A poeriori probabiliy ha he h vecor wa generaed by he k h Gaian of ae h γ = P( ae( ) = X λ) P( k. Gaian ae( ) = x λ k )

71 Spliing he Gamma among Gaian A poeriori probabiliy ha he h vecor wa generaed by he k h Gaian of ae h γ = P( ae( ) = X λ) P( k. Gaian ae( ) = x λ k ) γ k = P( ae( ) = X λ) ( 2π ) ( 2π ) k k T 1 ( μk ) Ck ( μk ) 1 w e x x 1 2 k d C T 1 ( μk ) Ck ( μ k ) 1 w e x x 1 2 k d k C

72 Updaing HMM Parameer ~ μ ~ k w k = = γ γ γ γ j x k k k j ~ γ ~ μ ~ μ ( x )( x ) k k k C k = γ k T Noe: Every obervaion conribe o he pdae of parameer vale of every Gaian of every ae

73 Overall Training Procedre: Single Gaian PDF Deermine a opology for he HMM Iniialize all HMM parameer Iniialize all allowed raniion o have he ame probabiliy Iniialize all ae op deniie o be Gaian We ll revii iniializaion 1. Over all erance compe he fficien aiic T x 2. Ue pdae formlae o compe new HMM parameer 3. If he overall probabiliy of he raining daa ha no converged rern o ep 1 γ γ γ ( μ ) ( x x ) μ

74 An Implemenaional Deail Sep1 compe bffer over all erance Thi can be pli and parallelized U 1 U 2 ec. can be proceed on eparae machine = U U x x x γ γ γ = U U γ γ γ.. ) ( ) ( ) ( ) ( ) ( ) ( = U T U T T x x x x x x μ μ γ μ μ γ μ μ γ 1 U γ 1 U γ x 1 ) ( ) ( U T x x μ μ γ 2 U γ 2 U γ x 2 ) ( ) ( U T x x μ μ γ Machine 1 Machine 2

75 An Implemenaional Deail Sep2 aggregae and add bffer before pdaing he model = U U x x x γ γ γ = U U γ γ γ.. ) ( ) ( ) ( ) ( ) ( ) ( = U T U T T x x x x x x μ μ γ μ μ γ μ μ γ ~ μ γ γ k k k x = ( )( ) ~ ~ ~ C k k k k T k x x = γ μ μ γ ~ w k k j j = γ γ

76 An Implemenaional Deail Sep2 aggregae and add bffer before pdaing he model γ γ + γ.. ~ μ ~ k w k γ = + U U = x γ x + γ x + U U 1 1 T T T γ ( x μ ) ( x μ ) = γ ( x μ ) ( x μ ) + γ ( x μ ) ( x μ ) +.. = = γ γ γ γ j x k k k j U 1 ~ γ U 2 x ~ μ x ~ μ ( )( ) k k k C k = γ k Comped by machine 1 Comped by machine 2 T

77 Training for wih Gaian Mixre Sae Op Diribion Gaian Mixre are obained by pliing 1. Train an HMM wih (ingle) Gaian ae op diribion 2. Spli he Gaian wih he large variance Perrb he mean by adding and bracing a mall nmber Thi give 2 Gaian. Pariion he mixre weigh of he Gaian ino wo halve one for each Gaian A mixre wih N Gaian now become a mixre of N+1 Gaian 3. Ierae BW o convergence 4. If he deired nmber of Gaian no obained rern o 2

78 Spliing a Gaian μ ε μ+ε μ μ The mixre weigh w for he Gaian ge hared a 0.5w by each of he wo pli Gaian

79 Implemenaion of BW: nderflow Arihmeic nderflow i a problem α () = α ( 1) P( ) Px ( ) probabiliy erm probabiliy erm The alpha erm are a recrive prodc of probabiliy erm A increae an increaingly greaer nmber probabiliy erm are facored ino he alpha All probabiliy erm are le han 1 Sae op probabiliie are acally probabiliy deniie Probabiliy deniy vale can be greaer han 1 On he oher hand for large dimenional daa probabiliy deniy vale are ally mch le han 1 Wih increaing ime alpha vale decreae Wihin a few ime inan hey nderflow o 0 Every alpha goe o 0 a ome ime. All fre alpha remain 0 A he dimenionaliy of he daa increae alpha goe o 0 faer

80 Underflow: Solion One mehod of avoiding nderflow i o cale all alpha a each ime inan Scale wih repec o he large alpha o make re he large caled alpha i 1.0 Scale wih repec o he m of he alpha o enre ha all alpha m o 1.0 Scaling conan m be appropriaely conidered when comping he final probabiliie of an obervaion eqence

81 Implemenaion of BW: nderflow Similarly arihmeic nderflow can occr dring bea compaion β ( ) = β ( ' + 1) P( ' ) P( x + 1 ' ) ' The bea erm are alo a recrive prodc of probabiliy erm and can nderflow Underflow can be prevened by Scaling: Divide all bea erm by a conan ha preven nderflow By performing bea compaion in he log domain

82 Bilding a recognizer for iolaed word Now have all neceary componen o bild an HMM-baed recognizer for iolaed word Where each word i poken by ielf in iolaion E.g. a imple applicaion where one may eiher ay Ye or No o a recognizer and i m recognize wha wa aid

83 Iolaed Word Recogniion wih Aming all word are eqally likely Training Collec a e of raining recording for each word Compe feare vecor eqence for he word Train for each word Recogniion: Compe feare vecor eqence for e erance Compe he forward probabiliy of he feare vecor eqence from he HMM for each word Alernaely compe he be ae eqence probabiliy ing Vierbi Selec he word for which hi vale i highe

84 Ie Wha i he opology o e for he How many ae Wha kind of raniion rcre If ae op deniie have Gaian Mixre: how many Gaian?

85 HMM Topology For peech a lef-o-righ opology work be The Baki opology Noe ha he iniial ae probabiliy P() i 1 for he 1 ae and 0 for oher. Thi need no be learned Sae may be kipped

86 Deermining he Nmber of Sae How do we know he nmber of ae o e for any word? We do no really Ideally here hold be a lea one ae for each baic ond wihin he word Oherwie widely differing ond may be collaped ino one ae The average feare vecor for ha ae wold be a poor repreenaion For compaional efficiency he nmber of ae hold be mall Thee wo are conflicing reqiremen ally olved by making ome edcaed gee

87 Deermining he Nmber of Sae For mall vocablarie i i poible o examine each word in deail and arrive a reaonable nmber: S O ME TH I NG For larger vocablarie we may be forced o rely on ome ad hoc principle E.g. proporional o he nmber of leer in he word Work beer for ome langage han oher Spanih and Indian langage are good example where hi work a almo every leer in a word prodce a ond

88 How many Gaian No clear anwer for hi eiher The nmber of Gaian i ally a fncion of he amon of raining daa available Ofen e by rial and error A minimm of 4 Gaian i ally reqired for reaonable recogniion

89 Implemenaion of BW: iniializaion of alpha and bea Iniializaion for alpha: α (1) e o 0 for all ae excep he fir ae of he model. α (1) e o P(o 1 ) for he fir ae All obervaion m begin a he fir ae Iniializaion for bea: β ( T) e o 0 for all ae excep he erminaing ae. β ( ) e o 1 for hi ae All obervaion m erminae a he final ae

90 Iniializing Sae Op Deniy Parameer 1. Iniially only a ingle Gaian per ae amed Mixre obained by pliing Gaian 2. For Baki-opology a good iniializaion i he fla iniializaion Compe he global mean and variance of all feare vecor in all raining inance of he word Iniialize all Gaian (i.e all ae op diribion) wih hi mean and variance Their mean and variance will converge o appropriae vale aomaically wih ieraion Gaian pliing o compe Gaian mixre ake care of he re

91 Iolaed word recogniion: Final hogh All relevan opic covered How o compe feare from recording of he word We will no explicily refer o feare compaion in fre lecre How o e HMM opologie for he word How o rain for he word Bam-Welch algorihm How o elec he mo probable HMM for a e inance Comping probabiliie ing he forward algorihm Comping probabiliie ing he Vierbi algorihm Which alo give he ae egmenaion

92 Qeion?

1 Motivation and Basic Definitions

1 Motivation and Basic Definitions CSCE : Deign and Analyi of Algorihm Noe on Max Flow Fall 20 (Baed on he preenaion in Chaper 26 of Inroducion o Algorihm, 3rd Ed. by Cormen, Leieron, Rive and Sein.) Moivaion and Baic Definiion Conider

More information

CMPS 6610/4610 Fall Flow Networks. Carola Wenk Slides adapted from slides by Charles Leiserson

CMPS 6610/4610 Fall Flow Networks. Carola Wenk Slides adapted from slides by Charles Leiserson CMP 6610/4610 Fall 2016 Flow Nework Carola Wenk lide adaped rom lide by Charle Leieron Max low and min c Fndamenal problem in combinaorial opimizaion Daliy beween max low and min c Many applicaion: Biparie

More information

Randomized Perfect Bipartite Matching

Randomized Perfect Bipartite Matching Inenive Algorihm Lecure 24 Randomized Perfec Biparie Maching Lecurer: Daniel A. Spielman April 9, 208 24. Inroducion We explain a randomized algorihm by Ahih Goel, Michael Kapralov and Sanjeev Khanna for

More information

On Line Supplement to Strategic Customers in a Transportation Station When is it Optimal to Wait? A. Manou, A. Economou, and F.

On Line Supplement to Strategic Customers in a Transportation Station When is it Optimal to Wait? A. Manou, A. Economou, and F. On Line Spplemen o Sraegic Comer in a Tranporaion Saion When i i Opimal o Wai? A. Mano, A. Economo, and F. Karaemen 11. Appendix In hi Appendix, we provide ome echnical analic proof for he main rel of

More information

CS4445/9544 Analysis of Algorithms II Solution for Assignment 1

CS4445/9544 Analysis of Algorithms II Solution for Assignment 1 Conider he following flow nework CS444/944 Analyi of Algorihm II Soluion for Aignmen (0 mark) In he following nework a minimum cu ha capaciy 0 Eiher prove ha hi aemen i rue, or how ha i i fale Uing he

More information

Localization and Map Making

Localization and Map Making Localiaion and Map Making My old office DILab a UTK ar of he following noes are from he book robabilisic Roboics by S. Thrn W. Brgard and D. Fo Two Remaining Qesions Where am I? Localiaion Where have I

More information

What is maximum Likelihood? History Features of ML method Tools used Advantages Disadvantages Evolutionary models

What is maximum Likelihood? History Features of ML method Tools used Advantages Disadvantages Evolutionary models Wha i maximum Likelihood? Hiory Feaure of ML mehod Tool ued Advanage Diadvanage Evoluionary model Maximum likelihood mehod creae all he poible ree conaining he e of organim conidered, and hen ue he aiic

More information

Algorithmic Discrete Mathematics 6. Exercise Sheet

Algorithmic Discrete Mathematics 6. Exercise Sheet Algorihmic Dicree Mahemaic. Exercie Shee Deparmen of Mahemaic SS 0 PD Dr. Ulf Lorenz 7. and 8. Juni 0 Dipl.-Mah. David Meffer Verion of June, 0 Groupwork Exercie G (Heap-Sor) Ue Heap-Sor wih a min-heap

More information

Network Flows: Introduction & Maximum Flow

Network Flows: Introduction & Maximum Flow CSC 373 - lgorihm Deign, nalyi, and Complexiy Summer 2016 Lalla Mouaadid Nework Flow: Inroducion & Maximum Flow We now urn our aenion o anoher powerful algorihmic echnique: Local Search. In a local earch

More information

of Manchester The University COMP14112 Hidden Markov Models

of Manchester The University COMP14112 Hidden Markov Models COMP42 Lecure 8 Hidden Markov Model he Univeriy of Mancheer he Univeriy of Mancheer Hidden Markov Model a b2 0. 0. SAR 0.9 0.9 SOP b 0. a2 0. Imagine he and 2 are hidden o he daa roduced i a equence of

More information

0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED

0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED 0.1 MAXIMUM LIKELIHOOD ESTIMATIO EXPLAIED Maximum likelihood esimaion is a bes-fi saisical mehod for he esimaion of he values of he parameers of a sysem, based on a se of observaions of a random variable

More information

Machine Learning 4771

Machine Learning 4771 ony Jebara, Columbia Universiy achine Learning 4771 Insrucor: ony Jebara ony Jebara, Columbia Universiy opic 20 Hs wih Evidence H Collec H Evaluae H Disribue H Decode H Parameer Learning via JA & E ony

More information

ARTIFICIAL INTELLIGENCE. Markov decision processes

ARTIFICIAL INTELLIGENCE. Markov decision processes INFOB2KI 2017-2018 Urech Univeriy The Neherland ARTIFICIAL INTELLIGENCE Markov deciion procee Lecurer: Silja Renooij Thee lide are par of he INFOB2KI Coure Noe available from www.c.uu.nl/doc/vakken/b2ki/chema.hml

More information

Hidden Markov Models

Hidden Markov Models Hidden Markov Models Probabilisic reasoning over ime So far, we ve mosly deal wih episodic environmens Excepions: games wih muliple moves, planning In paricular, he Bayesian neworks we ve seen so far describe

More information

Algorithms and Data Structures 2011/12 Week 9 Solutions (Tues 15th - Fri 18th Nov)

Algorithms and Data Structures 2011/12 Week 9 Solutions (Tues 15th - Fri 18th Nov) Algorihm and Daa Srucure 2011/ Week Soluion (Tue 15h - Fri 18h No) 1. Queion: e are gien 11/16 / 15/20 8/13 0/ 1/ / 11/1 / / To queion: (a) Find a pair of ube X, Y V uch ha f(x, Y) = f(v X, Y). (b) Find

More information

Main Reference: Sections in CLRS.

Main Reference: Sections in CLRS. Maximum Flow Reied 09/09/200 Main Reference: Secion 26.-26. in CLRS. Inroducion Definiion Muli-Source Muli-Sink The Ford-Fulkeron Mehod Reidual Nework Augmening Pah The Max-Flow Min-Cu Theorem The Edmond-Karp

More information

The Residual Graph. 12 Augmenting Path Algorithms. Augmenting Path Algorithm. Augmenting Path Algorithm

The Residual Graph. 12 Augmenting Path Algorithms. Augmenting Path Algorithm. Augmenting Path Algorithm Augmening Pah Algorihm Greedy-algorihm: ar wih f (e) = everywhere find an - pah wih f (e) < c(e) on every edge augmen flow along he pah repea a long a poible The Reidual Graph From he graph G = (V, E,

More information

Network Flow. Data Structures and Algorithms Andrei Bulatov

Network Flow. Data Structures and Algorithms Andrei Bulatov Nework Flow Daa Srucure and Algorihm Andrei Bulao Algorihm Nework Flow 24-2 Flow Nework Think of a graph a yem of pipe We ue hi yem o pump waer from he ource o ink Eery pipe/edge ha limied capaciy Flow

More information

Problem Set If all directed edges in a network have distinct capacities, then there is a unique maximum flow.

Problem Set If all directed edges in a network have distinct capacities, then there is a unique maximum flow. CSE 202: Deign and Analyi of Algorihm Winer 2013 Problem Se 3 Inrucor: Kamalika Chaudhuri Due on: Tue. Feb 26, 2013 Inrucion For your proof, you may ue any lower bound, algorihm or daa rucure from he ex

More information

Discussion Session 2 Constant Acceleration/Relative Motion Week 03

Discussion Session 2 Constant Acceleration/Relative Motion Week 03 PHYS 100 Dicuion Seion Conan Acceleraion/Relaive Moion Week 03 The Plan Today you will work wih your group explore he idea of reference frame (i.e. relaive moion) and moion wih conan acceleraion. You ll

More information

The Residual Graph. 11 Augmenting Path Algorithms. Augmenting Path Algorithm. Augmenting Path Algorithm

The Residual Graph. 11 Augmenting Path Algorithms. Augmenting Path Algorithm. Augmenting Path Algorithm Augmening Pah Algorihm Greedy-algorihm: ar wih f (e) = everywhere find an - pah wih f (e) < c(e) on every edge augmen flow along he pah repea a long a poible The Reidual Graph From he graph G = (V, E,

More information

Introduction to Congestion Games

Introduction to Congestion Games Algorihmic Game Theory, Summer 2017 Inroducion o Congeion Game Lecure 1 (5 page) Inrucor: Thoma Keelheim In hi lecure, we ge o know congeion game, which will be our running example for many concep in game

More information

2. VECTORS. R Vectors are denoted by bold-face characters such as R, V, etc. The magnitude of a vector, such as R, is denoted as R, R, V

2. VECTORS. R Vectors are denoted by bold-face characters such as R, V, etc. The magnitude of a vector, such as R, is denoted as R, R, V ME 352 VETS 2. VETS Vecor algebra form he mahemaical foundaion for kinemaic and dnamic. Geomer of moion i a he hear of boh he kinemaic and dnamic of mechanical em. Vecor anali i he imehonored ool for decribing

More information

Maximum Flow 5/6/17 21:08. Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015

Maximum Flow 5/6/17 21:08. Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015 Maximm Flo 5/6/17 21:08 Preenaion for e ih he exbook, Algorihm Deign and Applicaion, by M. T. Goodrich and R. Tamaia, Wiley, 2015 Maximm Flo χ 4/6 4/7 1/9 2015 Goodrich and Tamaia Maximm Flo 1 Flo Neork

More information

EE Control Systems LECTURE 2

EE Control Systems LECTURE 2 Copyrigh F.L. Lewi 999 All righ reerved EE 434 - Conrol Syem LECTURE REVIEW OF LAPLACE TRANSFORM LAPLACE TRANSFORM The Laplace ranform i very ueful in analyi and deign for yem ha are linear and ime-invarian

More information

Warm Up. Correct order: s,u,v,y,x,w,t

Warm Up. Correct order: s,u,v,y,x,w,t Warm Up Rn Breadh Fir Search on hi graph aring from. Wha order are erice placed on he qee? When proceing a erex iner neighbor in alphabeical order. In a direced graph, BFS only follow an edge in he direcion

More information

20/20 20/20 0/5 0/5 20/20 20/20 5/5 0/5 0/5 5/5 0/20 25/30 20/20 30/30 20/20 0/5 5/5 20/20 0/5 0/5 15/20 15/25 20/20 10/10

20/20 20/20 0/5 0/5 20/20 20/20 5/5 0/5 0/5 5/5 0/20 25/30 20/20 30/30 20/20 0/5 5/5 20/20 0/5 0/5 15/20 15/25 20/20 10/10 Annoncemen CSEP Applied Algorihm Richard Anderon Lecre 9 Nework Flow Applicaion Reading for hi week 7.-7.. Nework flow applicaion Nex week: Chaper 8. NP-Compleene Final exam, March 8, 6:0 pm. A UW. hor

More information

Admin MAX FLOW APPLICATIONS. Flow graph/networks. Flow constraints 4/30/13. CS lunch today Grading. in-flow = out-flow for every vertex (except s, t)

Admin MAX FLOW APPLICATIONS. Flow graph/networks. Flow constraints 4/30/13. CS lunch today Grading. in-flow = out-flow for every vertex (except s, t) /0/ dmin lunch oday rading MX LOW PPLIION 0, pring avid Kauchak low graph/nework low nework direced, weighed graph (V, ) poiive edge weigh indicaing he capaciy (generally, aume ineger) conain a ingle ource

More information

6.8 Laplace Transform: General Formulas

6.8 Laplace Transform: General Formulas 48 HAP. 6 Laplace Tranform 6.8 Laplace Tranform: General Formula Formula Name, ommen Sec. F() l{ f ()} e f () d f () l {F()} Definiion of Tranform Invere Tranform 6. l{af () bg()} al{f ()} bl{g()} Lineariy

More information

Soviet Rail Network, 1955

Soviet Rail Network, 1955 Sovie Rail Nework, 1 Reference: On he hiory of he ranporaion and maximum flow problem. Alexander Schrijver in Mah Programming, 1: 3,. Maximum Flow and Minimum Cu Max flow and min cu. Two very rich algorihmic

More information

18 Extensions of Maximum Flow

18 Extensions of Maximum Flow Who are you?" aid Lunkwill, riing angrily from hi ea. Wha do you wan?" I am Majikhie!" announced he older one. And I demand ha I am Vroomfondel!" houed he younger one. Majikhie urned on Vroomfondel. I

More information

CHAPTER 7: SECOND-ORDER CIRCUITS

CHAPTER 7: SECOND-ORDER CIRCUITS EEE5: CI RCUI T THEORY CHAPTER 7: SECOND-ORDER CIRCUITS 7. Inroducion Thi chaper conider circui wih wo orage elemen. Known a econd-order circui becaue heir repone are decribed by differenial equaion ha

More information

5.2 GRAPHICAL VELOCITY ANALYSIS Polygon Method

5.2 GRAPHICAL VELOCITY ANALYSIS Polygon Method ME 352 GRHICL VELCITY NLYSIS 52 GRHICL VELCITY NLYSIS olygon Mehod Velociy analyi form he hear of kinemaic and dynamic of mechanical yem Velociy analyi i uually performed following a poiion analyi; ie,

More information

u(t) Figure 1. Open loop control system

u(t) Figure 1. Open loop control system Open loop conrol v cloed loop feedbac conrol The nex wo figure preen he rucure of open loop and feedbac conrol yem Figure how an open loop conrol yem whoe funcion i o caue he oupu y o follow he reference

More information

Flow networks. Flow Networks. A flow on a network. Flow networks. The maximum-flow problem. Introduction to Algorithms, Lecture 22 December 5, 2001

Flow networks. Flow Networks. A flow on a network. Flow networks. The maximum-flow problem. Introduction to Algorithms, Lecture 22 December 5, 2001 CS 545 Flow Nework lon Efra Slide courey of Charle Leieron wih mall change by Carola Wenk Flow nework Definiion. flow nework i a direced graph G = (V, E) wih wo diinguihed verice: a ource and a ink. Each

More information

To become more mathematically correct, Circuit equations are Algebraic Differential equations. from KVL, KCL from the constitutive relationship

To become more mathematically correct, Circuit equations are Algebraic Differential equations. from KVL, KCL from the constitutive relationship Laplace Tranform (Lin & DeCarlo: Ch 3) ENSC30 Elecric Circui II The Laplace ranform i an inegral ranformaion. I ranform: f ( ) F( ) ime variable complex variable From Euler > Lagrange > Laplace. Hence,

More information

Chapter 7: Inverse-Response Systems

Chapter 7: Inverse-Response Systems Chaper 7: Invere-Repone Syem Normal Syem Invere-Repone Syem Baic Sar ou in he wrong direcion End up in he original eady-ae gain value Two or more yem wih differen magniude and cale in parallel Main yem

More information

Introduction to Bayesian Estimation. McGill COMP 765 Sept 12 th, 2017

Introduction to Bayesian Estimation. McGill COMP 765 Sept 12 th, 2017 Inrodcion o Baesian Esimaion McGill COM 765 Sep 2 h 207 Where am I? or firs core problem Las class: We can model a robo s moions and he world as spaial qaniies These are no perfec and herefore i is p o

More information

Soviet Rail Network, 1955

Soviet Rail Network, 1955 7.1 Nework Flow Sovie Rail Nework, 19 Reerence: On he hiory o he ranporaion and maximum low problem. lexander Schrijver in Mah Programming, 91: 3, 00. (See Exernal Link ) Maximum Flow and Minimum Cu Max

More information

CS 473G Lecture 15: Max-Flow Algorithms and Applications Fall 2005

CS 473G Lecture 15: Max-Flow Algorithms and Applications Fall 2005 CS 473G Lecure 1: Max-Flow Algorihm and Applicaion Fall 200 1 Max-Flow Algorihm and Applicaion (November 1) 1.1 Recap Fix a direced graph G = (V, E) ha doe no conain boh an edge u v and i reveral v u,

More information

Hidden Markov Models. Adapted from. Dr Catherine Sweeney-Reed s slides

Hidden Markov Models. Adapted from. Dr Catherine Sweeney-Reed s slides Hidden Markov Models Adaped from Dr Caherine Sweeney-Reed s slides Summary Inroducion Descripion Cenral in HMM modelling Exensions Demonsraion Specificaion of an HMM Descripion N - number of saes Q = {q

More information

Network flows. The problem. c : V V! R + 0 [ f+1g. flow network G = (V, E, c), a source s and a sink t uv not in E implies c(u, v) = 0

Network flows. The problem. c : V V! R + 0 [ f+1g. flow network G = (V, E, c), a source s and a sink t uv not in E implies c(u, v) = 0 Nework flow The problem Seing flow nework G = (V, E, c), a orce and a ink no in E implie c(, ) = 0 Flow from o capaciy conrain kew-ymmery flow-coneraion ale of he flow jfj = P 2V Find a maximm flow from

More information

Network Flows UPCOPENCOURSEWARE number 34414

Network Flows UPCOPENCOURSEWARE number 34414 Nework Flow UPCOPENCOURSEWARE number Topic : F.-Javier Heredia Thi work i licened under he Creaive Common Aribuion- NonCommercial-NoDeriv. Unpored Licene. To view a copy of hi licene, vii hp://creaivecommon.org/licene/by-nc-nd/./

More information

CSC 364S Notes University of Toronto, Spring, The networks we will consider are directed graphs, where each edge has associated with it

CSC 364S Notes University of Toronto, Spring, The networks we will consider are directed graphs, where each edge has associated with it CSC 36S Noe Univeriy of Torono, Spring, 2003 Flow Algorihm The nework we will conider are direced graph, where each edge ha aociaed wih i a nonnegaive capaciy. The inuiion i ha if edge (u; v) ha capaciy

More information

! Abstraction for material flowing through the edges. ! G = (V, E) = directed graph, no parallel edges.

! Abstraction for material flowing through the edges. ! G = (V, E) = directed graph, no parallel edges. Sovie Rail Nework, haper Nework Flow Slide by Kevin Wayne. opyrigh Pearon-ddion Weley. ll righ reerved. Reference: On he hiory of he ranporaion and maximum flow problem. lexander Schrijver in Mah Programming,

More information

1 Adjusted Parameters

1 Adjusted Parameters 1 Adjued Parameer Here, we li he exac calculaion we made o arrive a our adjued parameer Thee adjumen are made for each ieraion of he Gibb ampler, for each chain of he MCMC The regional memberhip of each

More information

Physics 240: Worksheet 16 Name

Physics 240: Worksheet 16 Name Phyic 4: Workhee 16 Nae Non-unifor circular oion Each of hee proble involve non-unifor circular oion wih a conan α. (1) Obain each of he equaion of oion for non-unifor circular oion under a conan acceleraion,

More information

Graphs III - Network Flow

Graphs III - Network Flow Graph III - Nework Flow Flow nework eup graph G=(V,E) edge capaciy w(u,v) 0 - if edge doe no exi, hen w(u,v)=0 pecial verice: ource verex ; ink verex - no edge ino and no edge ou of Aume every verex v

More information

Using Lagrangian relaxation in optimisation of unit commitment and planning

Using Lagrangian relaxation in optimisation of unit commitment and planning Faklea za lekroehniko va horin Heike Brand Chrioph Weber Uing Lagrangian relaxaion in opimiaion of ni commimen and planning SCGN Dicion aper No. 3 Conrac No. NK5-C--94 rojec co-fnded by he ropean Commniy

More information

Vehicle Arrival Models : Headway

Vehicle Arrival Models : Headway Chaper 12 Vehicle Arrival Models : Headway 12.1 Inroducion Modelling arrival of vehicle a secion of road is an imporan sep in raffic flow modelling. I has imporan applicaion in raffic flow simulaion where

More information

Math 10B: Mock Mid II. April 13, 2016

Math 10B: Mock Mid II. April 13, 2016 Name: Soluions Mah 10B: Mock Mid II April 13, 016 1. ( poins) Sae, wih jusificaion, wheher he following saemens are rue or false. (a) If a 3 3 marix A saisfies A 3 A = 0, hen i canno be inverible. True.

More information

Buckling of a structure means failure due to excessive displacements (loss of structural stiffness), and/or

Buckling of a structure means failure due to excessive displacements (loss of structural stiffness), and/or Buckling Buckling of a rucure mean failure due o exceive diplacemen (lo of rucural iffne), and/or lo of abiliy of an equilibrium configuraion of he rucure The rule of humb i ha buckling i conidered a mode

More information

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter Sae-Space Models Iniializaion, Esimaion and Smoohing of he Kalman Filer Iniializaion of he Kalman Filer The Kalman filer shows how o updae pas predicors and he corresponding predicion error variances when

More information

CSE 521: Design & Analysis of Algorithms I

CSE 521: Design & Analysis of Algorithms I CSE 52: Deign & Analyi of Algorihm I Nework Flow Paul Beame Biparie Maching Given: A biparie graph G=(V,E) M E i a maching in G iff no wo edge in M hare a verex Goal: Find a maching M in G of maximum poible

More information

EECE 301 Signals & Systems Prof. Mark Fowler

EECE 301 Signals & Systems Prof. Mark Fowler EECE 31 Signal & Syem Prof. Mark Fowler Noe Se #27 C-T Syem: Laplace Tranform Power Tool for yem analyi Reading Aignmen: Secion 6.1 6.3 of Kamen and Heck 1/18 Coure Flow Diagram The arrow here how concepual

More information

Notes on cointegration of real interest rates and real exchange rates. ρ (2)

Notes on cointegration of real interest rates and real exchange rates. ρ (2) Noe on coinegraion of real inere rae and real exchange rae Charle ngel, Univeriy of Wiconin Le me ar wih he obervaion ha while he lieraure (mo prominenly Meee and Rogoff (988) and dion and Paul (993))

More information

Probabilistic Robotics Sebastian Thrun-- Stanford

Probabilistic Robotics Sebastian Thrun-- Stanford robabilisic Roboics Sebasian Thrn-- Sanford Inrodcion robabiliies Baes rle Baes filers robabilisic Roboics Ke idea: Eplici represenaion of ncerain sing he calcls of probabili heor ercepion sae esimaion

More information

Today: Max Flow Proofs

Today: Max Flow Proofs Today: Max Flow Proof COSC 58, Algorihm March 4, 04 Many of hee lide are adaped from everal online ource Reading Aignmen Today cla: Chaper 6 Reading aignmen for nex cla: Chaper 7 (Amorized analyi) In-Cla

More information

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power Alpaydin Chaper, Michell Chaper 7 Alpaydin slides are in urquoise. Ehem Alpaydin, copyrigh: The MIT Press, 010. alpaydin@boun.edu.r hp://www.cmpe.boun.edu.r/ ehem/imle All oher slides are based on Michell.

More information

Average Case Lower Bounds for Monotone Switching Networks

Average Case Lower Bounds for Monotone Switching Networks Average Cae Lower Bound for Monoone Swiching Nework Yuval Filmu, Toniann Piai, Rober Robere, Sephen Cook Deparmen of Compuer Science Univeriy of Torono Monoone Compuaion (Refreher) Monoone circui were

More information

Greedy. I Divide and Conquer. I Dynamic Programming. I Network Flows. Network Flow. I Previous topics: design techniques

Greedy. I Divide and Conquer. I Dynamic Programming. I Network Flows. Network Flow. I Previous topics: design techniques Algorihm Deign Technique CS : Nework Flow Dan Sheldon April, reedy Divide and Conquer Dynamic Programming Nework Flow Comparion Nework Flow Previou opic: deign echnique reedy Divide and Conquer Dynamic

More information

Speaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis

Speaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis Speaker Adapaion Techniques For Coninuous Speech Using Medium and Small Adapaion Daa Ses Consaninos Boulis Ouline of he Presenaion Inroducion o he speaker adapaion problem Maximum Likelihood Sochasic Transformaions

More information

4.2 Continuous-Time Systems and Processes Problem Definition Let the state variable representation of a linear system be

4.2 Continuous-Time Systems and Processes Problem Definition Let the state variable representation of a linear system be 4 COVARIANCE ROAGAION 41 Inrodcion Now ha we have compleed or review of linear sysems and random processes, we wan o eamine he performance of linear sysems ecied by random processes he sandard approach

More information

1 Review of Zero-Sum Games

1 Review of Zero-Sum Games COS 5: heoreical Machine Learning Lecurer: Rob Schapire Lecure #23 Scribe: Eugene Brevdo April 30, 2008 Review of Zero-Sum Games Las ime we inroduced a mahemaical model for wo player zero-sum games. Any

More information

16 Max-Flow Algorithms and Applications

16 Max-Flow Algorithms and Applications Algorihm A proce canno be underood by opping i. Underanding mu move wih he flow of he proce, mu join i and flow wih i. The Fir Law of Mena, in Frank Herber Dune (196) There a difference beween knowing

More information

Exponential Sawtooth

Exponential Sawtooth ECPE 36 HOMEWORK 3: PROPERTIES OF THE FOURIER TRANSFORM SOLUTION. Exponenial Sawooh: The eaie way o do hi problem i o look a he Fourier ranform of a ingle exponenial funcion, () = exp( )u(). From he able

More information

Lecture 3: Exponential Smoothing

Lecture 3: Exponential Smoothing NATCOR: Forecasing & Predicive Analyics Lecure 3: Exponenial Smoohing John Boylan Lancaser Cenre for Forecasing Deparmen of Managemen Science Mehods and Models Forecasing Mehod A (numerical) procedure

More information

Wrap up: Weighted, directed graph shortest path Minimum Spanning Tree. Feb 25, 2019 CSCI211 - Sprenkle

Wrap up: Weighted, directed graph shortest path Minimum Spanning Tree. Feb 25, 2019 CSCI211 - Sprenkle Objecive Wrap up: Weighed, direced graph hore pah Minimum Spanning Tree eb, 1 SI - Sprenkle 1 Review Wha are greedy algorihm? Wha i our emplae for olving hem? Review he la problem we were working on: Single-ource,

More information

Matching. Slides designed by Kevin Wayne.

Matching. Slides designed by Kevin Wayne. Maching Maching. Inpu: undireced graph G = (V, E). M E i a maching if each node appear in a mo edge in M. Max maching: find a max cardinaliy maching. Slide deigned by Kevin Wayne. Biparie Maching Biparie

More information

Echocardiography Project and Finite Fourier Series

Echocardiography Project and Finite Fourier Series Echocardiography Projec and Finie Fourier Series 1 U M An echocardiagram is a plo of how a porion of he hear moves as he funcion of ime over he one or more hearbea cycles If he hearbea repeas iself every

More information

, the. L and the L. x x. max. i n. It is easy to show that these two norms satisfy the following relation: x x n x = (17.3) max

, the. L and the L. x x. max. i n. It is easy to show that these two norms satisfy the following relation: x x n x = (17.3) max ecure 8 7. Sabiliy Analyi For an n dimenional vecor R n, he and he vecor norm are defined a: = T = i n i (7.) I i eay o how ha hee wo norm aify he following relaion: n (7.) If a vecor i ime-dependen, hen

More information

Introduction to SLE Lecture Notes

Introduction to SLE Lecture Notes Inroducion o SLE Lecure Noe May 13, 16 - The goal of hi ecion i o find a ufficien condiion of λ for he hull K o be generaed by a imple cure. I urn ou if λ 1 < 4 hen K i generaed by a imple curve. We will

More information

RL Lecture 7: Eligibility Traces. R. S. Sutton and A. G. Barto: Reinforcement Learning: An Introduction 1

RL Lecture 7: Eligibility Traces. R. S. Sutton and A. G. Barto: Reinforcement Learning: An Introduction 1 RL Lecure 7: Eligibiliy Traces R. S. Suon and A. G. Baro: Reinforcemen Learning: An Inroducion 1 N-sep TD Predicion Idea: Look farher ino he fuure when you do TD backup (1, 2, 3,, n seps) R. S. Suon and

More information

4/12/12. Applications of the Maxflow Problem 7.5 Bipartite Matching. Bipartite Matching. Bipartite Matching. Bipartite matching: the flow network

4/12/12. Applications of the Maxflow Problem 7.5 Bipartite Matching. Bipartite Matching. Bipartite Matching. Bipartite matching: the flow network // Applicaion of he Maxflow Problem. Biparie Maching Biparie Maching Biparie maching. Inpu: undireced, biparie graph = (, E). M E i a maching if each node appear in a mo one edge in M. Max maching: find

More information

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB Elecronic Companion EC.1. Proofs of Technical Lemmas and Theorems LEMMA 1. Le C(RB) be he oal cos incurred by he RB policy. Then we have, T L E[C(RB)] 3 E[Z RB ]. (EC.1) Proof of Lemma 1. Using he marginal

More information

Presentation Overview

Presentation Overview Acion Refinemen in Reinforcemen Learning by Probabiliy Smoohing By Thomas G. Dieerich & Didac Busques Speaer: Kai Xu Presenaion Overview Bacground The Probabiliy Smoohing Mehod Experimenal Sudy of Acion

More information

Rough Paths and its Applications in Machine Learning

Rough Paths and its Applications in Machine Learning Pah ignaure Machine learning applicaion Rough Pah and i Applicaion in Machine Learning July 20, 2017 Rough Pah and i Applicaion in Machine Learning Pah ignaure Machine learning applicaion Hiory and moivaion

More information

I Let E(v! v 0 ) denote the event that v 0 is selected instead of v I The block error probability is the union of such events

I Let E(v! v 0 ) denote the event that v 0 is selected instead of v I The block error probability is the union of such events ED042 Error Conrol Coding Kodningseknik) Chaper 3: Opimal Decoding Mehods, Par ML Decoding Error Proailiy Sepemer 23, 203 ED042 Error Conrol Coding: Chaper 3 20 / 35 Pairwise Error Proailiy Assme ha v

More information

m = 41 members n = 27 (nonfounders), f = 14 (founders) 8 markers from chromosome 19

m = 41 members n = 27 (nonfounders), f = 14 (founders) 8 markers from chromosome 19 Sequenial Imporance Sampling (SIS) AKA Paricle Filering, Sequenial Impuaion (Kong, Liu, Wong, 994) For many problems, sampling direcly from he arge disribuion is difficul or impossible. One reason possible

More information

Today s topics. CSE 421 Algorithms. Problem Reduction Examples. Problem Reduction. Undirected Network Flow. Bipartite Matching. Problem Reductions

Today s topics. CSE 421 Algorithms. Problem Reduction Examples. Problem Reduction. Undirected Network Flow. Bipartite Matching. Problem Reductions Today opic CSE Algorihm Richard Anderon Lecure Nework Flow Applicaion Prolem Reducion Undireced Flow o Flow Biparie Maching Dijoin Pah Prolem Circulaion Loweround conrain on flow Survey deign Prolem Reducion

More information

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD HAN XIAO 1. Penalized Leas Squares Lasso solves he following opimizaion problem, ˆβ lasso = arg max β R p+1 1 N y i β 0 N x ij β j β j (1.1) for some 0.

More information

They were originally developed for network problem [Dantzig, Ford, Fulkerson 1956]

They were originally developed for network problem [Dantzig, Ford, Fulkerson 1956] 6. Inroducion... 6. The primal-dual algorihmn... 6 6. Remark on he primal-dual algorihmn... 7 6. A primal-dual algorihmn for he hore pah problem... 8... 9 6.6 A primal-dual algorihmn for he weighed maching

More information

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017 Two Popular Bayesian Esimaors: Paricle and Kalman Filers McGill COMP 765 Sep 14 h, 2017 1 1 1, dx x Bel x u x P x z P Recall: Bayes Filers,,,,,,, 1 1 1 1 u z u x P u z u x z P Bayes z = observaion u =

More information

DESIGN OF TENSION MEMBERS

DESIGN OF TENSION MEMBERS CHAPTER Srcral Seel Design LRFD Mehod DESIGN OF TENSION MEMBERS Third Ediion A. J. Clark School of Engineering Deparmen of Civil and Environmenal Engineering Par II Srcral Seel Design and Analysis 4 FALL

More information

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power Alpaydin Chaper, Michell Chaper 7 Alpaydin slides are in urquoise. Ehem Alpaydin, copyrigh: The MIT Press, 010. alpaydin@boun.edu.r hp://www.cmpe.boun.edu.r/ ehem/imle All oher slides are based on Michell.

More information

t is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t...

t is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t... Mah 228- Fri Mar 24 5.6 Marix exponenials and linear sysems: The analogy beween firs order sysems of linear differenial equaions (Chaper 5) and scalar linear differenial equaions (Chaper ) is much sronger

More information

Suggested Solutions to Midterm Exam Econ 511b (Part I), Spring 2004

Suggested Solutions to Midterm Exam Econ 511b (Part I), Spring 2004 Suggeed Soluion o Miderm Exam Econ 511b (Par I), Spring 2004 1. Conider a compeiive equilibrium neoclaical growh model populaed by idenical conumer whoe preference over conumpion ream are given by P β

More information

20. Applications of the Genetic-Drift Model

20. Applications of the Genetic-Drift Model 0. Applicaions of he Geneic-Drif Model 1) Deermining he probabiliy of forming any paricular combinaion of genoypes in he nex generaion: Example: If he parenal allele frequencies are p 0 = 0.35 and q 0

More information

Miscellanea Miscellanea

Miscellanea Miscellanea Miscellanea Miscellanea Miscellanea Miscellanea Miscellanea CENRAL EUROPEAN REVIEW OF ECONOMICS & FINANCE Vol., No. (4) pp. -6 bigniew Śleszński USING BORDERED MARICES FOR DURBIN WASON D SAISIC EVALUAION

More information

HYPOTHESIS TESTING. four steps. 1. State the hypothesis. 2. Set the criterion for rejecting. 3. Compute the test statistics. 4. Interpret the results.

HYPOTHESIS TESTING. four steps. 1. State the hypothesis. 2. Set the criterion for rejecting. 3. Compute the test statistics. 4. Interpret the results. Inrodcion o Saisics in Psychology PSY Professor Greg Francis Lecre 23 Hypohesis esing for correlaions Is here a correlaion beween homework and exam grades? for seps. Sae he hypohesis. 2. Se he crierion

More information

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles Diebold, Chaper 7 Francis X. Diebold, Elemens of Forecasing, 4h Ediion (Mason, Ohio: Cengage Learning, 006). Chaper 7. Characerizing Cycles Afer compleing his reading you should be able o: Define covariance

More information

Modeling the Evolution of Demand Forecasts with Application to Safety Stock Analysis in Production/Distribution Systems

Modeling the Evolution of Demand Forecasts with Application to Safety Stock Analysis in Production/Distribution Systems Modeling he Evoluion of Demand oreca wih Applicaion o Safey Sock Analyi in Producion/Diribuion Syem David Heah and Peer Jackon Preened by Kai Jiang Thi ummary preenaion baed on: Heah, D.C., and P.L. Jackon.

More information

ODEs II, Lecture 1: Homogeneous Linear Systems - I. Mike Raugh 1. March 8, 2004

ODEs II, Lecture 1: Homogeneous Linear Systems - I. Mike Raugh 1. March 8, 2004 ODEs II, Lecure : Homogeneous Linear Sysems - I Mike Raugh March 8, 4 Inroducion. In he firs lecure we discussed a sysem of linear ODEs for modeling he excreion of lead from he human body, saw how o ransform

More information

Geometric Path Problems with Violations

Geometric Path Problems with Violations Click here o view linked Reference 1 1 1 0 1 0 1 0 1 0 1 Geomeric Pah Problem wih Violaion Anil Mahehwari 1, Subha C. Nandy, Drimi Paanayak, Saanka Roy and Michiel Smid 1 1 School of Compuer Science, Carleon

More information

HYPOTHESIS TESTING. four steps. 1. State the hypothesis and the criterion. 2. Compute the test statistic. 3. Compute the p-value. 4.

HYPOTHESIS TESTING. four steps. 1. State the hypothesis and the criterion. 2. Compute the test statistic. 3. Compute the p-value. 4. Inrodcion o Saisics in Psychology PSY Professor Greg Francis Lecre 24 Hypohesis esing for correlaions Is here a correlaion beween homework and exam grades? for seps. Sae he hypohesis and he crierion 2.

More information

Types of Exponential Smoothing Methods. Simple Exponential Smoothing. Simple Exponential Smoothing

Types of Exponential Smoothing Methods. Simple Exponential Smoothing. Simple Exponential Smoothing M Business Forecasing Mehods Exponenial moohing Mehods ecurer : Dr Iris Yeung Room No : P79 Tel No : 788 8 Types of Exponenial moohing Mehods imple Exponenial moohing Double Exponenial moohing Brown s

More information

Viterbi Algorithm: Background

Viterbi Algorithm: Background Vierbi Algorihm: Background Jean Mark Gawron March 24, 2014 1 The Key propery of an HMM Wha is an HMM. Formally, i has he following ingrediens: 1. a se of saes: S 2. a se of final saes: F 3. an iniial

More information

10. State Space Methods

10. State Space Methods . Sae Space Mehods. Inroducion Sae space modelling was briefly inroduced in chaper. Here more coverage is provided of sae space mehods before some of heir uses in conrol sysem design are covered in he

More information

Lecture 2 October ε-approximation of 2-player zero-sum games

Lecture 2 October ε-approximation of 2-player zero-sum games Opimizaion II Winer 009/10 Lecurer: Khaled Elbassioni Lecure Ocober 19 1 ε-approximaion of -player zero-sum games In his lecure we give a randomized ficiious play algorihm for obaining an approximae soluion

More information

Speech and Language Processing

Speech and Language Processing Speech and Language rocessing Lecure 4 Variaional inference and sampling Informaion and Communicaions Engineering Course Takahiro Shinozaki 08//5 Lecure lan (Shinozaki s par) I gives he firs 6 lecures

More information