Hidden Markov Models

Size: px
Start display at page:

Download "Hidden Markov Models"

Transcription

1 Machne Learnng for Sgnal Processng Hdden Markov Models Class Oc

2 Admnsrva HW2 due Tuesday Is everyone on he projecs page? Where are your projec proposals? 2

3 Recap: Wha s an HMM Probablsc funcon of a markov chan Models a dynamcal sysem Sysem goes hrough a number of saes Followng a Markov chan model On arrvng a any sae generaes observaons accordng o a sae-specfc specfc probably dsrbuon 3

4 A Though Expermen I jus called ou he 6 from he blue guy.. goa swch o paern 2.. Two shooers roll dce A caller calls ou he number rolled. We only ge o hear wha he calls ou The caller behaves randomly If he has jus called a number rolled by he blue shooer, hs nex call s ha of he red shooer 70% of he me Bu f he has jus called he red shooer, he has only a 40% probably of callng he red shooer agan n he nex call How do we characerze hs? 12 Oc /18797

5 A Though Expermen P(X blue e P(X red d The dos and arrows represen he saes of he caller When he s on he blue crcle he calls ou he blue dce When he s on he red crcle he calls ou he red dce The hsograms represen he probably dsrbuon of he numbers for he blue and red dce 5

6 A Though Expermen P(X blue e P(X red d When he caller s n any sae, he calls a number based on he probably dsrbuon of ha sae We call hese sae oupu dsrbuons A each sep, he moves from hs curren sae o anoher sae followng a probably dsrbuon We call hese ranson probables The caller s an HMM!!! 12 Oc /18797

7 Wha s an HMM HMMs are sascal models for (causal processes The model assumes ha he process canbenoneofa of a number of saes a any me nsan The sae of he process a any me nsan depends only on he sae a he prevous nsan (causaly, Markovan A each nsan he process generaes an observaon from a probably dsrbuon ha s specfc o he curren sae The generaed observaons are all ha we ge o see he acual sae of he process s no drecly observable Hence he qualfer hdden 7

8 Hdden Markov Models A Hdden Markov Model consss of wo componens A sae/ranson backbone ha specfes how many saes here are, and how hey can follow one anoher A se of probably dsrbuons, one for each sae, whch specfes he dsrbuon of all vecors n ha sae Markov chan Daa dsrbuons Ths can be facored no wo separae probablsc enes A probablsc Markov chan wh saes and ransons A se of daa probably dsrbuons, assocaed wh he saes 11755/18797

9 How an HMM models a process HMM assumed o be generang daa sae sequence sae dsrbuons observaon sequence 9

10 HMM Parameers The opology of he HMM Number of saes and allowed ransons E.g. here we have 3 saes and canno go from he blue sae o he red The ranson probables Ofen represened as a marx as here T j s he probably ha when n sae, he process wll move o j The probably of begnnng T a any sae s The complee se s represened as 12 Oc The 2010 sae oupu dsrbuons 11755/

11 HMM sae oupu dsrbuons The sae oupu dsrbuon s he dsrbuon of daa produced from any sae Typcally modelled as Gaussan Typcally modelled as Gaussan ( 0.5( 1 2 1, ; ( ( T x x d e x Gaussan s x P The paremeers are and More ypcally, modelled as Gaussan mxures 1, ; ( ( K j j j x Gaussan w s x P Oher dsrbuons may also be used E.g. hsograms n he dce case 0,,,, ; ( ( j j j j x Gaussan w s x P.g. sog a s e d ce case 12 Oc /18797

12 The Dagonal Covarance Marx Full covarance: all elemens are non-zero Dagonal covarance: off-dagonal elemens are zero -0.5(x- T -1 (x- (x - 2 / 2 2 For GMMs s frequenly assumed ha he feaure vecor dmensons are all ndependen of each oher Resul: The covarance marx s reduced o a dagonal form The deermnan of he dagonal marx s easy o compue 12

13 Three Basc HMM Problems Wha s he probably ha wll generae a specfc observaon sequence Gven a observaon sequence, how do we deermne whch observaon was generaed from whch sae The sae segmenaon problem How do we learn he parameers of he HMM from observaon sequences 13

14 Compung he Probably of an Observaon Sequence Two aspecs o producng he observaon: Progressng hrough a sequence of saes Producng observaons from hese saes 14

15 Progressng hrough saes HMM assumed o be generang daa sae sequence The process begns a some sae (red here From ha sae, makes an allowed ranson To arrve a he same or any oher sae From ha sae makes anoher allowed ranson And so on 15

16 Probably ha he HMM wll follow a parcular sae sequence P ( s, s, s,... P ( s P ( s s P ( s s P(s( 1 s he probably ha he process wll nally be n sae s 1 P(s s s he ranson probably bl of movng o sae s a he nex me nsan when he sysem s currenly n s Also denoed by T j earler 16

17 Generang Observaons from Saes HMM assumed o be generang daa sae sequence sae dsrbuons observaon sequence A each me generaes an observaon from he sae s n a ha me 17

18 Probably ha he HMM wll generae a parcular observaon sequence gven a sae sequence (sae sequence known P( o, o, o,... s, s, s,... P( os P( o s P( o s Compued from he Gaussan or Gaussan mxure for sae s 1 P(o s s he probably of generang observaon o when he sysem s n sae s 18

19 Proceedng hrough Saes and Producng Observaons HMM assumed o be generang daa sae sequence sae dsrbuons observaon sequence A each me produces an observaon and makes a ranson 19

20 Probably ha he HMM wll generae a parcular sae sequence and from, a parcular observaon sequence P ( o, o, o,..., s, s, s, P( o1, o2, o3,... s1, s2, s3,... P( s1, s2, s3,... P( os 1 1 P( o2 s2 P( o3 s3... P( s1 P( s2 s1 P( s3 s

21 Probably of Generang an Observaon Sequence The precse sae sequence s no known All possble sae sequences mus be consdered P ( o, o, o, all. possble sae. sequences P ( o, o, o,...,, s, s, s, all. possble sae. sequences P( os P( o s P( o s... P( s P( s s P( s s

22 Compung Effcenly Explc summng over all sae sequences s no racable A very large number of possble sae sequences Insead we use he forward algorhm A dynamc programmng echnque. 22

23 Illusrave Example Example: a generc HMM wh 5 saes and a ermnang sae. Lef o rgh opology P(s = 1 for sae 1 and 0 for ohers The arrows represen ranson for whch he probably s no 0 Noaon: P(s s = T j We represen P(o s = b ( for brevy 23

24 Dverson: The Trells Sae ndex s (s, -1 Feaure vecors (me The rells s a graphcal represenaon of all possble pahs hrough he HMM o produce a gven observaon The Y-axs represens HMM saes, X axs represens observaons Every edge n he graph represens a vald ranson n he HMM over a sngle me sep Every node represens he even of a parcular observaon beng generaed from a parcular sae 24

25 The Forward Algorhm ( s, P( x1, x2,..., x, sae( s ndex Sae n s (s, -1 me (s, s he oal probably of ALL sae sequences ha end a sae s a me, and all observaons unl x 25

26 The Forward Algorhm ( s, P( x1, x2,..., x, sae( s Sae n ndex (s,-1 s (s, Can be recursvely esmaed sarng from he frs me nsan (forward recurson (1,-1-1 ( s, ( s', 1 P ( s s' P ( x s s' me (s, can be recursvely compued n erms of (s,, ( he forward probables bl a me -1 26

27 The Forward Algorhm Toalprob ( s, T s ndex Sae n T me In he fnal observaon he alpha a each sae gves he probably of all sae sequences endng a ha sae General model: The oal probably of he observaon s he sum of he alpha values a all saes 27

28 The absorbng sae Observaon sequences are assumed o end only when he process arrves a an absorbng b sae No observaons are produced from he absorbng sae 28

29 The Forward Algorhm Toalprob ( s, T 1 absorbng ndex Sae n me ( s b absorbng, T 1 ( s', T P ( sabsorbng b s' s' Absorbng sae model: The oal probably s he alpha compued a he absorbng sae afer he fnal observaon 29 T

30 Problem 2: Sae segmenaon Gven only a sequence of observaons, how do we deermne whch h sequence of saes was followed n producng? 30

31 The HMM as a generaor HMM assumed o be generang daa sae sequence sae dsrbuons observaon sequence The process goes hrough a seres of saes and produces observaons from hem 31

32 Saes are hdden HMM assumed o be generang daa sae sequence sae dsrbuons observaon sequence The observaons do no reveal he underlyng sae 32

33 The sae segmenaon problem HMM assumed o be generang daa sae sequence sae dsrbuons observaon sequence Sae segmenaon: Esmae sae sequence gven observaons 33

34 Esmang he Sae Sequence Many dfferen sae sequences are capable of producng he observaon o Soluon: Idenfy he mos probable sae sequence The sae sequence for whch he probably of progressng hrough ha sequence and generang he observaon sequence s maxmum P( o, o, o,..., s, s, s,....e s maxmum

35 Esmang he sae sequence Once agan, exhausve evaluaon s mpossbly expensve Bu once agan a smple dynamc-programmng soluon s avalable P( o, o, o,..., s, s, s, P( os 1 1 P( o2 s2 P( o3 s3... P( s1 P( s2 s1 P( s3 s2... Needed: arg maxs s, s,... P ( o1 s1 P ( s1 P ( o2 s2 P ( s2 s1 P ( o3 s3 P ( s3 2, s

36 Esmang he sae sequence Once agan, exhausve evaluaon s mpossbly expensve Bu once agan a smple dynamc-programmng soluon s avalable P( o, o, o,..., s, s, s, P( os 1 1 P( o2 s2 P( o3 s3... P( s1 P( s2 s1 P( s3 s2... Needed: arg maxs s, s,... P ( o1 s1 P ( s1 P ( o2 s2 P ( s2 s1 P ( o3 s3 P ( s3 2, s

37 The HMM as a generaor HMM assumed o be generang daa sae sequence sae dsrbuons observaon sequence Each enclosed erm represens one forward ranson and a subsequen emsson 37

38 The sae sequence The probably of a sae sequence?,?,?,?,s x,s y endng d d ll b l a me, and producng all observaons unl o P(o 1..-1,?,?,?,?, s x, o,s y = P(o 1..-1,?,?,?,?, s x P(o s y P(s y s x The bes sae sequence ha ends wh s x,s y a wll have a probably equal o he probably of he bes sae sequence endng a -1 a s x mes P(o s y P(s y s x 38

39 Exendng he sae sequence sae sequence s x s y sae dsrbuons b observaon sequence The probably of a sae sequence?,?,?,?,s x,s y endng a me and producng observaons unl o P(o 1..-1,o,?,?,?,?, s x,s y = P(o 1..-1,?,?,?,?, s x P(o s y P(s y s x 39

40 Trells The graph below shows he se of all possble sae sequences hrough hs HMM n fve me nsans me 12 Oc /18797

41 The cos of exendng a sae sequence The cos of exendng a sae sequence endng a s x s only dependen on he ranson from s x o s y, and he observaon probably a s y P(o s y P(s y s x s y s x me 12 Oc /18797

42 The cos of exendng a sae sequence The bes pah o s y hrough s x s smply an exenson of he bes pah o s xbesp(o1..-1,?,?,?,?, s x P(o s y P(s y s x s y s x me 12 Oc /18797

43 The Recurson The overall bes pah o s y s an exenson of he bes pah o one of he saes a he prevous me s y me 12 Oc /18797

44 The Recurson Prob. of bes pah o s y = Max s???? x BesP(o 1..-1,?,?,?,?, s x P(o s y P(s y s x s y me 12 Oc /18797

45 Fndng he bes sae sequence The smple algorhm jus presened s called he VITERBI algorhm n he leraure Afer A.J.Verb, who nvened hs dynamc programmng algorhm for a compleely dfferen purpose: decodng error correcon codes! 45

46 Verb Search (cond. Inal sae nalzed wh pah-score = P(s 1 b 1 (1 me All oher saes have score 0 snce P(s = 0 for hem 12 Oc /18797

47 Verb Search (cond. P ( j Sae wh bes pah-score Sae wh pah-score < bes Sae whou a vald pah-score = max [P (-1 j b j (] Sae ranson probably, o j Score for sae j, gven he npu a me Toal pah-score endng up a sae j a me me 47

48 Verb Search (cond. P ( j = max [P (-1 j b j (] Sae ranson probably, o j Score for sae j, gven he npu a me Toal pah-score endng up a sae j a me me 48

49 Verb Search (cond. me 49

50 Verb Search (cond. me 50

51 Verb Search (cond. me 51

52 Verb Search (cond. me 52

53 Verb Search (cond. me 53

54 Verb Search (cond. me 54

55 Verb Search (cond. me 55

56 Verb Search (cond. THE BEST STATE SEQUENCE IS THE ESTIMATE OF THE STATE SEQUENCE FOLLOWED IN GENERATING THE OBSERVATION me 56

57 Problem3: Tranng HMM parameers We can compue he probably of an observaon, and he bes sae sequence gven an observaon, usng he HMM s parameers Bu where do he HMM parameers come from? They mus be learned from a collecon of observaon sequences 57

58 Learnng HMM parameers: Smple procedure counng Gven a se of ranng nsances Ieravely: 1. Inalze HMM parameers 2. Segmen all ranng nsances 3. Esmae ranson probables and sae oupu probably parameers by counng 58

59 Learnng by counng example Explanaon by example n nex few sldes 2-sae HMM, Gaussan PDF a saes, 3 observaon sequences Example shows ONE eraon How o coun afer sae sequences are obaned 59

60 Example: Learnng HMM Parameers We have an HMM wh wo saes s1 and s2. Observaons are vecors x j -h sequence, j-h vecor We are gven he followng hree observaon sequences And have already esmaed sae sequences Observaon 1 Tme sae S1 S1 S2 S2 S2 S1 S1 S2 S1 S1 Obs X a1 X a2 X a3 X a4 X a5 X a6 X a7 X a8 X a9 X a10 Observaon 2 Tme sae S2 S2 S1 S1 S2 S2 S2 S2 S1 Obs X b1 X b2 X b3 X b4 X b5 X b6 X b7 X b8 X b9 Observaon 3 Tme sae S1 S2 S1 S1 S1 S2 S2 S2 Obs X c1 X c2 X c3 X c4 X c5 X c6 X c7 X c8 60

61 Example: Learnng HMM Parameers Inal sae probables (usually denoed as : We have 3 observaons 2 of hese begn wh S1, and one wh S2 (S1 = 2/3, (S2 = 1/3 Observaon 1 Tme sae S1 S1 S2 S2 S2 S1 S1 S2 S1 S1 Obs X a1 X a2 X a3 X a4 X a5 X a6 X a7 X a8 X a9 X a10 Observaon 2 Tme sae S2 S2 S1 S1 S2 S2 S2 S2 S1 Obs X b1 X b2 X b3 X b4 X b5 X b6 X b7 X b8 X b9 Observaon 3 Tme sae S1 S2 S1 S1 S1 S2 S2 S2 Obs X c1 X c2 X c3 X c4 X c5 X c6 X c7 X c8 61

62 Example: Learnng HMM Parameers Transon probables: Sae S1 occurs 11 mes n non-ermnal locaons Of hese, s followed by S1 X mes I s followed by S2 Y mes P(S1 S1 = x/ 11; P(S2 S1 = y / 11 Observaon 1 Tme sae S1 S1 S2 S2 S2 S1 S1 S2 S1 S1 Obs X a1 X a2 X a3 X a4 X a5 X a6 X a7 X a8 X a9 X a10 Observaon 2 Tme sae S2 S2 S1 S1 S2 S2 S2 S2 S1 Obs X b1 X b2 X b3 X b4 X b5 X b6 X b7 X b8 X b9 Observaon 3 Tme sae S1 S2 S1 S1 S1 S2 S2 S2 Obs X c1 X c2 X c3 X c4 X c5 X c6 X c7 X c8 62

63 Example: Learnng HMM Parameers Transon probables: Sae S1 occurs 11 mes n non-ermnal locaons Of hese, s followed mmedaely by S1 6 mes I s followed by S2 Y mes P(S1 S1 = x/ 11; P(S2 S1 = y / 11 Observaon 1 Tme sae S1 S1 S2 S2 S2 S1 S1 S2 S1 S1 Obs X a1 X a2 X a3 X a4 X a5 X a6 X a7 X a8 X a9 X a10 Observaon 2 Tme sae S2 S2 S1 S1 S2 S2 S2 S2 S1 Obs X b1 X b2 X b3 X b4 X b5 X b6 X b7 X b8 X b9 Observaon 3 Tme sae S1 S2 S1 S1 S1 S2 S2 S2 Obs X c1 X c2 X c3 X c4 X c5 X c6 X c7 X c8 63

64 Example: Learnng HMM Parameers Transon probables: Sae S1 occurs 11 mes n non-ermnal locaons Of hese, s followed mmedaely by S1 6 mes I s followed mmedaely by S2 5 mes P(S1 S1 = x/ 11; P(S2 S1 = y / 11 Observaon 1 Tme sae S1 S1 S2 S2 S2 S1 S1 S2 S1 S1 Obs X a1 X a2 X a3 X a4 X a5 X a6 X a7 X a8 X a9 X a10 Observaon 2 Tme sae S2 S2 S1 S1 S2 S2 S2 S2 S1 Obs X b1 X b2 X b3 X b4 X b5 X b6 X b7 X b8 X b9 Observaon 3 Tme sae S1 S2 S1 S1 S1 S2 S2 S2 Obs X c1 X c2 X c3 X c4 X c5 X c6 X c7 X c8 64

65 Example: Learnng HMM Parameers Transon probables: Sae S1 occurs 11 mes n non-ermnal locaons Of hese, s followed mmedaely by S1 6 mes I s followed mmedaely by S2 5 mes P(S1 S1 = 6/ 11; P(S2 S1 = 5 / 11 Observaon 1 Tme sae S1 S1 S2 S2 S2 S1 S1 S2 S1 S1 Obs X a1 X a2 X a3 X a4 X a5 X a6 X a7 X a8 X a9 X a10 Observaon 2 Tme sae S2 S2 S1 S1 S2 S2 S2 S2 S1 Obs X b1 X b2 X b3 X b4 X b5 X b6 X b7 X b8 X b9 Observaon 3 Tme sae S1 S2 S1 S1 S1 S2 S2 S2 Obs X c1 X c2 X c3 X c4 X c5 X c6 X c7 X c8 65

66 Example: Learnng HMM Parameers Transon probables: Sae S2 occurs 13 mes n non-ermnal locaons Of hese, s followed mmedaely by S1 6 mes I s followed mmedaely by S2 5 mes P(S1 S1 = 6/ 11; P(S2 S1 = 5 / 11 Observaon 1 Tme sae S1 S1 S2 S2 S2 S1 S1 S2 S1 S1 Obs. X a1 X a2 X a3 X a4 X a5 X a6 X a7 X a8 X a9 X a10 Observaon 2 Tme sae S2 S2 S1 S1 S2 S2 S2 S2 S1 Obs X b1 X b2 X b3 X b4 X b5 X b6 X b7 X b8 X b9 Observaon 3 Tme sae S1 S2 S1 S1 S1 S2 S2 S2 Obs X c1 X c2 X c3 X c4 X c5 X c6 X c7 X c8 66

67 Example: Learnng HMM Parameers Transon probables: Sae S2 occurs 13 mes n non-ermnal locaons Of hese, s followed mmedaely by S1 5 mes I s followed mmedaely by S2 5 mes P(S1 S1 = 6/ 11; P(S2 S1 = 5 / 11 Observaon 1 Tme sae S1 S1 S2 S2 S2 S1 S1 S2 S1 S1 Obs X a1 X a2 X a3 X a4 X a5 X a6 X a7 X a8 X a9 X a10 Observaon 2 Tme sae S2 S2 S1 S1 S2 S2 S2 S2 S1 Obs X b1 X b2 X b3 X b4 X b5 X b6 X b7 X b8 X b9 Observaon 3 Tme sae S1 S2 S1 S1 S1 S2 S2 S2 Obs X c1 X c2 X c3 X c4 X c5 X c6 X c7 X c8 67

68 Example: Learnng HMM Parameers Transon probables: Sae S2 occurs 13 mes n non-ermnal locaons Of hese, s followed mmedaely by S1 5 mes I s followed mmedaely by S2 8 mes P(S1 S1 = 6/ 11; P(S2 S1 = 5 / 11 Observaon 1 Tme sae S1 S1 S2 S2 S2 S1 S1 S2 S1 S1 Obs X a1 X a2 X a3 X a4 X a5 X a6 X a7 X a8 X a9 X a10 Observaon 2 Tme sae S2 S2 S1 S1 S2 S2 S2 S2 S1 Obs X b1 X b2 X b3 X b4 X b5 X b6 X b7 X b8 X b9 Observaon 3 Tme sae S1 S2 S1 S1 S1 S2 S2 S2 Obs X c1 X c2 X c3 X c4 X c5 X c6 X c7 X c8 68

69 Example: Learnng HMM Parameers Transon probables: Sae S2 occurs 13 mes n non-ermnal locaons Of hese, s followed mmedaely by S1 5 mes I s followed mmedaely by S2 8 mes P(S1 S2 = 5 / 13; P(S2 S2 = 8 / 13 Observaon 1 Tme sae S1 S1 S2 S2 S2 S1 S1 S2 S1 S1 Obs X a1 X a2 X a3 X a4 X a5 X a6 X a7 X a8 X a9 X a10 Observaon 2 Tme sae S2 S2 S1 S1 S2 S2 S2 S2 S1 Obs X b1 X b2 X b3 X b4 X b5 X b6 X b7 X b8 X b9 Observaon 3 Tme sae S1 S2 S1 S1 S1 S2 S2 S2 Obs X c1 X c2 X c3 X c4 X c5 X c6 X c7 X c8 69

70 Parameers learn so far Sae nal probables, ofen denoed as (S1 = 2/3 = 0.66 (S2 = 1/3 = 0.33 Sae ranson probables P(S1 S1 = 6/11 = 0.545; P(S2 S1 = 5/11 = P(S1 S2 = 5/13 = 0.385; P(S2 S2 = 8/13 = Represened as a ranson marx A P ( S 1 S 1 P ( S 2 S P( S1 S2 P( S2 S Each row of hs marx mus sum o

71 Example: Learnng HMM Parameers Sae oupu probably for S1 There are 13 observaons n S1 Observaon 1 Tme sae S1 S1 S2 S2 S2 S1 S1 S2 S1 S1 Obs X a1 X a2 X a3 X a4 X a5 X a6 X a7 X a8 X a9 X a10 Observaon 2 Tme sae S2 S2 S1 S1 S2 S2 S2 S2 S1 Obs X b1 X b2 X b3 X b4 X b5 X b6 X b7 X b8 X b9 Observaon 3 Tme sae S1 S2 S1 S1 S1 S2 S2 S2 Obs X c1 X c2 X c3 X c4 X c5 X c6 X c7 X c8 71

72 Example: Learnng HMM Parameers Sae oupu probably for S1 There are 13 observaons n S1 Segregae hem ou and coun Compue parameers (mean and varance of Gaussan oupu densy for sae S1 oupu densy for sae S1 Tme sae S1 S1 S1 S1 S1 S1 Obs X a1 X a2 X a6 X a7 X a9 X a10 ( 0.5( exp (2 1 ( X X S X P T d Tme sae S1 S1 S1 Ob X X X b a a a a a a X X X X X X X X X X X X X Obs X b3 X b4 X b9 Tme c c c c b b X X X X X X T T T a a T a a X X X X X X X X sae S1 S1 S1 S1 Obs X c1 X c2 X c4 X c T c c T c c b b b b X X X X X X X X 12 Oc /18797

73 Example: Learnng HMM Parameers Sae oupu probably for S2 There are 14 observaons n S2 Observaon 1 Tme sae S1 S1 S2 S2 S2 S1 S1 S2 S1 S1 Obs X a1 X a2 X a3 X a4 X a5 X a6 X a7 X a8 X a9 X a10 Observaon 2 Tme sae S2 S2 S1 S1 S2 S2 S2 S2 S1 Obs X b1 X b2 X b3 X b4 X b5 X b6 X b7 X b8 X b9 Observaon 3 Tme sae S1 S2 S1 S1 S1 S2 S2 S2 Obs X c1 X c2 X c3 X c4 X c5 X c6 X c7 X c8 73

74 Example: Learnng HMM Parameers Sae oupu probably for S2 There are 14 observaons n S2 Segregae hem ou and coun Compue parameers (mean and varance of Gaussan oupu densy for sae S2 Tme sae S2 S2 S2 S2 1 T 1 P( X S2 exp 0.5( X 2 2 ( X 2 d Obs X a3 X a4 X a5 X a8 (2 2 Tme sae S2 S2 S2 S2 S2 S2 Obs X b1 X b2 X b5 X b6 X b7 X b8 Tme sae S2 S2 S2 S X a X Obs X c2 X c6 X c7 X c8 X b6 X a4 X b7 X a5 X b8 X a8 X c2 X b1 X c6 X b2 X c7 X b X T 1 a3 2 X a c8

75 We have learn all he HMM parmeers Sae nal probables, ofen denoed as (S1 = 0.66 (S2 = 1/3 = 0.33 Sae ranson probables A Sae oupu probables bl Sae oupu probably for S1 1 T 1 P( X S1 exp 0.5( X 1 1 ( X 1 d (2 1 Sae oupu probably for S2 1 T 1 P( X S2 exp 0.5( X 2 2 ( X 2 d (2 2 75

76 Updae rules a each eraon No.of observaon sequences ha sar a sae s ( s Toal no. of observaon sequences P( s j s obs : sae ( s 1.&.& sae ( 1 s 1 obs : sae( s. obs : sae( s j ( X ( X obs, 1 obs : sae( s. obs, X obs, obs : sae ( s obs T : sae( s. Assumes sae oupu PDF = Gaussan For GMMs, esmae GMM parameers from 12 Oc 2010collecon of observaons 11755/18797 a any sae 76 1

77 Tranng by segmenaon: Verb ranng Inal yes models Segmenaons Models Converged? no Inalze all HMM parameers Segmen all ranng observaon sequences no saes usng he Verb algorhm wh he curren models Usng esmaed sae sequences and ranng observaon sequences, reesmae he HMM parameers Ths mehod s also called a segmenal k-means learnng procedure

78 Alernave o counng: SOFT counng Expecaon maxmzaon Every observaon conrbues o every sae 78

79 Updae rules a each eraon Obs Obs P( s j ( s s P( sae( 1 s Obs Obs Toal no. of observaon sequences P sae( s Obs P( sae( s, sae( 1 s j Obs Obs ( X Obs, P ( sae( s Obs Obs P ( sae ( s Obs Obs P( sae ( s Obs ( X ( X ( Obs, Obs, Obs P( sae( s Obs Every observaon conrbues o every sae 79 T

80 Updae rules a each eraon Obs Obs P( s j ( s s P( sae( 1 s Obs Obs Toal no. of observaon sequences P sae( s Obs P( sae( s, sae( 1 s j Obs Obs ( X Obs, P ( sae( s Obs Obs P ( sae ( s Obs Obs P( sae ( s Obs ( X ( X ( Obs, Obs, Obs P( sae( s Obs Where dd hese erms come from? 80 T

81 P( sae( s Obs The probably ha he process was a s when generaed X gven he enre observaon Droppng he Obs subscrp for brevy P( sae( s X, X 2,..., X T P( sae( s, X1, X 2,..., X 1 T We wll compue P( sae ( s, x, x,..., x frs ( 1 2 T Ths s he probably ha he process vsed s a me whle producng he enre observaon 81

82 P( sae( s, x, x2,..., x 1 T The probably ha he HMM was n a parcular sae s when generang he observaon sequence s he probably ha followed a sae sequence ha passed hrough s a me s me 82

83 P( sae( s, x, x2,..., x 1 T Ths can be decomposed no wo mulplcave secons The secon of he lace leadng no sae s a me and he secon leadng ou of s me 83

84 The Forward Pahs The probably of he red secon s he oal probably of all sae sequences endng a sae s a me Ths s smply (s, Can be compued usng he forward algorhm s me 84

85 The Backward Pahs The blue poron represens he probably of all sae sequences ha began a sae s a me Lke he red poron can be compued usng a backward recurson me 85

86 ( s, P ( x The Backward Recurson ( 2 1, x,..., xt sae ( s (s, s (N, (s, Can be recursvely esmaed sarng from he fnal me me nsan (backward recurson +1 me ( s, ( s', 1 P ( s' s P ( x 1 s' ( 1 s' (s, s he oal probably of ALL sae sequences ha depar from s a me, and all observaons afer x (s,t = 1 a he fnal me nsan for all vald fnal saes 86

87 The complee probably ( s, ( s, P( x 1, x 2,..., xt, sae( s (N, (s,-1 s (s, (s 1, me 87

88 Poseror probably of a sae The probably ha he process was n sae s a me, gven ha we have observed he daa s obaned by smple normalzaon P( sae( s Obs P( sae( s, x1, x2,..., xt P( sae( s, x, x,..., x 1 2 T s' s' ( s, ( s, ( s', ( s', Ths erm s ofen referred o as he gamma erm and denoed by s, 88

89 Updae rules a each eraon Obs Obs P( s j ( s s P( sae( 1 s Obs Obs Toal no. of observaon sequences P sae( s Obs P( sae( s, sae( 1 s j Obs Obs ( X Obs, P ( sae( s Obs Obs P ( sae ( s Obs Obs P( sae ( s Obs ( X ( X ( Obs, Obs, Obs These have been found P( sae( s Obs 89 T

90 Updae rules a each eraon Obs Obs P( s j ( s s P( sae( 1 s Obs Obs Toal no. of observaon sequences P sae( s Obs P( sae( s, sae( 1 s j Obs Obs ( X Obs, P ( sae( s Obs Obs P ( sae ( s Obs Obs P( sae ( s Obs ( X ( X ( Obs, Obs, Obs P( sae( s Obs Where dd hese erms come from? 90 T

91 P( sae( s, sae( 1 s', x, x2,..., x 1 T s s me Oc /18797

92 P( sae( s, sae( 1 s', x, x2,..., x ( s, 1 T s s me Oc /18797

93 P( sae( s, sae( 1 s', x, x2,..., x ( s, P( s' s P( x 1 s' 1 T s s me Oc /18797

94 P( sae( s, sae( 1 s', x, x2,..., x 1 T ( s, P( s' s P( x 1 s' ( s', 1 s s me Oc /18797

95 The a poseror probably of ranson ( s, P( s' s P( x 1 s' ( s', 1 P ( sae ( s, sae ( 1 s' Obs ( s, P( s s P( x s ( s, s s The a poseror probably of a ranson gven an observaon 95

96 Updae rules a each eraon Obs Obs P( s j ( s s P( sae( 1 s Obs Obs Toal no. of observaon sequences P sae( s Obs P( sae( s, sae( 1 s j Obs Obs ( X Obs, P ( sae( s Obs Obs P ( sae ( s Obs Obs P( sae ( s Obs ( X ( X ( Obs, Obs, Obs These have been found P( sae( s Obs 96 T

97 Tranng whou explc segmenaon: Baum-Welch ranng Every feaure vecor assocaed wh every sae of every HMM wh a probably Sae Inal assocaon models Models Converged? yes probables no Probables compued usng he forward-backward algorhm Sof decsons aken a he level of HMM sae In pracce, he segmenaon based Verb ranng s much easer o mplemen and s much faser The dfference n performance beween he wo s small, especally f we have los of ranng daa

98 HMM Issues How o fnd he bes sae sequence: Covered How o learn HMM parameers: Covered How o compue he probably of an observaon sequence: Covered 98

99 Magc numbers How many saes: No nce auomac echnque o learn hs You choose For speech, HMM opology s usually lef o rgh (no backward ransons For oher cyclc processes, opology mus reflec naure of process No. of saes 3 per phoneme n speech For oher processes, depends on esmaed no. of dsnc saes n process 99

100 Applcaons of HMMs Classfcaon: Learn HMMs for he varous classes of me seres from ranng daa Compue probably of es me seres usng he HMMs for each class Use n a Bayesan classfer Speech recognon, o vson, gene e sequencng, characer recognon, ex mnng, opc deecon 100

101 Applcaons of HMMs Segmenaon: Gven HMMs for varous evens, fnd even boundares Smply fnd he bes sae sequence and he locaons where sae denes change Auomac speech segmenaon, ex segmenaon by opc, geneome segmenaon, 101

102 Implemenaon Issues For long daa sequences arhmec underflow s a problem Scores are producs of numbers ha are all less han 1 The Verb algorhm provdes a workaround work only wh log probables bl Mulplcaon changes o addon compuaonally faser oo Underflow almos compleely elmnaed For he forward algorhm complex normalzaon schemes mus be mplemened o preven underflow A some compuaonal expense Ofen no worh go wh Verb 102

103 Classfcaon wh HMMs HMM for Yes HMM for No P(Yes P(X Yes P(No P(X No Speech recognon of solaed words: Tranng: Collec ranng nsances for each word Learn an HMM for each word Recognon of an observaon X For each word compue P(X word Usng forward algorhm Alernaely, compue P(X,bes.sae.sequence word Compued usng he Verb segmenaon algorhm Compue P(word P(X word P(word = a pror probably of word Selec he word for whch P(word P(X word s hghes 103

104 Creang compose models HMM for Open HMM for Close HMM for Fle HMM for Open Fle HMM for Fle Close HMMs wh absorbng saes can be combned no composes E.g. ran models for open, close and fle Concaenae hem o creae models for open fle and fle close Can recognze open fle and fle close 12 Oc /18797

105 Model graphs HMM for open HMM for fle HMM for close Models can also be composed no graphs No jus lnearly Verb sae algnmen wll ell us whch porons of he graphs were vsed for an observaon X 105

106 Recognzng from graph u Trells for Open Fle vs. Close Fle u The VITERBI bes pah ells you wha was spoken Open Close Fle 106

107 Recognzng from graph u Trells for Open Fle vs. Close Fle u The VITERBI bes pah ells you wha was spoken Open Close Fle 107

108 Language probables can be ncorporaed HMM for open P(Open P(fle open P(Close P(fle close l HMM for fle HMM for close Transons beween HMMs can be assgned a probably Drawn from properes of he language Here we have shown Bgram probables 108

109 Ths s used n speech recognon Recognzng one of four lnes from charge of he lgh brgade Cannon o rgh of hem Cannon o lef of hem Cannon n fron of hem Cannon behnd hem Each word s an HMM o rgh of hem lef of hem Each word s an HMM M Cannon n fron behnd of hem hem 11755/18797

110 Graphs can be reduced somemes Recognzng one of four lnes from charge of he lgh brgade Graph reducon does no mpede recognon of wha was spoken P(rgh o rgh P(of rgh P(of lef P(cannon P(o cannon o lef P(hem of Cannon of hem P(n cannon n fron P(behnd cannon behnd 11755/18797 P(hem behnd

111 Speech recognon: An asde In speech recognon sysems models are raned for phonemes Acually rphones phonemes n conex Word HMMs are composed from phoneme HMMs Language HMMs are composed from word HMMs The graph s reduced usng auomaed echnques John McDonough alks abou WFSTs on Thursday 111

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model Probablsc Model for Tme-seres Daa: Hdden Markov Model Hrosh Mamsuka Bonformacs Cener Kyoo Unversy Oulne Three Problems for probablsc models n machne learnng. Compung lkelhood 2. Learnng 3. Parsng (predcon

More information

Discrete Markov Process. Introduction. Example: Balls and Urns. Stochastic Automaton. INTRODUCTION TO Machine Learning 3rd Edition

Discrete Markov Process. Introduction. Example: Balls and Urns. Stochastic Automaton. INTRODUCTION TO Machine Learning 3rd Edition EHEM ALPAYDI he MI Press, 04 Lecure Sldes for IRODUCIO O Machne Learnng 3rd Edon alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/ml3e Sldes from exboo resource page. Slghly eded and wh addonal examples

More information

Clustering (Bishop ch 9)

Clustering (Bishop ch 9) Cluserng (Bshop ch 9) Reference: Daa Mnng by Margare Dunham (a slde source) 1 Cluserng Cluserng s unsupervsed learnng, here are no class labels Wan o fnd groups of smlar nsances Ofen use a dsance measure

More information

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms Course organzaon Inroducon Wee -2) Course nroducon A bref nroducon o molecular bology A bref nroducon o sequence comparson Par I: Algorhms for Sequence Analyss Wee 3-8) Chaper -3, Models and heores» Probably

More information

Lecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press,

Lecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press, Lecure Sldes for INTRDUCTIN T Machne Learnng ETHEM ALAYDIN The MIT ress, 2004 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/2ml CHATER 3: Hdden Marov Models Inroducon Modelng dependences n npu; no

More information

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore. Ths documen s downloaded from DR-NTU, Nanyang Technologcal Unversy Lbrary, Sngapore. Tle A smplfed verb machng algorhm for word paron n vsual speech processng( Acceped verson ) Auhor(s) Foo, Say We; Yong,

More information

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University Hdden Markov Models Followng a lecure by Andrew W. Moore Carnege Mellon Unversy www.cs.cmu.edu/~awm/uorals A Markov Sysem Has N saes, called s, s 2.. s N s 2 There are dscree meseps, 0,, s s 3 N 3 0 Hdden

More information

Variants of Pegasos. December 11, 2009

Variants of Pegasos. December 11, 2009 Inroducon Varans of Pegasos SooWoong Ryu bshboy@sanford.edu December, 009 Youngsoo Cho yc344@sanford.edu Developng a new SVM algorhm s ongong research opc. Among many exng SVM algorhms, we wll focus on

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4 CS434a/54a: Paern Recognon Prof. Olga Veksler Lecure 4 Oulne Normal Random Varable Properes Dscrmnan funcons Why Normal Random Varables? Analycally racable Works well when observaon comes form a corruped

More information

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany Herarchcal Markov Normal Mxure models wh Applcaons o Fnancal Asse Reurns Appendx: Proofs of Theorems and Condonal Poseror Dsrbuons John Geweke a and Gann Amsano b a Deparmens of Economcs and Sascs, Unversy

More information

Fall 2010 Graduate Course on Dynamic Learning

Fall 2010 Graduate Course on Dynamic Learning Fall 200 Graduae Course on Dynamc Learnng Chaper 4: Parcle Flers Sepember 27, 200 Byoung-Tak Zhang School of Compuer Scence and Engneerng & Cognve Scence and Bran Scence Programs Seoul aonal Unversy hp://b.snu.ac.kr/~bzhang/

More information

( ) () we define the interaction representation by the unitary transformation () = ()

( ) () we define the interaction representation by the unitary transformation () = () Hgher Order Perurbaon Theory Mchael Fowler 3/7/6 The neracon Represenaon Recall ha n he frs par of hs course sequence, we dscussed he chrödnger and Hesenberg represenaons of quanum mechancs here n he chrödnger

More information

Digital Speech Processing Lecture 20. The Hidden Markov Model (HMM)

Digital Speech Processing Lecture 20. The Hidden Markov Model (HMM) Dgal Speech Processng Lecure 20 The Hdden Markov Model (HMM) Lecure Oulne Theory of Markov Models dscree Markov processes hdden Markov processes Soluons o he Three Basc Problems of HMM s compuaon of observaon

More information

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance INF 43 3.. Repeon Anne Solberg (anne@f.uo.no Bayes rule for a classfcaon problem Suppose we have J, =,...J classes. s he class label for a pxel, and x s he observed feaure vecor. We can use Bayes rule

More information

( ) [ ] MAP Decision Rule

( ) [ ] MAP Decision Rule Announcemens Bayes Decson Theory wh Normal Dsrbuons HW0 due oday HW o be assgned soon Proec descrpon posed Bomercs CSE 90 Lecure 4 CSE90, Sprng 04 CSE90, Sprng 04 Key Probables 4 ω class label X feaure

More information

Linear Response Theory: The connection between QFT and experiments

Linear Response Theory: The connection between QFT and experiments Phys540.nb 39 3 Lnear Response Theory: The connecon beween QFT and expermens 3.1. Basc conceps and deas Q: ow do we measure he conducvy of a meal? A: we frs nroduce a weak elecrc feld E, and hen measure

More information

Density Matrix Description of NMR BCMB/CHEM 8190

Density Matrix Description of NMR BCMB/CHEM 8190 Densy Marx Descrpon of NMR BCMBCHEM 89 Operaors n Marx Noaon Alernae approach o second order specra: ask abou x magnezaon nsead of energes and ranson probables. If we say wh one bass se, properes vary

More information

Dishonest casino as an HMM

Dishonest casino as an HMM Dshnes casn as an HMM N = 2, ={F,L} M=2, O = {h,} A = F B= [. F L F L 0.95 0.0 0] h 0.5 0. L 0.05 0.90 0.5 0.9 c Deva ubramanan, 2009 63 A generave mdel fr CpG slands There are w hdden saes: CpG and nn-cpg.

More information

Consider processes where state transitions are time independent, i.e., System of distinct states,

Consider processes where state transitions are time independent, i.e., System of distinct states, Dgal Speech Processng Lecure 0 he Hdden Marov Model (HMM) Lecure Oulne heory of Marov Models dscree Marov processes hdden Marov processes Soluons o he hree Basc Problems of HMM s compuaon of observaon

More information

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS R&RATA # Vol.) 8, March FURTHER AALYSIS OF COFIDECE ITERVALS FOR LARGE CLIET/SERVER COMPUTER ETWORKS Vyacheslav Abramov School of Mahemacal Scences, Monash Unversy, Buldng 8, Level 4, Clayon Campus, Wellngon

More information

Math 128b Project. Jude Yuen

Math 128b Project. Jude Yuen Mah 8b Proec Jude Yuen . Inroducon Le { Z } be a sequence of observed ndependen vecor varables. If he elemens of Z have a on normal dsrbuon hen { Z } has a mean vecor Z and a varancecovarance marx z. Geomercally

More information

Pattern Classification (III) & Pattern Verification

Pattern Classification (III) & Pattern Verification Preare by Prof. Hu Jang CSE638 --4 CSE638 3. Seech & Language Processng o.5 Paern Classfcaon III & Paern Verfcaon Prof. Hu Jang Dearmen of Comuer Scence an Engneerng York Unversy Moel Parameer Esmaon Maxmum

More information

Computing Relevance, Similarity: The Vector Space Model

Computing Relevance, Similarity: The Vector Space Model Compung Relevance, Smlary: The Vecor Space Model Based on Larson and Hears s sldes a UC-Bereley hp://.sms.bereley.edu/courses/s0/f00/ aabase Managemen Sysems, R. Ramarshnan ocumen Vecors v ocumens are

More information

Chapter 6: AC Circuits

Chapter 6: AC Circuits Chaper 6: AC Crcus Chaper 6: Oulne Phasors and he AC Seady Sae AC Crcus A sable, lnear crcu operang n he seady sae wh snusodal excaon (.e., snusodal seady sae. Complee response forced response naural response.

More information

An introduction to Support Vector Machine

An introduction to Support Vector Machine An nroducon o Suppor Vecor Machne 報告者 : 黃立德 References: Smon Haykn, "Neural Neworks: a comprehensve foundaon, second edon, 999, Chaper 2,6 Nello Chrsann, John Shawe-Tayer, An Inroducon o Suppor Vecor Machnes,

More information

Density Matrix Description of NMR BCMB/CHEM 8190

Density Matrix Description of NMR BCMB/CHEM 8190 Densy Marx Descrpon of NMR BCMBCHEM 89 Operaors n Marx Noaon If we say wh one bass se, properes vary only because of changes n he coeffcens weghng each bass se funcon x = h< Ix > - hs s how we calculae

More information

Lecture 6: Learning for Control (Generalised Linear Regression)

Lecture 6: Learning for Control (Generalised Linear Regression) Lecure 6: Learnng for Conrol (Generalsed Lnear Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure 6: RLSC - Prof. Sehu Vjayakumar Lnear Regresson

More information

Robustness Experiments with Two Variance Components

Robustness Experiments with Two Variance Components Naonal Insue of Sandards and Technology (NIST) Informaon Technology Laboraory (ITL) Sascal Engneerng Dvson (SED) Robusness Expermens wh Two Varance Componens by Ana Ivelsse Avlés avles@ns.gov Conference

More information

Notes on the stability of dynamic systems and the use of Eigen Values.

Notes on the stability of dynamic systems and the use of Eigen Values. Noes on he sabl of dnamc ssems and he use of Egen Values. Source: Macro II course noes, Dr. Davd Bessler s Tme Seres course noes, zarads (999) Ineremporal Macroeconomcs chaper 4 & Techncal ppend, and Hamlon

More information

CHAPTER 10: LINEAR DISCRIMINATION

CHAPTER 10: LINEAR DISCRIMINATION CHAPER : LINEAR DISCRIMINAION Dscrmnan-based Classfcaon 3 In classfcaon h K classes (C,C,, C k ) We defned dscrmnan funcon g j (), j=,,,k hen gven an es eample, e chose (predced) s class label as C f g

More information

Lecture VI Regression

Lecture VI Regression Lecure VI Regresson (Lnear Mehods for Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure VI: MLSC - Dr. Sehu Vjayakumar Lnear Regresson Model M

More information

Solution in semi infinite diffusion couples (error function analysis)

Solution in semi infinite diffusion couples (error function analysis) Soluon n sem nfne dffuson couples (error funcon analyss) Le us consder now he sem nfne dffuson couple of wo blocks wh concenraon of and I means ha, n a A- bnary sysem, s bondng beween wo blocks made of

More information

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!") i+1,q - [(!

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!) i+1,q - [(! ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL The frs hng o es n wo-way ANOVA: Is here neracon? "No neracon" means: The man effecs model would f. Ths n urn means: In he neracon plo (wh A on he horzonal

More information

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL Sco Wsdom, John Hershey 2, Jonahan Le Roux 2, and Shnj Waanabe 2 Deparmen o Elecrcal Engneerng, Unversy o Washngon, Seale, WA, USA

More information

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s Ordnary Dfferenal Equaons n Neuroscence wh Malab eamples. Am - Gan undersandng of how o se up and solve ODE s Am Undersand how o se up an solve a smple eample of he Hebb rule n D Our goal a end of class

More information

Advanced Machine Learning & Perception

Advanced Machine Learning & Perception Advanced Machne Learnng & Percepon Insrucor: Tony Jebara SVM Feaure & Kernel Selecon SVM Eensons Feaure Selecon (Flerng and Wrappng) SVM Feaure Selecon SVM Kernel Selecon SVM Eensons Classfcaon Feaure/Kernel

More information

Machine Learning 2nd Edition

Machine Learning 2nd Edition INTRODUCTION TO Lecure Sldes for Machne Learnng nd Edon ETHEM ALPAYDIN, modfed by Leonardo Bobadlla and some pars from hp://www.cs.au.ac.l/~aparzn/machnelearnng/ The MIT Press, 00 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/mle

More information

Lecture 2 M/G/1 queues. M/G/1-queue

Lecture 2 M/G/1 queues. M/G/1-queue Lecure M/G/ queues M/G/-queue Posson arrval process Arbrary servce me dsrbuon Sngle server To deermne he sae of he sysem a me, we mus now The number of cusomers n he sysems N() Tme ha he cusomer currenly

More information

Robust and Accurate Cancer Classification with Gene Expression Profiling

Robust and Accurate Cancer Classification with Gene Expression Profiling Robus and Accurae Cancer Classfcaon wh Gene Expresson Proflng (Compuaonal ysems Bology, 2005) Auhor: Hafeng L, Keshu Zhang, ao Jang Oulne Background LDA (lnear dscrmnan analyss) and small sample sze problem

More information

2/20/2013. EE 101 Midterm 2 Review

2/20/2013. EE 101 Midterm 2 Review //3 EE Mderm eew //3 Volage-mplfer Model The npu ressance s he equalen ressance see when lookng no he npu ermnals of he amplfer. o s he oupu ressance. I causes he oupu olage o decrease as he load ressance

More information

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005 Dynamc Team Decson Theory EECS 558 Proec Shruvandana Sharma and Davd Shuman December 0, 005 Oulne Inroducon o Team Decson Theory Decomposon of he Dynamc Team Decson Problem Equvalence of Sac and Dynamc

More information

FTCS Solution to the Heat Equation

FTCS Solution to the Heat Equation FTCS Soluon o he Hea Equaon ME 448/548 Noes Gerald Reckenwald Porland Sae Unversy Deparmen of Mechancal Engneerng gerry@pdxedu ME 448/548: FTCS Soluon o he Hea Equaon Overvew Use he forward fne d erence

More information

Li An-Ping. Beijing , P.R.China

Li An-Ping. Beijing , P.R.China A New Type of Cpher: DICING_csb L An-Png Bejng 100085, P.R.Chna apl0001@sna.com Absrac: In hs paper, we wll propose a new ype of cpher named DICING_csb, whch s derved from our prevous sream cpher DICING.

More information

Machine Learning Linear Regression

Machine Learning Linear Regression Machne Learnng Lnear Regresson Lesson 3 Lnear Regresson Bascs of Regresson Leas Squares esmaon Polynomal Regresson Bass funcons Regresson model Regularzed Regresson Sascal Regresson Mamum Lkelhood (ML)

More information

Introduction to Boosting

Introduction to Boosting Inroducon o Boosng Cynha Rudn PACM, Prnceon Unversy Advsors Ingrd Daubeches and Rober Schapre Say you have a daabase of news arcles, +, +, -, -, +, +, -, -, +, +, -, -, +, +, -, + where arcles are labeled

More information

Filtrage particulaire et suivi multi-pistes Carine Hue Jean-Pierre Le Cadre and Patrick Pérez

Filtrage particulaire et suivi multi-pistes Carine Hue Jean-Pierre Le Cadre and Patrick Pérez Chaînes de Markov cachées e flrage parculare 2-22 anver 2002 Flrage parculare e suv mul-pses Carne Hue Jean-Perre Le Cadre and Parck Pérez Conex Applcaons: Sgnal processng: arge rackng bearngs-onl rackng

More information

CS286.2 Lecture 14: Quantum de Finetti Theorems II

CS286.2 Lecture 14: Quantum de Finetti Theorems II CS286.2 Lecure 14: Quanum de Fne Theorems II Scrbe: Mara Okounkova 1 Saemen of he heorem Recall he las saemen of he quanum de Fne heorem from he prevous lecure. Theorem 1 Quanum de Fne). Le ρ Dens C 2

More information

HIDDEN MARKOV MODELS FOR AUTOMATIC SPEECH RECOGNITION: THEORY AND APPLICATION. S J Cox

HIDDEN MARKOV MODELS FOR AUTOMATIC SPEECH RECOGNITION: THEORY AND APPLICATION. S J Cox HIDDEN MARKOV MODELS FOR AUTOMATIC SPEECH RECOGNITION: THEORY AND APPLICATION S J Cox ABSTRACT Hdden Markov modellng s currenly he mos wdely used and successful mehod for auomac recognon of spoken uerances.

More information

WiH Wei He

WiH Wei He Sysem Idenfcaon of onlnear Sae-Space Space Baery odels WH We He wehe@calce.umd.edu Advsor: Dr. Chaochao Chen Deparmen of echancal Engneerng Unversy of aryland, College Par 1 Unversy of aryland Bacground

More information

Reactive Methods to Solve the Berth AllocationProblem with Stochastic Arrival and Handling Times

Reactive Methods to Solve the Berth AllocationProblem with Stochastic Arrival and Handling Times Reacve Mehods o Solve he Berh AllocaonProblem wh Sochasc Arrval and Handlng Tmes Nsh Umang* Mchel Berlare* * TRANSP-OR, Ecole Polyechnque Fédérale de Lausanne Frs Workshop on Large Scale Opmzaon November

More information

CS 268: Packet Scheduling

CS 268: Packet Scheduling Pace Schedulng Decde when and wha pace o send on oupu ln - Usually mplemened a oupu nerface CS 68: Pace Schedulng flow Ion Soca March 9, 004 Classfer flow flow n Buffer managemen Scheduler soca@cs.bereley.edu

More information

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6)

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6) Econ7 Appled Economercs Topc 5: Specfcaon: Choosng Independen Varables (Sudenmund, Chaper 6 Specfcaon errors ha we wll deal wh: wrong ndependen varable; wrong funconal form. Ths lecure deals wh wrong ndependen

More information

J i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes.

J i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes. umercal negraon of he dffuson equaon (I) Fne dfference mehod. Spaal screaon. Inernal nodes. R L V For hermal conducon le s dscree he spaal doman no small fne spans, =,,: Balance of parcles for an nernal

More information

Cubic Bezier Homotopy Function for Solving Exponential Equations

Cubic Bezier Homotopy Function for Solving Exponential Equations Penerb Journal of Advanced Research n Compung and Applcaons ISSN (onlne: 46-97 Vol. 4, No.. Pages -8, 6 omoopy Funcon for Solvng Eponenal Equaons S. S. Raml *,,. Mohamad Nor,a, N. S. Saharzan,b and M.

More information

CHAPTER 7: CLUSTERING

CHAPTER 7: CLUSTERING CHAPTER 7: CLUSTERING Semparamerc Densy Esmaon 3 Paramerc: Assume a snge mode for p ( C ) (Chapers 4 and 5) Semparamerc: p ( C ) s a mure of denses Mupe possbe epanaons/prooypes: Dfferen handwrng syes,

More information

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model BGC1: Survval and even hsory analyss Oslo, March-May 212 Monday May 7h and Tuesday May 8h The addve regresson model Ørnulf Borgan Deparmen of Mahemacs Unversy of Oslo Oulne of program: Recapulaon Counng

More information

Normal Random Variable and its discriminant functions

Normal Random Variable and its discriminant functions Noral Rando Varable and s dscrnan funcons Oulne Noral Rando Varable Properes Dscrnan funcons Why Noral Rando Varables? Analycally racable Works well when observaon coes for a corruped snle prooype 3 The

More information

Appendix to Online Clustering with Experts

Appendix to Online Clustering with Experts A Appendx o Onlne Cluserng wh Expers Furher dscusson of expermens. Here we furher dscuss expermenal resuls repored n he paper. Ineresngly, we observe ha OCE (and n parcular Learn- ) racks he bes exper

More information

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment EEL 6266 Power Sysem Operaon and Conrol Chaper 5 Un Commmen Dynamc programmng chef advanage over enumeraon schemes s he reducon n he dmensonaly of he problem n a src prory order scheme, here are only N

More information

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS

THE PREDICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS THE PREICTION OF COMPETITIVE ENVIRONMENT IN BUSINESS INTROUCTION The wo dmensonal paral dfferenal equaons of second order can be used for he smulaon of compeve envronmen n busness The arcle presens he

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 0 Canoncal Transformaons (Chaper 9) Wha We Dd Las Tme Hamlon s Prncple n he Hamlonan formalsm Dervaon was smple δi δ Addonal end-pon consrans pq H( q, p, ) d 0 δ q ( ) δq ( ) δ

More information

Example: MOSFET Amplifier Distortion

Example: MOSFET Amplifier Distortion 4/25/2011 Example MSFET Amplfer Dsoron 1/9 Example: MSFET Amplfer Dsoron Recall hs crcu from a prevous handou: ( ) = I ( ) D D d 15.0 V RD = 5K v ( ) = V v ( ) D o v( ) - K = 2 0.25 ma/v V = 2.0 V 40V.

More information

II. Light is a Ray (Geometrical Optics)

II. Light is a Ray (Geometrical Optics) II Lgh s a Ray (Geomercal Opcs) IIB Reflecon and Refracon Hero s Prncple of Leas Dsance Law of Reflecon Hero of Aleandra, who lved n he 2 nd cenury BC, posulaed he followng prncple: Prncple of Leas Dsance:

More information

January Examinations 2012

January Examinations 2012 Page of 5 EC79 January Examnaons No. of Pages: 5 No. of Quesons: 8 Subjec ECONOMICS (POSTGRADUATE) Tle of Paper EC79 QUANTITATIVE METHODS FOR BUSINESS AND FINANCE Tme Allowed Two Hours ( hours) Insrucons

More information

FI 3103 Quantum Physics

FI 3103 Quantum Physics /9/4 FI 33 Quanum Physcs Aleander A. Iskandar Physcs of Magnesm and Phooncs Research Grou Insu Teknolog Bandung Basc Conces n Quanum Physcs Probably and Eecaon Value Hesenberg Uncerany Prncle Wave Funcon

More information

Including the ordinary differential of distance with time as velocity makes a system of ordinary differential equations.

Including the ordinary differential of distance with time as velocity makes a system of ordinary differential equations. Soluons o Ordnary Derenal Equaons An ordnary derenal equaon has only one ndependen varable. A sysem o ordnary derenal equaons consss o several derenal equaons each wh he same ndependen varable. An eample

More information

Advanced time-series analysis (University of Lund, Economic History Department)

Advanced time-series analysis (University of Lund, Economic History Department) Advanced me-seres analss (Unvers of Lund, Economc Hsor Dearmen) 3 Jan-3 Februar and 6-3 March Lecure 4 Economerc echnues for saonar seres : Unvarae sochasc models wh Box- Jenns mehodolog, smle forecasng

More information

Video-Based Face Recognition Using Adaptive Hidden Markov Models

Video-Based Face Recognition Using Adaptive Hidden Markov Models Vdeo-Based Face Recognon Usng Adapve Hdden Markov Models Xaomng Lu and suhan Chen Elecrcal and Compuer Engneerng, Carnege Mellon Unversy, Psburgh, PA, 523, U.S.A. xaomng@andrew.cmu.edu suhan@cmu.edu Absrac

More information

Lecture 11 SVM cont

Lecture 11 SVM cont Lecure SVM con. 0 008 Wha we have done so far We have esalshed ha we wan o fnd a lnear decson oundary whose margn s he larges We know how o measure he margn of a lnear decson oundary Tha s: he mnmum geomerc

More information

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue. Lnear Algebra Lecure # Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons

More information

Parametric Estimation in MMPP(2) using Time Discretization. Cláudia Nunes, António Pacheco

Parametric Estimation in MMPP(2) using Time Discretization. Cláudia Nunes, António Pacheco Paramerc Esmaon n MMPP(2) usng Tme Dscrezaon Cláuda Nunes, Anóno Pacheco Deparameno de Maemáca and Cenro de Maemáca Aplcada 1 Insuo Superor Técnco, Av. Rovsco Pas, 1096 Lsboa Codex, PORTUGAL In: J. Janssen

More information

On One Analytic Method of. Constructing Program Controls

On One Analytic Method of. Constructing Program Controls Appled Mahemacal Scences, Vol. 9, 05, no. 8, 409-407 HIKARI Ld, www.m-hkar.com hp://dx.do.org/0.988/ams.05.54349 On One Analyc Mehod of Consrucng Program Conrols A. N. Kvko, S. V. Chsyakov and Yu. E. Balyna

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 9 Hamlonan Equaons of Moon (Chaper 8) Wha We Dd Las Tme Consruced Hamlonan formalsm H ( q, p, ) = q p L( q, q, ) H p = q H q = p H = L Equvalen o Lagrangan formalsm Smpler, bu

More information

Graduate Macroeconomics 2 Problem set 5. - Solutions

Graduate Macroeconomics 2 Problem set 5. - Solutions Graduae Macroeconomcs 2 Problem se. - Soluons Queson 1 To answer hs queson we need he frms frs order condons and he equaon ha deermnes he number of frms n equlbrum. The frms frs order condons are: F K

More information

Political Economy of Institutions and Development: Problem Set 2 Due Date: Thursday, March 15, 2019.

Political Economy of Institutions and Development: Problem Set 2 Due Date: Thursday, March 15, 2019. Polcal Economy of Insuons and Developmen: 14.773 Problem Se 2 Due Dae: Thursday, March 15, 2019. Please answer Quesons 1, 2 and 3. Queson 1 Consder an nfne-horzon dynamc game beween wo groups, an ele and

More information

Let s treat the problem of the response of a system to an applied external force. Again,

Let s treat the problem of the response of a system to an applied external force. Again, Page 33 QUANTUM LNEAR RESPONSE FUNCTON Le s rea he problem of he response of a sysem o an appled exernal force. Agan, H() H f () A H + V () Exernal agen acng on nernal varable Hamlonan for equlbrum sysem

More information

A Bayesian algorithm for tracking multiple moving objects in outdoor surveillance video

A Bayesian algorithm for tracking multiple moving objects in outdoor surveillance video A Bayesan algorhm for racng mulple movng obecs n oudoor survellance vdeo Manunah Narayana Unversy of Kansas Lawrence, Kansas manu@u.edu Absrac Relable racng of mulple movng obecs n vdes an neresng challenge,

More information

Lecture 2 L n i e n a e r a M od o e d l e s

Lecture 2 L n i e n a e r a M od o e d l e s Lecure Lnear Models Las lecure You have learned abou ha s machne learnng Supervsed learnng Unsupervsed learnng Renforcemen learnng You have seen an eample learnng problem and he general process ha one

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 9 Hamlonan Equaons of Moon (Chaper 8) Wha We Dd Las Tme Consruced Hamlonan formalsm Hqp (,,) = qp Lqq (,,) H p = q H q = p H L = Equvalen o Lagrangan formalsm Smpler, bu wce as

More information

UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 2017 EXAMINATION

UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 2017 EXAMINATION INTERNATIONAL TRADE T. J. KEHOE UNIVERSITAT AUTÒNOMA DE BARCELONA MARCH 27 EXAMINATION Please answer wo of he hree quesons. You can consul class noes, workng papers, and arcles whle you are workng on he

More information

Fitting a Conditional Linear Gaussian Distribution

Fitting a Conditional Linear Gaussian Distribution Fng a Condonal Lnear Gaussan Dsrbuon Kevn P. Murphy 28 Ocober 1998 Revsed 29 January 2003 1 Inroducon We consder he problem of fndng he maxmum lkelhood ML esmaes of he parameers of a condonal Gaussan varable

More information

Kernel-Based Bayesian Filtering for Object Tracking

Kernel-Based Bayesian Filtering for Object Tracking Kernel-Based Bayesan Flerng for Objec Trackng Bohyung Han Yng Zhu Dorn Comancu Larry Davs Dep. of Compuer Scence Real-Tme Vson and Modelng Inegraed Daa and Sysems Unversy of Maryland Semens Corporae Research

More information

Comb Filters. Comb Filters

Comb Filters. Comb Filters The smple flers dscussed so far are characered eher by a sngle passband and/or a sngle sopband There are applcaons where flers wh mulple passbands and sopbands are requred Thecomb fler s an example of

More information

Anomaly Detection. Lecture Notes for Chapter 9. Introduction to Data Mining, 2 nd Edition by Tan, Steinbach, Karpatne, Kumar

Anomaly Detection. Lecture Notes for Chapter 9. Introduction to Data Mining, 2 nd Edition by Tan, Steinbach, Karpatne, Kumar Anomaly eecon Lecure Noes for Chaper 9 Inroducon o aa Mnng, 2 nd Edon by Tan, Senbach, Karpane, Kumar 2/14/18 Inroducon o aa Mnng, 2nd Edon 1 Anomaly/Ouler eecon Wha are anomales/oulers? The se of daa

More information

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process Neural Neworks-Based Tme Seres Predcon Usng Long and Shor Term Dependence n he Learnng Process J. Puchea, D. Paño and B. Kuchen, Absrac In hs work a feedforward neural neworksbased nonlnear auoregresson

More information

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue. Mah E-b Lecure #0 Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons are

More information

Foundations of State Estimation Part II

Foundations of State Estimation Part II Foundaons of Sae Esmaon Par II Tocs: Hdden Markov Models Parcle Flers Addonal readng: L.R. Rabner, A uoral on hdden Markov models," Proceedngs of he IEEE, vol. 77,. 57-86, 989. Sequenal Mone Carlo Mehods

More information

Testing a new idea to solve the P = NP problem with mathematical induction

Testing a new idea to solve the P = NP problem with mathematical induction Tesng a new dea o solve he P = NP problem wh mahemacal nducon Bacground P and NP are wo classes (ses) of languages n Compuer Scence An open problem s wheher P = NP Ths paper ess a new dea o compare he

More information

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim

GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS. Youngwoo Ahn and Kitae Kim Korean J. Mah. 19 (2011), No. 3, pp. 263 272 GENERATING CERTAIN QUINTIC IRREDUCIBLE POLYNOMIALS OVER FINITE FIELDS Youngwoo Ahn and Kae Km Absrac. In he paper [1], an explc correspondence beween ceran

More information

doi: info:doi/ /

doi: info:doi/ / do: nfo:do/0.063/.322393 nernaonal Conference on Power Conrol and Opmzaon, Bal, ndonesa, -3, June 2009 A COLOR FEATURES-BASED METHOD FOR OBJECT TRACKNG EMPLOYNG A PARTCLE FLTER ALGORTHM Bud Sugand, Hyoungseop

More information

Sampling Procedure of the Sum of two Binary Markov Process Realizations

Sampling Procedure of the Sum of two Binary Markov Process Realizations Samplng Procedure of he Sum of wo Bnary Markov Process Realzaons YURY GORITSKIY Dep. of Mahemacal Modelng of Moscow Power Insue (Techncal Unversy), Moscow, RUSSIA, E-mal: gorsky@yandex.ru VLADIMIR KAZAKOV

More information

Volatility Interpolation

Volatility Interpolation Volaly Inerpolaon Prelmnary Verson March 00 Jesper Andreasen and Bran Huge Danse Mares, Copenhagen wan.daddy@danseban.com brno@danseban.com Elecronc copy avalable a: hp://ssrn.com/absrac=69497 Inro Local

More information

Hidden Markov Model for Speech Recognition. Using Modified Forward-Backward Re-estimation Algorithm

Hidden Markov Model for Speech Recognition. Using Modified Forward-Backward Re-estimation Algorithm IJCSI Inernaonal Journal of Compuer Scence Issues Vol. 9 Issue 4 o 2 July 22 ISS (Onlne): 694-84.IJCSI.org 242 Hdden Markov Model for Speech Recognon Usng Modfed Forard-Backard Re-esmaon Algorhm Balan

More information

0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED

0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED 0.1 MAXIMUM LIKELIHOOD ESTIMATIO EXPLAIED Maximum likelihood esimaion is a bes-fi saisical mehod for he esimaion of he values of he parameers of a sysem, based on a se of observaions of a random variable

More information

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering CS 536: Machne Learnng Nonparamerc Densy Esmaon Unsupervsed Learnng - Cluserng Fall 2005 Ahmed Elgammal Dep of Compuer Scence Rugers Unversy CS 536 Densy Esmaon - Cluserng - 1 Oulnes Densy esmaon Nonparamerc

More information

MACHINE LEARNING. Learning Bayesian networks

MACHINE LEARNING. Learning Bayesian networks Iowa Sae Unversy MACHINE LEARNING Vasan Honavar Bonformacs and Compuaonal Bology Program Cener for Compuaonal Inellgence, Learnng, & Dscovery Iowa Sae Unversy honavar@cs.asae.edu www.cs.asae.edu/~honavar/

More information

2.1 Constitutive Theory

2.1 Constitutive Theory Secon.. Consuve Theory.. Consuve Equaons Governng Equaons The equaons governng he behavour of maerals are (n he spaal form) dρ v & ρ + ρdv v = + ρ = Conservaon of Mass (..a) d x σ j dv dvσ + b = ρ v& +

More information

Clustering with Gaussian Mixtures

Clustering with Gaussian Mixtures Noe o oher eachers and users of hese sldes. Andrew would be delghed f you found hs source maeral useful n gvng your own lecures. Feel free o use hese sldes verbam, or o modfy hem o f your own needs. PowerPon

More information

Polymerization Technology Laboratory Course

Polymerization Technology Laboratory Course Prakkum Polymer Scence/Polymersaonsechnk Versuch Resdence Tme Dsrbuon Polymerzaon Technology Laboraory Course Resdence Tme Dsrbuon of Chemcal Reacors If molecules or elemens of a flud are akng dfferen

More information

Single-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method

Single-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method 10 h US Naonal Congress on Compuaonal Mechancs Columbus, Oho 16-19, 2009 Sngle-loop Sysem Relably-Based Desgn & Topology Opmzaon (SRBDO/SRBTO): A Marx-based Sysem Relably (MSR) Mehod Tam Nguyen, Junho

More information