Doctoral Course in Speech Recognition
|
|
- Evangeline Warner
- 6 years ago
- Views:
Transcription
1 Docoral Course in Speech Recogniion Friday March 30 Mas Blomberg March-June 2007 March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg
2 General course info Home page hp:// se_pm.hml Exercises VQ, CART, HMM decoding, raining Reurn soluions by May 7 Term paper 4-6 pages, max 0 Send o reviewers ( 2 course paricipans by May 6 Reviewer reurn commens by May 25 Final paper o eacher and he reviewers by June Closing seminar Presenaion of own paper Acive discussions March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 2
3 Day # Course overview Probabiliy, Saisics and Informaion Theory (pp 73-3: 59 pages Paern Recogniion (pp 33-97: 65 pages Speech Signal Represenaions (pp pages Hidden Marov Models (pp : 37 pages Day #2 Hidden Marov Models (con. Acousic Modeling (pp : 6 pages Environmenal Robusness (pp : 68 pages Compuaional exercise Day #3 Language Modeling (pp : 46 pages Basic Search Algorihms (pp : 53 pages Large-Vocabulary Search Algorihms (pp : 4 pages Applicaions and User Inerfaces (pp : 38 pages Oher opics Day #4 Closing seminar Presenaions of erm papers March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 3
4 8.2.4 How o Esimae HMM Parameers - Baum-Welch Algorihm The mos difficul of he hree HMM problems Unsupervised learning. Incomplee daa. Sae sequence unnown. Use he EM algorihm Implemened by he ieraive Baum-Welch (Forward- Bacward algorihm March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 4
5 The EM Algorihm for HMM raining Problem approached Esimae model parameers ha maximize he probabiliy ha he model have generaed he daa. The maximum is achieved when he model disribuion is equal o ha of he he raining daa Esimae disribuions (ML of several classes (model saes when he raining daa (ime frames are no classified Is i possible o rain he classes anyway? (Yes - local maximum only Ieraive procedure, simplified descripion. Iniialise class disribuions 2. Using curren parameers, compue he class (sae probabiliies for each raining sample (ime frame 3. Every class (sae disribuion is re-compued as a probabiliy weighed conribuion of each sample (ime frame A moving arge; he new disribuion will affec he class probabiliies, which will, in urn, resul in new parameers, herefore: 4. Repea 2+3 unil convergence (Will converge Similar principle for observaion and ransiion probabiliies March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 5
6 Simplified illusraion of EM esimaion for HMM raining Say, hree pahs have been found. The probabiliies of he sae sequences for he iniial HMM are 0.3, 0.35 and s P = 0.3 s 3 s 2 P = 0.35 P = 0.22 s 2 3 T New E(s 2 = (0.3 X( X ( X( 3 / 0.70 No as simple as i may loo. Many pahs, parly shared March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 6
7 Towards Baum-Welch algorihm No feasible o compue over all individual possible sae sequences And no necessary The probabiliy ha he model has aen a cerain ransiion a a cerain ime is independen of he hisory and he fuure (Marov assumpion We only need o now heir summed effec o he probabiliy for every individual ransiion in he rellis diagram (ime-sae P(The model swiches beween saes i and a ime March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 7
8 The Baum-Welch algorihm P(The model swiched from sae i o a ime = γ (i, a b a c d d a b Transiion probabiliy: HMM a 23 a 23 8 = = 8 4 = = 2 γ (2,3 γ (2, Uerance T γ 4 (2,3 March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 8
9 The Baum-Welch Algorihm - 2 γ (i,: The probabiliy of he model having aen he ransiion from sae i o sae a ime and produced he observaions = The sum of he probabiliies for all pahs passing hrough (-,iand (, divided by he sum of he probabiliies for all pahs α ( i a b ( X β ( i = N = T α ( T March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 9
10 Forward and Bacward probabiliies N Forward probabiliy α (i α ( = α ( i ai b ( X i= The probabiliy of generaing a parial observaion X X ending a ime and sae i Bacward probabiliy The probabiliy of generaing a parial observaion X + X T saring from ime and sae i. N β ( i = aib ( X = β (i β+ ( = T,..., i N T ( i = / N + β Compuaion of α and β for a linear, lef-righ model March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg α ( i, β ( i 0 T
11 The Baum-Welch Algorihm (con. Def γ (i,: The probabiliy of he model having aen he ransiion from sae i o sae a ime γ ( i, = P(The model has swiched from sae i o a ime P(The model generaes he observed sequence and swiches from sae i o = P(The model generaes he observed sequence α ( i a b ( X β ( i = N = T α ( a ime March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg
12 The Baum-Welch Algorihm (con. New model esimaes: aˆ i bˆ ( T = = T N = = X = T γ ( i, (8.40 γ ( i, γ ( i, = o i (8.4 γ ( i, = i Quie inuiive equaions! The raio beween he expeced number of ransiions from sae i o and he expeced number of all ransiions from sae i The raio beween he expeced number of imes he observaion daa emied from sae is o and he expeced number of imes any observaion daa is emied from sae March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 2
13 8.3 Coninuous and Semi-Coninuous HMMs The observaion does no come from a finie se, bu from a coninuous space No quanizaion error March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 3
14 The Baum-Welch algorihm Coninuous Mixure Densiy HMMs P(The model swiched o sae a ime using mixure componen = ζ (, x x + x +2 Coninuous observaions HMM Uerance T General idea: One arrow for each mixure componen March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg + P(symbol => N(μ,σ 4
15 March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg Coninuous Mixure Densiy HMMs A weighed sum of mulivariae Gaussians M: number of mixure-componens c : he weigh for he h componen in sae Similar re-esimaion equaions as he discree case bu an exra dimension is he probabiliy of each mixure componen having produced he observaion = Σ = M x N c b,, ( ( μ x = = M c = = = T T, (, ( ˆ ζ ζ x μ = = = Σ T T x x ', ( ˆ ( ˆ (, ( ˆ ζ ζ μ μ = = = = T M T c, (, ( ˆ ζ ζ = = = N i T N i i i x b c a i ( ( ( (, ( α β α ζ Zea: The probabiliy ha here is a ransiion o sae and mixure comp, a ime given ha he model generaes he observed sequence
16 Discree HMM Observaions are discree symbols, e.g. VQ codeword indeces VQ space Codeword probabiliies HMM saes March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 6
17 Coninuous HMM Coninuous observaions The mixure componens and he mixure weighs are sae-dependen Gauss componens Mixure weighs HMM saes March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 7
18 Semiconinuous (Tied-Mixure HMM Coninuous observaions The mixure componen disribuions are sae-independen The mixure weighs are sae-dependen -dim. Gauss componens common o all saes Sae-dependen mixure weighs HMM saes March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 8
19 8.3.2 Semiconinuous HMMs (SCHMM (Tied-mixure HMM Bridging he gap beween discree and coninuous mixure densiy HMMs. As in discree HMM, a common codeboo for all saes Bu coninuous pdfs of Gaussians - no discree symbols Sae-dependen mixure weighs corresponds o discree oupu probabiliies b The observaion probabiliy is a sum of he individual probabiliies for all mixure componens. M b Reduced number of parameers (vs coninuous bu sill allowing deailed acousic modeling (vs discree Similar re-esimaion as for coninuous HMM, excep saeindependen mixures ( x = b ( N( x, μ, Σ = March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 9
20 8.4 Pracical Issues in Using HMMs Iniial Esimaes Model Topology Training Crieria Deleed Inerpolaion Parameer Smoohing Probabiliy Represenaions March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 20
21 8.4. Iniial Esimaes Imporan in order o reach a high local maximum Discree HMM Iniial zero probabiliy remains zero Uniform disribuion wors reasonably well Coninuous HMM mehods -means clusering Proceed from discree HMM o semi-coninuous o coninuous Sar raining single mixure models. Use previously segmened daa or fla sar (equal model parameers of all saes in he raining daa March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 2
22 8.4.2 Model Topology Lef-o-righ - he mos popular opology Number of saes per model Phone: Three o five saes Shor words: 2-3 saes per phoneme Long words: -2 saes per phoneme Null ransiions Change sae wihou using observaions (e.g. iniial and final saes March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 22
23 8.4.3 Training Crieria Maximum Lielihood Esimaion (MLE Sensiive o inaccurae Marov assumpions MCE and MMIE migh wor beer MAP suiable for adapaion and small raining daa MLE he mos used simpliciy superior performance MCE and MMIE for small and medium vocabularies Combinaions possible March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 23
24 March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg Deleed Inerpolaion Combine well-rained general models wih less well-rained deailed models The combined probabiliy is a weighed average (inerpolaion of he probabiliies of he separae models speaer-dependen and -independen models, conex-free and conexdependen phone models, unigrams, bigrams and rigrams V-fold cross-validaion o esimae λ Train on (v- pars, esimae λ on he remainding par, circulae EM algorihm: new λ = λ P(model A / (P(inerpolaed model DI ( ( ( ( x x x B A DI P P P λ λ + = ( ( (, (, (,, (, ( K C w C w C w w C w w C w w w C w w w P λ λ λ + + = Example: Combine rigrams, bigrams and unigrams
25 Alg. 8.5 Deleed Inerpolaion Procedure Sep : Iniialize λ wih a guessed esimae Sep 2: Updae λ by he formula: ˆ λ = λp ( x M ( x M n A = = λpa ( x + ( λ PB M: number of raining daa divisions P A- and P B- are esimaed on he all raining daa excep par n he number of daa poins in par aligned o he model x : he -h daa poin in he -h se Sep 3: Repea sep 2 unil convergence March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 25
26 8.4.5 Parameer Smoohing Compensae for insufficien raining daa Increase he daa (There is no daa lie more daa Reduce he number of free parameers Deleed inerpolaion Parameer flooring o avoid small probabiliy values Tying parameers (SCHMM Covariance marix inerpolae via MAP Tie marices Use diagonal covariance marices March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 26
27 8.4.6 Probabiliy Represenaions The probabiliies become very small underflow problem Vierbi decoding (only muliplicaion: use logarihm Forward-bacward (muliplicaion and addiion: difficul Soluion Scaling o mae i S α ( i = Soluion 2 Logarihmic probabiliies Use loo-up able o speed up log(p +P 2 March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 27
28 8.5 HMM Limiaions Duraion modeling Firs Order Assumpion Condiional Independence Assumpion March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 28
29 8.5. Duraion Modeling HMM duraion disribuion: exponenial decrease The probabiliy of sae duraion is he probabiliy of aing he self-loop imes muliplied by he probabiliy of leaving he sae d i ( = aii ( aii Differen o real disribuion (House & Crysal, 986 Maximum Lielihood search can model non-exponenial disribuions by sequence of saes Sandard Vierbi canno bu can be modified o a he expense of large increase in compuaion March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 29
30 Modified Vierbi for sae duraion modelling Algorihm τ V ( = Max Max V τ ( i aid ( τ b i τ l= Convenional Vierbi algorihm i [ ( i a b ( X ] V ( = Max V i ( X τ + l Maximize over previous sae i and saring ime -τ for end ime Large increase in complexiy O(D 2, (D = Max duraion Can be reduced (Seward, 2003 Modes accuracy improvemen March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 30
31 Recursion paern for explici duraion Vierbi decoding V ( τ = Max Max V τ ( i aid ( τ b i τ l= ( X τ + l HMM Explici duraion min = 2, max = 5 March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg Sandard T 3
32 8.5.2 Firs Order Assumpion In a firs order Marov chain, he ransiion from a sae depends only on he curren sae A second order Marov chain models he ransiion probabiliy beween wo saes as dependen on he curren and he previous sae Transiion probabiliy: a s = P( s s s 2s s 2 Has no offered sufficien accuracy improvemen o usify he increase in compuaional complexiy March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 32
33 8.5.3 Condiional Independence Assumpion I is assumed ha all observaion frames are dependen only on he sae ha generaed hem, no on any oher frames in he observaion sequence (neighboring or disan Difficul o handle non-saionary frames wih srong correlaion To include he dependence on he previous observaion frame: T or P( X S, Φ = P( X X, s =, Φ T P( X S, Φ = P( X R( X, s =, Φ March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 33
34 Disan repeiions of idenical phonemes in an uerance are acousically similar a de är väl fredag idag... så de blir väl ehh... fredag väll If he firs [e:] has cerain characerisics (due o accen, age, gender, ec. hen i is liely ha he second one has i as well. Dependence! Problem in speaer-independen ASR March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg
35 9.6 Adapive Techniques Minimizing Mismaches There is always mismach beween raining and recogniion condiions Adapaion Minimize he mismach dynamically wih lile calibraion daa Supervised nowledge of he correc ideniy Unsupervised he recogniion resul is assumed o be correc March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 35
36 9.6. Maximum a Poseriori (MAP A new model is esimaed using he raining daa inerpolaed wih old informaion abou he model T τ iμnw + ζ ( i, x i = ˆ μi = T τ + ζ ( i, τ i is a balancing facor beween he prior mean and he ML esimae. Can be a consan for all Gaussian componens Similar for he covariance esimaion Limiaions The prior model needs o be accurae Needs observaions for all models i = March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 36
37 9.6.2 Maximum Lielihood Linear Regression (MLLR Linear regression funcions ransform mean and covariance for maximizing he lielihood of he adapaion daa μ = A μ + b i c i A c is a regression marix, b c is an addiive vecor for regression class c A and b can be esimaed in a similar way as when raining he coninuous observaion parameers Ieraion for opimizaion Models no in he adapaion daa are updaed If lile raining daa, use few regression classes Can adap boh means and variances Does no adap ransiion probabiliies c March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 37
38 MLLR adapaion illusraion The ransform for a class is opimized o maximize he lielihood of he adaped models o generae he adapaion daa Original models Adaped models X M M M M M M M M M X X X Adapaion daa Adaped regression class Transform M M MM M M M M M Non-adaped regression class March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 38
39 Speaer-Adapive Training (SAT Problem Large variaion in he speaer-independen models MLLR adapaion of variances no very effecive Soluion: Speaer Adapive Training Adap (MLLR each raining speaer o he speaer-independen model. Use he adaped raining daa for each speaer o rain a new speaer-independen model Reduces he variaion by moving all speaers owards heir common average. Requires adapaion during recogniion March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 39
40 MLLR performance Models Relaive Error Reducion CHMM Baseline MLLR on mean only +2% MLLR on mean and +2% variance MLLR SAT +8% One conex-independen regression class for all conexdependen phones wih same mid uni March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 40
41 9.6.3 MLLR and MAP Comparison MLLR beer for small adapaion daa, MAP is beer when he adapaion daa is large. Combined MLLR+MAP bes in boh cases March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 4
42 9.6.4 Clusered Models A single speaer- and environmen- independen model ofen has oo much variabiliy for high performance recogniion and adapaion Cluser he raining daa ino smaller pariions Gender-dependen models can reduce WER by 0% Speaer clusering can reduce i furher, bu no as much Environmen clusering and adapaion in Chaper 0 March 29-30, 2007 Speech recogniion course 2007 Mas Blomberg 42
Speaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis
Speaker Adapaion Techniques For Coninuous Speech Using Medium and Small Adapaion Daa Ses Consaninos Boulis Ouline of he Presenaion Inroducion o he speaker adapaion problem Maximum Likelihood Sochasic Transformaions
More informationState-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter
Sae-Space Models Iniializaion, Esimaion and Smoohing of he Kalman Filer Iniializaion of he Kalman Filer The Kalman filer shows how o updae pas predicors and he corresponding predicion error variances when
More information0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED
0.1 MAXIMUM LIKELIHOOD ESTIMATIO EXPLAIED Maximum likelihood esimaion is a bes-fi saisical mehod for he esimaion of he values of he parameers of a sysem, based on a se of observaions of a random variable
More informationHidden Markov Models. Adapted from. Dr Catherine Sweeney-Reed s slides
Hidden Markov Models Adaped from Dr Caherine Sweeney-Reed s slides Summary Inroducion Descripion Cenral in HMM modelling Exensions Demonsraion Specificaion of an HMM Descripion N - number of saes Q = {q
More informationHidden Markov Models
Hidden Markov Models Probabilisic reasoning over ime So far, we ve mosly deal wih episodic environmens Excepions: games wih muliple moves, planning In paricular, he Bayesian neworks we ve seen so far describe
More informationGeorey E. Hinton. University oftoronto. Technical Report CRG-TR February 22, Abstract
Parameer Esimaion for Linear Dynamical Sysems Zoubin Ghahramani Georey E. Hinon Deparmen of Compuer Science Universiy oftorono 6 King's College Road Torono, Canada M5S A4 Email: zoubin@cs.orono.edu Technical
More informationVehicle Arrival Models : Headway
Chaper 12 Vehicle Arrival Models : Headway 12.1 Inroducion Modelling arrival of vehicle a secion of road is an imporan sep in raffic flow modelling. I has imporan applicaion in raffic flow simulaion where
More informationINTRODUCTION TO MACHINE LEARNING 3RD EDITION
ETHEM ALPAYDIN The MIT Press, 2014 Lecure Slides for INTRODUCTION TO MACHINE LEARNING 3RD EDITION alpaydin@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/i2ml3e CHAPTER 2: SUPERVISED LEARNING Learning a Class
More informationTwo Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017
Two Popular Bayesian Esimaors: Paricle and Kalman Filers McGill COMP 765 Sep 14 h, 2017 1 1 1, dx x Bel x u x P x z P Recall: Bayes Filers,,,,,,, 1 1 1 1 u z u x P u z u x z P Bayes z = observaion u =
More informationZürich. ETH Master Course: L Autonomous Mobile Robots Localization II
Roland Siegwar Margaria Chli Paul Furgale Marco Huer Marin Rufli Davide Scaramuzza ETH Maser Course: 151-0854-00L Auonomous Mobile Robos Localizaion II ACT and SEE For all do, (predicion updae / ACT),
More informationPresentation Overview
Acion Refinemen in Reinforcemen Learning by Probabiliy Smoohing By Thomas G. Dieerich & Didac Busques Speaer: Kai Xu Presenaion Overview Bacground The Probabiliy Smoohing Mehod Experimenal Sudy of Acion
More informationLecture 3: Exponential Smoothing
NATCOR: Forecasing & Predicive Analyics Lecure 3: Exponenial Smoohing John Boylan Lancaser Cenre for Forecasing Deparmen of Managemen Science Mehods and Models Forecasing Mehod A (numerical) procedure
More informationEnsamble methods: Bagging and Boosting
Lecure 21 Ensamble mehods: Bagging and Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Ensemble mehods Mixure of expers Muliple base models (classifiers, regressors), each covers a differen par
More informationSimulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010
Simulaion-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Week Descripion Reading Maerial 2 Compuer Simulaion of Dynamic Models Finie Difference, coninuous saes, discree ime Simple Mehods Euler Trapezoid
More informationEnsamble methods: Boosting
Lecure 21 Ensamble mehods: Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Schedule Final exam: April 18: 1:00-2:15pm, in-class Term projecs April 23 & April 25: a 1:00-2:30pm in CS seminar room
More informationDiebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles
Diebold, Chaper 7 Francis X. Diebold, Elemens of Forecasing, 4h Ediion (Mason, Ohio: Cengage Learning, 006). Chaper 7. Characerizing Cycles Afer compleing his reading you should be able o: Define covariance
More informationTom Heskes and Onno Zoeter. Presented by Mark Buller
Tom Heskes and Onno Zoeer Presened by Mark Buller Dynamic Bayesian Neworks Direced graphical models of sochasic processes Represen hidden and observed variables wih differen dependencies Generalize Hidden
More informationL07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms
L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS NA568 Mobile Roboics: Mehods & Algorihms Today s Topic Quick review on (Linear) Kalman Filer Kalman Filering for Non-Linear Sysems Exended Kalman Filer (EKF)
More informationIsolated-word speech recognition using hidden Markov models
Isolaed-word speech recogniion using hidden Markov models Håkon Sandsmark December 18, 21 1 Inroducion Speech recogniion is a challenging problem on which much work has been done he las decades. Some of
More informationKriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Kriging Models Predicing Arazine Concenraions in Surface Waer Draining Agriculural Waersheds Paul L. Mosquin, Jeremy Aldworh, Wenlin Chen Supplemenal Maerial Number
More informationPENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD
PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD HAN XIAO 1. Penalized Leas Squares Lasso solves he following opimizaion problem, ˆβ lasso = arg max β R p+1 1 N y i β 0 N x ij β j β j (1.1) for some 0.
More informationReliability of Technical Systems
eliabiliy of Technical Sysems Main Topics Inroducion, Key erms, framing he problem eliabiliy parameers: Failure ae, Failure Probabiliy, Availabiliy, ec. Some imporan reliabiliy disribuions Componen reliabiliy
More informationSequential Importance Resampling (SIR) Particle Filter
Paricle Filers++ Pieer Abbeel UC Berkeley EECS Many slides adaped from Thrun, Burgard and Fox, Probabilisic Roboics 1. Algorihm paricle_filer( S -1, u, z ): 2. Sequenial Imporance Resampling (SIR) Paricle
More informationOBJECTIVES OF TIME SERIES ANALYSIS
OBJECTIVES OF TIME SERIES ANALYSIS Undersanding he dynamic or imedependen srucure of he observaions of a single series (univariae analysis) Forecasing of fuure observaions Asceraining he leading, lagging
More informationt is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t...
Mah 228- Fri Mar 24 5.6 Marix exponenials and linear sysems: The analogy beween firs order sysems of linear differenial equaions (Chaper 5) and scalar linear differenial equaions (Chaper ) is much sronger
More informationBlock Diagram of a DCS in 411
Informaion source Forma A/D From oher sources Pulse modu. Muliplex Bandpass modu. X M h: channel impulse response m i g i s i Digial inpu Digial oupu iming and synchronizaion Digial baseband/ bandpass
More informationAir Traffic Forecast Empirical Research Based on the MCMC Method
Compuer and Informaion Science; Vol. 5, No. 5; 0 ISSN 93-8989 E-ISSN 93-8997 Published by Canadian Cener of Science and Educaion Air Traffic Forecas Empirical Research Based on he MCMC Mehod Jian-bo Wang,
More informationChapter 2. Models, Censoring, and Likelihood for Failure-Time Data
Chaper 2 Models, Censoring, and Likelihood for Failure-Time Daa William Q. Meeker and Luis A. Escobar Iowa Sae Universiy and Louisiana Sae Universiy Copyrigh 1998-2008 W. Q. Meeker and L. A. Escobar. Based
More information20. Applications of the Genetic-Drift Model
0. Applicaions of he Geneic-Drif Model 1) Deermining he probabiliy of forming any paricular combinaion of genoypes in he nex generaion: Example: If he parenal allele frequencies are p 0 = 0.35 and q 0
More informationEcon107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8)
I. Definiions and Problems A. Perfec Mulicollineariy Econ7 Applied Economerics Topic 7: Mulicollineariy (Sudenmund, Chaper 8) Definiion: Perfec mulicollineariy exiss in a following K-variable regression
More informationProbabilistic Robotics
Probabilisic Roboics Bayes Filer Implemenaions Gaussian filers Bayes Filer Reminder Predicion bel p u bel d Correcion bel η p z bel Gaussians : ~ π e p N p - Univariae / / : ~ μ μ μ e p Ν p d π Mulivariae
More informationCHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK
175 CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK 10.1 INTRODUCTION Amongs he research work performed, he bes resuls of experimenal work are validaed wih Arificial Neural Nework. From he
More informationACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H.
ACE 56 Fall 005 Lecure 5: he Simple Linear Regression Model: Sampling Properies of he Leas Squares Esimaors by Professor Sco H. Irwin Required Reading: Griffihs, Hill and Judge. "Inference in he Simple
More informationLecture 33: November 29
36-705: Inermediae Saisics Fall 2017 Lecurer: Siva Balakrishnan Lecure 33: November 29 Today we will coninue discussing he boosrap, and hen ry o undersand why i works in a simple case. In he las lecure
More informationAugmented Reality II - Kalman Filters - Gudrun Klinker May 25, 2004
Augmened Realiy II Kalman Filers Gudrun Klinker May 25, 2004 Ouline Moivaion Discree Kalman Filer Modeled Process Compuing Model Parameers Algorihm Exended Kalman Filer Kalman Filer for Sensor Fusion Lieraure
More informationReferences are appeared in the last slide. Last update: (1393/08/19)
SYSEM IDEIFICAIO Ali Karimpour Associae Professor Ferdowsi Universi of Mashhad References are appeared in he las slide. Las updae: 0..204 393/08/9 Lecure 5 lecure 5 Parameer Esimaion Mehods opics o be
More informationTemporal probability models
Temporal probabiliy models CS194-10 Fall 2011 Lecure 25 CS194-10 Fall 2011 Lecure 25 1 Ouline Hidden variables Inerence: ilering, predicion, smoohing Hidden Markov models Kalman ilers (a brie menion) Dynamic
More informationModal identification of structures from roving input data by means of maximum likelihood estimation of the state space model
Modal idenificaion of srucures from roving inpu daa by means of maximum likelihood esimaion of he sae space model J. Cara, J. Juan, E. Alarcón Absrac The usual way o perform a forced vibraion es is o fix
More informationNotes on Kalman Filtering
Noes on Kalman Filering Brian Borchers and Rick Aser November 7, Inroducion Daa Assimilaion is he problem of merging model predicions wih acual measuremens of a sysem o produce an opimal esimae of he curren
More informationAn recursive analytical technique to estimate time dependent physical parameters in the presence of noise processes
WHAT IS A KALMAN FILTER An recursive analyical echnique o esimae ime dependen physical parameers in he presence of noise processes Example of a ime and frequency applicaion: Offse beween wo clocks PREDICTORS,
More informationT Automatic Speech Recognition: From Theory to Practice
Auomaic Speech Recogniion: From Theory o Pracice hp://www.cis.hu.fi/opinno// November 15, 2004 Prof. Bryan Pellom Deparmen of Compuer Science Cener for Spoken Language Research Universiy of Colorado pellom@cslr.colorado.edu
More informationLinear Gaussian State Space Models
Linear Gaussian Sae Space Models Srucural Time Series Models Level and Trend Models Basic Srucural Model (BSM Dynamic Linear Models Sae Space Model Represenaion Level, Trend, and Seasonal Models Time Varying
More informationNon-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important
on-parameric echniques Insance Based Learning AKA: neares neighbor mehods, non-parameric, lazy, memorybased, or case-based learning Copyrigh 2005 by David Helmbold 1 Do no fi a model (as do LTU, decision
More informationPattern Classification (VI) 杜俊
Paern lassificaion VI 杜俊 jundu@usc.edu.cn Ouline Bayesian Decision Theory How o make he oimal decision? Maximum a oserior MAP decision rule Generaive Models Join disribuion of observaion and label sequences
More informationExponential Smoothing
Exponenial moohing Inroducion A simple mehod for forecasing. Does no require long series. Enables o decompose he series ino a rend and seasonal effecs. Paricularly useful mehod when here is a need o forecas
More informationExpectation- Maximization & Baum-Welch. Slides: Roded Sharan, Jan 15; revised by Ron Shamir, Nov 15
Expecaion- Maximizaion & Baum-Welch Slides: Roded Sharan, Jan 15; revised by Ron Shamir, Nov 15 1 The goal Inpu: incomplee daa originaing from a probabiliy disribuion wih some unknown parameers Wan o find
More informationA variational radial basis function approximation for diffusion processes.
A variaional radial basis funcion approximaion for diffusion processes. Michail D. Vreas, Dan Cornford and Yuan Shen {vreasm, d.cornford, y.shen}@ason.ac.uk Ason Universiy, Birmingham, UK hp://www.ncrg.ason.ac.uk
More informationIntroduction to Mobile Robotics
Inroducion o Mobile Roboics Bayes Filer Kalman Filer Wolfram Burgard Cyrill Sachniss Giorgio Grisei Maren Bennewiz Chrisian Plagemann Bayes Filer Reminder Predicion bel p u bel d Correcion bel η p z bel
More information10. State Space Methods
. Sae Space Mehods. Inroducion Sae space modelling was briefly inroduced in chaper. Here more coverage is provided of sae space mehods before some of heir uses in conrol sysem design are covered in he
More informationR t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t
Exercise 7 C P = α + β R P + u C = αp + βr + v (a) (b) C R = α P R + β + w (c) Assumpions abou he disurbances u, v, w : Classical assumions on he disurbance of one of he equaions, eg. on (b): E(v v s P,
More information12: AUTOREGRESSIVE AND MOVING AVERAGE PROCESSES IN DISCRETE TIME. Σ j =
1: AUTOREGRESSIVE AND MOVING AVERAGE PROCESSES IN DISCRETE TIME Moving Averages Recall ha a whie noise process is a series { } = having variance σ. The whie noise process has specral densiy f (λ) = of
More informationNon-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important
on-parameric echniques Insance Based Learning AKA: neares neighbor mehods, non-parameric, lazy, memorybased, or case-based learning Copyrigh 2005 by David Helmbold 1 Do no fi a model (as do LDA, logisic
More informationLecture 5. Time series: ECM. Bernardina Algieri Department Economics, Statistics and Finance
Lecure 5 Time series: ECM Bernardina Algieri Deparmen Economics, Saisics and Finance Conens Time Series Modelling Coinegraion Error Correcion Model Two Seps, Engle-Granger procedure Error Correcion Model
More information3.1.3 INTRODUCTION TO DYNAMIC OPTIMIZATION: DISCRETE TIME PROBLEMS. A. The Hamiltonian and First-Order Conditions in a Finite Time Horizon
3..3 INRODUCION O DYNAMIC OPIMIZAION: DISCREE IME PROBLEMS A. he Hamilonian and Firs-Order Condiions in a Finie ime Horizon Define a new funcion, he Hamilonian funcion, H. H he change in he oal value of
More informationSolutions to Odd Number Exercises in Chapter 6
1 Soluions o Odd Number Exercises in 6.1 R y eˆ 1.7151 y 6.3 From eˆ ( T K) ˆ R 1 1 SST SST SST (1 R ) 55.36(1.7911) we have, ˆ 6.414 T K ( ) 6.5 y ye ye y e 1 1 Consider he erms e and xe b b x e y e b
More informationhen found from Bayes rule. Specically, he prior disribuion is given by p( ) = N( ; ^ ; r ) (.3) where r is he prior variance (we add on he random drif
Chaper Kalman Filers. Inroducion We describe Bayesian Learning for sequenial esimaion of parameers (eg. means, AR coeciens). The updae procedures are known as Kalman Filers. We show how Dynamic Linear
More informationPlanning in POMDPs. Dominik Schoenberger Abstract
Planning in POMDPs Dominik Schoenberger d.schoenberger@sud.u-darmsad.de Absrac This documen briefly explains wha a Parially Observable Markov Decision Process is. Furhermore i inroduces he differen approaches
More informationGeneralized Least Squares
Generalized Leas Squares Augus 006 1 Modified Model Original assumpions: 1 Specificaion: y = Xβ + ε (1) Eε =0 3 EX 0 ε =0 4 Eεε 0 = σ I In his secion, we consider relaxing assumpion (4) Insead, assume
More informationSTATE-SPACE MODELLING. A mass balance across the tank gives:
B. Lennox and N.F. Thornhill, 9, Sae Space Modelling, IChemE Process Managemen and Conrol Subjec Group Newsleer STE-SPACE MODELLING Inroducion: Over he pas decade or so here has been an ever increasing
More informationMaximum Likelihood Parameter Estimation in State-Space Models
Maximum Likelihood Parameer Esimaion in Sae-Space Models Arnaud Douce Deparmen of Saisics, Oxford Universiy Universiy College London 4 h Ocober 212 A. Douce (UCL Maserclass Oc. 212 4 h Ocober 212 1 / 32
More informationProperties of Autocorrelated Processes Economics 30331
Properies of Auocorrelaed Processes Economics 3033 Bill Evans Fall 05 Suppose we have ime series daa series labeled as where =,,3, T (he final period) Some examples are he dail closing price of he S&500,
More information4.1 Other Interpretations of Ridge Regression
CHAPTER 4 FURTHER RIDGE THEORY 4. Oher Inerpreaions of Ridge Regression In his secion we will presen hree inerpreaions for he use of ridge regression. The firs one is analogous o Hoerl and Kennard reasoning
More informationAdaptive Noise Estimation Based on Non-negative Matrix Factorization
dvanced cience and Technology Leers Vol.3 (ICC 213), pp.159-163 hp://dx.doi.org/1.14257/asl.213 dapive Noise Esimaion ased on Non-negaive Marix Facorizaion Kwang Myung Jeon and Hong Kook Kim chool of Informaion
More informationTemporal probability models. Chapter 15, Sections 1 5 1
Temporal probabiliy models Chaper 15, Secions 1 5 Chaper 15, Secions 1 5 1 Ouline Time and uncerainy Inerence: ilering, predicion, smoohing Hidden Markov models Kalman ilers (a brie menion) Dynamic Bayesian
More informationStatistical Machine Learning Methods for Bioinformatics I. Hidden Markov Model Theory
Saisical Machine Learning Mehods for Bioinformaics I. Hidden Markov Model Theory Jianlin Cheng, PhD Informaics Insiue, Deparmen of Compuer Science Universiy of Missouri 2009 Free for Academic Use. Copyrigh
More informationHW6: MRI Imaging Pulse Sequences (7 Problems for 100 pts)
HW6: MRI Imaging Pulse Sequences (7 Problems for 100 ps) GOAL The overall goal of HW6 is o beer undersand pulse sequences for MRI image reconsrucion. OBJECTIVES 1) Design a spin echo pulse sequence o image
More informationObject tracking: Using HMMs to estimate the geographical location of fish
Objec racking: Using HMMs o esimae he geographical locaion of fish 02433 - Hidden Markov Models Marin Wæver Pedersen, Henrik Madsen Course week 13 MWP, compiled June 8, 2011 Objecive: Locae fish from agging
More informationModeling and Forecasting Volatility Autoregressive Conditional Heteroskedasticity Models. Economic Forecasting Anthony Tay Slide 1
Modeling and Forecasing Volailiy Auoregressive Condiional Heeroskedasiciy Models Anhony Tay Slide 1 smpl @all line(m) sii dl_sii S TII D L _ S TII 4,000. 3,000.1.0,000 -.1 1,000 -. 0 86 88 90 9 94 96 98
More informationLearning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power
Alpaydin Chaper, Michell Chaper 7 Alpaydin slides are in urquoise. Ehem Alpaydin, copyrigh: The MIT Press, 010. alpaydin@boun.edu.r hp://www.cmpe.boun.edu.r/ ehem/imle All oher slides are based on Michell.
More informationThe Simple Linear Regression Model: Reporting the Results and Choosing the Functional Form
Chaper 6 The Simple Linear Regression Model: Reporing he Resuls and Choosing he Funcional Form To complee he analysis of he simple linear regression model, in his chaper we will consider how o measure
More informationGMM - Generalized Method of Moments
GMM - Generalized Mehod of Momens Conens GMM esimaion, shor inroducion 2 GMM inuiion: Maching momens 2 3 General overview of GMM esimaion. 3 3. Weighing marix...........................................
More informationVectorautoregressive Model and Cointegration Analysis. Time Series Analysis Dr. Sevtap Kestel 1
Vecorauoregressive Model and Coinegraion Analysis Par V Time Series Analysis Dr. Sevap Kesel 1 Vecorauoregression Vecor auoregression (VAR) is an economeric model used o capure he evoluion and he inerdependencies
More informationFinancial Econometrics Kalman Filter: some applications to Finance University of Evry - Master 2
Financial Economerics Kalman Filer: some applicaions o Finance Universiy of Evry - Maser 2 Eric Bouyé January 27, 2009 Conens 1 Sae-space models 2 2 The Scalar Kalman Filer 2 21 Presenaion 2 22 Summary
More informationLearning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power
Alpaydin Chaper, Michell Chaper 7 Alpaydin slides are in urquoise. Ehem Alpaydin, copyrigh: The MIT Press, 010. alpaydin@boun.edu.r hp://www.cmpe.boun.edu.r/ ehem/imle All oher slides are based on Michell.
More informationLAPLACE TRANSFORM AND TRANSFER FUNCTION
CHBE320 LECTURE V LAPLACE TRANSFORM AND TRANSFER FUNCTION Professor Dae Ryook Yang Spring 2018 Dep. of Chemical and Biological Engineering 5-1 Road Map of he Lecure V Laplace Transform and Transfer funcions
More informationProbabilistic learning
Probabilisic learning Charles Elkan November 8, 2012 Imporan: These lecure noes are based closely on noes wrien by Lawrence Saul. Tex may be copied direcly from his noes, or paraphrased. Also, hese ypese
More informationNature Neuroscience: doi: /nn Supplementary Figure 1. Spike-count autocorrelations in time.
Supplemenary Figure 1 Spike-coun auocorrelaions in ime. Normalized auocorrelaion marices are shown for each area in a daase. The marix shows he mean correlaion of he spike coun in each ime bin wih he spike
More informationMethodology. -ratios are biased and that the appropriate critical values have to be increased by an amount. that depends on the sample size.
Mehodology. Uni Roo Tess A ime series is inegraed when i has a mean revering propery and a finie variance. I is only emporarily ou of equilibrium and is called saionary in I(0). However a ime series ha
More informationSmoothing. Backward smoother: At any give T, replace the observation yt by a combination of observations at & before T
Smoohing Consan process Separae signal & noise Smooh he daa: Backward smooher: A an give, replace he observaion b a combinaion of observaions a & before Simple smooher : replace he curren observaion wih
More information14 Autoregressive Moving Average Models
14 Auoregressive Moving Average Models In his chaper an imporan parameric family of saionary ime series is inroduced, he family of he auoregressive moving average, or ARMA, processes. For a large class
More informationSpring Ammar Abu-Hudrouss Islamic University Gaza
Chaper 7 Reed-Solomon Code Spring 9 Ammar Abu-Hudrouss Islamic Universiy Gaza ١ Inroducion A Reed Solomon code is a special case of a BCH code in which he lengh of he code is one less han he size of he
More informationMachine Learning Methods for Bioinformatics I. Hidden Markov Model Theory
Machine Learning Mehods for Bioinformaics I. Hidden Markov Model Theory Jianlin Cheng, PhD Deparmen of Compuer Science Universiy of Missouri 202 Free for Academic Use. Copyrigh @ Jianlin Cheng. Wha s is
More informationChapter 5. Heterocedastic Models. Introduction to time series (2008) 1
Chaper 5 Heerocedasic Models Inroducion o ime series (2008) 1 Chaper 5. Conens. 5.1. The ARCH model. 5.2. The GARCH model. 5.3. The exponenial GARCH model. 5.4. The CHARMA model. 5.5. Random coefficien
More informationMachine Learning 4771
ony Jebara, Columbia Universiy achine Learning 4771 Insrucor: ony Jebara ony Jebara, Columbia Universiy opic 20 Hs wih Evidence H Collec H Evaluae H Disribue H Decode H Parameer Learning via JA & E ony
More informationMath 10B: Mock Mid II. April 13, 2016
Name: Soluions Mah 10B: Mock Mid II April 13, 016 1. ( poins) Sae, wih jusificaion, wheher he following saemens are rue or false. (a) If a 3 3 marix A saisfies A 3 A = 0, hen i canno be inverible. True.
More informationIntroduction D P. r = constant discount rate, g = Gordon Model (1962): constant dividend growth rate.
Inroducion Gordon Model (1962): D P = r g r = consan discoun rae, g = consan dividend growh rae. If raional expecaions of fuure discoun raes and dividend growh vary over ime, so should he D/P raio. Since
More informationLet us start with a two dimensional case. We consider a vector ( x,
Roaion marices We consider now roaion marices in wo and hree dimensions. We sar wih wo dimensions since wo dimensions are easier han hree o undersand, and one dimension is a lile oo simple. However, our
More informationSingle-loop System Reliability-based Topology Optimization Accounting for Statistical Dependence between Limit-states
11 h US Naional Congress on Compuaional Mechanics Minneapolis, Minnesoa Single-loop Sysem Reliabiliy-based Topology Opimizaion Accouning for Saisical Dependence beween imi-saes Tam Nguyen, Norheasern Universiy
More information2.160 System Identification, Estimation, and Learning. Lecture Notes No. 8. March 6, 2006
2.160 Sysem Idenificaion, Esimaion, and Learning Lecure Noes No. 8 March 6, 2006 4.9 Eended Kalman Filer In many pracical problems, he process dynamics are nonlinear. w Process Dynamics v y u Model (Linearized)
More informationDeep Learning: Theory, Techniques & Applications - Recurrent Neural Networks -
Deep Learning: Theory, Techniques & Applicaions - Recurren Neural Neworks - Prof. Maeo Maeucci maeo.maeucci@polimi.i Deparmen of Elecronics, Informaion and Bioengineering Arificial Inelligence and Roboics
More informationAn introduction to the theory of SDDP algorithm
An inroducion o he heory of SDDP algorihm V. Leclère (ENPC) Augus 1, 2014 V. Leclère Inroducion o SDDP Augus 1, 2014 1 / 21 Inroducion Large scale sochasic problem are hard o solve. Two ways of aacking
More informationEchocardiography Project and Finite Fourier Series
Echocardiography Projec and Finie Fourier Series 1 U M An echocardiagram is a plo of how a porion of he hear moves as he funcion of ime over he one or more hearbea cycles If he hearbea repeas iself every
More informationUnit Root Time Series. Univariate random walk
Uni Roo ime Series Univariae random walk Consider he regression y y where ~ iid N 0, he leas squares esimae of is: ˆ yy y y yy Now wha if = If y y hen le y 0 =0 so ha y j j If ~ iid N 0, hen y ~ N 0, he
More informationACE 562 Fall Lecture 4: Simple Linear Regression Model: Specification and Estimation. by Professor Scott H. Irwin
ACE 56 Fall 005 Lecure 4: Simple Linear Regression Model: Specificaion and Esimaion by Professor Sco H. Irwin Required Reading: Griffihs, Hill and Judge. "Simple Regression: Economic and Saisical Model
More informationNumerical Dispersion
eview of Linear Numerical Sabiliy Numerical Dispersion n he previous lecure, we considered he linear numerical sabiliy of boh advecion and diffusion erms when approimaed wih several spaial and emporal
More informationSupplement for Stochastic Convex Optimization: Faster Local Growth Implies Faster Global Convergence
Supplemen for Sochasic Convex Opimizaion: Faser Local Growh Implies Faser Global Convergence Yi Xu Qihang Lin ianbao Yang Proof of heorem heorem Suppose Assumpion holds and F (w) obeys he LGC (6) Given
More information3.1 More on model selection
3. More on Model selecion 3. Comparing models AIC, BIC, Adjused R squared. 3. Over Fiing problem. 3.3 Sample spliing. 3. More on model selecion crieria Ofen afer model fiing you are lef wih a handful of
More information2017 3rd International Conference on E-commerce and Contemporary Economic Development (ECED 2017) ISBN:
7 3rd Inernaional Conference on E-commerce and Conemporary Economic Developmen (ECED 7) ISBN: 978--6595-446- Fuures Arbirage of Differen Varieies and based on he Coinegraion Which is under he Framework
More informationTracking. Announcements
Tracking Tuesday, Nov 24 Krisen Grauman UT Ausin Announcemens Pse 5 ou onigh, due 12/4 Shorer assignmen Auo exension il 12/8 I will no hold office hours omorrow 5 6 pm due o Thanksgiving 1 Las ime: Moion
More informationArticle from. Predictive Analytics and Futurism. July 2016 Issue 13
Aricle from Predicive Analyics and Fuurism July 6 Issue An Inroducion o Incremenal Learning By Qiang Wu and Dave Snell Machine learning provides useful ools for predicive analyics The ypical machine learning
More information