Isolated-word speech recognition using hidden Markov models
|
|
- Deirdre Young
- 5 years ago
- Views:
Transcription
1 Isolaed-word speech recogniion using hidden Markov models Håkon Sandsmark December 18, 21 1 Inroducion Speech recogniion is a challenging problem on which much work has been done he las decades. Some of he mos successful resuls have been obained by using hidden Markov models as explained by Rabiner in 1989 [1]. A well working generic speech recognizer would enable more efficien communicaion for everybody, bu especially for children, analphabes and people wih disabiliies. A speech recognizer could also be a subsysem in a speech-o-speech ranslaor. The speech recogniion sysem implemened during his projec rains one hidden Markov model for each word ha i should be able o recognize. The models are rained wih labeled raining daa, and he classificaion is performed by passing he feaures o each model and hen selecing he bes mach. apple banana kiwi speech feaure exracion lime classificaion orange peach pineapple Figure 1: Flow char of he sysem. The feaures exraced from he speech signal are passed o each word model and he bes mach is seleced. 1
2 2 Background heory 2.1 Hidden Markov models Basic knowledge of hidden Markov models is assumed, bu he wo mos imporan algorihms used in his projec will be described. The observable oupu from a hidden sae is assumed o be generaed by a mulivariae Gaussian disribuion, so here is one mean vecor and covariance marix for each sae. We will also assume ha he sae ransiion probabiliies are independen of ime, such ha he hidden Markov chain is homogenous. We will now define he noaion for describing a hidden Markov model as used in his projec. There is a oal number of N saes. An elemen a ss in he ransiion probabiliy marix A denoes he ransiion probabiliy from sae s o sae s, and he probabiliy for he chain o sar in sae s is π s. The mean vecor and covariance marix for he mulivariae Gaussian disribuion modeling he observable oupu from sae s are µ s and Σ s, respecively. For an observaion o, b s (o) denoes he probabiliy densiy of he mulivariae Gaussian disribuion of sae s a he values of o. We will someimes denoe he collecion of parameers describing he hidden Markov model as λ = {A, π, µ, Σ}. 2.2 The forward algorihm We wan o calculae he probabiliy densiy of an observaion o 1,..., o T for a specific model. This will be used o selec he model (i.e. word) ha mos likely generaed he speech signal. f(o 1,..., o T ; λ) = s T f(o 1,..., o T, s T ; λ) (1) = s T f(o T o 1,..., o T 1, s T ; λ)f(o 1,..., o T 1, s T ; λ) (2) = s T b st (o T ) s T 1 f(o 1,..., o T 1, s T 1, s T ; λ) (3) = s T b st (o T ) s T 1 f(s T o 1,..., o T 1, s T 1 ; λ)f(o 1,..., o T 1, s T 1 ; λ) = s T b st (o T ) s T 1 a st 1 s T f(o 1,..., o T 1, s T 1 ; λ) (5) (4) The recursive srucure is revealed as we reduced he problem from needing f(o 1,..., o T, s T ; λ) for all s T o needing f(o 1,..., o T 1, s T 1 ; λ) for all s T 1. Le us inroduce he forward variable o ease he noaion. 2
3 α 1 (s) f(o 1, S 1 = s; λ) (6) = b s (o 1 )π s (7) α (s) f(o 1,..., o, S = s; λ) (8) = b s (o ) a s sα 1 (s ) s (9) Then our soluion can be expressed nicely as f(o 1,..., o T ; λ) = s α T (s). (1) Implemened naïvely op-down (backwards in ime) his would no bring us any luck because of he exponenially recursive srucure. The naïve algorihm is however easily converible o an efficien varian using dynamic programming where we calculae he forward variables boom-up (forwards in ime). We simply calculae α (s) for all saes s, firs for = 1 and hen all he way up o T. This way all he forward variables from he previous ime sep are readily available when needed. 2.3 The Baum-Welch algorihm We wan o find he parameers λ ha maximize he likelihood of he observaions. This will be used o rain he hidden Markov model wih speech signals. The Baum-Welch algorihm is an ieraive expecaion-maximizaion (EM) algorihm ha converges o a locally opimal soluion from he iniializaion values. The M-sep consiss of updaing he parameers in he following inuiive way: expeced number of imes in sae s a = 1 π s := π s = expeced number of imes a = 1 (11) expeced number of ransiions from s o s a ss := a ss = expeced number of ransiions from s (12) µ s := µ s = expeced observaion when in sae s (13) Σ s := Σ s = observaion covariance when in sae s (14) The E-sep hus consiss of calculaing hese expecaions for a fixed λ. Le V s () denoe he even of ransiion from sae s a ime sep, and V () s,s he even of ransiion from s o s a. Then we calculae hese expecaions by using indicaor funcions and lineariy of expecaion. 3
4 π s = E{1[V s (1) ]} = P (V s (1) ) (15) a ss = E{ () 1[V s,s ]} E{ () 1[V s ]} = µ s = E{ () 1[V s ]o } E{ () 1[V s ]} = P (V () s,s ) P (V () s ) () P (V s )o () P (V s ) Σ s = E{ () 1[V s ](o o T µ s µ T s )} E{ () 1[V s ]} = () P (V s )o o T () P (V s ) µ s µ s T Noe ha he non-ialic T denoes ranspose and has nohing o do wih ime. To be able o calculae hese probabiliies we firs inroduce he backward variable which is very similar o he forward variable previously defined. (16) (17) (18) β T (s) 1 (19) β (s) f(o +1,..., o T S = s; λ) (2) = s a ss b s (o +1 )β +1 (s ) (21) The backward variable has is name because i is firs calculaed for he las ime sep and hen backwards in ime when implemened wih dynamic programming (essenially he reverse procedure of he one described in deail for he forward variable). Then we rename he probabiliies o he same symbols as used by Rabiner and express hem by forward and backward variables: γ (s) P (V s () ) = P (S = s o 1,..., o T ; λ) (22) = f(o 1,..., o T S = s)p (S = s) f(o 1,..., o T ) (23) = f(o 1,..., o, S = s)f(o +1,..., o T S = s) f(o 1,..., o T ) (24) = α (s)β (s) f(o 1,..., o T ) (25) And similarly (deails omied): ξ (s, s ) P (V () s,s ) = P (S = s, S +1 = s o 1,..., o T ; λ) (26) And finally we ge he following: = α (s)b s (o +1 )a ss β +1 (s ) f(o 1,..., o T ) (27) 4
5 π s = γ 1 (s) (28) a ss = ξ (s, s ) γ (29) (s) µ s = γ (s)o γ (3) (s) Σ s = γ (s)o o T T γ µ s µ s (31) (s) To summarize he E-sep boils down o compuing γ (s) and ξ (s, s ) for all s, s and while he parameers λ are fixed, and hen he M-sep will updae λ by using he calculaions done in he E-sep. This is ieraed unil saisfacion. 3 Sysem design 3.1 Feaure exracion The source speech is sampled a 8 Hz and quanized wih 16 bis. The signal is spli up in shor frames of 8 samples corresponding o 1 ms of speech. The frames overlap wih 2 samples on each side. The idea is ha he speech is close o saionary during his shor period of ime because of he relaively limied flexibiliy of he hroa. We will pick ou our feaures from he frequency domain, bu before we ge here by aking he fas Fourier ransform, we muliply by a Hamming window o reduce specral leakage caused by he framing of he signal Speech signal Hamming window 3.5 x X(F) ime (a) Speech signal and Hamming window in ime domain F [Hz] (b) Single-sided magniude specrum of he same speech signal muliplied by he Hamming window. Figure 2: An 8 sample frame of an unvoiced par of a speech signal. Unvoiced speech, like sh, is more noisy and conains higher frequencies han voiced speech. The D larges local maxima from he single-sided magniude specrum are are picked as feaures for each frame, and D is indeed an imporan parameer of he sysem ha will be discussed laer. 5
6 Speech signal Hamming window X(F) ime (a) Speech signal and Hamming window in ime domain F [Hz] (b) Single-sided magniude specrum of he same speech signal muliplied by he Hamming window. Figure 3: An 8 sample frame of a voiced par of a speech signal. 3.2 Training The raining is a combinaion of boh supervised and unsupervised echniques. We rain one hidden Markov model per word wih already classified speech signals. One imporan choice is he number of differen saes in each model. The goal is ha each sae should represen a phoneme in he word. The clusering of he Gaussians is however unsupervised and will depend on he iniial values used for he Baum-Welch algorihm. For his projec, oally random guesses (ha obey he saisical properies) for A and π were used as iniial values. For Σ s, he diagonal covariance marix for he raining daa was used for all saes. For each sae a random raining daa poin was chosen as µ s. The raining examples for each word are concaenaed ogeher, and Baum-Welch is run for 15 ieraions. 3.3 Classificaion Le λ i denoe he parameer se for word i. When presened wih an observaion o 1,..., o T, he selecion is done as follows. prediced word = arg max f(o 1,..., o T ; λ i ) (32) i And we recognize ha f(o 1,..., o T ; λ i ) is exacly wha he forward algorihm compues. 4 Experimenal seup and resuls For each of he seven words, 15 uerances by his auhor were recorded. The performance of he sysem was measured by five-fold cross-validaion on he recorded daa se of 15 uerances. Experimenaion indicaed ha he wo mos imporan parameers 6
7 4 Training apple 4 Training lime F2 [Hz] 2 F2 [Hz] F1 [Hz] F1 [Hz] (a) Apple. (b) Lime. 4 Training orange 4 Training peach F2 [Hz] 2 F2 [Hz] F1 [Hz] F1 [Hz] (c) Orange. (d) Peach. Figure 4: Fied Gaussians afer en ieraions of he Baum-Welch algorihm. We have six saes wih one Gaussian each. The wo mos dominan frequencies (feaures) are shown. Each green plus is represens a frame from a raining speech signal. The sars are he means of each Gaussian, and he ellipses indicae heir 75% confidence inerval. Noice he higher frequencies presen in he words conaining unvoiced phonemes ( peach and orange ) compared o he words ha do no ( apple and lime ). were he number of hidden saes, N, and he number of frequencies exraced from each frame, D. The cross-validaion was herefore run wih differen values for hese parameers, and he resuls are shown in able 1. 5 Discussion The resuls are quie good compared o he simple approach aken, especially in he feaure exracion phase. More advanced feaures like Mel-frequency cepsral coefficiens were considered, bu we decided on simple frequencies due o he low misclassificaion 7
8 N\D % 8.6% 3 21.% 15.2% 9.5% 12.4% 1.9% 14.3% 5.7% % 11.4% 8.6% 5.7% 3.8% 6.7% 4.8% % 8.6% 9.5% 4.8% 2.9% 5.7% 4.8% % 1.5% 3.8% 5.7% 7.6% 6.7% 1.5% % 12.4% 6.7% 1.5% 7.6% 2.9% 8.6% % 5.7% Table 1: Misclassificaion raes for five-fold cross-validaion wih differen values for he number of hidden saes, N, and he number of frequencies exraced from each frame, D. Each five-fold cross-validaion procedure akes abou 7 minues wih he 15 uerances on a 2 GHz Inel Core 2 Duo (serial execuion). raes achieved. I should be noed ha his sysem would no perform well if rained and esed wih differen speakers. This is because of he differen frequency characerisics of differen voices, especially for speakers of differen gender. We also experimened wih increasing he number of raining ieraions for he Baum- Welch algorihm, including seing a hreshold on he likelihood difference beween seps. Tha, however, proved o have lile benefi in pracice; neiher he execuion ime nor he misclassificaion rae showed any menionable improvemens over jus fixing he number of ieraions o 15. The reason why he execuion ime did no show any significan improvemens is because mos of he execuion ime is spen during feaure exracion, and no in raining. I is also ineresing o noe ha when N is oo small, here are many apple s misclassified as pineapple s, and vice versa, due o he loss of emporal informaion. Anoher imporan parameer is he number of samples in each frame. If he frame is oo small, i becomes hard o pick ou meaningful feaures, and if i is oo large, emporal informaion is los. However, due o ime consrains, we did no es anyhing else han 8 samples for his projec. The concaenaion of he raining examples rains a probabiliy of ransiioning from he las sae o he iniial sae ha is no needed for classificaion. Rabiner gives a modified Baum-Welch algorihm for muliple raining examples such ha concaenaion is no necessary, bu ha was no implemened during his projec as he concaenaion seemed o work well. 6 Conclusion and fuure work During his projec a sysem for isolaed-word speech recogniion was implemened and esed. The cross-validaion resuls are good for a single speaker. Two obvious exensions are beer suppor for several speakers, and suppor for coninuos speech. The firs sep owards he former would be more, and more robus, feaures. For he laer he simples approach is probably o deec word boundaries and hen proceed wih an isolaed-word 8
9 recognizer. The Malab implemenaion along wih he daa se is published as open source and can be found a hp://code.google.com/p/hmm-speech-recogniion/. 7 References [1] L. R. Rabiner, A uorial on hidden Markov models and seleced applicaions in speech recogniion, Proceedings of he IEEE, vol. 77, pp , Feb [2] C. M. Bishop, Paern Recogniion and Machine Learning (Informaion Science and Saisics). Springer, 1s ed. 26. corr. 2nd prining ed., Ocober 27. [3] X. Huang, A. Acero, and H.-W. Hon, Spoken Language Processing: A Guide o Theory, Algorihm and Sysem Developmen. Prenice Hall PTR, May 21. 9
0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED
0.1 MAXIMUM LIKELIHOOD ESTIMATIO EXPLAIED Maximum likelihood esimaion is a bes-fi saisical mehod for he esimaion of he values of he parameers of a sysem, based on a se of observaions of a random variable
More informationSpeaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis
Speaker Adapaion Techniques For Coninuous Speech Using Medium and Small Adapaion Daa Ses Consaninos Boulis Ouline of he Presenaion Inroducion o he speaker adapaion problem Maximum Likelihood Sochasic Transformaions
More informationHidden Markov Models. Adapted from. Dr Catherine Sweeney-Reed s slides
Hidden Markov Models Adaped from Dr Caherine Sweeney-Reed s slides Summary Inroducion Descripion Cenral in HMM modelling Exensions Demonsraion Specificaion of an HMM Descripion N - number of saes Q = {q
More informationEnsamble methods: Bagging and Boosting
Lecure 21 Ensamble mehods: Bagging and Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Ensemble mehods Mixure of expers Muliple base models (classifiers, regressors), each covers a differen par
More informationEnsamble methods: Boosting
Lecure 21 Ensamble mehods: Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Schedule Final exam: April 18: 1:00-2:15pm, in-class Term projecs April 23 & April 25: a 1:00-2:30pm in CS seminar room
More informationGeorey E. Hinton. University oftoronto. Technical Report CRG-TR February 22, Abstract
Parameer Esimaion for Linear Dynamical Sysems Zoubin Ghahramani Georey E. Hinon Deparmen of Compuer Science Universiy oftorono 6 King's College Road Torono, Canada M5S A4 Email: zoubin@cs.orono.edu Technical
More informationModal identification of structures from roving input data by means of maximum likelihood estimation of the state space model
Modal idenificaion of srucures from roving inpu daa by means of maximum likelihood esimaion of he sae space model J. Cara, J. Juan, E. Alarcón Absrac The usual way o perform a forced vibraion es is o fix
More informationVehicle Arrival Models : Headway
Chaper 12 Vehicle Arrival Models : Headway 12.1 Inroducion Modelling arrival of vehicle a secion of road is an imporan sep in raffic flow modelling. I has imporan applicaion in raffic flow simulaion where
More informationState-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter
Sae-Space Models Iniializaion, Esimaion and Smoohing of he Kalman Filer Iniializaion of he Kalman Filer The Kalman filer shows how o updae pas predicors and he corresponding predicion error variances when
More informationHidden Markov Models
Hidden Markov Models Probabilisic reasoning over ime So far, we ve mosly deal wih episodic environmens Excepions: games wih muliple moves, planning In paricular, he Bayesian neworks we ve seen so far describe
More informationT L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB
Elecronic Companion EC.1. Proofs of Technical Lemmas and Theorems LEMMA 1. Le C(RB) be he oal cos incurred by he RB policy. Then we have, T L E[C(RB)] 3 E[Z RB ]. (EC.1) Proof of Lemma 1. Using he marginal
More informationAn introduction to the theory of SDDP algorithm
An inroducion o he heory of SDDP algorihm V. Leclère (ENPC) Augus 1, 2014 V. Leclère Inroducion o SDDP Augus 1, 2014 1 / 21 Inroducion Large scale sochasic problem are hard o solve. Two ways of aacking
More informationGMM - Generalized Method of Moments
GMM - Generalized Mehod of Momens Conens GMM esimaion, shor inroducion 2 GMM inuiion: Maching momens 2 3 General overview of GMM esimaion. 3 3. Weighing marix...........................................
More informationArticle from. Predictive Analytics and Futurism. July 2016 Issue 13
Aricle from Predicive Analyics and Fuurism July 6 Issue An Inroducion o Incremenal Learning By Qiang Wu and Dave Snell Machine learning provides useful ools for predicive analyics The ypical machine learning
More informationTwo Coupled Oscillators / Normal Modes
Lecure 3 Phys 3750 Two Coupled Oscillaors / Normal Modes Overview and Moivaion: Today we ake a small, bu significan, sep owards wave moion. We will no ye observe waves, bu his sep is imporan in is own
More informationZápadočeská Univerzita v Plzni, Czech Republic and Groupe ESIEE Paris, France
ADAPTIVE SIGNAL PROCESSING USING MAXIMUM ENTROPY ON THE MEAN METHOD AND MONTE CARLO ANALYSIS Pavla Holejšovsá, Ing. *), Z. Peroua, Ing. **), J.-F. Bercher, Prof. Assis. ***) Západočesá Univerzia v Plzni,
More informationLecture 33: November 29
36-705: Inermediae Saisics Fall 2017 Lecurer: Siva Balakrishnan Lecure 33: November 29 Today we will coninue discussing he boosrap, and hen ry o undersand why i works in a simple case. In he las lecure
More informationSTATE-SPACE MODELLING. A mass balance across the tank gives:
B. Lennox and N.F. Thornhill, 9, Sae Space Modelling, IChemE Process Managemen and Conrol Subjec Group Newsleer STE-SPACE MODELLING Inroducion: Over he pas decade or so here has been an ever increasing
More informationLecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still.
Lecure - Kinemaics in One Dimension Displacemen, Velociy and Acceleraion Everyhing in he world is moving. Nohing says sill. Moion occurs a all scales of he universe, saring from he moion of elecrons in
More informationINTRODUCTION TO MACHINE LEARNING 3RD EDITION
ETHEM ALPAYDIN The MIT Press, 2014 Lecure Slides for INTRODUCTION TO MACHINE LEARNING 3RD EDITION alpaydin@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/i2ml3e CHAPTER 2: SUPERVISED LEARNING Learning a Class
More informationRandom Walk with Anti-Correlated Steps
Random Walk wih Ani-Correlaed Seps John Noga Dirk Wagner 2 Absrac We conjecure he expeced value of random walks wih ani-correlaed seps o be exacly. We suppor his conjecure wih 2 plausibiliy argumens and
More informationDiebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles
Diebold, Chaper 7 Francis X. Diebold, Elemens of Forecasing, 4h Ediion (Mason, Ohio: Cengage Learning, 006). Chaper 7. Characerizing Cycles Afer compleing his reading you should be able o: Define covariance
More informationTwo Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017
Two Popular Bayesian Esimaors: Paricle and Kalman Filers McGill COMP 765 Sep 14 h, 2017 1 1 1, dx x Bel x u x P x z P Recall: Bayes Filers,,,,,,, 1 1 1 1 u z u x P u z u x z P Bayes z = observaion u =
More informationLinear Response Theory: The connection between QFT and experiments
Phys540.nb 39 3 Linear Response Theory: The connecion beween QFT and experimens 3.1. Basic conceps and ideas Q: How do we measure he conduciviy of a meal? A: we firs inroduce a weak elecric field E, and
More informationTesting for a Single Factor Model in the Multivariate State Space Framework
esing for a Single Facor Model in he Mulivariae Sae Space Framework Chen C.-Y. M. Chiba and M. Kobayashi Inernaional Graduae School of Social Sciences Yokohama Naional Universiy Japan Faculy of Economics
More informationPENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD
PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD HAN XIAO 1. Penalized Leas Squares Lasso solves he following opimizaion problem, ˆβ lasso = arg max β R p+1 1 N y i β 0 N x ij β j β j (1.1) for some 0.
More informationNotes on Kalman Filtering
Noes on Kalman Filering Brian Borchers and Rick Aser November 7, Inroducion Daa Assimilaion is he problem of merging model predicions wih acual measuremens of a sysem o produce an opimal esimae of he curren
More information3.1 More on model selection
3. More on Model selecion 3. Comparing models AIC, BIC, Adjused R squared. 3. Over Fiing problem. 3.3 Sample spliing. 3. More on model selecion crieria Ofen afer model fiing you are lef wih a handful of
More informationA variational radial basis function approximation for diffusion processes.
A variaional radial basis funcion approximaion for diffusion processes. Michail D. Vreas, Dan Cornford and Yuan Shen {vreasm, d.cornford, y.shen}@ason.ac.uk Ason Universiy, Birmingham, UK hp://www.ncrg.ason.ac.uk
More informationAuthors. Introduction. Introduction
Auhors Hidden Applied in Agriculural Crops Classificaion Caholic Universiy of Rio de Janeiro (PUC-Rio Paula B. C. Leie Raul Q. Feiosa Gilson A. O. P. Cosa Hidden Applied in Agriculural Crops Classificaion
More informationMatlab and Python programming: how to get started
Malab and Pyhon programming: how o ge sared Equipping readers he skills o wrie programs o explore complex sysems and discover ineresing paerns from big daa is one of he main goals of his book. In his chaper,
More informationApplication of a Stochastic-Fuzzy Approach to Modeling Optimal Discrete Time Dynamical Systems by Using Large Scale Data Processing
Applicaion of a Sochasic-Fuzzy Approach o Modeling Opimal Discree Time Dynamical Sysems by Using Large Scale Daa Processing AA WALASZE-BABISZEWSA Deparmen of Compuer Engineering Opole Universiy of Technology
More informationTom Heskes and Onno Zoeter. Presented by Mark Buller
Tom Heskes and Onno Zoeer Presened by Mark Buller Dynamic Bayesian Neworks Direced graphical models of sochasic processes Represen hidden and observed variables wih differen dependencies Generalize Hidden
More informationAssignment 6. Tyler Shendruk December 6, 2010
Assignmen 6 Tyler Shendruk December 6, 1 1 Harden Problem 1 Le K be he coupling and h he exernal field in a 1D Ising model. From he lecures hese can be ransformed ino effecive coupling and fields K and
More informationChristos Papadimitriou & Luca Trevisan November 22, 2016
U.C. Bereley CS170: Algorihms Handou LN-11-22 Chrisos Papadimiriou & Luca Trevisan November 22, 2016 Sreaming algorihms In his lecure and he nex one we sudy memory-efficien algorihms ha process a sream
More informationZürich. ETH Master Course: L Autonomous Mobile Robots Localization II
Roland Siegwar Margaria Chli Paul Furgale Marco Huer Marin Rufli Davide Scaramuzza ETH Maser Course: 151-0854-00L Auonomous Mobile Robos Localizaion II ACT and SEE For all do, (predicion updae / ACT),
More information( ) = b n ( t) n " (2.111) or a system with many states to be considered, solving these equations isn t. = k U I ( t,t 0 )! ( t 0 ) (2.
Andrei Tokmakoff, MIT Deparmen of Chemisry, 3/14/007-6.4 PERTURBATION THEORY Given a Hamilonian H = H 0 + V where we know he eigenkes for H 0 : H 0 n = E n n, we can calculae he evoluion of he wavefuncion
More informationPattern Classification (VI) 杜俊
Paern lassificaion VI 杜俊 jundu@usc.edu.cn Ouline Bayesian Decision Theory How o make he oimal decision? Maximum a oserior MAP decision rule Generaive Models Join disribuion of observaion and label sequences
More information1 Review of Zero-Sum Games
COS 5: heoreical Machine Learning Lecurer: Rob Schapire Lecure #23 Scribe: Eugene Brevdo April 30, 2008 Review of Zero-Sum Games Las ime we inroduced a mahemaical model for wo player zero-sum games. Any
More informationLet us start with a two dimensional case. We consider a vector ( x,
Roaion marices We consider now roaion marices in wo and hree dimensions. We sar wih wo dimensions since wo dimensions are easier han hree o undersand, and one dimension is a lile oo simple. However, our
More information10. State Space Methods
. Sae Space Mehods. Inroducion Sae space modelling was briefly inroduced in chaper. Here more coverage is provided of sae space mehods before some of heir uses in conrol sysem design are covered in he
More informationBlock Diagram of a DCS in 411
Informaion source Forma A/D From oher sources Pulse modu. Muliplex Bandpass modu. X M h: channel impulse response m i g i s i Digial inpu Digial oupu iming and synchronizaion Digial baseband/ bandpass
More informationDemodulation of Digitally Modulated Signals
Addiional maerial for TSKS1 Digial Communicaion and TSKS2 Telecommunicaion Demodulaion of Digially Modulaed Signals Mikael Olofsson Insiuionen för sysemeknik Linköpings universie, 581 83 Linköping November
More informationLearning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power
Alpaydin Chaper, Michell Chaper 7 Alpaydin slides are in urquoise. Ehem Alpaydin, copyrigh: The MIT Press, 010. alpaydin@boun.edu.r hp://www.cmpe.boun.edu.r/ ehem/imle All oher slides are based on Michell.
More informationChapter 3 Boundary Value Problem
Chaper 3 Boundary Value Problem A boundary value problem (BVP) is a problem, ypically an ODE or a PDE, which has values assigned on he physical boundary of he domain in which he problem is specified. Le
More informationLinear Gaussian State Space Models
Linear Gaussian Sae Space Models Srucural Time Series Models Level and Trend Models Basic Srucural Model (BSM Dynamic Linear Models Sae Space Model Represenaion Level, Trend, and Seasonal Models Time Varying
More informationACE 564 Spring Lecture 7. Extensions of The Multiple Regression Model: Dummy Independent Variables. by Professor Scott H.
ACE 564 Spring 2006 Lecure 7 Exensions of The Muliple Regression Model: Dumm Independen Variables b Professor Sco H. Irwin Readings: Griffihs, Hill and Judge. "Dumm Variables and Varing Coefficien Models
More informationACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H.
ACE 56 Fall 005 Lecure 5: he Simple Linear Regression Model: Sampling Properies of he Leas Squares Esimaors by Professor Sco H. Irwin Required Reading: Griffihs, Hill and Judge. "Inference in he Simple
More informationIntroduction D P. r = constant discount rate, g = Gordon Model (1962): constant dividend growth rate.
Inroducion Gordon Model (1962): D P = r g r = consan discoun rae, g = consan dividend growh rae. If raional expecaions of fuure discoun raes and dividend growh vary over ime, so should he D/P raio. Since
More informationFinancial Econometrics Jeffrey R. Russell Midterm Winter 2009 SOLUTIONS
Name SOLUTIONS Financial Economerics Jeffrey R. Russell Miderm Winer 009 SOLUTIONS You have 80 minues o complee he exam. Use can use a calculaor and noes. Try o fi all your work in he space provided. If
More informationODEs II, Lecture 1: Homogeneous Linear Systems - I. Mike Raugh 1. March 8, 2004
ODEs II, Lecure : Homogeneous Linear Sysems - I Mike Raugh March 8, 4 Inroducion. In he firs lecure we discussed a sysem of linear ODEs for modeling he excreion of lead from he human body, saw how o ransform
More informationAir Traffic Forecast Empirical Research Based on the MCMC Method
Compuer and Informaion Science; Vol. 5, No. 5; 0 ISSN 93-8989 E-ISSN 93-8997 Published by Canadian Cener of Science and Educaion Air Traffic Forecas Empirical Research Based on he MCMC Mehod Jian-bo Wang,
More informationTracking. Announcements
Tracking Tuesday, Nov 24 Krisen Grauman UT Ausin Announcemens Pse 5 ou onigh, due 12/4 Shorer assignmen Auo exension il 12/8 I will no hold office hours omorrow 5 6 pm due o Thanksgiving 1 Las ime: Moion
More information3.1.3 INTRODUCTION TO DYNAMIC OPTIMIZATION: DISCRETE TIME PROBLEMS. A. The Hamiltonian and First-Order Conditions in a Finite Time Horizon
3..3 INRODUCION O DYNAMIC OPIMIZAION: DISCREE IME PROBLEMS A. he Hamilonian and Firs-Order Condiions in a Finie ime Horizon Define a new funcion, he Hamilonian funcion, H. H he change in he oal value of
More informationMath 10B: Mock Mid II. April 13, 2016
Name: Soluions Mah 10B: Mock Mid II April 13, 016 1. ( poins) Sae, wih jusificaion, wheher he following saemens are rue or false. (a) If a 3 3 marix A saisfies A 3 A = 0, hen i canno be inverible. True.
More informationSTRUCTURAL CHANGE IN TIME SERIES OF THE EXCHANGE RATES BETWEEN YEN-DOLLAR AND YEN-EURO IN
Inernaional Journal of Applied Economerics and Quaniaive Sudies. Vol.1-3(004) STRUCTURAL CHANGE IN TIME SERIES OF THE EXCHANGE RATES BETWEEN YEN-DOLLAR AND YEN-EURO IN 001-004 OBARA, Takashi * Absrac The
More informationR t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t
Exercise 7 C P = α + β R P + u C = αp + βr + v (a) (b) C R = α P R + β + w (c) Assumpions abou he disurbances u, v, w : Classical assumions on he disurbance of one of he equaions, eg. on (b): E(v v s P,
More informationLecture 9: September 25
0-725: Opimizaion Fall 202 Lecure 9: Sepember 25 Lecurer: Geoff Gordon/Ryan Tibshirani Scribes: Xuezhi Wang, Subhodeep Moira, Abhimanu Kumar Noe: LaTeX emplae couresy of UC Berkeley EECS dep. Disclaimer:
More informationCHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK
175 CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK 10.1 INTRODUCTION Amongs he research work performed, he bes resuls of experimenal work are validaed wih Arificial Neural Nework. From he
More informationMachine Learning 4771
ony Jebara, Columbia Universiy achine Learning 4771 Insrucor: ony Jebara ony Jebara, Columbia Universiy opic 20 Hs wih Evidence H Collec H Evaluae H Disribue H Decode H Parameer Learning via JA & E ony
More informationSection 3.5 Nonhomogeneous Equations; Method of Undetermined Coefficients
Secion 3.5 Nonhomogeneous Equaions; Mehod of Undeermined Coefficiens Key Terms/Ideas: Linear Differenial operaor Nonlinear operaor Second order homogeneous DE Second order nonhomogeneous DE Soluion o homogeneous
More informationψ ( t) = c n ( t) t n ( )ψ( ) t ku t,t 0 ψ I V kn
MIT Deparmen of Chemisry 5.74, Spring 4: Inroducory Quanum Mechanics II p. 33 Insrucor: Prof. Andrei Tokmakoff PERTURBATION THEORY Given a Hamilonian H ( ) = H + V ( ) where we know he eigenkes for H H
More informationLaplace transfom: t-translation rule , Haynes Miller and Jeremy Orloff
Laplace ransfom: -ranslaion rule 8.03, Haynes Miller and Jeremy Orloff Inroducory example Consider he sysem ẋ + 3x = f(, where f is he inpu and x he response. We know is uni impulse response is 0 for
More informationAdaptive Noise Estimation Based on Non-negative Matrix Factorization
dvanced cience and Technology Leers Vol.3 (ICC 213), pp.159-163 hp://dx.doi.org/1.14257/asl.213 dapive Noise Esimaion ased on Non-negaive Marix Facorizaion Kwang Myung Jeon and Hong Kook Kim chool of Informaion
More informationErrata (1 st Edition)
P Sandborn, os Analysis of Elecronic Sysems, s Ediion, orld Scienific, Singapore, 03 Erraa ( s Ediion) S K 05D Page 8 Equaion (7) should be, E 05D E Nu e S K he L appearing in he equaion in he book does
More informationCHEAPEST PMT ONLINE TEST SERIES AIIMS/NEET TOPPER PREPARE QUESTIONS
CHEAPEST PMT ONLINE TEST SERIES AIIMS/NEET TOPPER PREPARE QUESTIONS For more deails see las page or conac @aimaiims.in Physics Mock Tes Paper AIIMS/NEET 07 Physics 06 Saurday Augus 0 Uni es : Moion in
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Noes for EE7C Spring 018: Convex Opimizaion and Approximaion Insrucor: Moriz Hard Email: hard+ee7c@berkeley.edu Graduae Insrucor: Max Simchowiz Email: msimchow+ee7c@berkeley.edu Ocober 15, 018 3
More informationProblem Set #1. i z. the complex propagation constant. For the characteristic impedance:
Problem Se # Problem : a) Using phasor noaion, calculae he volage and curren waves on a ransmission line by solving he wave equaion Assume ha R, L,, G are all non-zero and independen of frequency From
More informationKINEMATICS IN ONE DIMENSION
KINEMATICS IN ONE DIMENSION PREVIEW Kinemaics is he sudy of how hings move how far (disance and displacemen), how fas (speed and velociy), and how fas ha how fas changes (acceleraion). We say ha an objec
More informationt is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t...
Mah 228- Fri Mar 24 5.6 Marix exponenials and linear sysems: The analogy beween firs order sysems of linear differenial equaions (Chaper 5) and scalar linear differenial equaions (Chaper ) is much sronger
More informationNature Neuroscience: doi: /nn Supplementary Figure 1. Spike-count autocorrelations in time.
Supplemenary Figure 1 Spike-coun auocorrelaions in ime. Normalized auocorrelaion marices are shown for each area in a daase. The marix shows he mean correlaion of he spike coun in each ime bin wih he spike
More informationExponential Weighted Moving Average (EWMA) Chart Under The Assumption of Moderateness And Its 3 Control Limits
DOI: 0.545/mjis.07.5009 Exponenial Weighed Moving Average (EWMA) Char Under The Assumpion of Moderaeness And Is 3 Conrol Limis KALPESH S TAILOR Assisan Professor, Deparmen of Saisics, M. K. Bhavnagar Universiy,
More informationRetrieval Models. Boolean and Vector Space Retrieval Models. Common Preprocessing Steps. Boolean Model. Boolean Retrieval Model
1 Boolean and Vecor Space Rerieval Models Many slides in his secion are adaped from Prof. Joydeep Ghosh (UT ECE) who in urn adaped hem from Prof. Dik Lee (Univ. of Science and Tech, Hong Kong) Rerieval
More informationSpring Ammar Abu-Hudrouss Islamic University Gaza
Chaper 7 Reed-Solomon Code Spring 9 Ammar Abu-Hudrouss Islamic Universiy Gaza ١ Inroducion A Reed Solomon code is a special case of a BCH code in which he lengh of he code is one less han he size of he
More informationLecture 2 October ε-approximation of 2-player zero-sum games
Opimizaion II Winer 009/10 Lecurer: Khaled Elbassioni Lecure Ocober 19 1 ε-approximaion of -player zero-sum games In his lecure we give a randomized ficiious play algorihm for obaining an approximae soluion
More informationAn EM algorithm for maximum likelihood estimation given corrupted observations. E. E. Holmes, National Marine Fisheries Service
An M algorihm maimum likelihood esimaion given corruped observaions... Holmes Naional Marine Fisheries Service Inroducion M algorihms e likelihood esimaion o cases wih hidden saes such as when observaions
More informationA Bayesian Approach to Spectral Analysis
Chirped Signals A Bayesian Approach o Specral Analysis Chirped signals are oscillaing signals wih ime variable frequencies, usually wih a linear variaion of frequency wih ime. E.g. f() = A cos(ω + α 2
More informationOnline Convex Optimization Example And Follow-The-Leader
CSE599s, Spring 2014, Online Learning Lecure 2-04/03/2014 Online Convex Opimizaion Example And Follow-The-Leader Lecurer: Brendan McMahan Scribe: Sephen Joe Jonany 1 Review of Online Convex Opimizaion
More informationCHAPTER 2 Signals And Spectra
CHAPER Signals And Specra Properies of Signals and Noise In communicaion sysems he received waveform is usually caegorized ino he desired par conaining he informaion, and he undesired par. he desired par
More informationChapter 6. Systems of First Order Linear Differential Equations
Chaper 6 Sysems of Firs Order Linear Differenial Equaions We will only discuss firs order sysems However higher order sysems may be made ino firs order sysems by a rick shown below We will have a sligh
More informationPresentation Overview
Acion Refinemen in Reinforcemen Learning by Probabiliy Smoohing By Thomas G. Dieerich & Didac Busques Speaer: Kai Xu Presenaion Overview Bacground The Probabiliy Smoohing Mehod Experimenal Sudy of Acion
More informationAsymptotic Equipartition Property - Seminar 3, part 1
Asympoic Equipariion Propery - Seminar 3, par 1 Ocober 22, 2013 Problem 1 (Calculaion of ypical se) To clarify he noion of a ypical se A (n) ε and he smalles se of high probabiliy B (n), we will calculae
More informationTime series model fitting via Kalman smoothing and EM estimation in TimeModels.jl
Time series model fiing via Kalman smoohing and EM esimaion in TimeModels.jl Gord Sephen Las updaed: January 206 Conens Inroducion 2. Moivaion and Acknowledgemens....................... 2.2 Noaion......................................
More informationLearning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power
Alpaydin Chaper, Michell Chaper 7 Alpaydin slides are in urquoise. Ehem Alpaydin, copyrigh: The MIT Press, 010. alpaydin@boun.edu.r hp://www.cmpe.boun.edu.r/ ehem/imle All oher slides are based on Michell.
More informationIntroduction to Probability and Statistics Slides 4 Chapter 4
Inroducion o Probabiliy and Saisics Slides 4 Chaper 4 Ammar M. Sarhan, asarhan@mahsa.dal.ca Deparmen of Mahemaics and Saisics, Dalhousie Universiy Fall Semeser 8 Dr. Ammar Sarhan Chaper 4 Coninuous Random
More informationBU Macro BU Macro Fall 2008, Lecture 4
Dynamic Programming BU Macro 2008 Lecure 4 1 Ouline 1. Cerainy opimizaion problem used o illusrae: a. Resricions on exogenous variables b. Value funcion c. Policy funcion d. The Bellman equaion and an
More information5. Stochastic processes (1)
Lec05.pp S-38.45 - Inroducion o Teleraffic Theory Spring 2005 Conens Basic conceps Poisson process 2 Sochasic processes () Consider some quaniy in a eleraffic (or any) sysem I ypically evolves in ime randomly
More informationNavneet Saini, Mayank Goyal, Vishal Bansal (2013); Term Project AML310; Indian Institute of Technology Delhi
Creep in Viscoelasic Subsances Numerical mehods o calculae he coefficiens of he Prony equaion using creep es daa and Herediary Inegrals Mehod Navnee Saini, Mayank Goyal, Vishal Bansal (23); Term Projec
More informationChapter 15: Phenomena. Chapter 15 Chemical Kinetics. Reaction Rates. Reaction Rates R P. Reaction Rates. Rate Laws
Chaper 5: Phenomena Phenomena: The reacion (aq) + B(aq) C(aq) was sudied a wo differen emperaures (98 K and 35 K). For each emperaure he reacion was sared by puing differen concenraions of he 3 species
More informationRL Lecture 7: Eligibility Traces. R. S. Sutton and A. G. Barto: Reinforcement Learning: An Introduction 1
RL Lecure 7: Eligibiliy Traces R. S. Suon and A. G. Baro: Reinforcemen Learning: An Inroducion 1 N-sep TD Predicion Idea: Look farher ino he fuure when you do TD backup (1, 2, 3,, n seps) R. S. Suon and
More informationRobust estimation based on the first- and third-moment restrictions of the power transformation model
h Inernaional Congress on Modelling and Simulaion, Adelaide, Ausralia, 6 December 3 www.mssanz.org.au/modsim3 Robus esimaion based on he firs- and hird-momen resricions of he power ransformaion Nawaa,
More informationReliability of Technical Systems
eliabiliy of Technical Sysems Main Topics Inroducion, Key erms, framing he problem eliabiliy parameers: Failure ae, Failure Probabiliy, Availabiliy, ec. Some imporan reliabiliy disribuions Componen reliabiliy
More informationDeep Learning: Theory, Techniques & Applications - Recurrent Neural Networks -
Deep Learning: Theory, Techniques & Applicaions - Recurren Neural Neworks - Prof. Maeo Maeucci maeo.maeucci@polimi.i Deparmen of Elecronics, Informaion and Bioengineering Arificial Inelligence and Roboics
More informationGINI MEAN DIFFERENCE AND EWMA CHARTS. Muhammad Riaz, Department of Statistics, Quaid-e-Azam University Islamabad,
GINI MEAN DIFFEENCE AND EWMA CHATS Muhammad iaz, Deparmen of Saisics, Quaid-e-Azam Universiy Islamabad, Pakisan. E-Mail: riaz76qau@yahoo.com Saddam Akbar Abbasi, Deparmen of Saisics, Quaid-e-Azam Universiy
More informationSUPPLEMENTARY INFORMATION
SUPPLEMENTARY INFORMATION DOI: 0.038/NCLIMATE893 Temporal resoluion and DICE * Supplemenal Informaion Alex L. Maren and Sephen C. Newbold Naional Cener for Environmenal Economics, US Environmenal Proecion
More informationSome Basic Information about M-S-D Systems
Some Basic Informaion abou M-S-D Sysems 1 Inroducion We wan o give some summary of he facs concerning unforced (homogeneous) and forced (non-homogeneous) models for linear oscillaors governed by second-order,
More informationSimulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010
Simulaion-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Week Descripion Reading Maerial 2 Compuer Simulaion of Dynamic Models Finie Difference, coninuous saes, discree ime Simple Mehods Euler Trapezoid
More informationKriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Kriging Models Predicing Arazine Concenraions in Surface Waer Draining Agriculural Waersheds Paul L. Mosquin, Jeremy Aldworh, Wenlin Chen Supplemenal Maerial Number
More informationDesigning Information Devices and Systems I Spring 2019 Lecture Notes Note 17
EES 16A Designing Informaion Devices and Sysems I Spring 019 Lecure Noes Noe 17 17.1 apaciive ouchscreen In he las noe, we saw ha a capacior consiss of wo pieces on conducive maerial separaed by a nonconducive
More informationStochastic models and their distributions
Sochasic models and heir disribuions Couning cusomers Suppose ha n cusomers arrive a a grocery a imes, say T 1,, T n, each of which akes any real number in he inerval (, ) equally likely The values T 1,,
More information