MODELING SPEECH PARAMETER SEQUENCES WITH LATENT TRAJECTORY HIDDEN MARKOV MODEL. Hirokazu Kameoka
|
|
- Aileen Shelton
- 5 years ago
- Views:
Transcription
1 15 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING, SEPT. 17, 15, BOSTON, USA MODELING SPEECH PARAMETER SEQUENCES WITH LATENT TRAJECTORY HIDDEN MARKOV MODEL Hirokazu Kameoka Nippon Telegraph and Telephone Corporaion / The Univeriy of Tokyo ABSTRACT Thi paper propoe a probabiliic generaive model of a equence of vecor called he laen rajecory hidden Markov model (HMM). While a convenional HMM i only capable of decribing piecewie aionary equence of daa vecor, he propoed model i capable of decribing coninuouly imevarying equence of daa vecor, governed by dicree hidden ae. Thi feaure i noeworhy in ha i can be ued o model many kind of ime erie daa ha are coninuou in naure uch a peech pecra. Given a equence of oberved daa, he opimal ae equence can be decoded uing he expecaion-maximizaion (EM) algorihm. Given a e of raining example, he underlying model parameer can be rained by eiher he expecaion-maximizaion algorihm or he variaional inference algorihm. Index Term Sequenial modeling, Hidden Markov model (HMM), Trajecory HMM, Laen rajecory HMM, Expecaion-Maximizaion algorihm, variaional inference 1. INTRODUCTION The weakne of hidden Markov model (HMM) i ha hey have difficuly in modeling and capuring he local dynamic of feaure equence due o he piecewie aionariy aumpion and he condiional independence aumpion on feaure equence. Tradiionally, in peech recogniion yem, hi limiaion ha been circumvened by appending dynamic (dela and dela-dela) componen o he feaure vecor. HMM-baed peech ynhei yem [1] alo ue he join vecor of aic and dynamic feaure a an oberved vecor in he raining proce. In he ynhei proce, on he oher hand, a equence of aic feaure i generaed according o he oupu probabiliie of he rained HMM given an inpu enence by aking accoun of he explici conrain beween he aic and dynamic feaure []. Alhough he HMM-baed peech ynhei framework ha many aracive feaure, one drawback i ha he crieria ued for raining and ynhei are inconien. While he join likelihood of aic and dynamic feaure i maximized during he raining proce, he likelihood of only he aic feaure i maximized during he ynhei proce. Thi implie ha he model parameer are no rained in uch a way ha he generaed pa- Thi work wa uppored by JSPS KAKENHI Gran Number 6731 and 686. The auhor would like o hank Dr. Tomoki Toda (NAIST) for fruiful dicuion and Mr. Tomohiko Nakamura (Univeriy of Tokyo) for hi help wih conducing he experimen. rameer equence become opimal. To addre hi problem, Zen [3] inroduced a varian of HMM called he rajecory HMM, which wa obained by incorporaing he explici relaionhip beween aic and dynamic feaure ino he radiional HMM. Thi ha made i poible o provide a unified framework for he raining and ynhei of peech parameer equence, however, i caue difficuly a regard parameer inference. Since he condiional independence aumpion on he feaure vecor i lo, efficien algorihm for raining and decoding regular HMM uch a he Vierbi algorihm and he Forward-Backward algorihm are no longer applicable o he rajecory HMM. Thu, ome approximaion and brue-force mehod are uually neceary o obain raining and decoding algorihm [3, 4]. In hi paper, we propoe formulaing a new model called he laen rajecory HMM. In conra wih he convenional rajecory HMM, he preen model pli he generaive proce of an oberved feaure equence ino wo procee, one for a equence of he join vecor of aic and dynamic feaure given HMM ae and he oher for an oberved feaure equence given he equence of he join vecor. By reaing he join vecor of aic and dynamic feaure a a laen variable o be marginalized ou, we obain a probabiliy deniy funcion of an oberved feaure equence wih a differen form from he likelihood funcion of he rajecory HMM. A decribed below, hi new formulaion naurally allow he combined ue of powerful inference echnique uch a he expecaion-maximizaion (EM) algorihm, Vierbi algorihm and Forward-Backward algorihm for raining and decoding, while ill reaining he piri of he original rajecory HMM. Thi work i no only direced oward peech ynhei applicaion bu alo oward everal differen applicaion uch a voice converion and acouic-o-ariculaory mapping, in which rajecory modeling ha proven o be effecive [5 7]. Anoher inereing applicaion we have in mind i audio ource eparaion. Recenly, we propoed mehod for ingle- and muli-channel audio ource eparaion baed on facorial HMM [8 11], he pecrogram of a mixure ignal i modeled a he um of he oupu emied from muliple HMM, each repreening he pecrogram of an underlying ource. One promiing way o improve hi approach would be o incorporae he dynamic of ource pecra. Thi can be accomplihed by plugging he preen model ino he facorial HMM formulaion. The preen formulaion will play a key role in making hi poible /15/$31. c 15 IEEE
2 . TRAJECTORY HIDDEN MARKOV MODEL We ar by briefly reviewing he original formulaion of he rajecory HMM [3]. Le u ue c o denoe a D- dimenional aic feaure vecor and define he join vecor of c and i velociy and acceleraion componen o := [c T, c T, c T ] T R 3D a he oberved vecor a ime. We wrie he equence of he aic feaure and he oberved vecor a c = [c T 1,..., c T T ]T and o = [o T 1,..., o T T ]T, repecively. Thu, he dimenion of c and o become DT and 3DT. The relaionhip beween c and o can be decribed explicily uing a conan 3DT by DT marix W a o = W c, (1) W i a pare marix ha append fir and econd order ime derivaive o he aic feaure vecor equence. Wihin he radiional HMM framework, a equence of oberved vecor, o, i imply aumed o be generaed from an HMM. Here, if we aume he emiion probabiliy deniy o be a ingle Gauian diribuion, he probabiliy deniy funcion of o given a ae equence = [ 1,..., T ] and an HMM parameer e λ = {µ, U, π}, wih µ = {µ i } 1 i I, U = {U i } 1 i I, and π = {π i,j } 1 i I,1 j J, i given a p(o, λ) = N (o; µ, U ) = T N (o ; µ, U ), () =1 N (x; µ, Σ) denoe a Gauian diribuion wih mean µ and covariance Σ: N (x; µ, Σ) 1 e 1 Σ 1 (x µ)t Σ 1 (x µ). (3) µ and U denoe he mean equence and a block diagonal marix whoe diagonal elemen are given by he equence of he covariance marice of he emiion deniie over ime: µ = [µ T 1,..., µ T T ] T, (4) U = diag(u 1,..., U T ). (5) In HMM-baed peech ynhei yem, he parameer e i ypically rained by olving he maximum likelihood eimaion problem ˆλ = argmax λ log p(o, λ)p(), (6) p() i given by he produc of he ae raniion probabiliie. A he ynhei age, given a ae equence and wih he rained parameer λ, a aic feaure equence c i generaed according o p(c, λ) i defined a ĉ = argmax p(c, λ), (7) c p(c, λ) N (W c; µ, U ) (8) e 1 (ct W T U 1 W c ct W T U 1 µ ) (9) Fig. 1. Illuraion ha how he difference beween HMM and rajecory HMM. = N (c; c, V ). (1) By compleing he quare in he exponen of (9) wih repec o c, we immediaely obain c and V a c = (W T U 1 W ) 1 W T U 1 µ, (11) V = (W T U 1 W ) 1. (1) Thu, he oluion o (7) i c. Geomerically, (1) can be viewed a a cuing plane of he deniy p(o, λ) a o = W c. A hown above, he radiional HMM-baed framework ue differen crieria for raining and ynhei: While p(o, λ) i ued for raining, p(c, λ) i ued for ynhei. Thi implie ha ˆλ i no necearily opimal for generaing opimal c. To addre hi inconiency beween he raining and ynhei crieria, Zen [3] propoed inroducing a framework called he rajecory HMM, which alo ue (1) a he raining crierion. Inead of olving (6), he parameer e λ i hu rained by olving {ˆλ, ŝ} = argmax λ, log p(c, λ)p(), (13) c i reaed a he oberved daa. Unlike he regular HMM, he condiional independence aumpion on oberved vecor doe no hold in he rajecory HMM: While he regular HMM aume ha each oberved vecor depend only on he curren ae, (1) indicae ha he oberved vecor c a each frame depend on he enire ae equence. Thi implie ha i i difficul o direcly apply he efficien decoding and raining algorihm ued in he HMM framework (uch a he Vierbi algorihm and he Forward-Backward algorihm). Thu, ome approximaion and brue-force mehod are uually neceary o perform raining and decoding [3]. Becaue of hi, he decoding algorihm i no guaraneed o find he opimal ae equence and he raining algorihm i no guaraneed o converge o a local opimal oluion. Noe ha hi alo applie o he minimum generaion error (MGE) raining framework [4], which ue (1) in which V i replaced by an ideniy marix a he raining crierion.
3 3.1. Model 3. LATENT TRAJECTORY HMM While he HMM i only capable of decribing piecewie aionary equence of daa vecor, he rajecory HMM i capable of decribing coninuouly varying equence of daa vecor, governed by dicree hidden ae. Thi feaure i noable in ha i can be ued o model many kind of ime erie daa ha are coninuou in naure, however, i caue a difficuly a regard parameer inference. We propoe inroducing a concepually imilar framework baed on a differen formulaion, which i advanageou in ha i alleviae he difficuly relaed o parameer inference. Inead of reaing o a a funcion of c, we rea o a a laen variable ha i relaed o c hrough a of conrain o W c. The relaionhip o W c can be expreed hrough he condiional diribuion p(c o). For example, we can define p(c o) a { p(c o) exp 1 } (W c o)t Λ(W c o), (14) Λ i a conan poiive definie marix ha can be e arbirarily. Indeed, hi probabiliy deniy funcion become larger a o approache W c. By compleing he quare in he exponen of (14) wih repec o c, we can wrie p(c o) explicily a p(c o) = N (c; m c o, Λ 1 c o ), (15) m c o = Ho, (16) H = (W T ΛW ) 1 W T Λ, (17) Λ c o = W T ΛW. (18) By uing hi and p(o, λ) defined in (), we can wrie p(c, λ) a p(c, λ) = p(c o)p(o, λ)do, (19) in a differen way from (1). Geomerically, hi can be viewed a a marginal diribuion of he e of he projeced value of o ono he ubpace o = W c. From () and (15), he join likelihood p(c, o, λ) can be wrien a p(c, o, λ) = p(c o)p(o, λ) { ( [ ] ) T ( [ ] )} exp 1 c c m x Λ x m x, () o o [ ] m x = Λ 1 x U 1, (1) µ [ ] Λ Λ x = c o W T Λ ΛW H T Λ c o H + U 1. () Thu, p(x, λ) = N (x; m x, Λ 1 x ) x = [c T, o T ] T. By uing he blockwie marix inverion formula, Λ 1 x i given a [ ] Σ x = Λ 1 Σ cc Σ co x =, (3) Hence, (19) can be wrien a Σ oc Σ oo Σ cc = Λ 1 c o + HU H T, (4) Σ co = HU, (5) Σ oc = U H T, (6) Σ oo = U. (7) p(c, λ) = N (c; Hµ, Σ cc ). (8) We call hi model he laen rajecory HMM. Wih hi framework, given a ae equence and a parameer e λ, c i generaed according o ĉ = argmax p(c, λ). (9) c Obviouly, he oluion o hi i Hµ. I i imporan o noe ha wih hi framework, he parameer inference problem can be deal wih uing he Expecaion-Maximizaion (EM) algorihm by reaing he join vecor [c T, o T ] T a he complee daa. 3.. Decoding and raining algorihm A wih he rajecory HMM framework, he preen framework ue p(c, λ) for feaure equence generaion, ae decoding and parameer raining in a conien manner. The problem of ae decoding and parameer raining can be formulaed a he following opimizaion problem: ŝ = argmax {ˆλ, ŝ} = argmax λ, log p(c, λ)p() (3) log p(c, λ)p(). (31) Since he decoding problem (3) i a ubproblem of he raining problem (31), here we only derive an algorihm for olving he raining problem (31). By regarding he e coniing of c and o a he complee daa, hi problem can be viewed a an incomplee daa problem, which can be deal wih uing he Expecaion- Maximizaion (EM) algorihm. The likelihood of and λ given he complee daa i given by (). By aking he condiional expecaion of log p(c, o, λ) wih repec o o given c, = and λ = λ, and hen adding log p(q), we obain an auxiliary funcion Q(, λ) := E o c,,λ [log p(c, o, λ)] + log p(). (3) By leaving only he erm ha depend on and λ, Q(, λ) can be wrien a Q(, λ),λ = E o c,,λ [log p(o, λ)] + log p()
4 = 1 { log U + Tr ( U 1 R ) µ T U 1 ō + µt U 1 } µ + log p(), (33) ō = E o c,,λ [o] = µ + Σ ocσ 1 cc (c Hµ ), (34) R = E o c,,λ [oot ] = Σ oo Σ ocσ 1 cc Σ co + ōō T. (35) Here, he prime mark indicae he value obained uing he model parameer updaed a he previou ieraion. Since U i a block diagonal marix, a given in (5), (33) can be decompoed ino he um of T individual erm: Q(, λ),λ = 1 T =1 {log U + Tr[U 1 R ] µ T U 1 ō + µ T U 1 µ } T + log π 1 + log π 1,, (36) ō T = ō 1 ō =., R = R 1... R T. (37) Wih fixed λ, Q(, λ) can be maximized wih repec o by employing he Vierbi algorihm. Wih fixed, Q(, λ) i maximized wih repec o λ when 1[ = i]ō µ i = U i = π i,j =, (38) 1[ = i] 1[ = i](r ō µ T i µ i ō T + µ i µ T i ) 1[ = i] 1[ 1 = i, = j] 1[ 1 = i], (39), (4) if 1[ = i] i and j denoe ae indice and 1[ ] denoe an indicaor funcion ha ake he value 1 if i argumen i rue and oherwie. Overall, he parameer raining algorihm can be ummarized a follow: (E-ep) Subiue and λ ino and λ and recompue ō and R uing (34) and (35). (M-ep) Updae λ uing (38) (4) and find uing he Vierbi algorihm. = argmax Q(, λ) (41) Noe ha if λ i fixed, he above algorihm reduce o a ae decoding algorihm. I may appear ha a huge amoun of compuaion for invering Σ cc i required o compue ō and R. However, hi can be carried ou very efficienly. Fir, by uing he Woodbury marix ideniy, Σ 1 cc can be wrien a Λ c o Λ c o ((HU H T ) 1 +Λ c o ) 1 Λ c o. Nex, ince Λ can be e arbirarily, we e Λ a U 1 o compue (HU H T ) 1. Under hi eing, (HU H T ) 1 i given a W T U 1 W. Since boh W T U 1 W and Λ c o are pare ymmeric band marice, (W T U 1 W + Λ c o ) 1 Λ c o can be compued efficienly uing he Choleky decompoiion. To iniialize, one reaonable way would be o earch for = argmax p(o, λ) uing he Vierbi algorihm Variaional learning algorihm We decribe a differen approach for parameer raining baed on variaional inference. The random variable of inere in our model are o, and λ = {µ, P, π} µ = {µ i } 1 i I, P = {P i := U 1 i } 1 i I, and π = {π i,j } 1 i I,1 j J. We denoe he enire e of he above parameer a Θ = {o,, λ}. Our goal i o compue he poerior p(θ c) = p(c, Θ). (4) p(c) By uing he condiional diribuion defined in 3.1, we can wrie he join diribuion p(c, Θ) a p(c, Θ) = p(c o)p(o, λ)p(). (43) To obain he exac poerior p(θ c), we mu compue p(c), which involve many inracable inegral. Inead of obaining he exac poerior, we conider approximaing hi poerior variaionally by olving an opimizaion problem: argmin KL(q(Θ) p(θ c)), (44) q KL( ) denoe he Kullback-Leibler (KL) divergence beween i wo argumen, i.e., KL(q(Θ) p(θ c)) = q(o,, λ) q(o,, λ) log dodλ. (45) p(o,, λ c) By rericing he cla of he approximae diribuion o hoe ha facorize ino independen facor: q(o,, λ) = q(o)q()q(µ, P )q(π), (46)
5 we can ue a imple coordinae acen algorihm o find a local opimum of (44). I can be hown uing he calculu of variaion ha he opimal diribuion for each of he facor can be expreed a: ˆq(X) exp E Θ\X [log p(c, Θ)], (47) X indicae one of he facor and E Θ\X [log p(c, Θ)] i he expecaion of he join probabiliy of he daa and laen variable, aken over all variable excep X. From (47), he variaional diribuion are given in he following form: ˆq(o) = N (o; m, Γ), (48) ˆq(µ, P ) = N (µ i ; ρ i, (β i P i ) 1 )W(P i ; B i, ν i ), i (49) ˆq(π i ) = Dir(π i ; α i ), (5) he parameer are updaed via he following equaion m = m 1. m T R 1 r, (51) Γ 1 Γ =... Γ T i q( 1 =i)ν i B i O R =... O i q( T =i)ν i B i i q( 1 =i)ν i B i ρ i r =. i q( T =i)ν i B i ρ i R 1, (5) + HT Λ c o H, (53) + HT Λ c o c, (54) β i q( =i), (55) ρ i 1 q( =i)m, β i (56) B 1 i q( =i)(γ + m m T ) β i ρ i ρ T i, (57) ν i β i + 3D, (58) α i,j 1 + q( 1 =i, =j). (59) W and Dir denoe he Wihar diribuion and he Dirichle diribuion, repecively, defined a W(X; V, ν) X ν d 1 e 1 Tr(V 1 X), (6) Dir(x; α) i x αi 1 i, (61) X i a d d ymmeric marix of random variable ha i poiive definie and V i a d d poiive definie marix. q( = i) and q( 1 = i, = j) can be compued uing he forward-backward algorihm a a ubrouine, a in [1]. 4. EXPERIMENTS To confirm he generalizaion abiliy of he preen model and he convergence peed of he preen raining algorihm, we conduced parameer raining experimen uing mel-ceprum equence of peech a experimenal daa. We choe he parameer raining algorihm for he original rajecory HMM developed by Zen e al. a he baeline mehod. Zen algorihm ue he mehod of eepe acen for updaing λ and he delayed deciion Vierbi algorihm for updaing. Reader are referred o [3] for he deail. Since he degree of freedom of he rajecory HMM and he laen rajecory HMM are exacly he ame when he number of hidden ae are he ame, he difference of he log-likelihood core obained wih he preen and baeline algorihm would reflec he difference in heir generalizaion abiliie. The experimenal condiion were a follow. We ued 5 peech daa excerped from he ATR peech daabae, from each of which we obained he mel-ceprum equence of he fir 5 frame. For boh he propoed model and he rajecory HMM, he number of hidden ae were e a 14. Λ were fixed a Λ = diag(a,..., A), (6) }{{} T.1 1 A =.1 1. (63).1 1 The workaion ued o perform he experimen had an Inel Core i3-1 Proceor wih a 3.3GHz 4 clock peed and a 7.7GB memory. Fig. how he evoluion of he log-likelihood wih repec o he number of ieraion and compuaion ime during he parameer raining of he propoed model and he convenional rajecory HMM. A Fig. how, he preen algorihm converged faer han he convenional algorihm. Thi reveal he effecivene of he combined ue of efficien aiical inference echnique uch a he EM algorihm and he dynamic programming principle by he propoed algorihm. I i alo worh noing ha he converged value of he log-likelihood obained wih he propoed algorihm wa greaer han ha obained wih he convenional algorihm. Thi implie he poibiliy ha, compared wih he convenional model, he propoed model ha a higher generalizaion abiliy, namely an abiliy o fi an arbirary e of feaure equence, given ha he degree-of-freedom of he wo model were he ame. Fig. 3 how an example of parameer generaion uing he propoed model. Afer raining λ uing 5 peech daa, a feaure equence ĉ wa generaed according o (9) given a ae equence. The figure a he op how he pecrogram of a peech ample and he figure a he boom how he pecrogram conruced uing ĉ obained uing he rained λ and he ae equence labeled from he peech ample. A Fig. 3 how, he propoed model wa able o repreen he coninuouly ime-varying naure of peech pecrogram reaonably well, howing ha i ha a imilar propery o he rajecory HMM.
6 Log-likelihood Frequency (khz) # ieraion Propoed Trajecory HMM.5 3 x Time () Log-likelihood Propoed Trajecory HMM Time (min) Frequency (khz) 6 4 Fig.. Evoluion of he log-likelihood wih repec o he number of ieraion (op) and he compuaion ime (boom). 5. CONCLUSIONS Inpired by he rajecory HMM framework propoed by Zen e al., hi paper propoed a probabiliic generaive model for decribing coninuouly ime-varying equence of daa vecor governed by dicree hidden ae. The propoed model i advanageou over he convenional rajecory HMM in ha i make i poible o derive convergence-guaraneed and efficien algorihm for parameer raining and ae decoding. Inereing fuure work involve incorporaing he propoed model ino he facorial HMM formulaion o develop a new mehod for audio ource eparaion ha ake accoun of he dynamic of ource pecra. 6. REFERENCES [1] T. Yohimura, K. Tokuda, T. Mauko, T. Kobayahi, and T. Kiamura, Simulaneou modeling of pecrum, pich and duraion in HMMbaed peech ynhei, in Proc. The 6h European Conference on Speech Communicaion and Technology (EUROSPEECH 1999), 1999, pp [] K. Tokuda, T. Yohimura, T. Mauko, T. Kobayahi, and T. Kiamura, Speech parameer generaion algorihm for HMM-baed peech ynhei, in Proc. IEEE Inernaional Conference on Acouic, Speech and Signal Proceing (ICASSP ),, pp [3] H. Zen, K. Tokuda, and T. Kiamura, Reformulaing he HMM a a rajeory model by impoing explici relaionhip beween aic and dynamic feaure vecor equence, Compuer Speech and Language, vol. 1, pp , 7. [4] Y.-J. Wu and R.H. Wang, Minimum generaion error raining for HMM-baed peech ynhei, in Proc. 6 IEEE Inernaional Conference on Acouic, Speech and Signal Proceing (ICASSP 6), 6, pp [5] T. Toda, A.W. Black, and K. Tokuda, Voice converion baed on maximum-likelihood eimaion of pecral parameer rajecory, IEEE Time () Fig. 3. Example of parameer generaion uing he propoed model. The pecrogram of a raining ample (op) and he pecrogram conruced uing he generaed parameer (boom). Traancion on Audio, Speech, and Language Proceing, vol. 15, no. 8, pp. 35, 7. [6] T. Toda, A.W. Black, and K. Tokuda, Saiical mapping beween ariculaory movemen and acouic pecrum uing a Gauian mixure model, Speech Communicaion, vol. 5, no. 3, pp. 15 7, 7. [7] L. Zhang and S. Renal, Acouic-ariculaory modelling wih he rajecory HMM, IEEE Signal Proceing Leer, vol. 15, pp , 8. [8] M. Nakano, J. Le Roux, H. Kameoka, T. Nakamura, N. Ono, and S. Sagayama, Bayeian nonparameric pecrogram modeling baed on infinie facorial infinie hidden Markov model, in Proc. 11 IEEE Workhop on Applicaion of Signal Proceing o Audio and Acouic (WASPAA11), 11, pp [9] T. Higuchi, H. Takeda, T. Nakamura, and H. Kameoka, A unified approach for underdeermined blind ignal eparaion and ource aciviy deecion by mulichannel facorial hidden markov model, in Proc. The 15h Annual Conference of he Inernaional Speech Communicaion Aociaion (Inerpeech 14), 14, pp [1] T. Higuchi and H. Kameoka, Join audio ource eparaion and dereverberaion baed on mulichannel facorial hidden markov model, in Proc. The 4h IEEE Inernaional Workhop on Machine Learning for Signal Proceing (MLSP 14), 14. [11] T. Higuchi and H. Kameoka, Unified approach for underdeermined BSS, VAD, dereverberaion and DOA eimaion wih mulichannel facorial HMM, in Proc. of The nd IEEE Global Conference on Signal and Informaion Proceing (GlobalSIP 14), 14. [1] M.J. Beal, Variaional Algorihm for Approximae Bayeian Inference, Ph.D. hei, Gaby Compuaional Neurocience Uni, Univeriy College London, 1998.
2. VECTORS. R Vectors are denoted by bold-face characters such as R, V, etc. The magnitude of a vector, such as R, is denoted as R, R, V
ME 352 VETS 2. VETS Vecor algebra form he mahemaical foundaion for kinemaic and dnamic. Geomer of moion i a he hear of boh he kinemaic and dnamic of mechanical em. Vecor anali i he imehonored ool for decribing
More informationProblem Set If all directed edges in a network have distinct capacities, then there is a unique maximum flow.
CSE 202: Deign and Analyi of Algorihm Winer 2013 Problem Se 3 Inrucor: Kamalika Chaudhuri Due on: Tue. Feb 26, 2013 Inrucion For your proof, you may ue any lower bound, algorihm or daa rucure from he ex
More informationARTIFICIAL INTELLIGENCE. Markov decision processes
INFOB2KI 2017-2018 Urech Univeriy The Neherland ARTIFICIAL INTELLIGENCE Markov deciion procee Lecurer: Silja Renooij Thee lide are par of he INFOB2KI Coure Noe available from www.c.uu.nl/doc/vakken/b2ki/chema.hml
More information1 Motivation and Basic Definitions
CSCE : Deign and Analyi of Algorihm Noe on Max Flow Fall 20 (Baed on he preenaion in Chaper 26 of Inroducion o Algorihm, 3rd Ed. by Cormen, Leieron, Rive and Sein.) Moivaion and Baic Definiion Conider
More informationWhat is maximum Likelihood? History Features of ML method Tools used Advantages Disadvantages Evolutionary models
Wha i maximum Likelihood? Hiory Feaure of ML mehod Tool ued Advanage Diadvanage Evoluionary model Maximum likelihood mehod creae all he poible ree conaining he e of organim conidered, and hen ue he aiic
More information, the. L and the L. x x. max. i n. It is easy to show that these two norms satisfy the following relation: x x n x = (17.3) max
ecure 8 7. Sabiliy Analyi For an n dimenional vecor R n, he and he vecor norm are defined a: = T = i n i (7.) I i eay o how ha hee wo norm aify he following relaion: n (7.) If a vecor i ime-dependen, hen
More informationCS4445/9544 Analysis of Algorithms II Solution for Assignment 1
Conider he following flow nework CS444/944 Analyi of Algorihm II Soluion for Aignmen (0 mark) In he following nework a minimum cu ha capaciy 0 Eiher prove ha hi aemen i rue, or how ha i i fale Uing he
More informationMon Apr 2: Laplace transform and initial value problems like we studied in Chapter 5
Mah 225-4 Week 2 April 2-6 coninue.-.3; alo cover par of.4-.5, EP 7.6 Mon Apr 2:.-.3 Laplace ranform and iniial value problem like we udied in Chaper 5 Announcemen: Warm-up Exercie: Recall, The Laplace
More informationNetwork Flows: Introduction & Maximum Flow
CSC 373 - lgorihm Deign, nalyi, and Complexiy Summer 2016 Lalla Mouaadid Nework Flow: Inroducion & Maximum Flow We now urn our aenion o anoher powerful algorihmic echnique: Local Search. In a local earch
More information5.2 GRAPHICAL VELOCITY ANALYSIS Polygon Method
ME 352 GRHICL VELCITY NLYSIS 52 GRHICL VELCITY NLYSIS olygon Mehod Velociy analyi form he hear of kinemaic and dynamic of mechanical yem Velociy analyi i uually performed following a poiion analyi; ie,
More informationNotes on cointegration of real interest rates and real exchange rates. ρ (2)
Noe on coinegraion of real inere rae and real exchange rae Charle ngel, Univeriy of Wiconin Le me ar wih he obervaion ha while he lieraure (mo prominenly Meee and Rogoff (988) and dion and Paul (993))
More informationLet. x y. denote a bivariate time series with zero mean.
Linear Filer Le x y : T denoe a bivariae ime erie wih zero mean. Suppoe ha he ime erie {y : T} i conruced a follow: y a x The ime erie {y : T} i aid o be conruced from {x : T} by mean of a Linear Filer.
More information18 Extensions of Maximum Flow
Who are you?" aid Lunkwill, riing angrily from hi ea. Wha do you wan?" I am Majikhie!" announced he older one. And I demand ha I am Vroomfondel!" houed he younger one. Majikhie urned on Vroomfondel. I
More informationTo become more mathematically correct, Circuit equations are Algebraic Differential equations. from KVL, KCL from the constitutive relationship
Laplace Tranform (Lin & DeCarlo: Ch 3) ENSC30 Elecric Circui II The Laplace ranform i an inegral ranformaion. I ranform: f ( ) F( ) ime variable complex variable From Euler > Lagrange > Laplace. Hence,
More informationModeling the Evolution of Demand Forecasts with Application to Safety Stock Analysis in Production/Distribution Systems
Modeling he Evoluion of Demand oreca wih Applicaion o Safey Sock Analyi in Producion/Diribuion Syem David Heah and Peer Jackon Preened by Kai Jiang Thi ummary preenaion baed on: Heah, D.C., and P.L. Jackon.
More informationAlgorithmic Discrete Mathematics 6. Exercise Sheet
Algorihmic Dicree Mahemaic. Exercie Shee Deparmen of Mahemaic SS 0 PD Dr. Ulf Lorenz 7. and 8. Juni 0 Dipl.-Mah. David Meffer Verion of June, 0 Groupwork Exercie G (Heap-Sor) Ue Heap-Sor wih a min-heap
More informationAdditional Methods for Solving DSGE Models
Addiional Mehod for Solving DSGE Model Karel Meren, Cornell Univeriy Reference King, R. G., Ploer, C. I. & Rebelo, S. T. (1988), Producion, growh and buine cycle: I. he baic neoclaical model, Journal of
More informationAdmin MAX FLOW APPLICATIONS. Flow graph/networks. Flow constraints 4/30/13. CS lunch today Grading. in-flow = out-flow for every vertex (except s, t)
/0/ dmin lunch oday rading MX LOW PPLIION 0, pring avid Kauchak low graph/nework low nework direced, weighed graph (V, ) poiive edge weigh indicaing he capaciy (generally, aume ineger) conain a ingle ource
More informationRandomized Perfect Bipartite Matching
Inenive Algorihm Lecure 24 Randomized Perfec Biparie Maching Lecurer: Daniel A. Spielman April 9, 208 24. Inroducion We explain a randomized algorihm by Ahih Goel, Michael Kapralov and Sanjeev Khanna for
More informationOptimal State-Feedback Control Under Sparsity and Delay Constraints
Opimal Sae-Feedback Conrol Under Spariy and Delay Conrain Andrew Lamperki Lauren Leard 2 3 rd IFAC Workhop on Diribued Eimaion and Conrol in Neworked Syem NecSy pp. 24 29, 22 Abrac Thi paper preen he oluion
More information13.1 Circuit Elements in the s Domain Circuit Analysis in the s Domain The Transfer Function and Natural Response 13.
Chaper 3 The Laplace Tranform in Circui Analyi 3. Circui Elemen in he Domain 3.-3 Circui Analyi in he Domain 3.4-5 The Tranfer Funcion and Naural Repone 3.6 The Tranfer Funcion and he Convoluion Inegral
More informationIntroduction to Congestion Games
Algorihmic Game Theory, Summer 2017 Inroducion o Congeion Game Lecure 1 (5 page) Inrucor: Thoma Keelheim In hi lecure, we ge o know congeion game, which will be our running example for many concep in game
More informationCHAPTER 7: SECOND-ORDER CIRCUITS
EEE5: CI RCUI T THEORY CHAPTER 7: SECOND-ORDER CIRCUITS 7. Inroducion Thi chaper conider circui wih wo orage elemen. Known a econd-order circui becaue heir repone are decribed by differenial equaion ha
More informationThe Residual Graph. 12 Augmenting Path Algorithms. Augmenting Path Algorithm. Augmenting Path Algorithm
Augmening Pah Algorihm Greedy-algorihm: ar wih f (e) = everywhere find an - pah wih f (e) < c(e) on every edge augmen flow along he pah repea a long a poible The Reidual Graph From he graph G = (V, E,
More informationof Manchester The University COMP14112 Hidden Markov Models
COMP42 Lecure 8 Hidden Markov Model he Univeriy of Mancheer he Univeriy of Mancheer Hidden Markov Model a b2 0. 0. SAR 0.9 0.9 SOP b 0. a2 0. Imagine he and 2 are hidden o he daa roduced i a equence of
More informationLaplace Transform. Inverse Laplace Transform. e st f(t)dt. (2)
Laplace Tranform Maoud Malek The Laplace ranform i an inegral ranform named in honor of mahemaician and aronomer Pierre-Simon Laplace, who ued he ranform in hi work on probabiliy heory. I i a powerful
More informationVehicle Arrival Models : Headway
Chaper 12 Vehicle Arrival Models : Headway 12.1 Inroducion Modelling arrival of vehicle a secion of road is an imporan sep in raffic flow modelling. I has imporan applicaion in raffic flow simulaion where
More informationExponential Sawtooth
ECPE 36 HOMEWORK 3: PROPERTIES OF THE FOURIER TRANSFORM SOLUTION. Exponenial Sawooh: The eaie way o do hi problem i o look a he Fourier ranform of a ingle exponenial funcion, () = exp( )u(). From he able
More informationHidden Markov models in DNA sequence segmentation modeling Dr Darfiana Nur
Hidden Markov model in DNA equence egmenaion modeling Dr Darfiana Nur Lecurer in Saiic School of Mahemaical and hyical Science The Univeriy of Newcale Auralia Reearch inere Since 990.. Nonlinear ime erie
More informationu(t) Figure 1. Open loop control system
Open loop conrol v cloed loop feedbac conrol The nex wo figure preen he rucure of open loop and feedbac conrol yem Figure how an open loop conrol yem whoe funcion i o caue he oupu y o follow he reference
More informationObject tracking: Using HMMs to estimate the geographical location of fish
Objec racking: Using HMMs o esimae he geographical locaion of fish 02433 - Hidden Markov Models Marin Wæver Pedersen, Henrik Madsen Course week 13 MWP, compiled June 8, 2011 Objecive: Locae fish from agging
More informationRough Paths and its Applications in Machine Learning
Pah ignaure Machine learning applicaion Rough Pah and i Applicaion in Machine Learning July 20, 2017 Rough Pah and i Applicaion in Machine Learning Pah ignaure Machine learning applicaion Hiory and moivaion
More informationCHAPTER. Forced Equations and Systems { } ( ) ( ) 8.1 The Laplace Transform and Its Inverse. Transforms from the Definition.
CHAPTER 8 Forced Equaion and Syem 8 The aplace Tranform and I Invere Tranform from he Definiion 5 5 = b b {} 5 = 5e d = lim5 e = ( ) b {} = e d = lim e + e d b = (inegraion by par) = = = = b b ( ) ( )
More information0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED
0.1 MAXIMUM LIKELIHOOD ESTIMATIO EXPLAIED Maximum likelihood esimaion is a bes-fi saisical mehod for he esimaion of he values of he parameers of a sysem, based on a se of observaions of a random variable
More informationCONTROL SYSTEMS. Chapter 10 : State Space Response
CONTROL SYSTEMS Chaper : Sae Space Repone GATE Objecive & Numerical Type Soluion Queion 5 [GATE EE 99 IIT-Bombay : Mark] Conider a econd order yem whoe ae pace repreenaion i of he form A Bu. If () (),
More informationTom Heskes and Onno Zoeter. Presented by Mark Buller
Tom Heskes and Onno Zoeer Presened by Mark Buller Dynamic Bayesian Neworks Direced graphical models of sochasic processes Represen hidden and observed variables wih differen dependencies Generalize Hidden
More informationGeorey E. Hinton. University oftoronto. Technical Report CRG-TR February 22, Abstract
Parameer Esimaion for Linear Dynamical Sysems Zoubin Ghahramani Georey E. Hinon Deparmen of Compuer Science Universiy oftorono 6 King's College Road Torono, Canada M5S A4 Email: zoubin@cs.orono.edu Technical
More informationThe Residual Graph. 11 Augmenting Path Algorithms. Augmenting Path Algorithm. Augmenting Path Algorithm
Augmening Pah Algorihm Greedy-algorihm: ar wih f (e) = everywhere find an - pah wih f (e) < c(e) on every edge augmen flow along he pah repea a long a poible The Reidual Graph From he graph G = (V, E,
More informationMain Reference: Sections in CLRS.
Maximum Flow Reied 09/09/200 Main Reference: Secion 26.-26. in CLRS. Inroducion Definiion Muli-Source Muli-Sink The Ford-Fulkeron Mehod Reidual Nework Augmening Pah The Max-Flow Min-Cu Theorem The Edmond-Karp
More informationEECE 301 Signals & Systems Prof. Mark Fowler
EECE 31 Signal & Syem Prof. Mark Fowler Noe Se #27 C-T Syem: Laplace Tranform Power Tool for yem analyi Reading Aignmen: Secion 6.1 6.3 of Kamen and Heck 1/18 Coure Flow Diagram The arrow here how concepual
More information6.8 Laplace Transform: General Formulas
48 HAP. 6 Laplace Tranform 6.8 Laplace Tranform: General Formula Formula Name, ommen Sec. F() l{ f ()} e f () d f () l {F()} Definiion of Tranform Invere Tranform 6. l{af () bg()} al{f ()} bl{g()} Lineariy
More information18.03SC Unit 3 Practice Exam and Solutions
Sudy Guide on Sep, Dela, Convoluion, Laplace You can hink of he ep funcion u() a any nice mooh funcion which i for < a and for > a, where a i a poiive number which i much maller han any ime cale we care
More informationIntroduction to SLE Lecture Notes
Inroducion o SLE Lecure Noe May 13, 16 - The goal of hi ecion i o find a ufficien condiion of λ for he hull K o be generaed by a imple cure. I urn ou if λ 1 < 4 hen K i generaed by a imple curve. We will
More informationHidden Markov Models
Hidden Markov Models Probabilisic reasoning over ime So far, we ve mosly deal wih episodic environmens Excepions: games wih muliple moves, planning In paricular, he Bayesian neworks we ve seen so far describe
More information13.1 Accelerating Objects
13.1 Acceleraing Objec A you learned in Chaper 12, when you are ravelling a a conan peed in a raigh line, you have uniform moion. However, mo objec do no ravel a conan peed in a raigh line o hey do no
More informationNECESSARY AND SUFFICIENT CONDITIONS FOR LATENT SEPARABILITY
NECESSARY AND SUFFICIENT CONDITIONS FOR LATENT SEPARABILITY Ian Crawford THE INSTITUTE FOR FISCAL STUDIES DEPARTMENT OF ECONOMICS, UCL cemmap working paper CWP02/04 Neceary and Sufficien Condiion for Laen
More informationCHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK
175 CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK 10.1 INTRODUCTION Amongs he research work performed, he bes resuls of experimenal work are validaed wih Arificial Neural Nework. From he
More information4/12/12. Applications of the Maxflow Problem 7.5 Bipartite Matching. Bipartite Matching. Bipartite Matching. Bipartite matching: the flow network
// Applicaion of he Maxflow Problem. Biparie Maching Biparie Maching Biparie maching. Inpu: undireced, biparie graph = (, E). M E i a maching if each node appear in a mo one edge in M. Max maching: find
More informationSTATE-SPACE MODELLING. A mass balance across the tank gives:
B. Lennox and N.F. Thornhill, 9, Sae Space Modelling, IChemE Process Managemen and Conrol Subjec Group Newsleer STE-SPACE MODELLING Inroducion: Over he pas decade or so here has been an ever increasing
More informationFLAT CYCLOTOMIC POLYNOMIALS OF ORDER FOUR AND HIGHER
#A30 INTEGERS 10 (010), 357-363 FLAT CYCLOTOMIC POLYNOMIALS OF ORDER FOUR AND HIGHER Nahan Kaplan Deparmen of Mahemaic, Harvard Univeriy, Cambridge, MA nkaplan@mah.harvard.edu Received: 7/15/09, Revied:
More informationInterpolation and Pulse Shaping
EE345S Real-Time Digial Signal Proceing Lab Spring 2006 Inerpolaion and Pule Shaping Prof. Brian L. Evan Dep. of Elecrical and Compuer Engineering The Univeriy of Texa a Auin Lecure 7 Dicree-o-Coninuou
More informationModal identification of structures from roving input data by means of maximum likelihood estimation of the state space model
Modal idenificaion of srucures from roving inpu daa by means of maximum likelihood esimaion of he sae space model J. Cara, J. Juan, E. Alarcón Absrac The usual way o perform a forced vibraion es is o fix
More informationSuggested Solutions to Midterm Exam Econ 511b (Part I), Spring 2004
Suggeed Soluion o Miderm Exam Econ 511b (Par I), Spring 2004 1. Conider a compeiive equilibrium neoclaical growh model populaed by idenical conumer whoe preference over conumpion ream are given by P β
More informationCSC 364S Notes University of Toronto, Spring, The networks we will consider are directed graphs, where each edge has associated with it
CSC 36S Noe Univeriy of Torono, Spring, 2003 Flow Algorihm The nework we will conider are direced graph, where each edge ha aociaed wih i a nonnegaive capaciy. The inuiion i ha if edge (u; v) ha capaciy
More informationMATH 5720: Gradient Methods Hung Phan, UMass Lowell October 4, 2018
MATH 5720: Gradien Mehods Hung Phan, UMass Lowell Ocober 4, 208 Descen Direcion Mehods Consider he problem min { f(x) x R n}. The general descen direcions mehod is x k+ = x k + k d k where x k is he curren
More informationPENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD
PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD HAN XIAO 1. Penalized Leas Squares Lasso solves he following opimizaion problem, ˆβ lasso = arg max β R p+1 1 N y i β 0 N x ij β j β j (1.1) for some 0.
More informationFractional Ornstein-Uhlenbeck Bridge
WDS'1 Proceeding of Conribued Paper, Par I, 21 26, 21. ISBN 978-8-7378-139-2 MATFYZPRESS Fracional Ornein-Uhlenbeck Bridge J. Janák Charle Univeriy, Faculy of Mahemaic and Phyic, Prague, Czech Republic.
More informationReminder: Flow Networks
0/0/204 Ma/CS 6a Cla 4: Variou (Flow) Execie Reminder: Flow Nework A flow nework i a digraph G = V, E, ogeher wih a ource verex V, a ink verex V, and a capaciy funcion c: E N. Capaciy Source 7 a b c d
More informationFlow Networks. Ma/CS 6a. Class 14: Flow Exercises
0/0/206 Ma/CS 6a Cla 4: Flow Exercie Flow Nework A flow nework i a digraph G = V, E, ogeher wih a ource verex V, a ink verex V, and a capaciy funcion c: E N. Capaciy Source 7 a b c d e Sink 0/0/206 Flow
More informationAn recursive analytical technique to estimate time dependent physical parameters in the presence of noise processes
WHAT IS A KALMAN FILTER An recursive analyical echnique o esimae ime dependen physical parameers in he presence of noise processes Example of a ime and frequency applicaion: Offse beween wo clocks PREDICTORS,
More informationNetwork Flow. Data Structures and Algorithms Andrei Bulatov
Nework Flow Daa Srucure and Algorihm Andrei Bulao Algorihm Nework Flow 24-2 Flow Nework Think of a graph a yem of pipe We ue hi yem o pump waer from he ource o ink Eery pipe/edge ha limied capaciy Flow
More informationLecture 20: Riccati Equations and Least Squares Feedback Control
34-5 LINEAR SYSTEMS Lecure : Riccai Equaions and Leas Squares Feedback Conrol 5.6.4 Sae Feedback via Riccai Equaions A recursive approach in generaing he marix-valued funcion W ( ) equaion for i for he
More informationAverage Case Lower Bounds for Monotone Switching Networks
Average Cae Lower Bound for Monoone Swiching Nework Yuval Filmu, Toniann Piai, Rober Robere, Sephen Cook Deparmen of Compuer Science Univeriy of Torono Monoone Compuaion (Refreher) Monoone circui were
More informationChapter 7: Inverse-Response Systems
Chaper 7: Invere-Repone Syem Normal Syem Invere-Repone Syem Baic Sar ou in he wrong direcion End up in he original eady-ae gain value Two or more yem wih differen magniude and cale in parallel Main yem
More informationSOLUTIONS TO ECE 3084
SOLUTIONS TO ECE 384 PROBLEM 2.. For each sysem below, specify wheher or no i is: (i) memoryless; (ii) causal; (iii) inverible; (iv) linear; (v) ime invarian; Explain your reasoning. If he propery is no
More informationEE Control Systems LECTURE 2
Copyrigh F.L. Lewi 999 All righ reerved EE 434 - Conrol Syem LECTURE REVIEW OF LAPLACE TRANSFORM LAPLACE TRANSFORM The Laplace ranform i very ueful in analyi and deign for yem ha are linear and ime-invarian
More informationScheduling of Crude Oil Movements at Refinery Front-end
Scheduling of Crude Oil Movemens a Refinery Fron-end Ramkumar Karuppiah and Ignacio Grossmann Carnegie Mellon Universiy ExxonMobil Case Sudy: Dr. Kevin Furman Enerprise-wide Opimizaion Projec March 15,
More informationRandom Walk with Anti-Correlated Steps
Random Walk wih Ani-Correlaed Seps John Noga Dirk Wagner 2 Absrac We conjecure he expeced value of random walks wih ani-correlaed seps o be exacly. We suppor his conjecure wih 2 plausibiliy argumens and
More informationHidden Markov Models. Adapted from. Dr Catherine Sweeney-Reed s slides
Hidden Markov Models Adaped from Dr Caherine Sweeney-Reed s slides Summary Inroducion Descripion Cenral in HMM modelling Exensions Demonsraion Specificaion of an HMM Descripion N - number of saes Q = {q
More informationState-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter
Sae-Space Models Iniializaion, Esimaion and Smoohing of he Kalman Filer Iniializaion of he Kalman Filer The Kalman filer shows how o updae pas predicors and he corresponding predicion error variances when
More informationMotion Compensated Color Video Classification Using Markov Random Fields
Moion Compenaed Color Video Claificaion Uing Markov Random Field Zolan Kao, Ting-Chuen Pong, John Chung-Mong Lee Hong Kong Univeriy of Science and Technology, Compuer Science Dep., Clear Waer Bay, Kowloon,
More informationNotes on Kalman Filtering
Noes on Kalman Filering Brian Borchers and Rick Aser November 7, Inroducion Daa Assimilaion is he problem of merging model predicions wih acual measuremens of a sysem o produce an opimal esimae of he curren
More informationImplementation of 64-Point FFT Processor Based on Radix-2 Using Verilog
Inernaional Journal of Engineering Reearch & Technology (IJERT) Implemenaion of 64-oin roceor Baed on Radix-2 Uing Verilog T.TIRUALA KOTESWARA RAO 1, S. SARATH CHANDRA 2 Suden of. Tech Deparmen of Elecronic
More informationPhysics 240: Worksheet 16 Name
Phyic 4: Workhee 16 Nae Non-unifor circular oion Each of hee proble involve non-unifor circular oion wih a conan α. (1) Obain each of he equaion of oion for non-unifor circular oion under a conan acceleraion,
More informationStability in Distribution for Backward Uncertain Differential Equation
Sabiliy in Diribuion for Backward Uncerain Differenial Equaion Yuhong Sheng 1, Dan A. Ralecu 2 1. College of Mahemaical and Syem Science, Xinjiang Univeriy, Urumqi 8346, China heng-yh12@mail.inghua.edu.cn
More informationNEUTRON DIFFUSION THEORY
NEUTRON DIFFUSION THEORY M. Ragheb 4//7. INTRODUCTION The diffuion heory model of neuron ranpor play a crucial role in reacor heory ince i i imple enough o allow cienific inigh, and i i ufficienly realiic
More informationSample Final Exam (finals03) Covering Chapters 1-9 of Fundamentals of Signals & Systems
Sample Final Exam Covering Chaper 9 (final04) Sample Final Exam (final03) Covering Chaper 9 of Fundamenal of Signal & Syem Problem (0 mar) Conider he caual opamp circui iniially a re depiced below. I LI
More informationChapter 6. Laplace Transforms
Chaper 6. Laplace Tranform Kreyzig by YHLee;45; 6- An ODE i reduced o an algebraic problem by operaional calculu. The equaion i olved by algebraic manipulaion. The reul i ranformed back for he oluion of
More information6.2 Transforms of Derivatives and Integrals.
SEC. 6.2 Transforms of Derivaives and Inegrals. ODEs 2 3 33 39 23. Change of scale. If l( f ()) F(s) and c is any 33 45 APPLICATION OF s-shifting posiive consan, show ha l( f (c)) F(s>c)>c (Hin: In Probs.
More informationLinear Response Theory: The connection between QFT and experiments
Phys540.nb 39 3 Linear Response Theory: The connecion beween QFT and experimens 3.1. Basic conceps and ideas Q: How do we measure he conduciviy of a meal? A: we firs inroduce a weak elecric field E, and
More informationINTRODUCTION TO INERTIAL CONFINEMENT FUSION
INTODUCTION TO INETIAL CONFINEMENT FUSION. Bei Lecure 7 Soluion of he imple dynamic igniion model ecap from previou lecure: imple dynamic model ecap: 1D model dynamic model ρ P() T enhalpy flux ino ho
More informationSpeaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis
Speaker Adapaion Techniques For Coninuous Speech Using Medium and Small Adapaion Daa Ses Consaninos Boulis Ouline of he Presenaion Inroducion o he speaker adapaion problem Maximum Likelihood Sochasic Transformaions
More informations-domain Circuit Analysis
Domain ircui Analyi Operae direcly in he domain wih capacior, inducor and reior Key feaure lineariy i preerved c decribed by ODE and heir I Order equal number of plu number of Elemenbyelemen and ource
More informationChapter 2. First Order Scalar Equations
Chaper. Firs Order Scalar Equaions We sar our sudy of differenial equaions in he same way he pioneers in his field did. We show paricular echniques o solve paricular ypes of firs order differenial equaions.
More informationDeep Learning: Theory, Techniques & Applications - Recurrent Neural Networks -
Deep Learning: Theory, Techniques & Applicaions - Recurren Neural Neworks - Prof. Maeo Maeucci maeo.maeucci@polimi.i Deparmen of Elecronics, Informaion and Bioengineering Arificial Inelligence and Roboics
More information1 Review of Zero-Sum Games
COS 5: heoreical Machine Learning Lecurer: Rob Schapire Lecure #23 Scribe: Eugene Brevdo April 30, 2008 Review of Zero-Sum Games Las ime we inroduced a mahemaical model for wo player zero-sum games. Any
More informationt is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t...
Mah 228- Fri Mar 24 5.6 Marix exponenials and linear sysems: The analogy beween firs order sysems of linear differenial equaions (Chaper 5) and scalar linear differenial equaions (Chaper ) is much sronger
More informationGMM - Generalized Method of Moments
GMM - Generalized Mehod of Momens Conens GMM esimaion, shor inroducion 2 GMM inuiion: Maching momens 2 3 General overview of GMM esimaion. 3 3. Weighing marix...........................................
More informationFinal Spring 2007
.615 Final Spring 7 Overview The purpose of he final exam is o calculae he MHD β limi in a high-bea oroidal okamak agains he dangerous n = 1 exernal ballooning-kink mode. Effecively, his corresponds o
More informationT L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB
Elecronic Companion EC.1. Proofs of Technical Lemmas and Theorems LEMMA 1. Le C(RB) be he oal cos incurred by he RB policy. Then we have, T L E[C(RB)] 3 E[Z RB ]. (EC.1) Proof of Lemma 1. Using he marginal
More information20. Applications of the Genetic-Drift Model
0. Applicaions of he Geneic-Drif Model 1) Deermining he probabiliy of forming any paricular combinaion of genoypes in he nex generaion: Example: If he parenal allele frequencies are p 0 = 0.35 and q 0
More informationExpectation- Maximization & Baum-Welch. Slides: Roded Sharan, Jan 15; revised by Ron Shamir, Nov 15
Expecaion- Maximizaion & Baum-Welch Slides: Roded Sharan, Jan 15; revised by Ron Shamir, Nov 15 1 The goal Inpu: incomplee daa originaing from a probabiliy disribuion wih some unknown parameers Wan o find
More informationAn introduction to the theory of SDDP algorithm
An inroducion o he heory of SDDP algorihm V. Leclère (ENPC) Augus 1, 2014 V. Leclère Inroducion o SDDP Augus 1, 2014 1 / 21 Inroducion Large scale sochasic problem are hard o solve. Two ways of aacking
More informationMore Digital Logic. t p output. Low-to-high and high-to-low transitions could have different t p. V in (t)
EECS 4 Spring 23 Lecure 2 EECS 4 Spring 23 Lecure 2 More igial Logic Gae delay and signal propagaion Clocked circui elemens (flip-flop) Wriing a word o memory Simplifying digial circuis: Karnaugh maps
More informationFlow networks. Flow Networks. A flow on a network. Flow networks. The maximum-flow problem. Introduction to Algorithms, Lecture 22 December 5, 2001
CS 545 Flow Nework lon Efra Slide courey of Charle Leieron wih mall change by Carola Wenk Flow nework Definiion. flow nework i a direced graph G = (V, E) wih wo diinguihed verice: a ource and a ink. Each
More informationNetwork Flows UPCOPENCOURSEWARE number 34414
Nework Flow UPCOPENCOURSEWARE number Topic : F.-Javier Heredia Thi work i licened under he Creaive Common Aribuion- NonCommercial-NoDeriv. Unpored Licene. To view a copy of hi licene, vii hp://creaivecommon.org/licene/by-nc-nd/./
More informationDiscussion Session 2 Constant Acceleration/Relative Motion Week 03
PHYS 100 Dicuion Seion Conan Acceleraion/Relaive Moion Week 03 The Plan Today you will work wih your group explore he idea of reference frame (i.e. relaive moion) and moion wih conan acceleraion. You ll
More information10. State Space Methods
. Sae Space Mehods. Inroducion Sae space modelling was briefly inroduced in chaper. Here more coverage is provided of sae space mehods before some of heir uses in conrol sysem design are covered in he
More informationMacroeconomics 1. Ali Shourideh. Final Exam
4780 - Macroeconomic 1 Ali Shourideh Final Exam Problem 1. A Model of On-he-Job Search Conider he following verion of he McCall earch model ha allow for on-he-job-earch. In paricular, uppoe ha ime i coninuou
More informationRobotics I. April 11, The kinematics of a 3R spatial robot is specified by the Denavit-Hartenberg parameters in Tab. 1.
Roboics I April 11, 017 Exercise 1 he kinemaics of a 3R spaial robo is specified by he Denavi-Harenberg parameers in ab 1 i α i d i a i θ i 1 π/ L 1 0 1 0 0 L 3 0 0 L 3 3 able 1: able of DH parameers of
More information