Probabilistic & Unsupervised Learning. Factored Variational Approximations and Variational Bayes. Expectations in Statistical Modelling

Size: px
Start display at page:

Download "Probabilistic & Unsupervised Learning. Factored Variational Approximations and Variational Bayes. Expectations in Statistical Modelling"

Transcription

1 Expecaons n Sascal Modellng Probablsc & Unsupervsed Learnng Parameer esmaon ˆ = argmax dz P(Z )P(X Z, ) Facored Varaonal Approxmaons and Varaonal Bayes Maneesh Sahan maneesh@gasby.ucl.ac.uk Gasby Compuaonal Neuroscence Un, and MSc ML/CSML, Dep Compuer Scence Unversy College London erm, Auumn 08 (or, usng EM) new = argmax dz P(Z X, old ) log P(X, Z ) Predcon p(x D, m) = d p( D, m)p(x, D, m) Model selecon or weghng (by margnal lkelhood) p(d m) = d p( m)p(d, m) hese negrals are ofen nracable: Analyc nracably: negrals may no have closed form n non-lnear, non-gaussan models numercal negraon. Compuaonal nracably: Numercal negral (or sum f Z or are dscree) may be exponenal n daa or model sze. Inracables and approxmaons Dsrbued models Inference compuaonal nracably Facored varaonal approx Loopy BP/EP/Power EP LP relaxaons/ convexfed BP Gbbs samplng, oher MCMC Inference analyc nracably Laplace approxmaon (global) Paramerc varaonal approx Message approxmaons (lnearsed, sgma-pon, Laplace) Assumed-densy mehods and Expecaon-Propagaon (Sequenal) Mone-Carlo mehods s () s () s () s () s () s () s () s () x x x x Learnng nracable paron funcon Samplng parameers Consrasve dvergence Score-machng Model selecon Laplace approxmaon / BIC Varaonal Bayes (Annealed) mporance samplng Reversble jump MCMC No a complee ls! Consder an FHMM wh M sae varables akng on K values each. Moralsaon pus smulaneous saes (s (), s (),..., s (M) ) no a sngle clque rangulaon exends clques o sze M + Each sae akes K values sums over K M+ erms. Facoral pror Facoral poseror (explanng away). Varaonal mehods approxmae he poseror, ofen n a facored form. o see how hey work, we need o revew he free-energy nerpreaon of EM.

2 he Free Energy for a Laen Varable Model Observed daa X = {x }; Laen varables Z = {z }; Parameers. Goal: Maxmze he log lkelhood wr (.e. ML learnng): l() = log P(X ) = log P(Z, X )dz Any dsrbuon, q(z), over he hdden varables can be used o oban a lower bound on he log lkelhood usng Jensen s nequaly: P(Z, X ) P(Z, X ) l() = log q(z) dz q(z) log dz def = F(q, ) q(z) q(z) he E and M seps of EM he log lkelhood s bounded below by: F(q, ) = log P(Z, X ) q(z) + H[q = l() KL[q(Z) P(Z X, ) EM alernaes beween: E sep: opmse F(q, ) wr dsrbuon over hdden varables holdng parameers fxed: q(z) log P(Z, X ) q(z) dz = = q(z) log P(Z, X ) dz q(z) log P(Z, X ) dz + H[q, q(z) log q(z) dz q (k) (Z) := argmax q(z) F ( q(z), (k )) = P(Z X, (k ) ) M sep: maxmse F(q, ) wr parameers holdng hdden dsrbuon fxed: (k) := argmax F ( q (k) (Z), ) = argmax log P(Z, X ) q (k) (Z) where H[q s he enropy of q(z). So: F(q, ) = log P(Z, X ) q(z) + H[q EM as Coordnae Ascen n F EM Never Decreases he Lkelhood he E and M seps ogeher never decrease he log lkelhood: l ( (k )) = E sep F( q (k), (k )) M sep F ( q (k), (k)) Jensen l ( (k)), he E sep brngs he free energy o he lkelhood. he M-sep maxmses he free energy wr. F l by Jensen or, equvalenly, from he non-negavy of KL If he M-sep s execued so ha (k) (k ) ff F ncreases, hen he overall EM eraon wll sep o a new value of ff he lkelhood ncreases.

3 Free-energy-based varaonal approxmaon Wha f fndng expeced suffcen sas under P(Z X, ) s compuaonally nracable? For he generalsed EM algorhm, we argued ha nracable maxmsaons could be replaced by graden M-seps. Each sep ncreases he lkelhood. A fxed pon of he graden M-sep mus be a a mode of he expeced log-jon. For he E-sep we could: Parameerse q = q ρ(z) and ake a graden sep n ρ. Assume some smplfed form for q, usually facored: q = q (Z ) where Z paron Z, and maxmse whn hs form. In eher case, we choose q from whn a lmed se Q: VE sep: maxmse F(q, ) wr consraned laen dsrbuon gven parameers: q (k) (Z) := argmax F ( q(z), (k )). q(z) Q Consran Wha do we lose? Wha does resrcng q o Q cos us? Recall ha he free-energy s bounded above by Jensen: F(q, ) l( ML ) hus, as long as every sep ncreases F, convergence s sll guaraneed. Bu, snce P(Z X, (k) ) may no le n Q, we no longer saurae he bound afer he E-sep. hus, he lkelhood may no ncrease on each full EM sep. l ( (k )) / = E sep F( q (k), (k )) M sep F ( q (k), (k)) Jensen l ( (k)), M sep: unchanged (k) := argmax F ( q (k) (Z), ) = argmax q (k) (Z) log p(z, X )dz, hs means we may no (and usually won ) converge o a maxmum of l. he hope s ha by ncreasng a lower bound on l we wll fnd a decen soluon. [Noe ha f P(Z X, ML ) Q, hen ML s a fxed pon of he varaonal algorhm. Unlke n GEM, he fxed pon may no be a an unconsraned opmum of F. KL dvergence Recall ha hus, F(q, ) = log P(X, Z ) q(z) + H[q s equvalen o: = log P(X ) + log P(Z X, ) q(z) log q(z) q(z) = log P(X ) q(z) KL[q P(Z X, ). E sep maxmse F(q, ) wr he dsrbuon over laens, gven parameers: q (k) (Z) := argmax q(z) Q F ( q(z), (k )). E sep mnmse KL[q p(z X, ) wr dsrbuon over laens, gven parameers: q (k) q(z) (Z) := argmn q(z) log q(z) Q p(z X, (k ) ) dz So, n each E sep, he algorhm s ryng o fnd he bes approxmaon o P(Z X ) n Q n a KL sense. hs s relaed o deas n nformaon geomery. I also suggess generalsaons o oher dsance measures. Facored Varaonal E-sep he mos common form of varaonal approxmaon parons Z no dsjon ses Z wh Q = { q q(z) = q } (Z ). In hs case he E-sep s self erave: (Facored VE sep) : maxmse F(q, ) wr q (Z ) gven oher q j and parameers: q (k) (Z ) := argmax q (Z ) F ( q (Z ) j q j(z j), (k )). q updaes eraed o convergence o complee VE-sep. In fac, every (VE) -sep separaely ncreases F, so any schedule of (VE) - and M-seps wll converge. Choce can be dcaed by praccal ssues (rarely effcen o fully converge E-sep before updang parameers).

4 Facored Varaonal E-sep he Facored Varaonal E-sep has a general form. he free energy s: ( F q j(z j), (k )) = j = log P(X, Z (k ) ) q j j (Z j ) [ + H dz q (Z ) log P(X, Z (k ) ) j j q j(z j) + H[q + H[q j q j (Z j ) j Mean-feld approxmaons If Z = z (.e., q s facored over all varables) hen he varaonal echnque s ofen called a mean feld approxmaon. Suppose P(X, Z) has suffcen sascs ha are separable n he laen varables: e.g. he Bolzmann machne P(X, Z) = ( Z exp W js s j + ) b s j Now, akng he varaonal dervave of he Lagrangan (enforcng normalsaon of q ): ( ( )) δ F + λ q = log P(X, Z (k ) ) log q (Z ) q(z) δq q j (Z j ) q + λ (Z ) j wh some s Z and ohers observed. Expecaons wr a fully-facored q dsrbue over all s Z log P(X, Z) q = W js q s j qj + b s q j (= 0) q (Z ) exp log P(X, Z (k ) ) q j j (Z j ) In general, hs depends only on he expeced suffcen sascs under q j. hus, agan, we don acually need he enre dsrbuons, jus he relevan expecaons (now for approxmae nference as well as learnng). (where q for s X s a dela funcon on he observed value). hus, we can updae each q n urn gven he means (or, n general, mean suffcen sascs) of he ohers. Each varable sees he mean feld mposed by s neghbours, and we updae hese felds unl hey all agree. Mean-feld FHMM Mean-feld FHMM s () s () s () s () s () s () s () s () q(s :M : ) = q m (s m ) m, s () s () s () s () s () s () s () s () q(s :M : ) = q m (s m ) m, x x x x q m (s m ) exp log P(s :M :, x : ) (m,) (sm ) = exp log P(s τ µ s µ τ ) + µ τ τ [ log m exp P(s s ) m + log P(x q s :M m ) log P(x τ s :M τ ) }{{} α m log Φ () e m j j q m (j) e log A (x ) q Cf. forward-backward: α () j α (j)φ j A (x ) (m,) + log P(s m + s m ) q m + } {{ } β m () e j log Φ m j q m + (j) β () j Φ j A j (x + )β + (j) x x x x [ log q m (s m m ) exp P(s s ) m + log P(x q s :M m ) }{{} α m log Φ () e m j j q m (j) e log A (x ) q Cf. forward-backward: α () j α (j)φ j A (x ) Yelds a message-passng algorhm lke forward-backward Updaes depend only on mmedae neghbours n chan Chans couple only hrough jon oupu Mulple passes; messages depend on (approxmae) margnals + log P(s m + s m ) q m + } {{ } β m () e j log Φ m j q m + (j) β () j Φ j A j (x + )β + (j) Evdence does no appear explcly n backward message (cf Kalman smoohng)

5 Srucured varaonal approxmaon q(z) need no be compleely facorzed. For example, suppose Z can be paroned no ses Z and Z such ha compung he expeced suffcen sascs under P(Z Z, X ) and P(Z Z, X ) would be racable. hen he facored approxmaon q(z) = q(z )q(z ) s racable. In parcular, any facorsaon of q(z) no a produc of dsrbuons on rees, yelds a racable approxmaon. C A D B C + A + B + B +... D + C + A + D + Sucured FHMM s () s () s () s () s () s () s () s () x x x x q m (s m : ) exp log P(s :M :, x : ) (s m : ) = exp log P(s µ s µ ) + µ [ exp log P(s m s ) m + = P(s m s m ) e log P(x s :M For he FHMM we can facor he chans: log P(x s :M ) log P(x s :M ) ) (s m ) q(s :M : ) = m (s m ) q m (s m : ) hs looks lke a sandard HMM jon, wh a modfed lkelhood erm cycle hrough mulple forward-backward passes, updang lkelhood erms each me. Messages on an arbrary graph Non-facored varaonal mehods Consder a DAG: A C B P(X, Z) = k P(V k pa(v k)) he erm varaonal approxmaon s used whenever a bound on he lkelhood (or on anoher esmaon cos funcon) s opmsed, bu does no necessarly become gh. and le q(z) = q (Z ) for dsjon ses {Z }. E We have ha he VE updae for q s gven by q (Z ) exp log p(z, X ) q (Z) where q (Z) denoes averagng wh respec o qj(zj) for all j hen: log q (Z ) = log P(V k pa(v k)) k D q (Z) + cons = j Z log P(Z j pa(z j)) q (Z) + j ch(z ) log P(V j pa(v j)) q (Z) + cons hs defnes messages ha are passed beween nodes n he graph. Each node receves messages from s Markov boundary: parens, chldren and parens of chldren (all neghbours n he correspondng facor graph). Many furher varaonal approxmaons have been developed, ncludng: paramerc forms (e.g. Gaussan) for non-lnear models closed form updaes n specal cases numercal or samplng-based compuaon of expecaons recognon neworks or amorsaon o esmae varaonal parameers non-free-energy-based bounds (boh upper and lower) on he lkelhood. We can also see MAP- or zero-emperaure EM and recognon models as paramerc forms of varaonal nference. Varaonal mehods can also be used o fnd an approxmae poseror on he parameers.

6 Varaonal Bayes Varaonal Bayesan EM... So far, we have appled Jensen s bound and facorsaons o help wh negrals over laen varables. We can do he same for negrals over parameers n order o bound he log margnal lkelhood or evdence. log P(X M) = log dz d P(X, Z, M)P( M) P(X, Z, M) = max dz d Q(Z, ) log Q Q(Z, ) P(X, Z, M) max dz d Q Z(Z)Q () log Q Z,Q Q Z(Z)Q () he consran ha he dsrbuon Q mus facor no he produc Q y(z)q () leads o he varaonal Bayesan EM algorhm or jus Varaonal Bayes. Some call hs he Evdence Lower Bound (ELBO). I m no fond of ha erm. Coordnae maxmzaon of he VB free-energy lower bound p(x, Z, M) F(Q Z, Q ) = dz d Q Z(Z)Q () log Q Z(Z)Q () leads o EM-lke updaes: Q Z(Z) exp log P(Z,X ) Q () Q () P() exp log P(Z,X ) QZ (Z) E-lke sep M-lke sep Maxmzng F s equvalen o mnmzng KL-dvergence beween he approxmae poseror, Q()Q(Z) and he rue poseror, P(, Z X ). P(X, Z, ) log P(X ) F(Q Z, Q ) = log P(X ) dz d Q Z(Z)Q () log Q Z(Z)Q () = dz d Q Z(Z)Q () log QZ(Z)Q () = KL(Q P) P(Z, X ) Conjugae-Exponenal models Le s focus on conjugae-exponenal (CE) laen-varable models: Condon (). he jon probably over varables s n he exponenal famly: { } P(Z, X ) = f (Z, X ) g() exp φ() (Z, X ) Conjugae-Exponenal examples In he CE famly: Gaussan mxures facor analyss, probablsc PCA hdden Markov models and facoral HMMs where φ() s he vecor of naural parameers, are suffcen sascs lnear dynamcal sysems and swchng models Condon (). he pror over parameers s conjugae o hs jon probably: { } P( ν, τ) = h(ν, τ) g() ν exp φ() τ where ν and τ are hyperparameers of he pror. dscree-varable belef neworks Oher as ye undream-of models combnaons of Gaussan, Gamma, Posson, Drchle, Wshar, Mulnomal and ohers. No n he CE famly: Bolzmann machnes, MRFs (no smple conjugacy) Conjugae prors are compuaonally convenen and have an nuve nerpreaon: ν: number of pseudo-observaons τ : values of pseudo-observaons logsc regresson (no smple conjugacy) sgmod belef neworks (no exponenal) ndependen componens analyss (no exponenal) Noe: one can ofen approxmae such models wh a suable choce from he CE famly.

7 Conjugae-exponenal VB he Varaonal Bayesan EM algorhm Gven an d daa se D = (x,... x n), f he model s CE hen: Q () s also conjugae,.e. Q () P() exp = h(ν, τ )g() ν e φ() τ h( ν, τ )g() ν e φ() τ log P(z, x ) g() n e Q Z log f (Z,X ) Q Z e φ() wh ν = ν + n and τ = τ + (z, x ) QZ only need o rack ν, τ. (z,x ) Q Z EM for MAP esmaon Goal: maxmze P( X, m) wr E Sep: compue Q Z(Z) p(z X, ) M Sep: argmax Properes: dz Q Z(Z) log P(Z, X, ) Varaonal Bayesan EM Goal: maxmse bound on P(X m) wr Q VB-E Sep: compue Q Z(Z) p(z X, φ) VB-M Sep: Q () exp dz Q Z(Z) log P(Z, X, ) Q Z(Z) = n = Qz (z) akes he same form as n he E-sep of regular EM Q z (z ) exp log P(z, x ) Q f (z, x )e φ() Q (z,x ) = P(z x, φ()) wh naural parameers φ() = φ() Q nference unchanged from regular EM. Reduces o he EM algorhm f Q () = δ( ). F m ncreases monooncally, and ncorporaes he model complexy penaly. Analycal parameer dsrbuons (bu no consraned o be Gaussan). VB-E sep has same complexy as correspondng E sep. We can use he juncon ree, belef propagaon, Kalman fler, ec, algorhms n he VB-E sep of VB-EM, bu usng expeced naural parameers, φ. VB and model selecon Varaonal Bayesan EM yelds an approxmae poseror Q over model parameers. I also yelds an opmsed lower bound on he model evdence max F M(Q Z, Q ) P(D M) hese lower bounds can be compared amongs models o learn he rgh (srucure, connecvy... of he) model If a connuous doman of models s specfed by a hyperparameer η, hen he VB free energy depends on ha parameer: P(X, Z, η) F(Q Z, Q, η) = dz d Q Z(Z)Q () log P(X η) Q Z(Z)Q () A hyper-m sep maxmses he curren bound wr η: η argmax dz d Q Z(Z)Q () log P(X, Z, η) η ARD for unsupervsed learnng Recall ha ARD (auomac relevance deermnaon) was a hyperparameer mehod o selec relevan or useful npus n regresson. A smlar dea used wh varaonal Bayesan mehods can learn a laen dmensonaly. Consder facor analyss: x N (Λz, Ψ) z N (0, I) wh a column-wse pror Λ : N ( 0, α I ) he VB free energy s F(Q Z(Z), Q Λ (Λ), Ψ, α) = log P(X, Z Λ, Ψ) + log P(Λ α) + log P(Ψ) Q Z Q Λ +... and so hyperparameer opmsaon requres α argmax log P(Λ α) QΛ Now Q Λ s Gaussan, wh he same form as n lnear regresson, bu wh expeced momens of z appearng n place of he npus. Opmsaon wr he dsrbuons, Ψ and α n urn causes some α o dverge as n regresson ARD. In hs case, hese parameers selec relevan laen dmensons, effecvely learnng he dmensonaly of z.

8 Augmened Varaonal Mehods Sparse GP approxmaons In our examples so far, he approxmae varaonal dsrbuon has been over he naural laen varables (and parameers) of he generave model. GP predcons: y X, Y, x N ( K x X (K XX + σ I) Y, K x x K x X (K XX+σ I) K Xx + σ ) Evdence (for learnng kernel hyperparameers): log P(Y X) = log π(kxx + σ I) Y (KXX + σ I) Y Somemes may be useful o nroduce addonal laen varables, solely o acheve compuaonal racably. Compung eher form requres nverng he N N marx K XX, n O(N ) me. One proposal o make hs more effcen s o fnd (or selec) a smaller se of possbly fcous measuremens U a npus Z such ha wo examples are GP regresson and he GPLVM. P(y Z, U, x ) P(y X, Y, x ). Wha values should U and Z ake? Varaonal Sparse GP approxmaons Wre F for he (smooh) GP funcon values ha underle Y (so Y N ( F, σ I ) ). Inroduce laen measuremens U a npus Z (and negrae over U). he lkelhood can be wren P(Y X) = df du P(Y, F, U X, Z) = df du P(Y F)P(F U, X, Z)P(U Z) Varaonal Sparse GP approxmaons F(q(U),, Z) = log P(Y F) P(F U) + log P(U Z) log q(u) q(u) Now P(F U) s fxed by he generave model (raher han beng subjec o free opmsaon). So we can evaluae ha expecaon: Now, boh U and F are laen, so we nroduce a varaonal dsrbuon q(f, U) o form a free-energy. F(q(F, U), ) = log P(Y F)P(F U, X, Z)P(U Z) q(f, U) q(f,u) Now, choose he varaonal form q(f, U) = P(F U, X, Z)q(U). ha s, fx F U whou reference o Y so nformaon abou Y wll need o be compressed no q(u). hen F(q(F, U),, Z) = log = P(Y F) P(F U, X, Z) P(U Z) P(F U, X, Z) q(u) P(F U)q(U) log P(Y F) P(F U) + log P(U Z) log q(u) q(u) So, log P(Y F) P(F U) = log πσ I [(Y σ r F)(Y F) P(F U) = log πσ I [ σ r (Y F P(F U) )(Y F P(F U) ) σ r [ Σ F U = log N ( Y K XZ K ZZ U, σ I ) σ r [ K XX K XZ K ZZ K ZX F(q(U),, Z) = log N ( Y K XZ K ZZ U, σ I ) + log P(U Z) log q(u) q(u) σ r [ K XX K XZ K ZZ K ZX.

9 Varaonal Sparse GP approxmaons A few references F(q(U),, Z) = log N ( Y K XZ K U, ZZ σ I ) P(U Z) q(u) q(u) σ r [ K XX K XZ K ZZ K ZX. he expecaon s he free energy of a PPCA-lke model wh normal pror U N (0, K ZZ ) and loadng marx K XZ K ZZ. he maxmum of hs free energy s he log-lkelhood (acheved wh q equal o he poseror under he PPCA-lke model). hs gves F(q (U),, Z) = log N ( Y 0, K XZ K ZZ K ZZ K ZZ K ZX + σ I ) σ r [ K XX K XZ K ZZ K ZX. Noe ha we have elmnaed all erms n K XX. We can opmse he free energy numercally wh respec o Z and o adjus he GP pror and qualy of varaonal approxmaon. A smlar approach can be used o learn X f hey are unobserved (.e. n he GPLVM). Assume q(x, F, U) = q(x)p(f X, U)q(U). hen F = log P(Y, F, U X) log P(X) q(u)q(x) whch smplfes no racable componens n much he same way as above. Jordan, Ghahraman, Jaakkola, Saul, 999. An nroducon o varaonal mehods for graphcal models. Machne Learnng 7:8. Aas, 000. A varaonal Bayesan framework for graphcal models. NIPS. hp:// Beal, 00. Varaonal algorhms for approxmae Bayesan nference. PhD hess, Gasby Un, UCL. hp:// Wnn, 00. Varaonal message passng and s applcaons. PhD hess, Cambrdge. hp://johnwnn.org/publcaons/hess.hml; also VIBES sofware for conjugae-exponenal graphs. Some complexes: MacKay, 00. A problem wh varaonal free energy mnmzaon. hp:// urner, MS, 0. wo problems wh varaonal expecaon maxmsaon for me-seres models. In Barber, Cemgl, Chappa, eds., Bayesan me Seres Models. hp:// Berkes, urner, MS, 008. On sparsy and overcompleeness n mage models. NIPS 0. hp:// Gordano, R, Broderck,, and Jordan, MI, 05. Lnear response mehods for accurae covarance esmaes from mean feld varaonal Bayes. NIPS

Clustering (Bishop ch 9)

Clustering (Bishop ch 9) Cluserng (Bshop ch 9) Reference: Daa Mnng by Margare Dunham (a slde source) 1 Cluserng Cluserng s unsupervsed learnng, here are no class labels Wan o fnd groups of smlar nsances Ofen use a dsance measure

More information

Fall 2010 Graduate Course on Dynamic Learning

Fall 2010 Graduate Course on Dynamic Learning Fall 200 Graduae Course on Dynamc Learnng Chaper 4: Parcle Flers Sepember 27, 200 Byoung-Tak Zhang School of Compuer Scence and Engneerng & Cognve Scence and Bran Scence Programs Seoul aonal Unversy hp://b.snu.ac.kr/~bzhang/

More information

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model Probablsc Model for Tme-seres Daa: Hdden Markov Model Hrosh Mamsuka Bonformacs Cener Kyoo Unversy Oulne Three Problems for probablsc models n machne learnng. Compung lkelhood 2. Learnng 3. Parsng (predcon

More information

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany Herarchcal Markov Normal Mxure models wh Applcaons o Fnancal Asse Reurns Appendx: Proofs of Theorems and Condonal Poseror Dsrbuons John Geweke a and Gann Amsano b a Deparmens of Economcs and Sascs, Unversy

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4 CS434a/54a: Paern Recognon Prof. Olga Veksler Lecure 4 Oulne Normal Random Varable Properes Dscrmnan funcons Why Normal Random Varables? Analycally racable Works well when observaon comes form a corruped

More information

An introduction to Support Vector Machine

An introduction to Support Vector Machine An nroducon o Suppor Vecor Machne 報告者 : 黃立德 References: Smon Haykn, "Neural Neworks: a comprehensve foundaon, second edon, 999, Chaper 2,6 Nello Chrsann, John Shawe-Tayer, An Inroducon o Suppor Vecor Machnes,

More information

Lecture 6: Learning for Control (Generalised Linear Regression)

Lecture 6: Learning for Control (Generalised Linear Regression) Lecure 6: Learnng for Conrol (Generalsed Lnear Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure 6: RLSC - Prof. Sehu Vjayakumar Lnear Regresson

More information

Fitting a Conditional Linear Gaussian Distribution

Fitting a Conditional Linear Gaussian Distribution Fng a Condonal Lnear Gaussan Dsrbuon Kevn P. Murphy 28 Ocober 1998 Revsed 29 January 2003 1 Inroducon We consder he problem of fndng he maxmum lkelhood ML esmaes of he parameers of a condonal Gaussan varable

More information

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance INF 43 3.. Repeon Anne Solberg (anne@f.uo.no Bayes rule for a classfcaon problem Suppose we have J, =,...J classes. s he class label for a pxel, and x s he observed feaure vecor. We can use Bayes rule

More information

Machine Learning 2nd Edition

Machine Learning 2nd Edition INTRODUCTION TO Lecure Sldes for Machne Learnng nd Edon ETHEM ALPAYDIN, modfed by Leonardo Bobadlla and some pars from hp://www.cs.au.ac.l/~aparzn/machnelearnng/ The MIT Press, 00 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/mle

More information

( ) () we define the interaction representation by the unitary transformation () = ()

( ) () we define the interaction representation by the unitary transformation () = () Hgher Order Perurbaon Theory Mchael Fowler 3/7/6 The neracon Represenaon Recall ha n he frs par of hs course sequence, we dscussed he chrödnger and Hesenberg represenaons of quanum mechancs here n he chrödnger

More information

CS286.2 Lecture 14: Quantum de Finetti Theorems II

CS286.2 Lecture 14: Quantum de Finetti Theorems II CS286.2 Lecure 14: Quanum de Fne Theorems II Scrbe: Mara Okounkova 1 Saemen of he heorem Recall he las saemen of he quanum de Fne heorem from he prevous lecure. Theorem 1 Quanum de Fne). Le ρ Dens C 2

More information

Variants of Pegasos. December 11, 2009

Variants of Pegasos. December 11, 2009 Inroducon Varans of Pegasos SooWoong Ryu bshboy@sanford.edu December, 009 Youngsoo Cho yc344@sanford.edu Developng a new SVM algorhm s ongong research opc. Among many exng SVM algorhms, we wll focus on

More information

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms Course organzaon Inroducon Wee -2) Course nroducon A bref nroducon o molecular bology A bref nroducon o sequence comparson Par I: Algorhms for Sequence Analyss Wee 3-8) Chaper -3, Models and heores» Probably

More information

Linear Response Theory: The connection between QFT and experiments

Linear Response Theory: The connection between QFT and experiments Phys540.nb 39 3 Lnear Response Theory: The connecon beween QFT and expermens 3.1. Basc conceps and deas Q: ow do we measure he conducvy of a meal? A: we frs nroduce a weak elecrc feld E, and hen measure

More information

Lecture VI Regression

Lecture VI Regression Lecure VI Regresson (Lnear Mehods for Regresson) Conens: Lnear Mehods for Regresson Leas Squares, Gauss Markov heorem Recursve Leas Squares Lecure VI: MLSC - Dr. Sehu Vjayakumar Lnear Regresson Model M

More information

Advanced Machine Learning & Perception

Advanced Machine Learning & Perception Advanced Machne Learnng & Percepon Insrucor: Tony Jebara SVM Feaure & Kernel Selecon SVM Eensons Feaure Selecon (Flerng and Wrappng) SVM Feaure Selecon SVM Kernel Selecon SVM Eensons Classfcaon Feaure/Kernel

More information

EM and Structure Learning

EM and Structure Learning EM and Structure Learnng Le Song Machne Learnng II: Advanced Topcs CSE 8803ML, Sprng 2012 Partally observed graphcal models Mxture Models N(μ 1, Σ 1 ) Z X N N(μ 2, Σ 2 ) 2 Gaussan mxture model Consder

More information

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL

DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL DEEP UNFOLDING FOR MULTICHANNEL SOURCE SEPARATION SUPPLEMENTARY MATERIAL Sco Wsdom, John Hershey 2, Jonahan Le Roux 2, and Shnj Waanabe 2 Deparmen o Elecrcal Engneerng, Unversy o Washngon, Seale, WA, USA

More information

CHAPTER 10: LINEAR DISCRIMINATION

CHAPTER 10: LINEAR DISCRIMINATION CHAPER : LINEAR DISCRIMINAION Dscrmnan-based Classfcaon 3 In classfcaon h K classes (C,C,, C k ) We defned dscrmnan funcon g j (), j=,,,k hen gven an es eample, e chose (predced) s class label as C f g

More information

FTCS Solution to the Heat Equation

FTCS Solution to the Heat Equation FTCS Soluon o he Hea Equaon ME 448/548 Noes Gerald Reckenwald Porland Sae Unversy Deparmen of Mechancal Engneerng gerry@pdxedu ME 448/548: FTCS Soluon o he Hea Equaon Overvew Use he forward fne d erence

More information

Let s treat the problem of the response of a system to an applied external force. Again,

Let s treat the problem of the response of a system to an applied external force. Again, Page 33 QUANTUM LNEAR RESPONSE FUNCTON Le s rea he problem of he response of a sysem o an appled exernal force. Agan, H() H f () A H + V () Exernal agen acng on nernal varable Hamlonan for equlbrum sysem

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 9 Hamlonan Equaons of Moon (Chaper 8) Wha We Dd Las Tme Consruced Hamlonan formalsm H ( q, p, ) = q p L( q, q, ) H p = q H q = p H = L Equvalen o Lagrangan formalsm Smpler, bu

More information

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model

( t) Outline of program: BGC1: Survival and event history analysis Oslo, March-May Recapitulation. The additive regression model BGC1: Survval and even hsory analyss Oslo, March-May 212 Monday May 7h and Tuesday May 8h The addve regresson model Ørnulf Borgan Deparmen of Mahemacs Unversy of Oslo Oulne of program: Recapulaon Counng

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 9 Hamlonan Equaons of Moon (Chaper 8) Wha We Dd Las Tme Consruced Hamlonan formalsm Hqp (,,) = qp Lqq (,,) H p = q H q = p H L = Equvalen o Lagrangan formalsm Smpler, bu wce as

More information

Math 128b Project. Jude Yuen

Math 128b Project. Jude Yuen Mah 8b Proec Jude Yuen . Inroducon Le { Z } be a sequence of observed ndependen vecor varables. If he elemens of Z have a on normal dsrbuon hen { Z } has a mean vecor Z and a varancecovarance marx z. Geomercally

More information

( ) [ ] MAP Decision Rule

( ) [ ] MAP Decision Rule Announcemens Bayes Decson Theory wh Normal Dsrbuons HW0 due oday HW o be assgned soon Proec descrpon posed Bomercs CSE 90 Lecure 4 CSE90, Sprng 04 CSE90, Sprng 04 Key Probables 4 ω class label X feaure

More information

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS

V.Abramov - FURTHER ANALYSIS OF CONFIDENCE INTERVALS FOR LARGE CLIENT/SERVER COMPUTER NETWORKS R&RATA # Vol.) 8, March FURTHER AALYSIS OF COFIDECE ITERVALS FOR LARGE CLIET/SERVER COMPUTER ETWORKS Vyacheslav Abramov School of Mahemacal Scences, Monash Unversy, Buldng 8, Level 4, Clayon Campus, Wellngon

More information

Machine Learning Linear Regression

Machine Learning Linear Regression Machne Learnng Lnear Regresson Lesson 3 Lnear Regresson Bascs of Regresson Leas Squares esmaon Polynomal Regresson Bass funcons Regresson model Regularzed Regresson Sascal Regresson Mamum Lkelhood (ML)

More information

Chapter Lagrangian Interpolation

Chapter Lagrangian Interpolation Chaper 5.4 agrangan Inerpolaon Afer readng hs chaper you should be able o:. dere agrangan mehod of nerpolaon. sole problems usng agrangan mehod of nerpolaon and. use agrangan nerpolans o fnd deraes and

More information

Discrete Markov Process. Introduction. Example: Balls and Urns. Stochastic Automaton. INTRODUCTION TO Machine Learning 3rd Edition

Discrete Markov Process. Introduction. Example: Balls and Urns. Stochastic Automaton. INTRODUCTION TO Machine Learning 3rd Edition EHEM ALPAYDI he MI Press, 04 Lecure Sldes for IRODUCIO O Machne Learnng 3rd Edon alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/ml3e Sldes from exboo resource page. Slghly eded and wh addonal examples

More information

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!") i+1,q - [(!

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!) i+1,q - [(! ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL The frs hng o es n wo-way ANOVA: Is here neracon? "No neracon" means: The man effecs model would f. Ths n urn means: In he neracon plo (wh A on he horzonal

More information

CHAPTER 5: MULTIVARIATE METHODS

CHAPTER 5: MULTIVARIATE METHODS CHAPER 5: MULIVARIAE MEHODS Mulvarae Daa 3 Mulple measuremens (sensors) npus/feaures/arbues: -varae N nsances/observaons/eamples Each row s an eample Each column represens a feaure X a b correspons o he

More information

Filtrage particulaire et suivi multi-pistes Carine Hue Jean-Pierre Le Cadre and Patrick Pérez

Filtrage particulaire et suivi multi-pistes Carine Hue Jean-Pierre Le Cadre and Patrick Pérez Chaînes de Markov cachées e flrage parculare 2-22 anver 2002 Flrage parculare e suv mul-pses Carne Hue Jean-Perre Le Cadre and Parck Pérez Conex Applcaons: Sgnal processng: arge rackng bearngs-onl rackng

More information

Mechanics Physics 151

Mechanics Physics 151 Mechancs Physcs 5 Lecure 0 Canoncal Transformaons (Chaper 9) Wha We Dd Las Tme Hamlon s Prncple n he Hamlonan formalsm Dervaon was smple δi δ Addonal end-pon consrans pq H( q, p, ) d 0 δ q ( ) δq ( ) δ

More information

Robustness Experiments with Two Variance Components

Robustness Experiments with Two Variance Components Naonal Insue of Sandards and Technology (NIST) Informaon Technology Laboraory (ITL) Sascal Engneerng Dvson (SED) Robusness Expermens wh Two Varance Componens by Ana Ivelsse Avlés avles@ns.gov Conference

More information

WiH Wei He

WiH Wei He Sysem Idenfcaon of onlnear Sae-Space Space Baery odels WH We He wehe@calce.umd.edu Advsor: Dr. Chaochao Chen Deparmen of echancal Engneerng Unversy of aryland, College Par 1 Unversy of aryland Bacground

More information

Kernel-Based Bayesian Filtering for Object Tracking

Kernel-Based Bayesian Filtering for Object Tracking Kernel-Based Bayesan Flerng for Objec Trackng Bohyung Han Yng Zhu Dorn Comancu Larry Davs Dep. of Compuer Scence Real-Tme Vson and Modelng Inegraed Daa and Sysems Unversy of Maryland Semens Corporae Research

More information

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment

EEL 6266 Power System Operation and Control. Chapter 5 Unit Commitment EEL 6266 Power Sysem Operaon and Conrol Chaper 5 Un Commmen Dynamc programmng chef advanage over enumeraon schemes s he reducon n he dmensonaly of he problem n a src prory order scheme, here are only N

More information

On One Analytic Method of. Constructing Program Controls

On One Analytic Method of. Constructing Program Controls Appled Mahemacal Scences, Vol. 9, 05, no. 8, 409-407 HIKARI Ld, www.m-hkar.com hp://dx.do.org/0.988/ams.05.54349 On One Analyc Mehod of Consrucng Program Conrols A. N. Kvko, S. V. Chsyakov and Yu. E. Balyna

More information

Lecture 2 M/G/1 queues. M/G/1-queue

Lecture 2 M/G/1 queues. M/G/1-queue Lecure M/G/ queues M/G/-queue Posson arrval process Arbrary servce me dsrbuon Sngle server To deermne he sae of he sysem a me, we mus now The number of cusomers n he sysems N() Tme ha he cusomer currenly

More information

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005 Dynamc Team Decson Theory EECS 558 Proec Shruvandana Sharma and Davd Shuman December 0, 005 Oulne Inroducon o Team Decson Theory Decomposon of he Dynamc Team Decson Problem Equvalence of Sac and Dynamc

More information

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University Hdden Markov Models Followng a lecure by Andrew W. Moore Carnege Mellon Unversy www.cs.cmu.edu/~awm/uorals A Markov Sysem Has N saes, called s, s 2.. s N s 2 There are dscree meseps, 0,, s s 3 N 3 0 Hdden

More information

Chapter 6: AC Circuits

Chapter 6: AC Circuits Chaper 6: AC Crcus Chaper 6: Oulne Phasors and he AC Seady Sae AC Crcus A sable, lnear crcu operang n he seady sae wh snusodal excaon (.e., snusodal seady sae. Complee response forced response naural response.

More information

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. number of linearly independent eigenvectors associated with this eigenvalue. Lnear Algebra Lecure # Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons

More information

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6)

Econ107 Applied Econometrics Topic 5: Specification: Choosing Independent Variables (Studenmund, Chapter 6) Econ7 Appled Economercs Topc 5: Specfcaon: Choosng Independen Varables (Sudenmund, Chaper 6 Specfcaon errors ha we wll deal wh: wrong ndependen varable; wrong funconal form. Ths lecure deals wh wrong ndependen

More information

Solution in semi infinite diffusion couples (error function analysis)

Solution in semi infinite diffusion couples (error function analysis) Soluon n sem nfne dffuson couples (error funcon analyss) Le us consder now he sem nfne dffuson couple of wo blocks wh concenraon of and I means ha, n a A- bnary sysem, s bondng beween wo blocks made of

More information

Department of Economics University of Toronto

Department of Economics University of Toronto Deparmen of Economcs Unversy of Torono ECO408F M.A. Economercs Lecure Noes on Heeroskedascy Heeroskedascy o Ths lecure nvolves lookng a modfcaons we need o make o deal wh he regresson model when some of

More information

Graduate Macroeconomics 2 Problem set 5. - Solutions

Graduate Macroeconomics 2 Problem set 5. - Solutions Graduae Macroeconomcs 2 Problem se. - Soluons Queson 1 To answer hs queson we need he frms frs order condons and he equaon ha deermnes he number of frms n equlbrum. The frms frs order condons are: F K

More information

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas)

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas) Lecure 8: The Lalace Transform (See Secons 88- and 47 n Boas) Recall ha our bg-cure goal s he analyss of he dfferenal equaon, ax bx cx F, where we emloy varous exansons for he drvng funcon F deendng on

More information

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study)

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study) Inernaonal Mahemacal Forum, Vol. 8, 3, no., 7 - HIKARI Ld, www.m-hkar.com hp://dx.do.org/.988/mf.3.3488 New M-Esmaor Objecve Funcon n Smulaneous Equaons Model (A Comparave Sudy) Ahmed H. Youssef Professor

More information

FI 3103 Quantum Physics

FI 3103 Quantum Physics /9/4 FI 33 Quanum Physcs Aleander A. Iskandar Physcs of Magnesm and Phooncs Research Grou Insu Teknolog Bandung Basc Conces n Quanum Physcs Probably and Eecaon Value Hesenberg Uncerany Prncle Wave Funcon

More information

Lecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press,

Lecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press, Lecure Sldes for INTRDUCTIN T Machne Learnng ETHEM ALAYDIN The MIT ress, 2004 alpaydn@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/2ml CHATER 3: Hdden Marov Models Inroducon Modelng dependences n npu; no

More information

Part II CONTINUOUS TIME STOCHASTIC PROCESSES

Part II CONTINUOUS TIME STOCHASTIC PROCESSES Par II CONTINUOUS TIME STOCHASTIC PROCESSES 4 Chaper 4 For an advanced analyss of he properes of he Wener process, see: Revus D and Yor M: Connuous marngales and Brownan Moon Karazas I and Shreve S E:

More information

Lecture 11 SVM cont

Lecture 11 SVM cont Lecure SVM con. 0 008 Wha we have done so far We have esalshed ha we wan o fnd a lnear decson oundary whose margn s he larges We know how o measure he margn of a lnear decson oundary Tha s: he mnmum geomerc

More information

グラフィカルモデルによる推論 確率伝搬法 (2) Kenji Fukumizu The Institute of Statistical Mathematics 計算推論科学概論 II (2010 年度, 後期 )

グラフィカルモデルによる推論 確率伝搬法 (2) Kenji Fukumizu The Institute of Statistical Mathematics 計算推論科学概論 II (2010 年度, 後期 ) グラフィカルモデルによる推論 確率伝搬法 Kenj Fukuzu he Insue of Sascal Maheacs 計算推論科学概論 II 年度 後期 Inference on Hdden Markov Model Inference on Hdden Markov Model Revew: HMM odel : hdden sae fne Inference Coue... for any Naïve

More information

The Finite Element Method for the Analysis of Non-Linear and Dynamic Systems

The Finite Element Method for the Analysis of Non-Linear and Dynamic Systems Swss Federal Insue of Page 1 The Fne Elemen Mehod for he Analyss of Non-Lnear and Dynamc Sysems Prof. Dr. Mchael Havbro Faber Dr. Nebojsa Mojslovc Swss Federal Insue of ETH Zurch, Swzerland Mehod of Fne

More information

Density Matrix Description of NMR BCMB/CHEM 8190

Density Matrix Description of NMR BCMB/CHEM 8190 Densy Marx Descrpon of NMR BCMBCHEM 89 Operaors n Marx Noaon Alernae approach o second order specra: ask abou x magnezaon nsead of energes and ranson probables. If we say wh one bass se, properes vary

More information

Dual Approximate Dynamic Programming for Large Scale Hydro Valleys

Dual Approximate Dynamic Programming for Large Scale Hydro Valleys Dual Approxmae Dynamc Programmng for Large Scale Hydro Valleys Perre Carpener and Jean-Phlppe Chanceler 1 ENSTA ParsTech and ENPC ParsTech CMM Workshop, January 2016 1 Jon work wh J.-C. Alas, suppored

More information

Volatility Interpolation

Volatility Interpolation Volaly Inerpolaon Prelmnary Verson March 00 Jesper Andreasen and Bran Huge Danse Mares, Copenhagen wan.daddy@danseban.com brno@danseban.com Elecronc copy avalable a: hp://ssrn.com/absrac=69497 Inro Local

More information

Hidden Markov Models

Hidden Markov Models 11-755 Machne Learnng for Sgnal Processng Hdden Markov Models Class 15. 12 Oc 2010 1 Admnsrva HW2 due Tuesday Is everyone on he projecs page? Where are your projec proposals? 2 Recap: Wha s an HMM Probablsc

More information

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue.

. The geometric multiplicity is dim[ker( λi. A )], i.e. the number of linearly independent eigenvectors associated with this eigenvalue. Mah E-b Lecure #0 Noes We connue wh he dscusson of egenvalues, egenvecors, and dagonalzably of marces We wan o know, n parcular wha condons wll assure ha a marx can be dagonalzed and wha he obsrucons are

More information

Computing Relevance, Similarity: The Vector Space Model

Computing Relevance, Similarity: The Vector Space Model Compung Relevance, Smlary: The Vecor Space Model Based on Larson and Hears s sldes a UC-Bereley hp://.sms.bereley.edu/courses/s0/f00/ aabase Managemen Sysems, R. Ramarshnan ocumen Vecors v ocumens are

More information

Single-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method

Single-loop System Reliability-Based Design & Topology Optimization (SRBDO/SRBTO): A Matrix-based System Reliability (MSR) Method 10 h US Naonal Congress on Compuaonal Mechancs Columbus, Oho 16-19, 2009 Sngle-loop Sysem Relably-Based Desgn & Topology Opmzaon (SRBDO/SRBTO): A Marx-based Sysem Relably (MSR) Mehod Tam Nguyen, Junho

More information

Tools for Analysis of Accelerated Life and Degradation Test Data

Tools for Analysis of Accelerated Life and Degradation Test Data Acceleraed Sress Tesng and Relably Tools for Analyss of Acceleraed Lfe and Degradaon Tes Daa Presened by: Reuel Smh Unversy of Maryland College Park smhrc@umd.edu Sepember-5-6 Sepember 28-30 206, Pensacola

More information

Panel Data Regression Models

Panel Data Regression Models Panel Daa Regresson Models Wha s Panel Daa? () Mulple dmensoned Dmensons, e.g., cross-secon and me node-o-node (c) Pongsa Pornchawseskul, Faculy of Economcs, Chulalongkorn Unversy (c) Pongsa Pornchawseskul,

More information

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD

HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Journal of Appled Mahemacs and Compuaonal Mechancs 3, (), 45-5 HEAT CONDUCTION PROBLEM IN A TWO-LAYERED HOLLOW CYLINDER BY USING THE GREEN S FUNCTION METHOD Sansław Kukla, Urszula Sedlecka Insue of Mahemacs,

More information

Notes on the stability of dynamic systems and the use of Eigen Values.

Notes on the stability of dynamic systems and the use of Eigen Values. Noes on he sabl of dnamc ssems and he use of Egen Values. Source: Macro II course noes, Dr. Davd Bessler s Tme Seres course noes, zarads (999) Ineremporal Macroeconomcs chaper 4 & Techncal ppend, and Hamlon

More information

GMM parameter estimation. Xiaoye Lu CMPS290c Final Project

GMM parameter estimation. Xiaoye Lu CMPS290c Final Project GMM paraeer esaon Xaoye Lu M290c Fnal rojec GMM nroducon Gaussan ure Model obnaon of several gaussan coponens Noaon: For each Gaussan dsrbuon:, s he ean and covarance ar. A GMM h ures(coponens): p ( 2π

More information

January Examinations 2012

January Examinations 2012 Page of 5 EC79 January Examnaons No. of Pages: 5 No. of Quesons: 8 Subjec ECONOMICS (POSTGRADUATE) Tle of Paper EC79 QUANTITATIVE METHODS FOR BUSINESS AND FINANCE Tme Allowed Two Hours ( hours) Insrucons

More information

Normal Random Variable and its discriminant functions

Normal Random Variable and its discriminant functions Noral Rando Varable and s dscrnan funcons Oulne Noral Rando Varable Properes Dscrnan funcons Why Noral Rando Varables? Analycally racable Works well when observaon coes for a corruped snle prooype 3 The

More information

CH.3. COMPATIBILITY EQUATIONS. Continuum Mechanics Course (MMC) - ETSECCPB - UPC

CH.3. COMPATIBILITY EQUATIONS. Continuum Mechanics Course (MMC) - ETSECCPB - UPC CH.3. COMPATIBILITY EQUATIONS Connuum Mechancs Course (MMC) - ETSECCPB - UPC Overvew Compably Condons Compably Equaons of a Poenal Vecor Feld Compably Condons for Infnesmal Srans Inegraon of he Infnesmal

More information

Density Matrix Description of NMR BCMB/CHEM 8190

Density Matrix Description of NMR BCMB/CHEM 8190 Densy Marx Descrpon of NMR BCMBCHEM 89 Operaors n Marx Noaon If we say wh one bass se, properes vary only because of changes n he coeffcens weghng each bass se funcon x = h< Ix > - hs s how we calculae

More information

[ ] 2. [ ]3 + (Δx i + Δx i 1 ) / 2. Δx i-1 Δx i Δx i+1. TPG4160 Reservoir Simulation 2018 Lecture note 3. page 1 of 5

[ ] 2. [ ]3 + (Δx i + Δx i 1 ) / 2. Δx i-1 Δx i Δx i+1. TPG4160 Reservoir Simulation 2018 Lecture note 3. page 1 of 5 TPG460 Reservor Smulaon 08 page of 5 DISCRETIZATIO OF THE FOW EQUATIOS As we already have seen, fne dfference appromaons of he paral dervaves appearng n he flow equaons may be obaned from Taylor seres

More information

P R = P 0. The system is shown on the next figure:

P R = P 0. The system is shown on the next figure: TPG460 Reservor Smulaon 08 page of INTRODUCTION TO RESERVOIR SIMULATION Analycal and numercal soluons of smple one-dmensonal, one-phase flow equaons As an nroducon o reservor smulaon, we wll revew he smples

More information

Performance Analysis for a Network having Standby Redundant Unit with Waiting in Repair

Performance Analysis for a Network having Standby Redundant Unit with Waiting in Repair TECHNI Inernaonal Journal of Compung Scence Communcaon Technologes VOL.5 NO. July 22 (ISSN 974-3375 erformance nalyss for a Nework havng Sby edundan Un wh ang n epar Jendra Sngh 2 abns orwal 2 Deparmen

More information

Clustering with Gaussian Mixtures

Clustering with Gaussian Mixtures Noe o oher eachers and users of hese sldes. Andrew would be delghed f you found hs source maeral useful n gvng your own lecures. Feel free o use hese sldes verbam, or o modfy hem o f your own needs. PowerPon

More information

CHAPTER 7: CLUSTERING

CHAPTER 7: CLUSTERING CHAPTER 7: CLUSTERING Semparamerc Densy Esmaon 3 Paramerc: Assume a snge mode for p ( C ) (Chapers 4 and 5) Semparamerc: p ( C ) s a mure of denses Mupe possbe epanaons/prooypes: Dfferen handwrng syes,

More information

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore. Ths documen s downloaded from DR-NTU, Nanyang Technologcal Unversy Lbrary, Sngapore. Tle A smplfed verb machng algorhm for word paron n vsual speech processng( Acceped verson ) Auhor(s) Foo, Say We; Yong,

More information

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering CS 536: Machne Learnng Nonparamerc Densy Esmaon Unsupervsed Learnng - Cluserng Fall 2005 Ahmed Elgammal Dep of Compuer Scence Rugers Unversy CS 536 Densy Esmaon - Cluserng - 1 Oulnes Densy esmaon Nonparamerc

More information

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s

Ordinary Differential Equations in Neuroscience with Matlab examples. Aim 1- Gain understanding of how to set up and solve ODE s Ordnary Dfferenal Equaons n Neuroscence wh Malab eamples. Am - Gan undersandng of how o se up and solve ODE s Am Undersand how o se up an solve a smple eample of he Hebb rule n D Our goal a end of class

More information

Bayesian Inference of the GARCH model with Rational Errors

Bayesian Inference of the GARCH model with Rational Errors 0 Inernaonal Conference on Economcs, Busness and Markeng Managemen IPEDR vol.9 (0) (0) IACSIT Press, Sngapore Bayesan Inference of he GARCH model wh Raonal Errors Tesuya Takash + and Tng Tng Chen Hroshma

More information

Supervised Learning in Multilayer Networks

Supervised Learning in Multilayer Networks Copyrgh Cambrdge Unversy Press 23. On-screen vewng permed. Prnng no permed. hp://www.cambrdge.org/521642981 You can buy hs book for 3 pounds or $5. See hp://www.nference.phy.cam.ac.uk/mackay/la/ for lnks.

More information

Advanced time-series analysis (University of Lund, Economic History Department)

Advanced time-series analysis (University of Lund, Economic History Department) Advanced me-seres analss (Unvers of Lund, Economc Hsor Dearmen) 3 Jan-3 Februar and 6-3 March Lecure 4 Economerc echnues for saonar seres : Unvarae sochasc models wh Box- Jenns mehodolog, smle forecasng

More information

Foundations of State Estimation Part II

Foundations of State Estimation Part II Foundaons of Sae Esmaon Par II Tocs: Hdden Markov Models Parcle Flers Addonal readng: L.R. Rabner, A uoral on hdden Markov models," Proceedngs of he IEEE, vol. 77,. 57-86, 989. Sequenal Mone Carlo Mehods

More information

Existence and Uniqueness Results for Random Impulsive Integro-Differential Equation

Existence and Uniqueness Results for Random Impulsive Integro-Differential Equation Global Journal of Pure and Appled Mahemacs. ISSN 973-768 Volume 4, Number 6 (8), pp. 89-87 Research Inda Publcaons hp://www.rpublcaon.com Exsence and Unqueness Resuls for Random Impulsve Inegro-Dfferenal

More information

First-order piecewise-linear dynamic circuits

First-order piecewise-linear dynamic circuits Frs-order pecewse-lnear dynamc crcus. Fndng he soluon We wll sudy rs-order dynamc crcus composed o a nonlnear resse one-por, ermnaed eher by a lnear capacor or a lnear nducor (see Fg.. Nonlnear resse one-por

More information

Digital Speech Processing Lecture 20. The Hidden Markov Model (HMM)

Digital Speech Processing Lecture 20. The Hidden Markov Model (HMM) Dgal Speech Processng Lecure 20 The Hdden Markov Model (HMM) Lecure Oulne Theory of Markov Models dscree Markov processes hdden Markov processes Soluons o he Three Basc Problems of HMM s compuaon of observaon

More information

A HIERARCHICAL KALMAN FILTER

A HIERARCHICAL KALMAN FILTER A HIERARCHICAL KALMAN FILER Greg aylor aylor Fry Consulng Acuares Level 8, 3 Clarence Sree Sydney NSW Ausrala Professoral Assocae, Cenre for Acuaral Sudes Faculy of Economcs and Commerce Unversy of Melbourne

More information

Comb Filters. Comb Filters

Comb Filters. Comb Filters The smple flers dscussed so far are characered eher by a sngle passband and/or a sngle sopband There are applcaons where flers wh mulple passbands and sopbands are requred Thecomb fler s an example of

More information

Modélisation de la détérioration basée sur les données de surveillance conditionnelle et estimation de la durée de vie résiduelle

Modélisation de la détérioration basée sur les données de surveillance conditionnelle et estimation de la durée de vie résiduelle Modélsaon de la dééroraon basée sur les données de survellance condonnelle e esmaon de la durée de ve résduelle T. T. Le, C. Bérenguer, F. Chaelan Unv. Grenoble Alpes, GIPSA-lab, F-38000 Grenoble, France

More information

HARMONIC MARKOV SWITCHING AUTOREGRESSIVE MODELS FOR AIR POLLUTION ANALYSIS

HARMONIC MARKOV SWITCHING AUTOREGRESSIVE MODELS FOR AIR POLLUTION ANALYSIS HARMONIC MARKOV SWITCHING AUTOREGRESSIVE MODELS FOR AIR POLLUTION ANALYSIS ROBERTA PAROLI Isuo d Sasca, Unversà Caolca S.C. d Mlano LUIGI SPEZIA Deparmen of Sascs, Ahens Unversy of Economcs and Busness

More information

Introduction to Boosting

Introduction to Boosting Inroducon o Boosng Cynha Rudn PACM, Prnceon Unversy Advsors Ingrd Daubeches and Rober Schapre Say you have a daabase of news arcles, +, +, -, -, +, +, -, -, +, +, -, -, +, +, -, + where arcles are labeled

More information

Mean field approximation for PDE-Markov random field models in image analysis

Mean field approximation for PDE-Markov random field models in image analysis Proceedngs of he 6h WSEAS Inernaonal Conference on Applcaons of Elecrcal Engneerng, Isanbul, Turkey, May 7-9, 007 34 Mean feld approxmaon for PDE-Markov random feld models n mage analyss S. ZIMERAS Unversy

More information

CHAPTER FOUR REPEATED MEASURES IN TOXICITY TESTING

CHAPTER FOUR REPEATED MEASURES IN TOXICITY TESTING CHAPTER FOUR REPEATED MEASURES IN TOXICITY TESTING 4. Inroducon The repeaed measures sudy s a very commonly used expermenal desgn n oxcy esng because no only allows one o nvesgae he effecs of he oxcans,

More information

Supplementary Material to: IMU Preintegration on Manifold for E cient Visual-Inertial Maximum-a-Posteriori Estimation

Supplementary Material to: IMU Preintegration on Manifold for E cient Visual-Inertial Maximum-a-Posteriori Estimation Supplemenary Maeral o: IMU Prenegraon on Manfold for E cen Vsual-Ineral Maxmum-a-Poseror Esmaon echncal Repor G-IRIM-CP&R-05-00 Chrsan Forser, Luca Carlone, Fran Dellaer, and Davde Scaramuzza May 0, 05

More information

Time-interval analysis of β decay. V. Horvat and J. C. Hardy

Time-interval analysis of β decay. V. Horvat and J. C. Hardy Tme-nerval analyss of β decay V. Horva and J. C. Hardy Work on he even analyss of β decay [1] connued and resuled n he developmen of a novel mehod of bea-decay me-nerval analyss ha produces hghly accurae

More information

Sampling Procedure of the Sum of two Binary Markov Process Realizations

Sampling Procedure of the Sum of two Binary Markov Process Realizations Samplng Procedure of he Sum of wo Bnary Markov Process Realzaons YURY GORITSKIY Dep. of Mahemacal Modelng of Moscow Power Insue (Techncal Unversy), Moscow, RUSSIA, E-mal: gorsky@yandex.ru VLADIMIR KAZAKOV

More information

doi: info:doi/ /

doi: info:doi/ / do: nfo:do/0.063/.322393 nernaonal Conference on Power Conrol and Opmzaon, Bal, ndonesa, -3, June 2009 A COLOR FEATURES-BASED METHOD FOR OBJECT TRACKNG EMPLOYNG A PARTCLE FLTER ALGORTHM Bud Sugand, Hyoungseop

More information

Computer Robot Vision Conference 2010

Computer Robot Vision Conference 2010 School of Compuer Scence McGll Unversy Compuer Robo Vson Conference 2010 Ioanns Rekles Fundamenal Problems In Robocs How o Go From A o B? (Pah Plannng) Wha does he world looks lke? (mappng) sense from

More information