( ) ( ) ( ) ( ) STOCHASTIC SIMULATION FOR BLOCKED DATA. Monte Carlo simulation Rejection sampling Importance sampling Markov chain Monte Carlo
|
|
- Domenic Lawson
- 5 years ago
- Views:
Transcription
1 SOCHASIC SIMULAIO FOR BLOCKED DAA Stochastc System Analyss and Bayesan Model Updatng Monte Carlo smulaton Rejecton samplng Importance samplng Markov chan Monte Carlo Monte Carlo smulaton Introducton: If we know how to drectly sample from f x to obtan..d. samples { X :,..., } =, accordng to the Law of Large umber: E g X g X g f x = MCS Procedure: skpped snce t s trval Analyss:. he MCS estmator g MCS s unbased and the varance decays wth the / rate. MCS E f x g = E f x g X E g X f x = = MCS Var g Var g f x f x X = = Varf x g X. Confdence nterval MCS estmator g Central Lmt heorem MCS = = g X Var ormal E g X, f x g X o buld up confdence nterval, we need estmated smply as the sample varance of MCS Var g X g X g f x = f x f x Var g X, whch can be { g X :,..., } =,.e. herefore, the followng 95.4% confdence nterval for Gaussan can be bult:
2 Stochastc System Analyss and Bayesan Model Updatng Var g X Var g X MCS f x MCS f x g E g X g + f x Remarks:. If we know how to drectly sample from f x s a very strct premse snce t usually requres the knowledge of F the nverse of the cumulatve densty functon, whch s avalable to us only for certan standard probablty densty functons, e.g. unform, Gaussan, Gamma, etc. In the case that we know we can sample from f x usng the followng steps: Draw U ~ unf [ 0,] and let X = F U. It can be shown that ~ Proof: X f x. P X x = P F U x = P U F x = F x. he accuracy of MCS does not depend on the dmenson. F, MCS g s robust aganst the dmenson of X snce Var g
3 Stochastc System Analyss and Bayesan Model Updatng Rejecton samplng Introducton: 3
4 Stochastc System Analyss and Bayesan Model Updatng Suppose we don t know how to sample f x drectly, but we do know how to evaluate f x up to a constant. f x = a h x where a : unknown constant, h x : we know how to calculate t. ote that ths s the usual stuaton for Bayesan analyss, where the posteror PDF has the followng form: x f Y f Y f Y x f f x Y = = f Y x f x { h x a Assume that we know how to sample and evaluate a chosen PDF q x q x K q x h x Procedure:. Let K be a number such that K q x h x. Let X C ~ q x, C : canddate. 3. Accept X = X C w.p. C C h X Kq X., x. 4. Cycle 3 untl we get enough accepted samples. Analyss: Same as the Monte Carlo smulaton except that the number of samples should be the number of the accepted samples. Remarks: Dfferent from MCMC, when a sample s rejected, we do not repeat the prevous sample. he number K can be solved as the followng: Let x * x x * x * x h h = arg max K =,.e. solvng K usually requres optmzaton. x q q 3 How do we do Step 3? Let ~ [ 0, ] 4 If the shapes of q x and x U unf Let X = X C, f C C h X U Kq X h are very dfferent, the effcency of rejecton 4
5 Stochastc System Analyss and Bayesan Model Updatng samplng wll be poor. However, t s often the case that the shape of h x s unknown to us. It's hard to choose an effcent q x. 5 When the X dmenson s hgh, rejecton samplng can be extremely neffcent. hs s because t can be much harder to fnd a good q x n hgh dmensonal space. Adaptve rejecton samplng Suppose h x log s concave. l o g Kq x h x K q x Kq 3 3 x Procedure: Let q x be an exponental PDF and s used as the ntal rejecton samplng PDF. ote that Kq x s tangent to h x. Draw C ~ X q x. If C sample. However, f C X accepted, keep on usng x X rejected, let x that Kq x s tangent to x q to generate the next q be the second exponental PDF so h at X C n the log space. Generate the next sample usng the envelope PDF formed by Kq x and Kq dong so untl we get enough number of accepted samples. x. Contnue Remarks:. ot applcable to non-log-concave PDF.. Stll not good for hgh dmensonal X. 3. here are varants of adaptve rejecton samplng technques that do not requre evaluatng the gradents [] and do not requre log-concaveness []. 5
6 Stochastc System Analyss and Bayesan Model Updatng Importance samplng Introducton: Suppose we don t know how to drectly sample from f x, but we know how to evaluate t. We would lke to compute the expected value of some functon g X wth respect to f x,.e. to compute E f x [ g X ] g x f x = dx Let q x, called the mportance samplng PDF, be a PDF that we know how to sample and evaluate and whose support regon contans the support regon of f x, then f x f x E f x g X = g x f x dx = g x q x dx = Eq x g x q x q x Procedure: Draw { X },, L, =..d. from q x. Accordng to the Law of Large umber, we have f f X f x q x = X E g X = E g X g X g q X q X IS Analyss:. he IS estmator g IS s unbased and the varance of g IS decays wth the / rate. = E g X q x x X f X IS E g E q x q x g X = = q X f X q X f = Eq x g X q X f = g x q x dx = E f x g X q x 6
7 Stochastc System Analyss and Bayesan Model Updatng f X IS f X Var g Var q x q x g X = = Var q x g X = q X q X. Confdence nterval IS estmator g IS = g X f X q X Central Lmt heorem q x Central Lmt heorem X X f Varq x g X f X q X ormal E g X, q X f Varq x g X q X ormal E g X, f x o buld up confdence nterval, we need estmated smply as the sample varance of X Var g X q x q X f X, whch can be f g X : =,...,,.e. q X X f X f IS Var q x g X g X g q X = q X herefore, the followng 95.4% confdence nterval for Gaussan can be bult: f X f X Varq x g X Varq x g X q X q X IS IS g E g X g + f x Remarks:. Recall that the varance of the MCS estmator s Var g X seen that the varance of the IS estmator s Var g X q x f x f. Here we have X q X. It can be shown that f the mportance samplng PDF q x g x f x, the varance of 7
8 Stochastc System Analyss and Bayesan Model Updatng the IS estmator s 0. We call ths the optmal mportance samplng PDF. However, ths PDF usually cannot be drectly sampled and evaluated. It s also lkely to choose an mportance samplng PDF so that the varance of the resultng IS estmator s larger than that of MCS estmator.. Importance weghts Let us defne w { w : =,..., } X f as the mportance weght of the q X One can see that the IS estmator s smply E q x w = E = q X Also, w = = = f X q x = w g X Varq x f X Central Lmt heorem ormal, q X = q X w f. ote that X th sample. 3. Importance samplng s neffcent when X dmenson s hgh. hs s because when the dmenson s hgh, the mportance weghts may become hghly non-unform. he consequence s that the number of effectve samples s lttle. Modfed mportance samplng Introducton: For Bayesan analyss, we usually only know how to evaluate the posteror PDF f f x Y up to a constant: x Y = x f Y f Y x f ote that we usually don t know the normalzng constant f Y. So the mportance 8
9 Stochastc System Analyss and Bayesan Model Updatng samplng technque cannot be drectly appled snce t requres full evaluaton of f x Y. Modfed mportance samplng only requres the evaluaton of f to a constant. f x Y a h x =, where a : we don t know t, how to evaluate t. ow let the normalzed mportance weght be w j j h X h X q X j= q X ote that smply = w w g X. = =. he MIS estmator for Observaton: when, MIS s the same as IS. Proof: When, j j h X h X Eq x =, so j= q X q X a x Y up h x : we know E g X E g X Y f x Y j j h X. j= q X a herefore, the normalzed mportance weght n MIS and the mportance weght n IS become dentcal. he consequence s that MIS s only asymptotcally unbased. s Procedure:. Draw. { X :,..., } ~ q x = and compute E g X Y w g X = w j j h X h X. q X j= q X Remarks:. When s large, the statstcal propertes of the MIS estmator s smlar to those for the IS estmator. In fact, MIS s asymptotcally unbased.. MIS can be used to estmate the normalzng constant f Y n Bayesan analyss. Let h X α = be the non-normalzed mportance weght. It can be shown that q X α s an unbased estmator of Y = f. 9
10 Stochastc System Analyss and Bayesan Model Updatng Sample-mportance resamplng Introducton: Both mportance samplng and modfed mportance samplng are manly for estmatng E g X Y. If we would lke to obtan samples from f x Y, we can adopt SIR, whch s just one step ahead of MIS. he dea s to resample the MIS samples accordng to ther normalzed weghts. Procedure: Recall IIS, we have. Do MIS to obtan { w :, L, } =, {, } X w :,..., = w =. Recall that = = w =.. Resamplng: Let j X = X w.p. new w for j = L,, M, then j { X : j =, L, M} ~ f x Y new Remarks:. here may be repeated samples n. M s better to be no larger than. 3. If s small, asymptotcally unbased. j { X : j,, M } new j { X : j,, M } ~ f x Y new = L. = L. hs s because MIS s only 0
11 Stochastc System Analyss and Bayesan Model Updatng
12 Stochastc System Analyss and Bayesan Model Updatng Markov Chan Monte Carlo. Metropols-Hastng algorthm. Gbbs sampler 3. Hybrd Monte Carlo Metropols-Hastng algorthm Introducton: Suppose the target PDF s the posteror PDF n a Bayesan analyss: x f Y f Y x f f x Y = = a h x he dea of MH s to create a Markov chan whose statonary dstrbuton s the same as the target PDF. Procedure: c 0. Let H x x be the chosen proposal PDF.. Intalze 0 X = any place.
13 3. Let the canddate C c 0 ~ 4. Let X Stochastc System Analyss and Bayesan Model Updatng X H x X and compute C X w. p mn, = X w. p r r 0 mn, 5. Cycle 3 4 to obtan Markov chan samples { t X : t 0,..., } wll be asymptotcally dstrbuted as f C 0 C 0 C 0 h X H X X r =. h X H X X =. hese samples x Y f the Markov chan s ergodc. Frst note that the MH algorthm creates a Markov chan that satsfes detaled balance: F z x h x = B x z h z x, z, F z x and B x z are the forward and backward probablty transton kernels of the Markov chan. he forward and backward kernels n the MH algorthm are = mn, δ + mn, F z x r z x r H z x = mn, δ + mn, B x z r x z r H x z B h where z H x z f z Y H x z h r = =, x H z x rb h x H z x f x Y H z x LHS = F z x h x B x Y H z x f = =. h z H x z f z Y H x z h z H x z h z H x z = mn, δ z x h x mn, H z x h x h x H z x + h x H z x h x H z x = mn, δ + mn, h z H x z x z h z H z x h x h z H x z h x H z x h x H z x mn, δ x z h z mn h z H x z h z H x z B x z h z RHS = + = =, h z H x z herefore, as long as the Markov chan s ergodc, the statonary dstrbuton wll be unque and equal to the target f x Y. Analyss: 3
14 . he MCMC estmator for E g X Y MCMC estmator s asymptotcally unbased because Stochastc System Analyss and Bayesan Model Updatng s smply MCMC t g = g X. he t = [ ] t E g MCMC = E g X Y E g X Y t= ote that f we gnore the MC samples n the burn-n perod, the estmator s unbased. t t [ MCMC ] = = Var g Var g X Y Var g X Y t= t= t s t cov, = Var g X Y g X g X Y + t= s= t= ote that after the MC reaches ts statonary state, the tme orgn s forgotten, so t t+ k g X g X Y = R k cov, ote that R k = R k and = R 0 Var g X Y Var [ g MCMC ] = R 0 R s t R 0 R s s + = + s= t= s= s= 0 [ γ ] R 0 s R s R 0 = + = + R where s R s R 0 γ = quantfes the degree of dependence of the MC s= samples. ote that R 0 s the varance of the estmator f the MC samples are all ndependent. However, the MC samples are dependent so the actual varance s larger than R 0 of ndependent samples s + γ samples as the followng: t t+ k = cov, t= for a factor of [ + γ ]. One can say that the equvalent number. ote that R k can be estmated from the MC t t k g X g + MCMC g X g MCMC R k R k g X g X k k. Confdence nterval ote that the Central Lmt heorem and the Law of Large umber stll hold for 4
15 weakly dependent samples,.e. Stochastc System Analyss and Bayesan Model Updatng Y [ γ ] Var g X t g MCMC = g X E g X Y, + t= So the followng 95.4% confdence nterval can be bult: 0 MCMC R R0 g + E g X Y g + + [ ] MCMC γ [ γ ] Remarks:. Recall that f x Y a h x =. We don t need to know how to evaluate the normalzng constant a snce t s cancelled out as we compute r.. When the ntal state of the Markov chan s chosen arbtrarly, the chan may need some tme to reach ts statonary state. We call ths perod the burn-n perod. he MC samples n the burn-n perod are not dstrbuted as f x Y. We can choose to dscard these samples. he usual way of dentfyng the burn-n perod s to plot the sample value tme hstory or the tme hstory of certan chosen statstcs and dentfy t vsually. X t Burn-n perod t Samples n the burn-n perod can be dscarded 3. How do we know the resultng MC s ergodc? In many cases, we don t know t for sure. But f we conduct several MCMCs to obtan ndependent MCs and found ther behavor s smlar, we can argue that the MC may be ergodc. here are cases where we can mmedately see the MC s ergodc or not: y x H, f x f x y x H, 4. After reachng the statonary state, the MC samples become dentcally dstrbuted but dependent. he dependency between two samples decays wth the tme 5
16 dfference. 5. When H z x H x z algorthm. ote Stochastc System Analyss and Bayesan Model Updatng =, the resultng algorthm s called the Metropols C 0 h X r =. h X 6. he choce of the proposal PDF H z x can sgnfcantly affect the effcency of MCMC. From our experence, the type of H z x s not crtcal but the wdth of H z x matters. If H z x s very narrow, r and no rejecton, but adjacent samples are hghly correlated. If H z x s very wde, r s often small and lots of rejecton and repeatng samples, so the samples can be hghly correlated. herefore, H z x should not be too narrow or too wde. he rule of thumb s to take the wdth of H z x to be the same order of the wdth of f 50%. 0 h X always accept h X C h x C h X x Y. Usually, the performance s the best when the rejecton rate s around 7. In prncple, the MH algorthm should work even when the X dmenson s hgh. In practce, an effcent proposal PDF can be dffcult to buld because we usually lack of knowledge of the geometry of the target PDF f x Y. here are also ssues for the local-random-walk MH, e.g. MH wth a Gaussan proposal PDF. Consder the followng Gaussan PDF. Let σ max and σ mn be the maxmum and mnmum egenvalues of the covarance matrx. Suppose we choose a Gaussan proposal PDF H z x whose standard devaton n all drectons s dentcal and equal to σ mn. One can see that n order to have the MC samples travel h X accept wth prob r throughout the sgnfcant regon of the target PDF, we need at least σ max /σ mn MC 0 6
17 Stochastc System Analyss and Bayesan Model Updatng tme steps. hs dffculty s due to hgh asperty rato may occur regardless the dmenson of X. However, from my experence, t occurs more frequently when X dmenson s hgh. hs s defntely not an ssue when X s a scalar. 8. MH may not work for mult-modal PDFs snce the resultng MC may be non-ergodc. 7
18 Stochastc System Analyss and Bayesan Model Updatng 8
19 Stochastc System Analyss and Bayesan Model Updatng 9
20 Stochastc System Analyss and Bayesan Model Updatng Gbbs sampler Introducton: Is t possble to conduct MCMC wthout rejectng samples? It turns out to be possble. hs can happen when the uncertan varable X can be dvded nto m groups { X :,..., m} = and, moreover, when t s feasble to drectly sample from { } \, f x x x Y, where { x x } denotes { x,..., x, x,..., x } \ + m. Procedure:. Intalze. Sample X 0 = X, X,, X L m at any place. 3. Repeat step to get ~,,, X f x x x Y 0 0 L m 0 0 X ~ f x x, x3, L, xm, Y ~ 3,, 4, L, m, X = X f x x x x x Y M X ~ m f xm x, L, xm, Y dstrbuted as f { X t : t 0,, } = L. hese samples wll be asymptotcally x Y f the Markov chan s ergodc. ote that the Gbbs sampler algorthm creates a Markov chan that satsfes detaled balance: F z x f x Y B x z f z Y x, z =, F z x and B x z are the forward and backward probablty transton kernels of the Markov chan. Usng the followng transton as an example: t+ t+ x t+ t+ x t t+ x3 x3 t t x4 x4 x x L L M M t t x m x m x z he forward and backward kernels n the Gbbs sampler algorthm are: 0
21 Stochastc System Analyss and Bayesan Model Updatng = δ δ δ 4 4 Lδ m m 3,, 4, L, m, F z x z x z x z x z x f z x x x x Y = δ δ δ 4 4 Lδ m m 3,, 4, L, m, B x z x z x z x z x z f x z z z z Y x Y F z x f = δ z x Lδ z m xm f z3 x, L, xm, Y f x, L, xm Y o δ z3 x3 o x3 Lδ, L, m m m 3 3 o δ x z f z, L, zm Y o z3 = δ x z Lδ x m zm f z3 z, L, zm, Y f z, z, x3, z4l, zm Y o δ x3 z3 o z3 f z z Y = δ x z x z f z, z, x, z L, z Y Lδ,, 3, 4L, m 3 4 m f z z x z z Y = δ x z x z f z z Y, L, m m m 3 3 o δ x z f z, L, zm Y o z3 = δ x z Lδ x m zm f x3 z, L, zm, Y f z, L, zm Y = B x z f z Y o δ x3 z3 o z3 Remarks:. Unlke MH, Gbbs sampler needs no proposal PDF, so no rejecton. But the f x x \. tradeoff s that we need to know how to sample from x. In the case that we don t know how to drectly sample from f x x \, we can x sample t usng sample-mportance resamplng, MCMC, rejecton samplng, etc. A f x x \ popular choce s to use adaptve rejecton samplng to sample from x [3]. 3. he advantages and dsadvantages of Gbbs sampler are smlar to those for MH. he analyss s also the same. 4. Gbbs sampler wll not work well when some uncertan parameters are hghly correlated condtonng on the data. he consequence s that the MC samples move very slowly when sample these hghly correlated varables. here are varants of Gbbs sampler that enjoy larger jumps n the MC samples even when such hgh correlaton exsts, e.g. Gbbs sampler wth overrelaxaton [4,5]. Hybrd Monte Carlo
22 Introducton: Stochastc System Analyss and Bayesan Model Updatng Both MH and Gbbs sampler create MCs wth local random walk behavor. However, ths behavor s not desrable. Wth ths behavor, t may take a long tme to have the resultng MC travel throughout the sgnfcant regon of f x Y, especally when X dmenson s hgh. Hybrd Monte Carlo s a MCMC algorthm that does not use local random walk. he basc dea s to add an auxlary uncertan varable Z to the process, where the new target PDF s proportonal to, n z = f x z h x e where from f, f x Y = a h x and n s the dmenson of X. If we are able to sample x z, t s clear that the X part of the samples wll be dstrbuted as f x Y. How do we sample from f, x z? We employ MCMC. he man trck for Hybrd Monte Carlo s to gve h x and z some physcal meanng: log [ h x ] s consdered to be the potental energy of a ball wth unt mass x s the locaton of the ball; thnk of log [ h x ] and z s the velocty of the ball. So the total energy of the ball s n log h x + z = log f x, z H x, z = to be the profle of a valley If there s no frcton when the ball rolls on the valley, the total energy H x, z s conservatve,.e. f x, z s conservatve, and the ball rolls accordng to the followng equatons: dx, log dz H x z h x = z = = =,..., n dt dt x x Procedure:. Intalze 0 X, 0 Ẑ. Solve x t, z t accordng to the followng governng equaton, log h x dx dz H x z = z = = =,..., n dt dt x x
23 Stochastc System Analyss and Bayesan Model Updatng wth ntal condton x0 0 = X and z0 0 = Z. Evolve the soluton for randomzed duraton of 0 R. Let X x R 0 =. 3. Resample n z = Z ~ e 4. Cycle -3 to get { t X : t 0,..., } dstrbuted as f =. hese samples wll be asymptotcally x Y f the Markov chan s ergodc. Analyss: same as MH Remarks:. he HMC algorthm ndeed samples from f, x z. o see ths, let s vew the end product of Step n the algorthm as a canddate,.e. C 0 C X = x R, Z = z R 0 s the canddate of the MCMC algorthm. One can verfy that C C, 0 0, f X Z r = = f X Z hs s because the total energy H x, z, and hence f x, z, remans constant n the soluton process of the Hamltonan equatons. herefore, the canddate X C, C Z s always accepted as the next MC sample. 0. he duraton R n Step s better to be randomzed to avod a perodc MC, although the chance of beng so s very small. 3. Step 3 s necessary snce wthout t, f x, z wll be always constant, hence the MC wll not explore the entre phase space. 4. In practce, Step cannot be solved analytcally and must be solved approxmately. Dong so, the resultng r rato wll be only approxmately equal to. o handle ths ssue, we can smply add an accept/reject step,.e. change Step to the followng: Solve x t, z t approxmately accordng to the followng governng equaton 3
24 Stochastc System Analyss and Bayesan Model Updatng, log h x dx dz H x z = z = = =,..., n dt dt x x wth ntal condton x0 X 0 = and z0 0 = Z. Evolve the soluton for randomzed duraton of 0 R. Let C X x R 0 = and Z C z R 0 =. Accept the canddate,.e. take C C, 0 0, f X Z r = f X Z X C = X, wth probablty r, where 0 If not accepted, repeat the prevous sample,.e. take X = X. he approxmate soluton algorthm to the Hamltonan equatons can not be arbtrary. In fact, the chosen algorthm must be reversble n tme,.e. f startng from ntal condton, we obtan the canddate soluton; a reversble algorthm must have the property that startng from that canddate and solve the equaton backward n tme, we wll obtan the same ntal condton. hs s so because the resultng HMC algorthm must satsfy the so called detaled balance or reversblty condton. he followng leapfrog fnte dfference algorthm s reversble and s accurate to the nd order n the aylor seres expanson: t t h x z t + = z t + x x= x t h x t t x t + t = x t + t z t + t t h x z t + t = z t + + x x= x t+ t h x t + t 5. One can see that HMC does not do local random walk and t s possble for HMC to make large jumps. Of course the tradeoff s that HMC requres solvng the Hamltonan equatons, whch may requres much computaton. Accordng to the experence from other researchers, t seems that the cost s usually worthwhle, compared to MH and Gbbs sampler. 4
25 Stochastc System Analyss and Bayesan Model Updatng References: [] Glks, W. R. 99 Dervatve-free adaptve rejecton samplng for Gbbs samplng. Bayesan Statstcs 4, eds. Bernardo, J., Berger, J., Dawd, A. P., and Smth, A. F. M. Oxford Unversty Press. 5
26 Stochastc System Analyss and Bayesan Model Updatng [] Glks, W. R., Best,. G. and an, K. K. C. 995 Adaptve rejecton Metropols samplng. Appled Statstcs, 44, [3] Glks, W.R., and Wld, P. 99. Adaptve rejecton samplng for Gbbs samplng. Appled Statstcs, 4, [4] Adler, S.L. 98. Over-relaxaton method for the Monte Carlo evaluaton of the partton functon for multquadratc actons. Physcal Revew D - Partcles and Felds, 3, [5] eal, R.M Suppressng random walks n Markov chan Monte Carlo usng ordered overrelaxaton. echncal Report 9508, Dept. of Statstcs, Unversty of oronto. In general: Mackay, J.D.C. "Introducton to Monte Carlo methods." 6
Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement
Markov Chan Monte Carlo MCMC, Gbbs Samplng, Metropols Algorthms, and Smulated Annealng 2001 Bonformatcs Course Supplement SNU Bontellgence Lab http://bsnuackr/ Outlne! Markov Chan Monte Carlo MCMC! Metropols-Hastngs
More informationProbabilistic Graphical Models
School of Computer Scence robablstc Graphcal Models Appromate Inference: Markov Chan Monte Carlo 05 07 Erc Xng Lecture 7 March 9 04 X X 075 05 05 03 X 3 Erc Xng @ CMU 005-04 Recap of Monte Carlo Monte
More informationLecture 4 Hypothesis Testing
Lecture 4 Hypothess Testng We may wsh to test pror hypotheses about the coeffcents we estmate. We can use the estmates to test whether the data rejects our hypothess. An example mght be that we wsh to
More informationGlobal Sensitivity. Tuesday 20 th February, 2018
Global Senstvty Tuesday 2 th February, 28 ) Local Senstvty Most senstvty analyses [] are based on local estmates of senstvty, typcally by expandng the response n a Taylor seres about some specfc values
More informationU-Pb Geochronology Practical: Background
U-Pb Geochronology Practcal: Background Basc Concepts: accuracy: measure of the dfference between an expermental measurement and the true value precson: measure of the reproducblty of the expermental result
More informationMarkov Chain Monte Carlo Lecture 6
where (x 1,..., x N ) X N, N s called the populaton sze, f(x) f (x) for at least one {1, 2,..., N}, and those dfferent from f(x) are called the tral dstrbutons n terms of mportance samplng. Dfferent ways
More informationCS 3750 Machine Learning Lecture 6. Monte Carlo methods. CS 3750 Advanced Machine Learning. Markov chain Monte Carlo
CS 3750 Machne Learnng Lectre 6 Monte Carlo methods Mlos Haskrecht mlos@cs.ptt.ed 5329 Sennott Sqare Markov chan Monte Carlo Importance samplng: samples are generated accordng to Q and every sample from
More informationConvergence of random processes
DS-GA 12 Lecture notes 6 Fall 216 Convergence of random processes 1 Introducton In these notes we study convergence of dscrete random processes. Ths allows to characterze phenomena such as the law of large
More informationEcon107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)
I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes
More informationComposite Hypotheses testing
Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter
More informationLINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity
LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 30 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 2 Remedes for multcollnearty Varous technques have
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationOn an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1
On an Extenson of Stochastc Approxmaton EM Algorthm for Incomplete Data Problems Vahd Tadayon Abstract: The Stochastc Approxmaton EM (SAEM algorthm, a varant stochastc approxmaton of EM, s a versatle tool
More information3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X
Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number
More informationEconomics 130. Lecture 4 Simple Linear Regression Continued
Economcs 130 Lecture 4 Contnued Readngs for Week 4 Text, Chapter and 3. We contnue wth addressng our second ssue + add n how we evaluate these relatonshps: Where do we get data to do ths analyss? How do
More informationUNIVERSITY OF TORONTO Faculty of Arts and Science. December 2005 Examinations STA437H1F/STA1005HF. Duration - 3 hours
UNIVERSITY OF TORONTO Faculty of Arts and Scence December 005 Examnatons STA47HF/STA005HF Duraton - hours AIDS ALLOWED: (to be suppled by the student) Non-programmable calculator One handwrtten 8.5'' x
More informationRELIABILITY ASSESSMENT
CHAPTER Rsk Analyss n Engneerng and Economcs RELIABILITY ASSESSMENT A. J. Clark School of Engneerng Department of Cvl and Envronmental Engneerng 4a CHAPMAN HALL/CRC Rsk Analyss for Engneerng Department
More informationProbability Theory (revisited)
Probablty Theory (revsted) Summary Probablty v.s. plausblty Random varables Smulaton of Random Experments Challenge The alarm of a shop rang. Soon afterwards, a man was seen runnng n the street, persecuted
More informationChapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems
Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons
More informationSampling Theory MODULE VII LECTURE - 23 VARYING PROBABILITY SAMPLING
Samplng heory MODULE VII LECURE - 3 VARYIG PROBABILIY SAMPLIG DR. SHALABH DEPARME OF MAHEMAICS AD SAISICS IDIA ISIUE OF ECHOLOGY KAPUR he smple random samplng scheme provdes a random sample where every
More informationPHYS 705: Classical Mechanics. Calculus of Variations II
1 PHYS 705: Classcal Mechancs Calculus of Varatons II 2 Calculus of Varatons: Generalzaton (no constrant yet) Suppose now that F depends on several dependent varables : We need to fnd such that has a statonary
More informationFirst Year Examination Department of Statistics, University of Florida
Frst Year Examnaton Department of Statstcs, Unversty of Florda May 7, 010, 8:00 am - 1:00 noon Instructons: 1. You have four hours to answer questons n ths examnaton.. You must show your work to receve
More informationA Robust Method for Calculating the Correlation Coefficient
A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal
More informationOutline for today. Markov chain Monte Carlo. Example: spatial statistics (Christensen and Waagepetersen 2001)
Markov chan Monte Carlo Rasmus Waagepetersen Department of Mathematcs Aalborg Unversty Denmark November, / Outlne for today MCMC / Condtonal smulaton for hgh-dmensonal U: Markov chan Monte Carlo Consder
More informationStanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011
Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected
More informationWeek3, Chapter 4. Position and Displacement. Motion in Two Dimensions. Instantaneous Velocity. Average Velocity
Week3, Chapter 4 Moton n Two Dmensons Lecture Quz A partcle confned to moton along the x axs moves wth constant acceleraton from x =.0 m to x = 8.0 m durng a 1-s tme nterval. The velocty of the partcle
More informationProf. Dr. I. Nasser Phys 630, T Aug-15 One_dimensional_Ising_Model
EXACT OE-DIMESIOAL ISIG MODEL The one-dmensonal Isng model conssts of a chan of spns, each spn nteractng only wth ts two nearest neghbors. The smple Isng problem n one dmenson can be solved drectly n several
More informationWhy Monte Carlo Integration? Introduction to Monte Carlo Method. Continuous Probability. Continuous Probability
Introducton to Monte Carlo Method Kad Bouatouch IRISA Emal: kad@rsa.fr Wh Monte Carlo Integraton? To generate realstc lookng mages, we need to solve ntegrals of or hgher dmenson Pel flterng and lens smulaton
More informationLinear Approximation with Regularization and Moving Least Squares
Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...
More informationParametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010
Parametrc fractonal mputaton for mssng data analyss Jae Kwang Km Survey Workng Group Semnar March 29, 2010 1 Outlne Introducton Proposed method Fractonal mputaton Approxmaton Varance estmaton Multple mputaton
More information6. Stochastic processes (2)
6. Stochastc processes () Lect6.ppt S-38.45 - Introducton to Teletraffc Theory Sprng 5 6. Stochastc processes () Contents Markov processes Brth-death processes 6. Stochastc processes () Markov process
More information6. Stochastic processes (2)
Contents Markov processes Brth-death processes Lect6.ppt S-38.45 - Introducton to Teletraffc Theory Sprng 5 Markov process Consder a contnuous-tme and dscrete-state stochastc process X(t) wth state space
More informationQuantifying Uncertainty
Partcle Flters Quantfyng Uncertanty Sa Ravela M. I. T Last Updated: Sprng 2013 1 Quantfyng Uncertanty Partcle Flters Partcle Flters Appled to Sequental flterng problems Can also be appled to smoothng problems
More informationSimulation and Random Number Generation
Smulaton and Random Number Generaton Summary Dscrete Tme vs Dscrete Event Smulaton Random number generaton Generatng a random sequence Generatng random varates from a Unform dstrbuton Testng the qualty
More informationFeb 14: Spatial analysis of data fields
Feb 4: Spatal analyss of data felds Mappng rregularly sampled data onto a regular grd Many analyss technques for geophyscal data requre the data be located at regular ntervals n space and/or tme. hs s
More informationEstimation: Part 2. Chapter GREG estimation
Chapter 9 Estmaton: Part 2 9. GREG estmaton In Chapter 8, we have seen that the regresson estmator s an effcent estmator when there s a lnear relatonshp between y and x. In ths chapter, we generalzed the
More informationprinceton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora
prnceton unv. F 13 cos 521: Advanced Algorthm Desgn Lecture 3: Large devatons bounds and applcatons Lecturer: Sanjeev Arora Scrbe: Today s topc s devaton bounds: what s the probablty that a random varable
More informationDurban Watson for Testing the Lack-of-Fit of Polynomial Regression Models without Replications
Durban Watson for Testng the Lack-of-Ft of Polynomal Regresson Models wthout Replcatons Ruba A. Alyaf, Maha A. Omar, Abdullah A. Al-Shha ralyaf@ksu.edu.sa, maomar@ksu.edu.sa, aalshha@ksu.edu.sa Department
More informationCIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M
CIS56: achne Learnng Lecture 3 (Sept 6, 003) Preparaton help: Xaoyng Huang Lnear Regresson Lnear regresson can be represented by a functonal form: f(; θ) = θ 0 0 +θ + + θ = θ = 0 ote: 0 s a dummy attrbute
More informationNUMERICAL DIFFERENTIATION
NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the
More informationThe Feynman path integral
The Feynman path ntegral Aprl 3, 205 Hesenberg and Schrödnger pctures The Schrödnger wave functon places the tme dependence of a physcal system n the state, ψ, t, where the state s a vector n Hlbert space
More informationInformation Geometry of Gibbs Sampler
Informaton Geometry of Gbbs Sampler Kazuya Takabatake Neuroscence Research Insttute AIST Central 2, Umezono 1-1-1, Tsukuba JAPAN 305-8568 k.takabatake@ast.go.jp Abstract: - Ths paper shows some nformaton
More informationTarget tracking example Filtering: Xt. (main interest) Smoothing: X1: t. (also given with SIS)
Target trackng example Flterng: Xt Y1: t (man nterest) Smoothng: X1: t Y1: t (also gven wth SIS) However as we have seen, the estmate of ths dstrbuton breaks down when t gets large due to the weghts becomng
More informationANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)
Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of
More informationApplied Stochastic Processes
STAT455/855 Fall 23 Appled Stochastc Processes Fnal Exam, Bref Solutons 1. (15 marks) (a) (7 marks) The dstrbuton of Y s gven by ( ) ( ) y 2 1 5 P (Y y) for y 2, 3,... The above follows because each of
More informationCHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE
CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng
More informationTesting for seasonal unit roots in heterogeneous panels
Testng for seasonal unt roots n heterogeneous panels Jesus Otero * Facultad de Economía Unversdad del Rosaro, Colomba Jeremy Smth Department of Economcs Unversty of arwck Monca Gulett Aston Busness School
More informationFeature Selection: Part 1
CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?
More informationNotes on Frequency Estimation in Data Streams
Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to
More information2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification
E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton
More informationGeneralized Linear Methods
Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set
More informationComputing MLE Bias Empirically
Computng MLE Bas Emprcally Kar Wa Lm Australan atonal Unversty January 3, 27 Abstract Ths note studes the bas arses from the MLE estmate of the rate parameter and the mean parameter of an exponental dstrbuton.
More informationContinuous Time Markov Chains
Contnuous Tme Markov Chans Brth and Death Processes,Transton Probablty Functon, Kolmogorov Equatons, Lmtng Probabltes, Unformzaton Chapter 6 1 Markovan Processes State Space Parameter Space (Tme) Dscrete
More informationMonte Carlo method II
Course MP3 Lecture 5 14/11/2006 Monte Carlo method II How to put some real physcs nto the Monte Carlo method Dr James Ellott 5.1 Monte Carlo method revsted In lecture 4, we ntroduced the Monte Carlo (MC)
More informationModule 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:
More informationTracking with Kalman Filter
Trackng wth Kalman Flter Scott T. Acton Vrgna Image and Vdeo Analyss (VIVA), Charles L. Brown Department of Electrcal and Computer Engneerng Department of Bomedcal Engneerng Unversty of Vrgna, Charlottesvlle,
More informationx = , so that calculated
Stat 4, secton Sngle Factor ANOVA notes by Tm Plachowsk n chapter 8 we conducted hypothess tests n whch we compared a sngle sample s mean or proporton to some hypotheszed value Chapter 9 expanded ths to
More informationGoodness of fit and Wilks theorem
DRAFT 0.0 Glen Cowan 3 June, 2013 Goodness of ft and Wlks theorem Suppose we model data y wth a lkelhood L(µ) that depends on a set of N parameters µ = (µ 1,...,µ N ). Defne the statstc t µ ln L(µ) L(ˆµ),
More informationSnce h( q^; q) = hq ~ and h( p^ ; p) = hp, one can wrte ~ h hq hp = hq ~hp ~ (7) the uncertanty relaton for an arbtrary state. The states that mnmze t
8.5: Many-body phenomena n condensed matter and atomc physcs Last moded: September, 003 Lecture. Squeezed States In ths lecture we shall contnue the dscusson of coherent states, focusng on ther propertes
More informationLecture 4. Instructor: Haipeng Luo
Lecture 4 Instructor: Hapeng Luo In the followng lectures, we focus on the expert problem and study more adaptve algorthms. Although Hedge s proven to be worst-case optmal, one may wonder how well t would
More informationb ), which stands for uniform distribution on the interval a x< b. = 0 elsewhere
Fall Analyss of Epermental Measurements B. Esensten/rev. S. Errede Some mportant probablty dstrbutons: Unform Bnomal Posson Gaussan/ormal The Unform dstrbuton s often called U( a, b ), hch stands for unform
More informationLECTURE 9 CANONICAL CORRELATION ANALYSIS
LECURE 9 CANONICAL CORRELAION ANALYSIS Introducton he concept of canoncal correlaton arses when we want to quantfy the assocatons between two sets of varables. For example, suppose that the frst set of
More informationSingular Value Decomposition: Theory and Applications
Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real
More informationAPPROXIMATE PRICES OF BASKET AND ASIAN OPTIONS DUPONT OLIVIER. Premia 14
APPROXIMAE PRICES OF BASKE AND ASIAN OPIONS DUPON OLIVIER Prema 14 Contents Introducton 1 1. Framewor 1 1.1. Baset optons 1.. Asan optons. Computng the prce 3. Lower bound 3.1. Closed formula for the prce
More informationHopfield networks and Boltzmann machines. Geoffrey Hinton et al. Presented by Tambet Matiisen
Hopfeld networks and Boltzmann machnes Geoffrey Hnton et al. Presented by Tambet Matsen 18.11.2014 Hopfeld network Bnary unts Symmetrcal connectons http://www.nnwj.de/hopfeld-net.html Energy functon The
More information2016 Wiley. Study Session 2: Ethical and Professional Standards Application
6 Wley Study Sesson : Ethcal and Professonal Standards Applcaton LESSON : CORRECTION ANALYSIS Readng 9: Correlaton and Regresson LOS 9a: Calculate and nterpret a sample covarance and a sample correlaton
More informationUncertainty as the Overlap of Alternate Conditional Distributions
Uncertanty as the Overlap of Alternate Condtonal Dstrbutons Olena Babak and Clayton V. Deutsch Centre for Computatonal Geostatstcs Department of Cvl & Envronmental Engneerng Unversty of Alberta An mportant
More informationLecture 10: May 6, 2013
TTIC/CMSC 31150 Mathematcal Toolkt Sprng 013 Madhur Tulsan Lecture 10: May 6, 013 Scrbe: Wenje Luo In today s lecture, we manly talked about random walk on graphs and ntroduce the concept of graph expander,
More informationContinuous Time Markov Chain
Contnuous Tme Markov Chan Hu Jn Department of Electroncs and Communcaton Engneerng Hanyang Unversty ERICA Campus Contents Contnuous tme Markov Chan (CTMC) Propertes of sojourn tme Relatons Transton probablty
More informationHidden Markov Models & The Multivariate Gaussian (10/26/04)
CS281A/Stat241A: Statstcal Learnng Theory Hdden Markov Models & The Multvarate Gaussan (10/26/04) Lecturer: Mchael I. Jordan Scrbes: Jonathan W. Hu 1 Hdden Markov Models As a bref revew, hdden Markov models
More informationAn R implementation of bootstrap procedures for mixed models
The R User Conference 2009 July 8-10, Agrocampus-Ouest, Rennes, France An R mplementaton of bootstrap procedures for mxed models José A. Sánchez-Espgares Unverstat Poltècnca de Catalunya Jord Ocaña Unverstat
More informationChapter 12. Ordinary Differential Equation Boundary Value (BV) Problems
Chapter. Ordnar Dfferental Equaton Boundar Value (BV) Problems In ths chapter we wll learn how to solve ODE boundar value problem. BV ODE s usuall gven wth x beng the ndependent space varable. p( x) q(
More informationProbability Theory. The nth coefficient of the Taylor series of f(k), expanded around k = 0, gives the nth moment of x as ( ik) n n!
8333: Statstcal Mechancs I Problem Set # 3 Solutons Fall 3 Characterstc Functons: Probablty Theory The characterstc functon s defned by fk ep k = ep kpd The nth coeffcent of the Taylor seres of fk epanded
More informationPROPERTIES I. INTRODUCTION. Finite element (FE) models are widely used to predict the dynamic characteristics of aerospace
FINITE ELEMENT MODEL UPDATING USING BAYESIAN FRAMEWORK AND MODAL PROPERTIES Tshldz Marwala 1 and Sbusso Sbs I. INTRODUCTION Fnte element (FE) models are wdely used to predct the dynamc characterstcs of
More informationProbability, Statistics, and Reliability for Engineers and Scientists SIMULATION
CHATER robablty, Statstcs, and Relablty or Engneers and Scentsts Second Edton SIULATIO A. J. Clark School o Engneerng Department o Cvl and Envronmental Engneerng 7b robablty and Statstcs or Cvl Engneers
More informationChapter 2 - The Simple Linear Regression Model S =0. e i is a random error. S β2 β. This is a minimization problem. Solution is a calculus exercise.
Chapter - The Smple Lnear Regresson Model The lnear regresson equaton s: where y + = β + β e for =,..., y and are observable varables e s a random error How can an estmaton rule be constructed for the
More informationMulti-dimensional Central Limit Theorem
Mult-dmensonal Central Lmt heorem Outlne ( ( ( t as ( + ( + + ( ( ( Consder a sequence of ndependent random proceses t, t, dentcal to some ( t. Assume t = 0. Defne the sum process t t t t = ( t = (; t
More informationThe Study of Teaching-learning-based Optimization Algorithm
Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute
More informatione i is a random error
Chapter - The Smple Lnear Regresson Model The lnear regresson equaton s: where + β + β e for,..., and are observable varables e s a random error How can an estmaton rule be constructed for the unknown
More information1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands
Content. Inference on Regresson Parameters a. Fndng Mean, s.d and covarance amongst estmates.. Confdence Intervals and Workng Hotellng Bands 3. Cochran s Theorem 4. General Lnear Testng 5. Measures of
More informationj) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1
Random varables Measure of central tendences and varablty (means and varances) Jont densty functons and ndependence Measures of assocaton (covarance and correlaton) Interestng result Condtonal dstrbutons
More informationClassification as a Regression Problem
Target varable y C C, C,, ; Classfcaton as a Regresson Problem { }, 3 L C K To treat classfcaton as a regresson problem we should transform the target y nto numercal values; The choce of numercal class
More informationUsing T.O.M to Estimate Parameter of distributions that have not Single Exponential Family
IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran
More information4DVAR, according to the name, is a four-dimensional variational method.
4D-Varatonal Data Assmlaton (4D-Var) 4DVAR, accordng to the name, s a four-dmensonal varatonal method. 4D-Var s actually a drect generalzaton of 3D-Var to handle observatons that are dstrbuted n tme. The
More informationThe exam is closed book, closed notes except your one-page cheat sheet.
CS 89 Fall 206 Introducton to Machne Learnng Fnal Do not open the exam before you are nstructed to do so The exam s closed book, closed notes except your one-page cheat sheet Usage of electronc devces
More informationBasic Statistical Analysis and Yield Calculations
October 17, 007 Basc Statstcal Analyss and Yeld Calculatons Dr. José Ernesto Rayas Sánchez 1 Outlne Sources of desgn-performance uncertanty Desgn and development processes Desgn for manufacturablty A general
More informationCSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography
CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve
More informationA Comparative Study for Estimation Parameters in Panel Data Model
A Comparatve Study for Estmaton Parameters n Panel Data Model Ahmed H. Youssef and Mohamed R. Abonazel hs paper examnes the panel data models when the regresson coeffcents are fxed random and mxed and
More informationStat260: Bayesian Modeling and Inference Lecture Date: February 22, Reference Priors
Stat60: Bayesan Modelng and Inference Lecture Date: February, 00 Reference Prors Lecturer: Mchael I. Jordan Scrbe: Steven Troxler and Wayne Lee In ths lecture, we assume that θ R; n hgher-dmensons, reference
More informationMulti-dimensional Central Limit Argument
Mult-dmensonal Central Lmt Argument Outlne t as Consder d random proceses t, t,. Defne the sum process t t t t () t (); t () t are d to (), t () t 0 () t tme () t () t t t As, ( t) becomes a Gaussan random
More informationGaussian Mixture Models
Lab Gaussan Mxture Models Lab Objectve: Understand the formulaton of Gaussan Mxture Models (GMMs) and how to estmate GMM parameters. You ve already seen GMMs as the observaton dstrbuton n certan contnuous
More informationNumerical Heat and Mass Transfer
Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and
More informationLecture 12: Classification
Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna
More informationP R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /
Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons
More information4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA
4 Analyss of Varance (ANOVA) 5 ANOVA 51 Introducton ANOVA ANOVA s a way to estmate and test the means of multple populatons We wll start wth one-way ANOVA If the populatons ncluded n the study are selected
More informationBasically, if you have a dummy dependent variable you will be estimating a probability.
ECON 497: Lecture Notes 13 Page 1 of 1 Metropoltan State Unversty ECON 497: Research and Forecastng Lecture Notes 13 Dummy Dependent Varable Technques Studenmund Chapter 13 Bascally, f you have a dummy
More informationThe Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD
he Gaussan classfer Nuno Vasconcelos ECE Department, UCSD Bayesan decson theory recall that we have state of the world X observatons g decson functon L[g,y] loss of predctng y wth g Bayes decson rule s
More informationProbability and Random Variable Primer
B. Maddah ENMG 622 Smulaton 2/22/ Probablty and Random Varable Prmer Sample space and Events Suppose that an eperment wth an uncertan outcome s performed (e.g., rollng a de). Whle the outcome of the eperment
More informationVQ widely used in coding speech, image, and video
at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng
More informationPredictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore
Sesson Outlne Introducton to classfcaton problems and dscrete choce models. Introducton to Logstcs Regresson. Logstc functon and Logt functon. Maxmum Lkelhood Estmator (MLE) for estmaton of LR parameters.
More information