Efficient Sampling for Gaussian Process Inference using Control Variables

Size: px
Start display at page:

Download "Efficient Sampling for Gaussian Process Inference using Control Variables"

Transcription

1 Effent Samplng for Gaussan Proess Inferene usng Control Varables Mhals K. Ttsas, Nel D. Lawrene and Magnus Rattray Shool of Computer Sene, Unversty of Manhester Manhester M 9PL, UK Abstrat Samplng funtons n Gaussan proess (GP) models s hallengng beause of the hghly orrelated posteror dstrbuton. We desrbe an effent Markov han Monte Carlo algorthm for samplng from the posteror proess of the GP model. Ths algorthm uses ontrol varables whh are auxlary funton values that provde a low dmensonal representaton of the funton. At eah teraton, the algorthm proposes new values for the ontrol varables and generates the funton from the ondtonal GP pror. The ontrol varable nput loatons are found by mnmzng an objetve funton. We demonstrate the algorthm on regresson and lassfaton problems and we use t to estmate the parameters of a dfferental equaton model of gene regulaton. Introduton Gaussan proesses (GPs) are used for Bayesan non-parametr estmaton of unobserved or latent funtons. In regresson problems wth Gaussan lkelhoods, nferene n GP models s analytally tratable, whle for lassfaton determnst approxmate nferene algorthms are wdely used [6, 4,, ]. However, n reent applatons of GP models n systems bology [] that requre the estmaton of ordnary dfferental equaton models [,, 8], the development of determnst approxmatons s dffult sne the lkelhood an be hghly omplex. Other applatons of Gaussan proesses where nferene s ntratable arse n spato-temporal models and geostatsts and determnst approxmatons have also been developed there [4]. In ths paper, we onsder Markov han Monte Carlo (MCMC) algorthms for nferene n GP models. An advantage of MCMC over determnst approxmate nferene s that t provdes an arbtrarly prese approxmaton to the posteror dstrbuton n the lmt of long runs. Another advantage s that the samplng sheme wll often not depend on detals of the lkelhood funton, and s therefore very generally applable. In order to beneft from the advantages of MCMC t s neessary to develop an effent samplng strategy. Ths has proved to be partularly dffult n many GP applatons, beause the posteror dstrbuton desrbes a hghly orrelated hgh-dmensonal varable. Thus smple MCMC samplng shemes suh as Gbbs samplng an be very neffent. In ths ontrbuton we desrbe an effent MCMC algorthm for samplng from the posteror proess of a GP model whh onstruts the proposal dstrbutons by utlzng the GP pror. Ths algorthm uses ontrol varables whh are auxlary funton values. At eah teraton, the algorthm proposes new values for the ontrol varables and samples the funton by drawng from the ondtonal GP pror. The ontrol varables are hghly nformatve ponts that provde a low dmensonal representaton of the funton. The ontrol nput loatons are found by mnmzng an objetve funton. The objetve funton used s the expeted least squares error of reonstrutng the funton values from the ontrol varables, where the expetaton s over the GP pror. We demonstrate the proposed MCMC algorthm on regresson and lassfaton problems and ompare t wth two Gbbs samplng shemes. We also apply the algorthm to nferene n a systems

2 bology model where a set of genes s regulated by a transrpton fator proten [8]. Ths provdes an example of a problem wth a non-lnear and non-fatorzed lkelhood funton. Samplng algorthms for Gaussan Proess models In a GP model we assume a set of nputs (x,..., x N ) and a set of funton values f = (f,..., f N ) evaluated at those nputs. A Gaussan proess plaes a pror on f whh s a N-dmensonal Gaussan dstrbuton so that p(f) = N(y µ, K). The mean µ s typally zero and the ovarane matrx K s defned by the kernel funton k(x n, x m ) that depends on parameters θ. GPs are wdely used for supervsed learnng [] n whh ase we have a set of observed pars (y, x ), where =,..., N, and we assume a lkelhood model p(y f) that depends on parameters α. For regresson or lassfaton problems, the latent funton values are evaluated at the observed nputs and the lkelhood fatorzes aordng to p(y f) = N = p(y f ). However, for other type of applatons, suh as modellng latent funtons n ordnary dfferental equatons, the above fatorzaton s not applable. Assumng that we have obtaned sutable values for the model parameters (θ, α) nferene over f s done by applyng Bayes rule: p(f y) p(y f)p(f). () For regresson, where the lkelhood s Gaussan, the above posteror s a Gaussan dstrbuton that an be obtaned usng smple algebra. When the lkelhood p(y f) s non-gaussan, omputatons beome ntratable and we need to arry out approxmate nferene. The MCMC algorthm we onsder s the general Metropols-Hastngs (MH) algorthm []. Suppose we wsh to sample from the posteror n eq. (). The MH algorthm forms a Markov han. We ntalze f () and we onsder a proposal dstrbuton Q(f (t+) f (t) ) that allows us to draw a new state gven the urrent state. The new state s aepted wth probablty mn(, A) where A = p(y f (t+) )p(f (t+) ) Q(f (t) f (t+) ) p(y f (t) )p(f (t) ) Q(f (t+) f (t) ). () To apply ths gener algorthm, we need to hoose the proposal dstrbuton Q. For GP models, fndng a good proposal dstrbuton s hallengng sne f s hgh dmensonal and the posteror dstrbuton an be hghly orrelated. To motvate the algorthm presented n seton., we dsuss two extreme optons for spefyng the proposal dstrbuton Q. One smple way to hoose Q s to set t equal to the GP pror p(f). Ths gves us an ndependent MH algorthm []. However, samplng from the GP pror s very neffent as t s unlkely to obtan a sample that wll ft the data. Thus the Markov han wll get stuk n the same state for thousands of teratons. On the other hand, samplng from the pror s appealng beause any generated sample satsfes the smoothness requrement mposed by the ovarane funton. Funtons drawn from the posteror GP proess should satsfy the same smoothness requrement as well. The other extreme hoe for the proposal, that has been onsdered n [], s to apply Gbbs samplng where we teratvely draw samples from eah posteror ondtonal densty p(f f, y) wth f = f \f. However, Gbbs samplng an be extremely slow for densely dsretzed funtons, as n the regresson problem of Fgure, where the posteror GP proess s hghly orrelated. To larfy ths, note that the varane of the posteror ondtonal p(f f, y) s smaller or equal to the varane of the ondtonal GP pror p(f f ). However, p(f f ) may already have a tny varane aused by the ondtonng on all remanng latent funton values. For the one-dmensonal example n Fgure, Gbbs samplng s pratally not applable. We further study ths ssue n seton 4. A smlar algorthm to Gbbs samplng an be expressed by usng the sequene of the ondtonal denstes p(f f ) as a proposal dstrbuton for the MH algorthm. We all ths algorthm the Gbbs-lke algorthm. Ths algorthm an exhbt a hgh aeptane rate, but t s neffent to sample from hghly orrelated funtons. A smple generalzaton of the Gbbs-lke algorthm that s more approprate for samplng from smooth funtons s to dvde the doman of the funton nto regons and sample the entre funton wthn eah regon by ondtonng on the remanng funton regons. Loal regon samplng teratvely draws eah blok of funtons values f k from Thus we replae the proposal dstrbuton p(f f, y) wth the pror ondtonal p(f f ).

3 the ondtonal GP pror p(f t+ k k ), where f k = f \ f k. However, ths sheme s stll neffent to sample from hghly orrelated funtons sne the varane of the proposal dstrbuton an be very small lose to the boundares between neghbourng funton regons. The desrpton of ths algorthm s gven n the supplementary materal. In the next seton we dsuss an algorthm usng ontrol varables that an effently sample from hghly orrelated funtons. f (t). Samplng usng ontrol varables Let f be a set of M auxlary funton values that are evaluated at nputs X and drawn from the GP pror. We all f the ontrol varables and ther meanng s analogous to the auxlary ndung varables used n sparse GP models []. To ompute the posteror p(f y) based on ontrol varables we use the expresson p(f y) = p(f f, y)p(f y)df. () f Assumng that f s hghly nformatve about f, so that p(f f, y) p(f f ), we an approxmately sample from p(f y) n a two-stage manner: frstly sample the ontrol varables from p(f y) and then generate f from the ondtonal pror p(f f ). Ths sheme an allow us to ntrodue a MH ), that wll mm samplng from p(f y), and always sample f from the ondtonal pror p(f f ). The whole proposal dstrbuton takes the form algorthm, where we need to spefy only a proposal dstrbuton q(f (t+) Q(f (t+), f (t+) f (t), f (t) ) = p(f (t+) f (t+) f (t) )q(f (t+) f (t) ). (4) Eah proposed sample s aepted wth probablty mn(, A) where A s gven by A = p(y f (t+) )p(f (t+) ). p(y f (t) )p(f (t) ) (t) q(f f (t+) ) q(f (t+) f (t) ). () The usefulness of the above samplng sheme stems from the fat that the ontrol varables an form a low-dmensonal representaton of the funton. Assumng that these varables are muh fewer than the ponts n f, the samplng s manly arred out n the low dmensonal spae. In seton. we desrbe how to selet the number M of ontrol varables and the nputs X so as f beomes hghly nformatve about f. In the remander of ths seton we dsuss how we set the proposal dstrbuton q(f (t+) f (t) ). A sutable hoe for q s to use a Gaussan dstrbuton wth dagonal or full ovarane matrx. The ovarane matrx an be adapted durng the burn-n phase of MCMC n order to nrease the aeptane rate. Although ths sheme s general, t has pratal lmtatons. Frstly, tunng a full ovarane matrx s tme onsumng and n our ase ths adapton proess must be arred out smultaneously wth searhng for an approprate set of ontrol varables. Also, sne the terms nvolvng p(f ) do not anel out n the aeptane probablty n eq. (), usng a dagonal ovarane for the q dstrbuton has the rsk of proposng ontrol varables that may not satsfy the GP pror smoothness requrement. To avod these problems, we defne q by utlzng the GP pror. Aordng to eq. () a sutable hoe for q must mm the samplng from the posteror p(f y). Gven that the ontrol ponts are far apart from eah other, Gbbs samplng n the ontrol varables spae an be effent. However, teratvely samplng f from the ondtonal posteror p(f f, y) p(y f )p(f f ), where f = f \ f s ntratable for non-gaussan lkelhoods. An attratve alternatve s to use a Gbbs-lke algorthm where eah f s drawn from the ondtonal GP pror p(f (t+) f (t) ) and s aepted usng the MH step. More spefally, the proposal dstrbuton draws a new f (t+) for a ertan ontrol varable from p(f (t+) f (t) ) and generates the funton f (t+) from p(f (t+) f (t+), f (t) ). The sample (f (t+), f (t+) ) s aepted usng the MH step. Ths sheme of samplng the ontrol varables one-at-a-tme and resamplng f s terated between dfferent ontrol varables. A omplete teraton of the algorthm onssts of a full san over all ontrol varables. The aeptane probablty A n eq. () beomes the lkelhood rato and the pror smoothness requrement s always satsfed. The teraton between dfferent ontrol varables s llustrated n Fgure. Ths s beause we need to ntegrate out f n order to ompute p(y f ).

4 Fgure : Vsualzaton of teratng between ontrol varables. The red sold lne s the urrent f (t), the blue lne s the proposed f (t+), the red rles are the urrent ontrol varables f (t) whle the damond (n magenta) s the proposed ontrol varable f (t+). The blue sold vertal lne represents the dstrbuton p(f (t+) (wth two-standard error bars) and the shaded area shows the effetve proposal p(f t+ f (t) ). f (t) ) Although the ontrol varables are sampled one-at-at-tme, f an stll be drawn wth a onsderable varane. To larfy ths, note that when the ontrol varable f hanges the effetve proposal dstrbuton for f s p(f t+ f (t) ) = p(f t+ f f (t+) (t+), f (t) )p(f (t+) f (t) )df (t+), whh s the ondtonal GP pror gven all the ontrol ponts apart from the urrent pont f. Ths ondtonal pror an have onsderable varane lose to f and n all regons that are not lose to the remanng ontrol varables. As llustrated n Fgure, the teraton over dfferent ontrol varables allow f to be drawn wth a onsderable varane everywhere n the nput spae.. Seleton of the ontrol varables To apply the prevous algorthm we need to selet the number, M, of the ontrol ponts and the assoated nputs X. X must be hosen so that knowledge of f an determne f wth small error. The predton of f gven f s equal to K f, K, f whh s the mean of the ondtonal pror p(f f ). A sutable way to searh over X s to mnmze the reonstruton error f K f, K, f averaged over any possble value of (f, f ): G(X ) = f K f, K, f p(f f )p(f )dfdf = Tr(K f,f K f, K, Kf,). T f,f The quantty nsde the trae s the ovarane of p(f f ) and thus G(X ) s the total varane of ths dstrbuton. We an mnmze G(X ) w.r.t. X usng ontnuous optmzaton smlarly to the approah n []. Note that when G(X ) beomes zero, p(f f ) beomes a delta funton. To fnd the number M of ontrol ponts we mnmze G(X ) by nrementally addng ontrol varables untl the total varane of p(f f ) beomes smaller than a ertan perentage of the total varane of the pror p(f). % was the threshold used n all our experments. Then we start the smulaton and we observe the aeptane rate of the Markov han. Aordng to standard heursts [] whh suggest that desrable aeptane rates of MH algorthms are around /4, we requre a full teraton of the algorthm (a omplete san over the ontrol varables) to have an aeptane rate larger than /4. When for the urrent set of ontrol nputs X the han has a low aeptane rate, t means that the varane of p(f f ) s stll too hgh and we need to add more ontrol ponts n order to further redue G(X ). The proess of observng the aeptane rate and addng ontrol varables s ontnued untl we reah the desrable aeptane rate. When the tranng nputs X are plaed unformly n the spae, and the kernel funton s statonary, the mnmzaton of G plaes X n a regular grd. In general, the mnmzaton of G plaes the ontrol nputs lose to the lusters of the nput data n suh a way that the kernel funton s taken nto aount. Ths suggests that G an also be used for learnng ndung varables n sparse GP models n a unsupervsed fashon, where the observed outputs y are not nvolved.

5 Applatons We onsder two applatons where exat nferene s ntratable due to a non-lnear lkelhood funton: lassfaton and parameter estmaton n a dfferental equaton model of gene regulaton. Classfaton: Determnst nferene methods for GP lassfaton are desrbed n [6, 4, 7]. Among these approahes, the Expetaton-Propagaton (EP) algorthm [9] s found to be the most effent [6]. Our MCMC mplementaton onfrms these fndngs sne samplng usng ontrol varables gave smlar lassfaton auray to EP. Transrptonal regulaton: We onsder a small bologal sub-system where a set of target genes are regulated by one transrpton fator (TF) proten. Ordnary dfferental equatons (ODEs) an provde an useful framework for modellng the dynams n these bologal networks [,,, 8]. The onentraton of the TF and the gene spef knet parameters are typally unknown and need to be estmated by makng use of a set of observed gene expresson levels. We use a GP pror to model the unobserved TF atvty, as proposed n [8], and apply full Bayesan nferene based on the MCMC algorthm presented prevously. Bareno et al. [] ntrodue a lnear ODE model for gene atvaton from TF. Ths approah was extended n [, 8] to aount for non-lnear models. The general form of the ODE model for transrpton regulaton wth a sngle TF has the form dy j (t) = B j + S j g(f(t)) D j y j (t), (6) dt where the hangng level of a gene j s expresson, y j (t), s gven by a ombnaton of basal transrpton rate, B j, senstvty, S j, to ts governng TF s atvty, f(t), and the deay rate of the mrna, D j. The dfferental equaton an be solved for y j (t) gvng y j (t) = B j D j + A j e Djt + S j e Djt t g(f(u))e Dju du, (7) where A j term arses from the ntal ondton. Due to the non-lnearty of the g funton that transforms the TF, the ntegral n the above expresson s not analytally obtaned. However, numeral ntegraton an be used to aurately approxmate the ntegral wth a dense grd (u ) P = of ponts n the tme axs and evaluatng the funton at the grd ponts f p = f(u p ). In ths ase the ntegral n the above equaton an be wrtten P t p= w pg(f p )e Djup where the weghts w p arse from the numeral ntegraton method used and, for example, an be gven by the omposte Smpson rule. The TF onentraton f(t) n the above system of ODEs s a latent funton that needs to be estmated. Addtonally, the knet parameters of eah gene α j = (B j, D j, S j, A j ) are unknown and also need to be estmated. To nfer these quanttes we use mrna measurements (obtaned from mroarray experments) of N target genes at T dfferent tme steps. Let y jt denote the observed gene expresson level of gene j at tme t and let y = {y jt } ollet together all these observatons. Assumng a Gaussan nose for the observed gene expressons the lkelhood of our data has the form N T p(y f, {α j } N j=) = p(y jt f p Pt, α j ), (8) j= t= where eah probablty densty n the above produt s a Gaussan wth mean gven by eq. (7) and f p Pt denotes the TF values up to tme t. Note that ths lkelhood s non-gaussan due to the non-lnearty of g. Further, ths lkelhood does not have a fatorzed form, as n the regresson and lassfaton ases, sne an observed gene expresson depends on the proten onentraton atvty n all prevous tmes ponts. Also note that the dsretzaton of the TF n P tme ponts orresponds to a very dense grd, whle the gene expresson measurements are sparse,.e. P T. To apply full Bayesan nferene n the above model, we need to defne pror dstrbutons over all unknown quanttes. The proten onentraton f s a postve quantty, thus a sutable pror s to onsder a GP pror for log f. The knet parameters of eah gene are all postve salars. Those parameters are gven vague gamma prors. Samplng the GP funton s done exatly as desrbed n seton ; we have only to plug n the lkelhood from eq. (8) n the MH step. Samplng from the knet parameters s arred usng Gaussan proposal dstrbutons wth dagonal ovarane matres that sample the postve knet parameters n the log spae.

6 KL(real empral) gbbs regon ontrol MCMC teratons x 4 KL(real empral) 6 4 Gbbs regon ontrol dmenson (a) (b) () number of ontrol varables orrcoef ontrol dmenson Fgure : (a) shows the evoluton of the KL dvergene (aganst the number of MCMC teratons) between the true posteror and the emprally estmated posterors for a -dmensonal regresson dataset. (b) shows the mean values wth one-standard error bars of the KL dvergene (aganst the nput dmenson) between the true posteror and the emprally estmated posterors. () plots the number of ontrol varables together wth the average orrelaton oeffent of the GP pror. 4 Experments In the frst experment we ompare Gbbs samplng (Gbbs), samplng usng loal regons (regon) (see the supplementary fle) and samplng usng ontrol varables (ontrol) n standard regresson problems of vared nput dmensons. The performane of the algorthms an be aurately assessed by omputng the KL dvergenes between the exat Gaussan posteror p(f y) and the Gaussans obtaned by MCMC. We fx the number of tranng ponts to N = and we vary the nput dmenson d from to. The tranng nputs X were hosen randomly nsde the unt hyperube [, ] d. Thus, we an study the behavor of the algorthms w.r.t. the amount of orrelaton n the posteror GP proess whh depends on how densely the funton s sampled. The larger the dmenson, the sparser the funton s sampled. The outputs Y were hosen by randomly produng a GP funton usng the squared-exponental kernel σf xm xn exp( l ), where (σ f, l ) = (, ) and then addng nose wth varane σ =.9. The burn-n perod was 4 teratons. For a ertan dmenson d the algorthms were ntalzed to the same state obtaned by randomly drawng from the GP pror. The parameters (σf, l, σ ) were fxed to the values that generated the data. The expermental setup was repeated tmes so as to obtan onfdene ntervals. We used thnned samples (by keepng one sample every teratons) to alulate the means and ovaranes of the -dmensonal posteror Gaussans. Fgure (a) shows the KL dvergene aganst the number of MCMC teratons for the -dmensonal nput dataset. It seems that for tranng ponts and dmensons, the funton values are stll hghly orrelated and thus Gbbs takes muh longer for the KL dvergene to drop to zero. Fgure (b) shows the KL dvergene aganst the nput dmenson after fxng the number of teratons to be 4. Clearly Gbbs s very neffent n low dmensons beause of the hghly orrelated posteror. As dmenson nreases and the funtons beome sparsely sampled, Gbbs mproves and eventually the KL dvergenes approahes zero. The regon algorthm works better than Gbbs but n low dmensons t also suffers from the problem of hgh orrelaton. For the ontrol algorthm we observe that the KL dvergene s very lose to zero for all dmensons. Fgure () shows the nrease n the number of ontrol varables used as the nput dmenson nreases. The same plot shows the derease of the average orrelaton oeffent of the GP pror as the nput dmenson nreases. Ths s very ntutve, sne one should expet the number of ontrol varables to nrease as the funton values beome more ndependent. Next we onsder two GP lassfaton problems for whh exat nferene s ntratable. We used the Wsonsn Breast Caner (WBC) and the Pma Indans Dabetes (PID) bnary lassfaton datasets. The frst onssts of 68 examples (9 nput dmensons) and the seond of 768 examples (8 dmensons). % of the examples were used for testng n eah ase. The MCMC samplers were run for 4 teratons (thnned to one sample every fve teratons) after a burn-n of 4 teratons. The hyperparameters were fxed to those obtaned by EP. Fgures (a) and (b) shows For Gbbs we used 4 teratons sne the regon and ontrol algorthms requre addtonal teratons durng the adapton phase.

7 Log lkelhood Log lkelhood MCMC teratons MCMC teratons gbbs ontr ep gbbs ontr ep (a) (b) () Fgure : We show results for GP lassfaton. Log-lkelhood values are shown for MCMC samples obtaned from (a) Gbbs and (b) ontrol appled to the WBC dataset. In () we show the test errors (grey bars) and the average negatve log lkelhoods (blak bars) on the WBC (left) and PID (rght) datasets and ompare wth EP.. p6 sesn Gene frst Repla Deay rates DDB p6 sesn TNFRSFb CIp/p BIK 7 Inferred proten 7. dni Gene 7 yjw Gene Fgure 4: Frst row: The left plot shows the nferred TF onentraton for p; the small plot on top-rght shows the ground-truth proten onentraton obtaned by a Western blot experment []. The mddle plot shows the predted expresson of a gene obtaned by the estmated ODE model; red rosses orrespond to the atual gene expresson measurements. The rght-hand plot shows the estmated deay rates for all target genes used to tran the model. Grey bars dsplay the parameters found by MCMC and blak bars the parameters found n [] usng a lnear ODE model. Seond row: The left plot shows the nferred TF for LexA. Predted expressons of two target genes are shown n the rest two plots. Error bars n all plots orrespond to 9% redblty ntervals. the log-lkelhood for MCMC samples on the WBC dataset, for the Gbbs and ontrol algorthms respetvely. It an be observed that mxng s far superor for the ontrol algorthm and t has also onverged to a muh hgher lkelhood. In Fgure () we ompare the test error and the average negatve log lkelhood n the test data obtaned by the two MCMC algorthms wth the results from EP. The proposed ontrol algorthm shows smlar lassfaton performane to EP, whle the Gbbs algorthm performs sgnfantly worse on both datasets. In the fnal two experments we apply the ontrol algorthm to nfer the proten onentraton of TFs that atvate or repress a set of target genes. The latent funton n these problems s always onedmensonal and densely dsretzed and thus the ontrol algorthm s the only one that an onverge to the GP posteror proess n a reasonable tme. We frst onsder the TF p whh s a tumour repressor atvated durng DNA damage. Seven samples of the expresson levels of fve target genes n three replas are olleted as the raw tme ourse data. The non-lnear atvaton of the proten follows the Mhaels Menten knets nspred response [] that allows saturaton effets to be taken nto aount so as g(f(t)) = f(t) γ j+f(t) n eq.

8 (6) where the Mhaels onstant for the jth gene s gven by γ j. Note that sne f(t) s postve the GP pror s plaed on the log f(t). To apply MCMC we dsretze f usng a grd of P = ponts. Durng samplng, 7 ontrol varables were needed to obtan the desrable aeptane rate. Runnng tme was 4 hours for samplng teratons plus 4 burn-n teratons. The frst row of Fgure 4 summarzes the estmated quanttes obtaned from MCMC smulaton. Next we onsder the TF LexA n E.Col that ats as a repressor. In the represson ase there s an analogous Mhaels Menten model [] where the non-lnear funton g takes the form: g(f(t)) = γ j+f(t). Agan the GP pror s plaed on the log of the TF atvty. We appled our method to the same mroarray data onsdered n [] where mrna measurements of 4 target genes are olleted over sx tme ponts. For ths dataset, the expresson of the 4 genes were avalable for T = 6 tmes. The GP funton f was dsretzed usng ponts. The result for the nferred TF profle along wth predtons of two target genes are shown n the seond row of Fgure 4. Our nferred TF profle and reonstruted target gene profles are smlar to those obtaned n []. However, for ertan genes, our model provdes a better ft to the gene profle. Dsusson Gaussan proesses allow for nferene over latent funtons usng a Bayesan estmaton framework. In ths paper, we presented an MCMC algorthm that uses ontrol varables. We showed that ths samplng sheme an effently deal wth hghly orrelated posteror GP proesses. MCMC allows for full Bayesan nferene n the transrpton fator networks applaton. An mportant dreton for future researh wll be salng the models used to muh larger systems of ODEs wth multple nteratng transrpton fators. In suh large systems where MCMC an beome slow a ombnaton of our method wth the fast samplng sheme n [] ould be used to speed up the nferene. Aknowledgments Ths work s funded by EPSRC Grant No EP/F687/ Gaussan Proesses for Systems Identfaton wth Applatons n Systems Bology. Referenes [] U. Alon. An Introduton to Systems Bology: Desgn Prnples of Bologal Cruts. Chapman and Hall/CRC, 6. [] M. Bareno, D. Tomesu, D. Brewer, J. Callard, R. Stark, and M. Hubank. Ranked predton of p targets usng hdden varable dynam modelng. Genome Bology, 7(), 6. [] B. Calderhead, M. Grolam, and N.D. Lawrene. Aeleratng Bayesan Inferene over Nonlnear Dfferental Equatons wth Gaussan Proesses. In Neural Informaton Proessng Systems,, 8. [4] L. Csato and M. Opper. Sparse onlne Gaussan proesses. Neural Computaton, 4:64 668,. [] M. N. Gbbs and D. J. C. MaKay. Varatonal Gaussan proess lassfers. IEEE Transatons on Neural Networks, (6):48 464,. [6] M. Kuss and C. E. Rasmussen. Assessng Approxmate Inferene for Bnary Gaussan Proess Classfaton. Journal of Mahne Learnng Researh, 6:679 74,. [7] N. D. Lawerene, M. Seeger, and R. Herbrh. Fast sparse Gaussan proess methods: the nformatve vetor mahne. In Advanes n Neural Informaton Proessng Systems,. MIT Press,. [8] N. D. Lawrene, G. Sangunett, and M. Rattray. Modellng transrptonal regulaton usng Gaussan proesses. In Advanes n Neural Informaton Proessng Systems, 9. MIT Press, 7. [9] T. Mnka. Expetaton propagaton for approxmate Bayesan nferene. In UAI, pages 6 69,. [] R. M. Neal. Monte Carlo mplementaton of Gaussan proess models for Bayesan regresson and lassfaton. Tehnal report, Dept. of Statsts, Unversty of Toronto, 997. [] C. E. Rasmussen and C. K. I. Wllams. Gaussan Proesses for Mahne Learnng. MIT Press, 6. [] C. P. Robert and G. Casella. Monte Carlo Statstal Methods. Sprnger-Verlag, nd edton, 4. [] S. Rogers, R. Khann, and M. Grolam. Bayesan model-based nferene of transrpton fator atvty. BMC Bonformats, 8(), 6. [4] H. Rue, S. Martno, and N. Chopn. Approxmate Bayesan nferene for latent Gaussan models usng ntegrated nested Laplae approxmatons. NTNU Statsts Preprnt, 7. [] E. Snelson and Z. Ghahraman. Sparse Gaussan proess usng pseudo nputs. In Advanes n Neural Informaton Proessng Systems,. MIT Press, 6. [6] C. K. I. Wllams and D. Barber. Bayesan lassfaton wth Gaussan proesses. IEEE Transatons on Pattern Analyss and Mahne Intellgene, ():4, 998.

Clustering. CS4780/5780 Machine Learning Fall Thorsten Joachims Cornell University

Clustering. CS4780/5780 Machine Learning Fall Thorsten Joachims Cornell University Clusterng CS4780/5780 Mahne Learnng Fall 2012 Thorsten Joahms Cornell Unversty Readng: Mannng/Raghavan/Shuetze, Chapters 16 (not 16.3) and 17 (http://nlp.stanford.edu/ir-book/) Outlne Supervsed vs. Unsupervsed

More information

Outline. Clustering: Similarity-Based Clustering. Supervised Learning vs. Unsupervised Learning. Clustering. Applications of Clustering

Outline. Clustering: Similarity-Based Clustering. Supervised Learning vs. Unsupervised Learning. Clustering. Applications of Clustering Clusterng: Smlarty-Based Clusterng CS4780/5780 Mahne Learnng Fall 2013 Thorsten Joahms Cornell Unversty Supervsed vs. Unsupervsed Learnng Herarhal Clusterng Herarhal Agglomeratve Clusterng (HAC) Non-Herarhal

More information

JSM Survey Research Methods Section. Is it MAR or NMAR? Michail Sverchkov

JSM Survey Research Methods Section. Is it MAR or NMAR? Michail Sverchkov JSM 2013 - Survey Researh Methods Seton Is t MAR or NMAR? Mhal Sverhkov Bureau of Labor Statsts 2 Massahusetts Avenue, NE, Sute 1950, Washngton, DC. 20212, Sverhkov.Mhael@bls.gov Abstrat Most methods that

More information

Controller Design for Networked Control Systems in Multiple-packet Transmission with Random Delays

Controller Design for Networked Control Systems in Multiple-packet Transmission with Random Delays Appled Mehans and Materals Onlne: 03-0- ISSN: 66-748, Vols. 78-80, pp 60-604 do:0.408/www.sentf.net/amm.78-80.60 03 rans eh Publatons, Swtzerland H Controller Desgn for Networed Control Systems n Multple-paet

More information

The corresponding link function is the complementary log-log link The logistic model is comparable with the probit model if

The corresponding link function is the complementary log-log link The logistic model is comparable with the probit model if SK300 and SK400 Lnk funtons for bnomal GLMs Autumn 08 We motvate the dsusson by the beetle eample GLMs for bnomal and multnomal data Covers the followng materal from hapters 5 and 6: Seton 5.6., 5.6.3,

More information

Markov Chain Monte Carlo Algorithms for Gaussian Processes

Markov Chain Monte Carlo Algorithms for Gaussian Processes Markov Chain Monte Carlo Algorithms for Gaussian Processes Michalis K. Titsias, Neil Lawrence and Magnus Rattray School of Computer Science University of Manchester June 8 Outline Gaussian Processes Sampling

More information

Machine Learning: and 15781, 2003 Assignment 4

Machine Learning: and 15781, 2003 Assignment 4 ahne Learnng: 070 and 578, 003 Assgnment 4. VC Dmenson 30 onts Consder the spae of nstane X orrespondng to all ponts n the D x, plane. Gve the VC dmenson of the followng hpothess spaes. No explanaton requred.

More information

Instance-Based Learning and Clustering

Instance-Based Learning and Clustering Instane-Based Learnng and Clusterng R&N 04, a bt of 03 Dfferent knds of Indutve Learnng Supervsed learnng Bas dea: Learn an approxmaton for a funton y=f(x based on labelled examples { (x,y, (x,y,, (x n,y

More information

FAULT DETECTION AND IDENTIFICATION BASED ON FULLY-DECOUPLED PARITY EQUATION

FAULT DETECTION AND IDENTIFICATION BASED ON FULLY-DECOUPLED PARITY EQUATION Control 4, Unversty of Bath, UK, September 4 FAUL DEECION AND IDENIFICAION BASED ON FULLY-DECOUPLED PARIY EQUAION C. W. Chan, Hua Song, and Hong-Yue Zhang he Unversty of Hong Kong, Hong Kong, Chna, Emal:

More information

Using Artificial Neural Networks and Support Vector Regression to Model the Lyapunov Exponent

Using Artificial Neural Networks and Support Vector Regression to Model the Lyapunov Exponent Usng Artfal Neural Networks and Support Vetor Regresson to Model the Lyapunov Exponent Abstrat: Adam Maus* Aprl 3, 009 Fndng the salent patterns n haot data has been the holy gral of Chaos Theory. Examples

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Clustering through Mixture Models

Clustering through Mixture Models lusterng through Mxture Models General referenes: Lndsay B.G. 995 Mxture models: theory geometry and applatons FS- BMS Regonal onferene Seres n Probablty and Statsts. MLahlan G.J. Basford K.E. 988 Mxture

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement Markov Chan Monte Carlo MCMC, Gbbs Samplng, Metropols Algorthms, and Smulated Annealng 2001 Bonformatcs Course Supplement SNU Bontellgence Lab http://bsnuackr/ Outlne! Markov Chan Monte Carlo MCMC! Metropols-Hastngs

More information

Gaussian process classification: a message-passing viewpoint

Gaussian process classification: a message-passing viewpoint Gaussan process classfcaton: a message-passng vewpont Flpe Rodrgues fmpr@de.uc.pt November 014 Abstract The goal of ths short paper s to provde a message-passng vewpont of the Expectaton Propagaton EP

More information

Adaptive Multilayer Neural Network Control of Blood Pressure

Adaptive Multilayer Neural Network Control of Blood Pressure Proeedng of st Internatonal Symposum on Instrument Sene and Tenology. ISIST 99. P4-45. 999. (ord format fle: ISIST99.do) Adaptve Multlayer eural etwork ontrol of Blood Pressure Fe Juntao, Zang bo Department

More information

Gaussian Mixture Models

Gaussian Mixture Models Lab Gaussan Mxture Models Lab Objectve: Understand the formulaton of Gaussan Mxture Models (GMMs) and how to estmate GMM parameters. You ve already seen GMMs as the observaton dstrbuton n certan contnuous

More information

Geometric Clustering using the Information Bottleneck method

Geometric Clustering using the Information Bottleneck method Geometr Clusterng usng the Informaton Bottlenek method Susanne Stll Department of Physs Prneton Unversty, Prneton, NJ 08544 susanna@prneton.edu Wllam Balek Department of Physs Prneton Unversty, Prneton,

More information

A solution to the Curse of Dimensionality Problem in Pairwise Scoring Techniques

A solution to the Curse of Dimensionality Problem in Pairwise Scoring Techniques A soluton to the Curse of Dmensonalty Problem n Parwse orng Tehnques Man Wa MAK Dept. of Eletron and Informaton Engneerng The Hong Kong Polytehn Unversty un Yuan KUNG Dept. of Eletral Engneerng Prneton

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

8/25/17. Data Modeling. Data Modeling. Data Modeling. Patrice Koehl Department of Biological Sciences National University of Singapore

8/25/17. Data Modeling. Data Modeling. Data Modeling. Patrice Koehl Department of Biological Sciences National University of Singapore 8/5/17 Data Modelng Patrce Koehl Department of Bologcal Scences atonal Unversty of Sngapore http://www.cs.ucdavs.edu/~koehl/teachng/bl59 koehl@cs.ucdavs.edu Data Modelng Ø Data Modelng: least squares Ø

More information

Classification as a Regression Problem

Classification as a Regression Problem Target varable y C C, C,, ; Classfcaton as a Regresson Problem { }, 3 L C K To treat classfcaton as a regresson problem we should transform the target y nto numercal values; The choce of numercal class

More information

technische universiteit eindhoven Analysis of one product /one location inventory control models prof.dr. A.G. de Kok 1

technische universiteit eindhoven Analysis of one product /one location inventory control models prof.dr. A.G. de Kok 1 TU/e tehnshe unverstet endhoven Analyss of one produt /one loaton nventory ontrol models prof.dr. A.G. de Kok Aknowledgements: I would lke to thank Leonard Fortun for translatng ths ourse materal nto Englsh

More information

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2)

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2) 1/16 MATH 829: Introducton to Data Mnng and Analyss The EM algorthm (part 2) Domnque Gullot Departments of Mathematcal Scences Unversty of Delaware Aprl 20, 2016 Recall 2/16 We are gven ndependent observatons

More information

Lecture 12: Classification

Lecture 12: Classification Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna

More information

Gravity Drainage Prior to Cake Filtration

Gravity Drainage Prior to Cake Filtration 1 Gravty Dranage Pror to ake Fltraton Sott A. Wells and Gregory K. Savage Department of vl Engneerng Portland State Unversty Portland, Oregon 97207-0751 Voe (503) 725-4276 Fax (503) 725-4298 ttp://www.e.pdx.edu/~wellss

More information

Voltammetry. Bulk electrolysis: relatively large electrodes (on the order of cm 2 ) Voltammetry:

Voltammetry. Bulk electrolysis: relatively large electrodes (on the order of cm 2 ) Voltammetry: Voltammetry varety of eletroanalytal methods rely on the applaton of a potental funton to an eletrode wth the measurement of the resultng urrent n the ell. In ontrast wth bul eletrolyss methods, the objetve

More information

STK4900/ Lecture 4 Program. Counterfactuals and causal effects. Example (cf. practical exercise 10)

STK4900/ Lecture 4 Program. Counterfactuals and causal effects. Example (cf. practical exercise 10) STK4900/9900 - Leture 4 Program 1. Counterfatuals and ausal effets 2. Confoundng 3. Interaton 4. More on ANOVA Setons 4.1, 4.4, 4.6 Supplementary materal on ANOVA Example (f. pratal exerse 10) How does

More information

Phase Transition in Collective Motion

Phase Transition in Collective Motion Phase Transton n Colletve Moton Hefe Hu May 4, 2008 Abstrat There has been a hgh nterest n studyng the olletve behavor of organsms n reent years. When the densty of lvng systems s nreased, a phase transton

More information

1 Motivation and Introduction

1 Motivation and Introduction Instructor: Dr. Volkan Cevher EXPECTATION PROPAGATION September 30, 2008 Rce Unversty STAT 63 / ELEC 633: Graphcal Models Scrbes: Ahmad Beram Andrew Waters Matthew Nokleby Index terms: Approxmate nference,

More information

Exact Inference: Introduction. Exact Inference: Introduction. Exact Inference: Introduction. Exact Inference: Introduction.

Exact Inference: Introduction. Exact Inference: Introduction. Exact Inference: Introduction. Exact Inference: Introduction. Exat nferene: ntroduton Exat nferene: ntroduton Usng a ayesan network to ompute probabltes s alled nferene n general nferene nvolves queres of the form: E=e E = The evdene varables = The query varables

More information

Improving the Performance of Fading Channel Simulators Using New Parameterization Method

Improving the Performance of Fading Channel Simulators Using New Parameterization Method Internatonal Journal of Eletrons and Eletral Engneerng Vol. 4, No. 5, Otober 06 Improvng the Performane of Fadng Channel Smulators Usng New Parameterzaton Method Omar Alzoub and Moheldn Wanakh Department

More information

An Evaluation on Feature Selection for Text Clustering

An Evaluation on Feature Selection for Text Clustering An Evaluaton on Feature Seleton for Text Clusterng Tao Lu Department of Informaton Sene, anka Unversty, Tann 30007, P. R. Chna Shengpng Lu Department of Informaton Sene, Pekng Unversty, Beng 0087, P. R.

More information

Chapter 12. Ordinary Differential Equation Boundary Value (BV) Problems

Chapter 12. Ordinary Differential Equation Boundary Value (BV) Problems Chapter. Ordnar Dfferental Equaton Boundar Value (BV) Problems In ths chapter we wll learn how to solve ODE boundar value problem. BV ODE s usuall gven wth x beng the ndependent space varable. p( x) q(

More information

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore Sesson Outlne Introducton to classfcaton problems and dscrete choce models. Introducton to Logstcs Regresson. Logstc functon and Logt functon. Maxmum Lkelhood Estmator (MLE) for estmaton of LR parameters.

More information

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M CIS56: achne Learnng Lecture 3 (Sept 6, 003) Preparaton help: Xaoyng Huang Lnear Regresson Lnear regresson can be represented by a functonal form: f(; θ) = θ 0 0 +θ + + θ = θ = 0 ote: 0 s a dummy attrbute

More information

Markov Chain Monte Carlo Lecture 6

Markov Chain Monte Carlo Lecture 6 where (x 1,..., x N ) X N, N s called the populaton sze, f(x) f (x) for at least one {1, 2,..., N}, and those dfferent from f(x) are called the tral dstrbutons n terms of mportance samplng. Dfferent ways

More information

Negative Binomial Regression

Negative Binomial Regression STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...

More information

Structure and Drive Paul A. Jensen Copyright July 20, 2003

Structure and Drive Paul A. Jensen Copyright July 20, 2003 Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.

More information

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng

More information

GEL 446: Applied Environmental Geology

GEL 446: Applied Environmental Geology GE 446: ppled Envronmental Geology Watershed Delneaton and Geomorphology Watershed Geomorphology Watersheds are fundamental geospatal unts that provde a physal and oneptual framewor wdely used by sentsts,

More information

Statistics Chapter 4

Statistics Chapter 4 Statstcs Chapter 4 "There are three knds of les: les, damned les, and statstcs." Benjamn Dsrael, 1895 (Brtsh statesman) Gaussan Dstrbuton, 4-1 If a measurement s repeated many tmes a statstcal treatment

More information

C4B Machine Learning Answers II. = σ(z) (1 σ(z)) 1 1 e z. e z = σ(1 σ) (1 + e z )

C4B Machine Learning Answers II. = σ(z) (1 σ(z)) 1 1 e z. e z = σ(1 σ) (1 + e z ) C4B Machne Learnng Answers II.(a) Show that for the logstc sgmod functon dσ(z) dz = σ(z) ( σ(z)) A. Zsserman, Hlary Term 20 Start from the defnton of σ(z) Note that Then σ(z) = σ = dσ(z) dz = + e z e z

More information

DOAEstimationforCoherentSourcesinBeamspace UsingSpatialSmoothing

DOAEstimationforCoherentSourcesinBeamspace UsingSpatialSmoothing DOAEstmatonorCoherentSouresneamspae UsngSpatalSmoothng YnYang,ChunruWan,ChaoSun,QngWang ShooloEletralandEletronEngneerng NanangehnologalUnverst,Sngapore,639798 InsttuteoAoustEngneerng NorthwesternPoltehnalUnverst,X

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

Chapter 7. Ab initio Theory

Chapter 7. Ab initio Theory Chapter 7. Ab nto Theory Ab nto: from the frst prnples. I. Roothaan-Hall approah Assumng Born-Oppenhemer approxmaton, the eletron Hamltonan: Hˆ Tˆ e + V en + V ee Z r a a a + > r Wavefunton Slater determnant:

More information

CME 302: NUMERICAL LINEAR ALGEBRA FALL 2005/06 LECTURE 13

CME 302: NUMERICAL LINEAR ALGEBRA FALL 2005/06 LECTURE 13 CME 30: NUMERICAL LINEAR ALGEBRA FALL 005/06 LECTURE 13 GENE H GOLUB 1 Iteratve Methods Very large problems (naturally sparse, from applcatons): teratve methods Structured matrces (even sometmes dense,

More information

Finite Mixture Models and Expectation Maximization. Most slides are from: Dr. Mario Figueiredo, Dr. Anil Jain and Dr. Rong Jin

Finite Mixture Models and Expectation Maximization. Most slides are from: Dr. Mario Figueiredo, Dr. Anil Jain and Dr. Rong Jin Fnte Mxture Models and Expectaton Maxmzaton Most sldes are from: Dr. Maro Fgueredo, Dr. Anl Jan and Dr. Rong Jn Recall: The Supervsed Learnng Problem Gven a set of n samples X {(x, y )},,,n Chapter 3 of

More information

Fusion of Neural Classifiers for Financial Market Prediction

Fusion of Neural Classifiers for Financial Market Prediction Fuson of Neural Classfers for Fnanal Market Predton Trsh Keaton Dept. of Eletral Engneerng (136-93) Informaton Senes Laboratory (RL 69) Calforna Insttute of Tehnology HRL Laboratores, LLC Pasadena, CA

More information

Performance Modeling of Hierarchical Memories

Performance Modeling of Hierarchical Memories Performane Modelng of Herarhal Memores Marwan Sleman, Lester Lpsky, Kshor Konwar Department of omputer Sene and Engneerng Unversty of onnetut Storrs, T 0669-55 Emal: {marwan, lester, kshor}@engr.uonn.edu

More information

Scalable Multi-Class Gaussian Process Classification using Expectation Propagation

Scalable Multi-Class Gaussian Process Classification using Expectation Propagation Scalable Mult-Class Gaussan Process Classfcaton usng Expectaton Propagaton Carlos Vllacampa-Calvo and Danel Hernández Lobato Computer Scence Department Unversdad Autónoma de Madrd http://dhnzl.org, danel.hernandez@uam.es

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

Global Gaussian approximations in latent Gaussian models

Global Gaussian approximations in latent Gaussian models Global Gaussan approxmatons n latent Gaussan models Botond Cseke Aprl 9, 2010 Abstract A revew of global approxmaton methods n latent Gaussan models. 1 Latent Gaussan models In ths secton we ntroduce notaton

More information

NUMERICAL DIFFERENTIATION

NUMERICAL DIFFERENTIATION NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the

More information

( ) ( ) ( ) ( ) STOCHASTIC SIMULATION FOR BLOCKED DATA. Monte Carlo simulation Rejection sampling Importance sampling Markov chain Monte Carlo

( ) ( ) ( ) ( ) STOCHASTIC SIMULATION FOR BLOCKED DATA. Monte Carlo simulation Rejection sampling Importance sampling Markov chain Monte Carlo SOCHASIC SIMULAIO FOR BLOCKED DAA Stochastc System Analyss and Bayesan Model Updatng Monte Carlo smulaton Rejecton samplng Importance samplng Markov chan Monte Carlo Monte Carlo smulaton Introducton: If

More information

ELEKTRYKA 2016 Zeszyt 3-4 ( )

ELEKTRYKA 2016 Zeszyt 3-4 ( ) ELEKTRYKA 206 Zeszyt 3-4 (239-240) Rok LXII Waldemar BAUER, Jerzy BARANOWSKI, Tomasz DZIWIŃSKI, Paweł PIĄTEK, Marta ZAGÓROWSKA AGH Unversty of Sene and Tehnology, Kraków OUSTALOUP PARALLEL APPROXIMATION

More information

ALGEBRAIC SCHUR COMPLEMENT APPROACH FOR A NON LINEAR 2D ADVECTION DIFFUSION EQUATION

ALGEBRAIC SCHUR COMPLEMENT APPROACH FOR A NON LINEAR 2D ADVECTION DIFFUSION EQUATION st Annual Internatonal Interdsplnary Conferene AIIC 03 4-6 Aprl Azores Portugal - Proeedngs- ALGEBRAIC SCHUR COMPLEMENT APPROACH FOR A NON LINEAR D ADVECTION DIFFUSION EQUATION Hassan Belhad Professor

More information

Bivariate Analysis of Number of Services to Conception and Days Open in Norwegian Red Using a Censored Threshold-Linear Model

Bivariate Analysis of Number of Services to Conception and Days Open in Norwegian Red Using a Censored Threshold-Linear Model Bvarate Analyss of Number of Serves to Conepton and Days Open n Norwegan Red Usng a Censored Threshold-Lnear Model Y. M. Chang, I. M. Andersen-Ranberg, B. Herngstad,3, D. Ganola,3, and G. Klemetsdal 3

More information

The Similar Structure Method for Solving Boundary Value Problems of a Three Region Composite Bessel Equation

The Similar Structure Method for Solving Boundary Value Problems of a Three Region Composite Bessel Equation The Smlar Struture Method for Solvng Boundary Value Problems of a Three Regon Composte Bessel Equaton Mngmng Kong,Xaou Dong Center for Rado Admnstraton & Tehnology Development, Xhua Unversty, Chengdu 69,

More information

Outline and Reading. Dynamic Programming. Dynamic Programming revealed. Computing Fibonacci. The General Dynamic Programming Technique

Outline and Reading. Dynamic Programming. Dynamic Programming revealed. Computing Fibonacci. The General Dynamic Programming Technique Outlne and Readng Dynamc Programmng The General Technque ( 5.3.2) -1 Knapsac Problem ( 5.3.3) Matrx Chan-Product ( 5.3.1) Dynamc Programmng verson 1.4 1 Dynamc Programmng verson 1.4 2 Dynamc Programmng

More information

4DVAR, according to the name, is a four-dimensional variational method.

4DVAR, according to the name, is a four-dimensional variational method. 4D-Varatonal Data Assmlaton (4D-Var) 4DVAR, accordng to the name, s a four-dmensonal varatonal method. 4D-Var s actually a drect generalzaton of 3D-Var to handle observatons that are dstrbuted n tme. The

More information

First Year Examination Department of Statistics, University of Florida

First Year Examination Department of Statistics, University of Florida Frst Year Examnaton Department of Statstcs, Unversty of Florda May 7, 010, 8:00 am - 1:00 noon Instructons: 1. You have four hours to answer questons n ths examnaton.. You must show your work to receve

More information

See Book Chapter 11 2 nd Edition (Chapter 10 1 st Edition)

See Book Chapter 11 2 nd Edition (Chapter 10 1 st Edition) Count Data Models See Book Chapter 11 2 nd Edton (Chapter 10 1 st Edton) Count data consst of non-negatve nteger values Examples: number of drver route changes per week, the number of trp departure changes

More information

Chapter 13: Multiple Regression

Chapter 13: Multiple Regression Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to

More information

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI

Logistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI Logstc Regresson CAP 561: achne Learnng Instructor: Guo-Jun QI Bayes Classfer: A Generatve model odel the posteror dstrbuton P(Y X) Estmate class-condtonal dstrbuton P(X Y) for each Y Estmate pror dstrbuton

More information

Homework Assignment 3 Due in class, Thursday October 15

Homework Assignment 3 Due in class, Thursday October 15 Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.

More information

Quantifying Uncertainty

Quantifying Uncertainty Partcle Flters Quantfyng Uncertanty Sa Ravela M. I. T Last Updated: Sprng 2013 1 Quantfyng Uncertanty Partcle Flters Partcle Flters Appled to Sequental flterng problems Can also be appled to smoothng problems

More information

Charged Particle in a Magnetic Field

Charged Particle in a Magnetic Field Charged Partle n a Magnet Feld Mhael Fowler 1/16/08 Introduton Classall, the fore on a harged partle n eletr and magnet felds s gven b the Lorentz fore law: v B F = q E+ Ths velot-dependent fore s qute

More information

Relevance Vector Machines Explained

Relevance Vector Machines Explained October 19, 2010 Relevance Vector Machnes Explaned Trstan Fletcher www.cs.ucl.ac.uk/staff/t.fletcher/ Introducton Ths document has been wrtten n an attempt to make Tppng s [1] Relevance Vector Machnes

More information

STATS 306B: Unsupervised Learning Spring Lecture 10 April 30

STATS 306B: Unsupervised Learning Spring Lecture 10 April 30 STATS 306B: Unsupervsed Learnng Sprng 2014 Lecture 10 Aprl 30 Lecturer: Lester Mackey Scrbe: Joey Arthur, Rakesh Achanta 10.1 Factor Analyss 10.1.1 Recap Recall the factor analyss (FA) model for lnear

More information

since [1-( 0+ 1x1i+ 2x2 i)] [ 0+ 1x1i+ assumed to be a reasonable approximation

since [1-( 0+ 1x1i+ 2x2 i)] [ 0+ 1x1i+ assumed to be a reasonable approximation Econ 388 R. Butler 204 revsons Lecture 4 Dummy Dependent Varables I. Lnear Probablty Model: the Regresson model wth a dummy varables as the dependent varable assumpton, mplcaton regular multple regresson

More information

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method

More information

Differentiating Gaussian Processes

Differentiating Gaussian Processes Dfferentatng Gaussan Processes Andrew McHutchon Aprl 17, 013 1 Frst Order Dervatve of the Posteror Mean The posteror mean of a GP s gven by, f = x, X KX, X 1 y x, X α 1 Only the x, X term depends on the

More information

Introduction to Molecular Spectroscopy

Introduction to Molecular Spectroscopy Chem 5.6, Fall 004 Leture #36 Page Introduton to Moleular Spetrosopy QM s essental for understandng moleular spetra and spetrosopy. In ths leture we delneate some features of NMR as an ntrodutory example

More information

PHYSICS 212 MIDTERM II 19 February 2003

PHYSICS 212 MIDTERM II 19 February 2003 PHYSICS 1 MIDERM II 19 Feruary 003 Exam s losed ook, losed notes. Use only your formula sheet. Wrte all work and answers n exam ooklets. he aks of pages wll not e graded unless you so request on the front

More information

BINARY LAMBDA-SET FUNCTION AND RELIABILITY OF AIRLINE

BINARY LAMBDA-SET FUNCTION AND RELIABILITY OF AIRLINE BINARY LAMBDA-SET FUNTION AND RELIABILITY OF AIRLINE Y. Paramonov, S. Tretyakov, M. Hauka Ra Tehnal Unversty, Aeronautal Insttute, Ra, Latva e-mal: yur.paramonov@mal.om serejs.tretjakovs@mal.om mars.hauka@mal.om

More information

SDMML HT MSc Problem Sheet 4

SDMML HT MSc Problem Sheet 4 SDMML HT 06 - MSc Problem Sheet 4. The recever operatng characterstc ROC curve plots the senstvty aganst the specfcty of a bnary classfer as the threshold for dscrmnaton s vared. Let the data space be

More information

Composite Hypotheses testing

Composite Hypotheses testing Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter

More information

Accurate Online Support Vector Regression

Accurate Online Support Vector Regression Aurate Onlne Support Vetor Regresson Junshu Ma, James Theler, and Smon Perkns MS-D436, NIS-2, Los Alamos Natonal Laboratory, Los Alamos, NM 87545, USA {junshu, jt, s.perkns}@lanl.gov Abstrat Conventonal

More information

Implementation of α-qss Stiff Integration Methods for Solving the Detailed Combustion Chemistry

Implementation of α-qss Stiff Integration Methods for Solving the Detailed Combustion Chemistry Proeedngs of the World Congress on Engneerng 2007 Vol II Implementaton of α-qss Stff Integraton Methods for Solvng the Detaled Combuston Chemstry Shafq R. Quresh and Robert Prosser Abstrat Implt methods

More information

Statistical pattern recognition

Statistical pattern recognition Statstcal pattern recognton Bayes theorem Problem: decdng f a patent has a partcular condton based on a partcular test However, the test s mperfect Someone wth the condton may go undetected (false negatve

More information

Probability Theory (revisited)

Probability Theory (revisited) Probablty Theory (revsted) Summary Probablty v.s. plausblty Random varables Smulaton of Random Experments Challenge The alarm of a shop rang. Soon afterwards, a man was seen runnng n the street, persecuted

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that

More information

Analysis of Mixed Correlated Bivariate Negative Binomial and Continuous Responses

Analysis of Mixed Correlated Bivariate Negative Binomial and Continuous Responses Avalable at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 1932-9466 Vol. 8, Issue 2 (Deember 2013), pp. 404 415 Applatons and Appled Mathemats: An Internatonal Journal (AAM) Analyss of Mxed Correlated Bvarate

More information

Statistical Parametric Speech Synthesis with Joint Estimation of Acoustic and Excitation Model Parameters

Statistical Parametric Speech Synthesis with Joint Estimation of Acoustic and Excitation Model Parameters Statstal Parametr Speeh Synthess wth Jont Estmaton of Aoust and Extaton odel Parameters Rannery aa, Hega en, J F Gales Toshba Researh Europe Ltd, Cambrdge Researh Laboratory, Cambrdge, UK {rannerymaa,hegazen,mfg}@rltoshbaouk

More information

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010 Parametrc fractonal mputaton for mssng data analyss Jae Kwang Km Survey Workng Group Semnar March 29, 2010 1 Outlne Introducton Proposed method Fractonal mputaton Approxmaton Varance estmaton Multple mputaton

More information

Linear Feature Engineering 11

Linear Feature Engineering 11 Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19

More information

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012 MLE and Bayesan Estmaton Je Tang Department of Computer Scence & Technology Tsnghua Unversty 01 1 Lnear Regresson? As the frst step, we need to decde how we re gong to represent the functon f. One example:

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

Week 5: Neural Networks

Week 5: Neural Networks Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

Here is the rationale: If X and y have a strong positive relationship to one another, then ( x x) will tend to be positive when ( y y)

Here is the rationale: If X and y have a strong positive relationship to one another, then ( x x) will tend to be positive when ( y y) Secton 1.5 Correlaton In the prevous sectons, we looked at regresson and the value r was a measurement of how much of the varaton n y can be attrbuted to the lnear relatonshp between y and x. In ths secton,

More information

VOTING ALGORITHMS FOR DISCOVERING LONG MOTIFS

VOTING ALGORITHMS FOR DISCOVERING LONG MOTIFS VOTING ALGORITHMS FOR DISCOVERING LONG MOTIFS FRANCIS Y.L. CHIN AND HENRY C.M. LEUNG Department of Computer Sene The Unversty of Hong Kong Pokfulam, Hong Kong Pevzner and Sze [14] have ntrodued the Planted

More information

10-701/ Machine Learning, Fall 2005 Homework 3

10-701/ Machine Learning, Fall 2005 Homework 3 10-701/15-781 Machne Learnng, Fall 2005 Homework 3 Out: 10/20/05 Due: begnnng of the class 11/01/05 Instructons Contact questons-10701@autonlaborg for queston Problem 1 Regresson and Cross-valdaton [40

More information

A Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function

A Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function Advanced Scence and Technology Letters, pp.83-87 http://dx.do.org/10.14257/astl.2014.53.20 A Partcle Flter Algorthm based on Mxng of Pror probablty densty and UKF as Generate Importance Functon Lu Lu 1,1,

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

Correlation and Regression without Sums of Squares. (Kendall's Tau) Rudy A. Gideon ABSTRACT

Correlation and Regression without Sums of Squares. (Kendall's Tau) Rudy A. Gideon ABSTRACT Correlaton and Regson wthout Sums of Squa (Kendall's Tau) Rud A. Gdeon ABSTRACT Ths short pee provdes an ntroduton to the use of Kendall's τ n orrelaton and smple lnear regson. The error estmate also uses

More information

Review: Fit a line to N data points

Review: Fit a line to N data points Revew: Ft a lne to data ponts Correlated parameters: L y = a x + b Orthogonal parameters: J y = a (x ˆ x + b For ntercept b, set a=0 and fnd b by optmal average: ˆ b = y, Var[ b ˆ ] = For slope a, set

More information