Adaptive Importance Sampling in Signal Processing

Size: px
Start display at page:

Download "Adaptive Importance Sampling in Signal Processing"

Transcription

1 DSP Dgtal Sgnal Processng 25) 9 Adaptve Importance Samplng n Sgnal Processng ónca F. Bugallo a,, Luca artno b, Jukka Corander b a Department of Electrcal and Computer Engneerng, Stony Brook Unversty USA) b Department of athematcs and Statstcs, Unversty of Helsnk, Helsnk Fnland) Abstract In Bayesan sgnal processng, all the nformaton about the unknowns of nterest s contaned n ther posteror dstrbutons. The unknowns can be parameters of a model, or a model and ts parameters. In many mportant problems, these dstrbutons are mpossble to obtan n analytcal form. An alternatve s to generate ther approxmatons by onte Carlo-based methods lke arkov chan onte Carlo CC) samplng, adaptve mportance samplng AIS) or partcle flterng PF). Whle CC samplng and PF have receved consderable attenton n the lterature and are reasonably well understood, the AIS methodology remans relatvely unexplored. Ths artcle revews the bascs of AIS as well as provdes a comprehensve survey of the state-ofthe-art of the topc. Some of ts most relevant mplementatons are revsted and compared through computer smulaton examples. c 2 Publshed by Elsever Ltd. Keywords: Adaptve mportance samplng; arkov Chan onte Carlo; populaton onte Carlo; partcle flterng.. Introducton The past two decades have wtnessed an astoundng growth n the feld of statstcal sgnal and nformaton processng. The advances n the feld have been partcularly compellng n terms of computatonal methods. Ths progress has been unleashed wth the ever ncreasng computng power avalable nowadays at very low costs. Bayesan sgnal and nformaton processng has perhaps benefted most sgnfcantly from the advances n computng hardware and algorthms. Wth ths technologcal momentum, the Bayesans freed themselves from the computatonal lmtatons of the classcal approaches and found a range of new opportuntes for handlng models of very hgh complextes. As a result, Bayesan theory and practce have seen remarkable advances. any dscplnes have benefted from these developments ncludng engneerng, bonformatcs, econometrcs, astronomy, clmatology, and computatonal physcs. A drvng force for the ncreased popularty of Bayesan methods has been the progress n the theory and practce of reject-accept methods [, 2], arkov chan onte Carlo CC) algorthms [3, 4] and partcle flterng PF) [5]. All these methodologes are based on approxmatng dstrbutons of unknown parameters or states) of nterest by samples and assocated weghts [6, 7, 8, 9, ]. Recently, adaptve mportance samplng AIS) [] has resurfaced as an alternatve to CC samplng and t has ganed attenton n varous research communtes [2]. Unlke CC samplng, whch explots arkov chans to reach a desred target) dstrbuton for generatng the desred samples, the AIS methodology draws them from dstrbutons that mprove wth teratons. A beneft of AIS wth respect to Correspondng author. Emal address: monca.bugallo@stonybrook.edu ónca F. Bugallo)

2 / Dgtal Sgnal Processng 25) 9 2 CC samplng s that none of the generated samples are rejected. oreover, AIS schemes do not necesstate of burn-n perods and can be mplemented easly [3]. Addtonally, wth AIS methods, t s possble to easly estmate the normalzng constant of the posteror target dstrbuton also called Bayesan evdence, margnal lkelhood or partton functon) by averagng the unnormalzed mportance weghts. The margnal lkelhood s partcularly useful n model selecton [, 4]. Furthermore, n general, the consstency of AIS schemes s easly guaranteed by weak basc condtons, whereas the theoretcal study of the convergence of a arkov chan generated by an adaptve CC method needs more careful consderatons [5, 6, 7]. The key dea behnd AIS s mportance samplng [8, 9]. It bols down to generatng samples from a cleverly selected dstrbuton called nstrumental, mportance or proposal functon. Ths dstrbuton s dfferent from the target dstrbuton because one assumes that drect samplng from the target dstrbuton s nfeasble. Good proposal functons do closely resemble the target dstrbuton. Once the samples are generated from a proposal functon, they are assgned weghts. The samples wth ther weghts represent an approxmaton of the target dstrbuton. The key problem s that good proposal functons are very hard to choose n an automated fashon. On the other hand, Bayesan theory suggests that one can start by generatng samples from the pror and weghtng them usng the posteror usually the target dstrbuton). The obtaned set of samples and weghts s an approxmaton of the posteror that can be mproved. To that end, a temptng dea s to use ths approxmaton n constructng a better nstrumental functon than the orgnally used pror. If ths works, the samples and weghts of the mproved proposal functon wll produce a better approxmaton of the posteror. Consequently, ths approxmaton can also serve to agan construct a better proposal functon. Ths s naturally the case wth all the subsequent approxmatons too. Thus, the mportance samplng keeps adaptng as one proceeds wth teratons, whch s why the methodology s referred to as AIS. In ths process, learnng takes place from samples and weghts obtaned n prevous teratons. Ths paper revews the theory and practce of AIS when appled to problems n sgnal processng. ore specfcally, the next secton provdes the general background and dscusses the bascs of the AIS method. After that, an extensve survey of the lterature on the topc s presented and ncludes a summary of the most mportant convergence results. The secton on mplementatons detals some of the most popular AIS algorthms ncludng the populaton onte Carlo PC) [3], the AIS wth mxture Gaussan proposals [2], the adaptve multple mportance samplng AIS) [2, 2], and the adaptve populaton mportance sampler APIS) [22, 23]. The secton on examples analyzes and compares the dscussed algorthms as well as the CC through computer smulatons. Three relevant problems are examned: the well known banana-shaped dstrbuton [5] as proof of concept, a hgh-dmensonal settng wth a multmodal target dstrbuton, and an autoregressve AR) model n a non-gaussan nose scenaro. The fnal secton provdes some closng remarks. 2. Background and the bascs of AIS A very mportant challenge n many sgnal processng problems s the relable approxmaton of target dstrbutons. Unfortunately n many scenaros such approxmaton cannot be obtaned analytcally and one has to resort to numercal methods. A popular class of computatonal algorthms s the onte Carlo famly, whch s partcularly sutable to the broad and challengng problems of non-lnear and non-gaussan nature. The target dstrbuton can be, for example, the posteror dstrbuton of the unknowns or a predctve dstrbuton of observatons. Wthn the onte Carlo-based methods, CC algorthms use correlated samples to approxmate a target dstrbuton [24, 25, 26]. CC technques yeld a arkov chan wth a statonary dstrbuton that concdes wth the target. Thus, after a suffcent amount of teratons, the CC methods approxmate the target dstrbuton wth a random measure composed of generated samples wth equal weghts. In adaptve CC samplng, one also uses proposal functons and they may depend on already generated samples. The AIS has the same objectve as CC but t operates wth samples whose weghts are all dfferent. The dfference n weghts s due to the fact that wth AIS the generated samples are always used n the estmaton and are drawn by a proposal functon dfferent from the desred dstrbuton. Some proposal functons are better than others and the man challenge les n the constructon of an approprate proposal. Wth AIS, the am s to mplement mportance samplng n an teratve manner, where one explots the samples and weghts of past teratons n constructng mproved mportance functons. PF s another onte Carlo methodology that uses mportance samplng [27]. Wth PF, the man objectve s to track n tme varous dynamc dstrbutons of nterest [5, 28, 29]. As ponted out, PF, lke AIS, explots mportance samplng, but does not usually take advantage of teratve learnng whle t tracks ts target dstrbutons. Thus, n a 2

3 / Dgtal Sgnal Processng 25) 9 3 nutshell, AIS shares mportant features wth both CC samplng and PF. Lke n adaptve CC samplng, t uses prevous samples to construct better proposal functons, and lke PF, t employs mportance samplng to avod generaton of samples from nfeasble dstrbutons. Another advantage of the AIS approach w.r.t. the CC technques) s that t s easer to desgn schemes usng a populaton of dfferent proposal dstrbutons sharng nformaton by the weghted samples). The mathematcal bascs of AIS can be descrbed as follows. Suppose that we want to approxmate a target dstrbuton px) by a set of samples and weghts. Here x R d x s the unknown state of the system of dmenson d x. Ths dstrbuton s most often a posteror, px y :Ny ), where y n R d y represents observatons wth nformaton about x, and y :Ny {y, y 2,, y Ny }, wth N y ndcatng the number of avalable observatons. Frst, we revew the concept of mportance samplng [8]. Let the samples used for approxmaton be drawn from px) tself and let they be denoted by x, m =, 2,,, wth representng the number of samples that are generated. Then p x) = m= δ x x ), ) where the approxmatng dstrbuton, p x), s dscrete and δ x x ) s the unt delta measure concentrated at x. When t s dffcult or mpossble to draw samples from px), the alternatve s to use a proposal functon πx), wth a shape as close as possble to px) and support enclosng that of px). If x π x) where the subscrpt denotes the teraton), then the approxmaton of px) s p x) = m= where w s the normalzed mportance weght obtaned from where w = w w δ ) x x, 2) w j= w j) The generaton of samples from a proposal functon and the assgnment of weghts s known as mportance samplng. We note that the superscrpt n the proposal functon ndcates that for each sample generated, one can use a dfferent proposal. The nature of ths dfference of proposal pdfs can be n the type of dstrbuton e.g. some samples could be generated from a Gaussan, some from a mxture Gaussan, etc.) or n the parameters of the proposal e.g., assumng a smple case where all samples are generated from Gaussans, those Gaussans could have dfferent parameters dependng on the partcular sample.) In ths secton, for smplcty n notaton, we relate the proposal dstrbuton drectly to the unknown varable of the system x. In general, and as seen n the next sectons, the proposal may not only relate to the varable x but to other parameters. The dea behnd AIS s to use the random measure χ = { x, 3) = p ) x ). 4) π x Iteraton Iteraton posteror px y:ny ) w ) " w ) x ) x ) posteror px y:ny ), } w m= w ) x ) nstrumental functon x) random measure = {x,w } m= nstrumental functon x) w ) " x ) Fgure. The dea behnd AIS wth two teratons. for creatng a better proposal functon than π x) see Fg. for a graphcal descrpton of the modus operand of AIS n two teratons). If ths new proposal 3

4 functon s π x), and t s used to obtan a new random measure χ = { x wth the new set of samples s / Dgtal Sgnal Processng 25) 9 4 p x) = m=, } w m=, then the approxmaton of px) w δ ) x x, 5) where the subscrpt of p x) ndcates the approxmaton of px) at the frst teraton. One can also combne the ntal random measure χ wth χ to mprove the approxmaton 5), e.g., by p x) = m= w δ ) x x + w δ )) x x, 6) where the new weghts w and w are approprately recomputed so that they all sum up to one, namely, w = w j= w j) +, and w j= w j) = w j= w j) +. 7) j= w j) The process can obvously contnue teratvely n the sequel, the subscrpt wll denote teraton number and I the maxmum number of teratons). In the second teraton, one can use the random measure χ and create yet an even better proposal functon. Or, one can use both measures χ : here, χ : {χ, χ }) to obtan the new proposal functon. The teratve process proceeds untl a stoppng condton s met. It s mportant to note that the samples from all the teratons can be used once ther weghts are properly computed. Fnally, t s mportant to remark that Ẑ = 2 s an estmator of the normalzng constant of px). j= w j) + j= w j) R dx px)dx, 8) 3. State-of-the-art The dea of mportance samplng was proposed for the frst tme more than 5 years ago [3]. It was ntroduced n Bayesan nference n [3] and later developed n [32, 33]. The concept of learnng about the target dstrbuton along wth onte Carlo samplng was proposed n [34, 35], and more recent efforts nclude [2, 36]. It s mportant to pont out that learnng from past samples has been explored n the CC samplng lterature, where the man motvaton has prmarly been the development of CC methods for the local trap problem [25]. A resurgence of nterest n AIS schemes happened after the publcaton of the populaton onte Carlo PC) samplng method n [3, 26]. It s based on representng the proposal functons by mxtures of kernels. The underlyng dea s smple and flexble and several varants have been proposed. In partcular, two general versons of PC amng at complete adaptaton of the proposal functons, have receved specal attenton. In one, the proposal functons are ft to the target dstrbuton so that they mnmze the varance or the Kullback-Lebler dvergence [37]. In the other method, one updates the weghts and parameters of the mxture dstrbuton by usng an entropy crteron [2]. In [38], a PC method was appled to mssng data problems, where a sequence of mportance functons dependent on both the teraton and sample ndex was proposed. Addtonal applcatons of PC samplng nclude the analyss of on channels usng a fxed dmenson model [39], nterpretaton of statc mages n a robot vson setup [4], analyss of complex trats for phenotypc data nterpretaton of pedgreed populatons [4], motf dscovery for decpherng genetc regulatory codes [42], and nvestgaton of detecton systems for bologcal threats [43]. Several other related works can be found n lterature [44, 45, 46]. For nstance, n [46], the loss of dversty n the populaton due of the resamplng step s reduced by artfcally forcng a pre-defned amount of the hghest mportance weghts to be equal. Other varants of AIS schemes have been presented n lterature. Some approaches focus on the use of nonparametrc mxture dstrbutons as proposal functons [47, 48, 49, 5]. The number of components n the mxture vares wth teratons accordng to certan statstcal crtera. In many of these schemes, the new added mxands.e., a new 4

5 / Dgtal Sgnal Processng 25) 9 5 component) are located and weghted by usng the samples generated at the prevous teraton [47, 48]. Then, certan components are removed or replaced accordng to a resamplng step. The parameters of the components are adapted usng global [47] or local [48] strateges. In other approaches, the optmal parameters of the components are obtaned va emprcal estmaton [49]. In [5], a strategy that ntates wth a very large number of mxands s descrbed. As the algorthm proceeds, some of the mxands are merged and some are removed usng a clusterng procedure. An alternatve to ths method s the ncremental mxture mportance samplng IIS) method. Wth IIS, the proposal mxture dstrbuton s adapted by ncrementally addng components [5] based on the hghest mportance weghts from the prevous teraton. ore recently, the dea ncremental mxtures has been promoted by another AIS scheme, the adaptve multple mportance samplng AIS) method [2, 2, 52]. In AIS, at a partcular teraton, only one proposal functon s used to generate all samples. However, at each teraton the past and present weghts are recomputed by usng the so-called determnstc mxture approach [53, 54]. The mxture s composed by the present and all past proposal dstrbutons. Thus, a new component s ncorporated n the mxture at each teraton n ths sense, t s ncremental). Also, the parameters of the next teraton proposal are obtaned by usng all partcles and weghts up to that moment. It has been demonstrated that the determnstc mxture approach stablzes the teratve mportance samplng estmator. Thus, AIS often provdes better performance than other technques. For nstance, t has been successfully appled to bologcal data analyss [55]. In the prevous works, some elements of the mxture, e.g., weghts or covarances, reman fxed,.e., are not adapted. A complete adaptaton of a mxture of proposal dstrbutons has been desgned n [2], from the theoretcal and practcal pont of vew. In ths case, the weghts and parameters of each mxand are updated n order to mnmze the Kullback-Lebler dvergence between the target and proposal dstrbutons. Snce the adaptaton of weghts and covarance matrces s n general crtcal for adequate performance of the algorthms, other methods only adapt the locaton parameters [22], n order to provde a more robust performance. Wth the adaptve populaton mportance samplng APIS) method [22, 23], each component of the mxture can use dfferent parameters as n PC, but the parameters of a mxand are updated to reflect the local features of the target. The weghts of the mxture reman the same, equal for all the mxands n ts basc formulaton away from the and values, as suggested n [56]). APIS also takes advantage of the determnstc mxture approach [53, 54] but wth the populaton of proposal dstrbutons at the same teraton, nstead of creatng a mxture among proposal dstrbutons obtaned at dfferent tme steps, as n AIS. Fnally, combnatons of CC and mportance samplng technques have been studed to obtan further mproved AIS methods. For nstance, CC steps are used to accelerate the adaptaton of the AIS technque [23, 57] or, more generally, the CC outputs are used to buld a proposal dstrbuton for AIS estmaton [58]. In general, the proof of consstency of the estmators obtaned by the AIS schemes s based on the same arguments used for the standard mportance samplng method [, 4]. In fact, the AIS estmator that s calculated at the last teraton of the method I s the maxmum number of teratons of the algorthm) can be nterpreted as a standardstatc mportance samplng estmator bult usng I dfferent proposal dstrbutons. The detals of ths analyss can be found n [23] and Chapter 4 of []. The consstency of the method when the number of samples tend to nfnty for a fxed number of teratons wth fxed I) s, n general, straghtforward based on smple onte Carlo arguments. However, the proof for an nfnte number of teratons and a fxed number of samples I wth fxed ) s more nvolved snce the partal estmators at each teraton of the the AIS scheme are usually based see dscusson n [23]). The convergence of some specfc AIS schemes lke the AIS have receved specal attenton n [2]. There, the algorthm, whch orgnally consdered all samples from the past and current teratons for update of the proposal, was modfed to facltate the theoretcal analyss and only the current samples at a partcular teraton were used n the update of the proposal functon. Some addtonal theoretcal results can be found n [28, 37]. 4. AIS schemes In ths secton, we frst revew the generc characterstcs of the AIS methodology as well as ts basc steps. Then, we provde detals of four dfferent mplementatons of AIS. In partcular, we focus on the standard PC algorthm [3], the AIS wth mxture Gaussan proposals [2], the adaptve multple mportance samplng AIS) [2, 2], and the adaptve populaton mportance sampler APIS) [22, 23]. 5

6 / Dgtal Sgnal Processng 25) 9 6 Indces Descrpton Functons Descrpton Index denotng the teraton of the algorthm p Target functon I Total number of teratons π Proposal functon m Index denotng the m-th sample q xand n the proposal π Total number of samples at each teraton w ẅ d Index denotng the d-th mxand w D Total number of mxands n the proposal π w τ Table. Summary of the notaton of the AIS methodology. Unnormalzed weght of the m-th sample at the -th teraton Unnormalzed weght used n APIS Weght of the m-th sample, normalzed consderng the samples drawn at the -th teraton Weght of the m-th sample at the τ-th teraton τ ), normalzed consderng all the samples drawn up to the -th teraton 4.. Outlne of the general scheme As ndcated n Secton 2, the objectve of the AIS algorthms s the approxmaton of a target dstrbuton, px), by a set of samples and weghts. Snce the samples cannot generally be drawn drectly from px), one uses a proposal functon, πx), to generate the samples and assgns them weghts, whch ndcate the relevance of the samples to the approxmaton. The AIS methods are teratve and mprove the proposal functon wth each teraton. We note that all methods revewed n ths paper assume parametrc proposal functons. Alternatve schemes could be consdered as s the case n [47, 48, 49, 5], where nonparametrc proposal pdfs are used. The man notaton s summarzed n Table and a graphcal representaton of the general outlne of the AIS methodology ncludng the notaton s provded n Fg. 2. In general, at the -th teraton, the AIS technques, move from a random measure, χ = { x, } w, to a m= new random measure, χ, where the samples are generated from an mproved proposal constructed from χ and the weghts are properly updated. In partcular, the generaton of new samples s gven by x π, m =,...,, 9) where π s the proposal functon for the m-th sample at the -th teraton. It s n the constructon of the proposal where the AIS algorthms are dfferentated: Standard PC [3]: Each sample s generated from a dfferent proposal of parameters updated from the prevous teraton and defned by the prevous sample,.e., x π x) = q,m x µ, ) Σ, where µ and Σ denote the mean and covarance matrx defnng the parametrc dstrbuton. Note that one can nterpret that PC uses a mxture of dstrbutons q,m, =,..., I, m =,...,, where the selecton of a partcular mxand to draw one sample s determnstc one sample s generated from each of the proposal pdfs). Ths observaton establshes an mportant connecton wth the rest of technques analyzed below. AIS wth mxture Gaussan proposals [2]: There s one proposal dstrbuton from whch all samples are generated. Ths functon s a mxture Gaussan wth D components of parameters updated at each teraton. The m-th sample at the -th teraton s generated by randomly selectng one of the mxands,.e., x π x) = Dd= α,d q,d x µ,d, Σ,d ), where α,d, µ,d and Σ,d denote the weght, mean and covarance matrx of the d-th component of the mxture. In the sequel, we also refer to ths method as x-ais. AIS [2, 2]: The samples, at -th teraton, are generated usng the same proposal functon wth parameters obtaned from all samples and weghts up to that teraton,.e., x π x) = q x µ, Σ ), where µ and Σ denote the mean and covarance matrx calculated at the )-th teraton usng all samples and weghts. AIS also explots the determnstc mxture approach n the computaton of the weghts. At the -th teraton, all samples up to that teraton, have been generated from the temporal mxture j= + q x µ, Σ ). APIS [22, 23]: At each teraton, there are D proposal pdfs and the same number of samples s drawn from each of them,.e., f one generates samples, N = D 2 samples come from each of the proposal functons. 6

7 / Dgtal Sgnal Processng 25) 9 7 Therefore, x π ) x) = q,d x µ,d, Σ,d, d = m N +, where µ,d and Σ,d denote the mean and covarance matrx of the d-th proposal and are updated usng the samples generated from that proposal. A useful nterpretaton s that APIS uses a mxture of proposal dstrbutons, q,d, where the selecton of the mxand to draw the sample from s determnstc N samples are generated from each of the D proposal pdfs such that ND = ). For each of the prevous varants of AIS, one needs to update the correspondng weghts takng nto account the consdered proposal pdf. In the next subsectons we detal the specfcs of each of the approaches. The man steps of all the schemes are: Intalzaton: Obtan χ = { x, w parameters needed for the frst proposal. For = : I } m=, from a pror dstrbuton. Ths step also nvolves ntalzaton of the. Generate new samples: x π x), m =,...,. The way the proposal functons π x) are bult dstngush the dfferent approaches. 2. Calculate the unnormalzed weghts: ) x w = p π x ), m =,...,. ) As seen later, the unnormalzed weghts are usually calculated as the rato between the target and the proposal. The AIS algorthm evaluates the weghts by consderng the cumulatve proposal resultng from the addton of all proposal pdfs up to the current teraton. In addton, t also recalculates the weghts of samples from prevous teratons n the same way. The APIS algorthm contemplates two dfferent unnormalzed weghts: on one hand, the rato between the target and the proposal s consdered, and on the other hand the rato between the target and the composte proposal resultng from the mxture of all ndvdual proposals s calculated. The former s used for adaptaton of the next teraton parameters of each proposal pdfs, whle the later s used for approxmaton of the target dstrbuton. 3. Normalze the weghts: Two dfferent types of normalzaton are consdered: a) For adaptaton of the necessary parameters for the next teraton proposal w = w j= w j), m =,...,. ) The standard PC also uses these weghts for the resamplng operaton. In PC the dscrete random measure degenerates quckly and only few samples are assgned meanngful weghts. Ths degradaton leads to a deterorated performance of the method. Resamplng elmnates samplng wth small weghts and replcates partcles wth large weghts [4,, 59, 6]. It s mportant to note that the AIS approach does not normalze the weghts usng the prevous expresson but the next one. Ths s due to the cumulatve nature of the algorthm, whch constructs the proposal pdfs by updatng the parameters wth the samples from all prevous teratons. Therefore, the weghts used for adaptaton of these parameters are normalzed thorough all the teratons. Usng the generated samples and the calculated weghts, one obtans the next random measure, χ = { } x, w. m= b) For approxmaton of the target dstrbuton as well as calculaton of pont estmates of unknowns w w τ τ = ρ= j= w ρ j) τ =,...,, m =,...,. 2) Ths s a global normalzaton across all weghts from prevous and current teratons. 7

8 / Dgtal Sgnal Processng 25) Approxmate the target dstrbuton: To acheve ths, one uses all samples and globally normalzed weghts up to the present teraton, p x) = m= w δ ) x x + w δ x x ) w δ x x )). 3) 5. Adapt the parameters for the next proposal: Dependng on the verson of the AIS method, one has to update the parameters for the next teraton proposal accordngly. Detals of ths adaptaton are provded n the next subsectons. Fgure 2 llustrates the general AIS scheme and Table 2 compares the expressons of the proposal pdfs, π, as well as the number of mxands n the proposal pdfs used by each of the consdered AIS mplementatons. x) proposal pdf q, x),...,q,d x),...,q,d x) x,m=,..., samples = + w unnormalzed weghts for APIS) ẅ normalzed weghts for adaptng the proposal w w for approxmatng the target px) Fgure 2. Outlne of the general AIS scheme wth ts man steps and the used notaton. ethod Proposal pdfs D PC π x) = q,m x µ, ) Σ D = x-ais π x) = D d= α,d q,d x µ,d, Σ,d ) D AIS π x) = q x µ, Σ ) D = APIS π ) x) = q,d x µ,d, Σ,d, D = N d = m N + Table 2. Expressons of the proposal and number of mxands per proposal for the dfferent AIS mplementatons Standard PC Here we detal the orgnal PC samplng algorthm proposed n [3]. We consder that at the begnnng of the -th teraton, the random measure from the )-th teraton, χ = { x, } w, s avalable. To obtan the new m= random measure, the m-th sample s propagated from a proposal dstrbuton, π ), descrbed by a set of parameters obtaned from m-th sample at the prevous teraton,.e., π x) = q,m x µ, ) Σ. Ths mples that at each teraton there are dfferent proposal pdfs. We note that as descrbed n [3] all the proposals are of the same dstrbuton type, e.g., all Gaussans or all Student s t-dstrbutons. However, the parameters of those proposal pdfs are dfferent n that they come from dfferent parental samples. We also note that the proposals do not change ther 8

9 / Dgtal Sgnal Processng 25) 9 9 type from teraton to teraton. In a more general settng, one could have the proposal pdfs beng of dfferent dstrbuton types and changng ther types at each teraton. The proposals consdered n [3] are characterzed by ther means and covarances matrces,.e., the -th proposal depends on µ,m and Σ,m from the prevous teraton. Table 8 summarzes the algorthm. Note that, at each teraton, the samples obtaned from the PC method can be nterpreted to come from a mxture m= q,m x µ, ) Σ, where exactly one sample s generated from each of the mxands. However, unlke AIS and APIS, PC does not take advantage of ths consderaton. Ths s perceved n the calculaton of the weghts snce the correspondng weght for each sample only reflects the proposal that the sample was drawn from and not the entre mxture AIS wth mxture Gaussan proposals In the PC samplng algorthm descrbed n [2], the proposal dstrbuton at the -th teraton s the same for all partcles and s gven by a mxture of D Gaussans,.e., π x) = D d= α,d q,d x µ,d, Σ,d ), where ) ) q,d x µ,d, Σ,d = N x µ,d, Σ,d s the d-th mxand whose parameters as well as ts weght n the mxture are updated at each teraton by usng the generated samples and calculated weghts. The generaton of samples at the -th teraton nvolves the selecton of a component of the mxture accordng to the weghts α,d, d =,..., D. A sort of resamplng s therefore nherent to the generaton step. Table 9 summarzes the algorthm. We note that the subscrpt used for ntalzaton of the algorthm refers to the parameters of the pror dstrbuton used to obtan the frst set of samples. It s mportant to remark that the adaptaton of the parameters n Table 9 s only vald f all the mxture components are Gaussans. In other cases, an addtonal theoretcal dervaton s needed [2] AIS In the AIS approach [2, 2] the proposal functons are ncremental n the sense that at each teraton the new proposal s characterzed by parameters that are obtaned usng all samples and weghts from prevous teratons. Ths proposal s used for generaton of all samples,.e., π x) = q x µ, Σ ), m. Also, at each teraton the weghts of all samples current and past) are computed or recomputed n the case of past samples) takng nto account the ncremental nature of the proposal,.e., the weghts measure the adequacy of the samples wth respect to the ) composte dstrbuton that accounts for all teratons proposal pdfs up to that teraton, j= + q j x µ j, Σ j. Ths mxture, wth ncreasng number of components, can be consdered as the complete reference proposal pdf for weght calculaton used n AIS. All the samples from all the teratons as well as the properly recomputed and normalzed weghts are used for adjustment of the parameters for the next teraton proposal. Table summarzes the algorthm. We note that the subscrpt used for ntalzaton of the algorthm refers to the parameters of the pror dstrbuton used to obtan the frst set of samples APIS The algorthm proposed n [22, 23] uses D proposal functons at each teraton to obtan the samples,.e., t generates N = D 2 samples per proposal functon N N). Namely, the total number of samples per teraton s = ND. Snce N samples are generated determnstcally from each mxand at each teraton, one can also consder that the = ND samples drawn at the -th teraton are dstrbuted accordng to the mxture pdf Dd= ) D q,d x µ,d, Σ,d. Ths mxture pdf s consdered as proposal pdf from whch samples are obtaned when the weghts are calculated. The parameters of the proposal pdfs for the next teraton are updated usng only the N samples generated from that partcular proposal, q,d. Note that t s assumed that D and s a multple of D. The weghts used for the adaptaton procedure, w, are calculated as the rato between the target and the proposal used for generaton of the samples.e., consderng only one mxand q,d n the denomnator). However, the weghts used for approxmaton of the target dstrbuton, ẅ, account for the complete mxture, ψ x), as proposal. In the partcular case D =, APIS results nto a standard AIS, whch only updates the means of the proposal pdfs. Table summarzes the algorthm. We remark that the operaton y, where y s an arbtrary argument, denotes the largest nteger not greater than y. 9

10 / Dgtal Sgnal Processng 25) 9 ethod D Target evaluatons Proposal evaluatons PC = D D 2 I + ) = DI + ) = I I + ) = DI + ) = I AIS wth mxture Gaussans D I + ) = I DI + ) = I D AIS D = I + ) = I =2 + ) = I + ) 2 = I I + ) APIS = ND D I + ) = NDI + ) = 2 I DI + ) = ND 2 I + ) = I D Table 3. Computatonal cost: Total number of target and proposal evaluatons when drawng samples per teraton and after I teratons Computatonal cost The computatonal cost of the AIS technques s determned by the total number of samples drawn at each teraton,, the number of proposal functons used for generaton of samples, D, and the total number of teratons, I. We remark that I s also the number of adaptatons of each proposal functon. We defne the total number of samples used for estmaton of the target dstrbuton as I = I + ), where we have taken nto account the samples obtaned at the ntalzaton step of the algorthm. In ths paper, we compare the complexty of the dfferent methods dscussed n the prevous secton n terms of the computatonal cost assocated to the calculaton of the weghts. Clearly, there are other operatons whch affect the computatonal cost such as the generaton of samples, the update of parameters for the next teraton proposal, or the resamplng step n the standard PC. For the generaton of samples, and gven that all algorthms are assumed to use Gaussan dstrbutons, we consder that they have the same computatonal complexty for ths operaton wth the dsclamer that, n the case of the AIS wth mxture Gaussan proposal pdfs, there s need for generaton of a random number per sample to choose the component of the mxture to draw the partcle from. The update of parameters can be assumed equvalent n complexty for all methods and s more crtcal for large values of, D and I. Here, for smplcty, we do not consder them. The same holds for the resamplng step of the PC, snce t wll depend on the specfc type of resamplng and wll be gnored to make the comparson more straghtforward. Table 3 shows the number of operatons needed for calculaton of the weghts for each of the consdered methods. Ths calculaton corresponds to obtanng the rato of target dstrbuton to the proposal dstrbuton. It s determned at each teraton and for each generated sample we note that AIS also recomputes at each teraton the weghts correspondng to the prevous samples). Therefore, the table reflects the total number of evaluatons of the target and proposal pdfs for the dfferent technques and the overall complexty s the result of the addton of both numbers. For all the methods, the number of target evaluatons s I = I + ),.e., for each sample at each teraton we evaluate the target once. The evaluaton of the denomnator vares wth the consdered method. The standard PC appears as the lghtest algorthm and the number of evaluatons of the proposal lnearly grows wth D or, whch concde n ths case) and I. In ths case, there s one proposal per sample that s evaluated at each teraton. In the AIS wth mxture Gaussan proposals the evaluaton of the denomnator for a partcular sample at a gven teraton takes nto account the D components of the mxture and therefore the number of proposal functons that must be evaluated each step s proportonal to D. The same argument holds for APIS, where the calculaton of the weghts reflects the approprateness of the sample to the mxture of D components. Fnally, for the AIS, the computatonal load ncreases per teraton, snce the evaluaton of the denomnator takes nto account the adequacy of the samples to all the prevous proposal pdfs and the current one. In addton, one also recalculates all the prevous weghts. Ths latter operaton nvolves evaluaton of the new proposal wth respect to all old samples one assumes that evaluatons of prevous proposals are stored and no recalculaton s needed). Therefore, n AIS, t s computatonally expensve to adapt several tmes the proposal functon,.e., large values of I ncrease ts complexty. 5. Examples In ths secton we provde computer smulaton results correspondng to three dfferent problems. Namely, we frst dscuss the well known banana-shaped dstrbuton problem [5] as proof of concept. Then, we consder a hgh-dmensonal settng wth a multmodal target dstrbuton, and fnally we dscuss an example of an AR model n a non-gaussan nose scenaro.

11 5.. Banana-shaped dstrbuton / Dgtal Sgnal Processng 25) 9 We consder as target dstrbuton the b-dmensonal banana-shaped dstrbuton [5], whch s a benchmark functon n the lterature due to ts nonlnear nature. athematcally, t s expressed as px, x 2 ) exp 4 Bx x 2η ) 2 x 2 x2 2 2η 2 2 2η 2 3, 4) where, n ths case, we set B =, η = 4, η 2 = 5 and η 3 = 5. The objectve s to estmate the expected value E[X], where X = [X, X 2 ] px, x 2 ), by applyng dfferent onte Carlo approxmatons. We can approxmately obtan the true value E[X] [.4845, ] usng an exhaustve determnstc numercal method wth an extremely thn grd), n order to obtan the mean square error SE) for comparson of the standard PC, the AIS wth mxture Gaussan proposal pdfs, the AIS and the APIS labeled n the fgures and referred n ths secton to as PC, x-ais, AIS, and APIS, respectvely). For smplcty, we consder Gaussan proposal dstrbutons for all the algorthms. The ntalzaton s performed by randomly drawng the parameters of the Gaussans, wth the mean of the j-th pror gven by µ, j U[ 6, 3] [ 4, 4]) and ts covarance matrx gven by Σ, j = [σ 2 j, ; σ 2 j,2 ]. We note that the subscrpt ndcates that the dstrbuton s a pror and the subscrpt j denotes the mxand n the ntal mxture.e., n standard PC we have prors, one per sample; n x-ais we have D components of the mxture; n AIS we have one proposal; and n APIS we have D proposal pdfs). We contemplate two cases: an sotropc settng where σ j, = σ j,2 = σ {, 2,..., }, and an ansotropc case wth random selecton of the parameters to test the robustness of the methods and where σ j, U[, 2]) and σ j,2 U[, 2]). Recall that the standard PC and the APIS do not adapt the covarance matrces along the teratons but consder them fxed. For each of the algorthms, at each teraton, samples are generated. We run three dfferent experments: Test : Fxed total number of samples I : We keep fxed the total number of samples I = I + ) = 2 5 namely, the total number of generated samples and evaluaton of the target) and vary the value of σ. Dfferent combnatons of parameters are examned: For PC: {5,, 3, 5 3 } and I = 2 5. Note that D = n the basc PC. For x-ais: {, 2, 3, 2 3, 5 3 }, I = 2 5 and D {, 5, }. For AIS: {5, 3, 2 3, 5 3, 4 } and I = 2 5. Recall that D = n AIS. For APIS: {, 2, 3, 2 3, 5 3 }, I = 2 5 and D {5, }. The parameter N for the APIS method s set such that N = D 2. The range of values of the parameters are chosen, after a prelmnary study, n order to obtan the best performance from each technque. Test 2: Fxed and σ, varyng I: We set = 3 and σ = 5. The total number of samples I = I + ) s changed dependng on the value of I, whch represents the total number of adaptaton steps of the algorthms. Test 3: Fxed I and σ, varyng : We set I = 4 and σ = 5. The total number of samples I = I + ) s changed dependng on the value of, whch represents the number of samples generated at each teraton. For Test 2 and Test 3, we set D = the number of proposals, or, what s the same, components n the proposal) for the x-ais and the APIS schemes. We recall that, n x-ais and AIS, the covarance matrces are also updated durng the adaptaton, and therefore the value σ for these methods only represents the ntal parameters. The results are averaged over 5 ndependent smulatons, for each combnaton of parameters. Table 4 shows the smallest and hghest SE values obtaned n the estmaton of the expected value of the target, averaged between the two components of E[X], acheved by the dfferent methods n the frst experment Test. The smallest SE n each column each σ) s hghlghted. The log-ses as a functon of σ are shown n Fgs. 3a)-b), whle Fgs. 3c)-d) show the log-se as functon of I Test 2) and of Test 3), respectvely. Fnally, Fg. 4 dsplays the fnal confguratons of the means for the dfferent algorthms. Note that for the x-ais and AIS the covarance matrces are also adapted and the crcles show approxmately 85% of probablty mass. In ths example, AIS and APIS

12 / Dgtal Sgnal Processng 25) 9 2 Algorthm σ =.5 σ = σ = 2 σ = 3 σ = 5 σ = σ = 7 σ, j U[, 2]) PC x-ais AIS APIS Worst Best Worst Best Worst Best Worst Best Table 4. B-dmensonal banana-shaped dstrbuton example: Best and worst results n terms of SE, obtaned wth the dfferent technques for dfferent values of σ Test ). The smallest SE for each σ s bold-faced. provde the best results. AIS works better startng wth larger ntal parameters snce AIS adapts the varances as well), whereas APIS prefers smaller ntal parameters. In general, APIS needs to tune the parameter N = D as functon of the varance σ. For nstance, wth σ = 7 the results show that a larger N s necessary. However, due to ths addtonal degree of freedom compared to the standard PC, APIS s able to reach good performance ultmodal target dstrbuton In ths secton, we consder a hgh dmensonal multmodal dstrbuton, x R, defned as mxture of three multvarate Gaussans px) 3 Nx ν n, Λ n ), 5) 3 n= wth ν n = [ν n,,..., ν n, ] representng the -dmensonal vector of means correspondng to the n-th component n the multvarate dstrbuton and Λ n = σ n I, n =, 2, 3, denotng the correspondng -dmensonal vector of covarance matrces, where I s the dentty matrx. We set ν, j = 6, ν 2, j = 5 wth j =,...,, and ν 3 = [, 2, 3, 4, 5, 5, 4, 3, 2, ]. oreover, we set σ n = 3, for all n =, 2, 3. For ths target dstrbuton px), one can analytcally calculate the expected value, whch s gven by E[X] = [ 2, 3,, 4 3, 5 3, 2, 2, 5 3, 4 3,, 3] 2 where X px). As n the prevous example, we consder Gaussan proposal functons for all the compared methods. The ntalzaton s performed by randomly drawng the means of the Gaussans, wth the mean of the j-th pdf gven by µ,d U[ ] ), d =,..., D. We note that, unlke the prevous example, ths s a good ntalzaton snce the hyper-rectangle [ ] contans all the modes of the target dstrbuton. We also use dfferent ntal covarance matrces, wth the one correspondng to the d-th proposal pdf beng, Σ,d = σi and σ U[, ]). We test dfferent combnatons of parameters keepng the total number of used samples fxed, I = I + ) = 4 5. We evaluate dfferent values of, {, 2, 5, 3, 2 3, 5 3, 4, 2 4, 4 4, 5 }, and as a consequence, I = 4 5. We recall that D = n PC and D = n AIS. For the x-ais and APIS methods we use D {,, 2}. The parameter N for APIS s set to N = D 2. We run 5 ndependent smulatons and compute the SE of E[X] we average the SEs of each component). The worst, the best and the averaged results n terms of SE are shown n Table 5. As seen, for a multmodal target f a good ntalzaton s used, the standard PC obtans better performance than n the prevous example, provdng the mnmum SE. APIS also shows good performance achevng the best averaged results. APIS gets ts worst result n the extreme case N = D = 2 smallest possble value). AIS suffers from the multmodal scenaro snce t often gets trapped n a specfc mode. Fnally, Fg. 5 depcts a partcular b-dmensonal slce of the target pdf wth x = [x,..., x ] R ), together wth the last confguratons of the locaton parameters means) of the proposal pdfs for PC and APIS n one specfc run. The slce s a functon of x and x, and t s obtaned keepng fxed the other varables as x 2:5 = 9.85, x 6:9 = 8.5. oreover, n these plots, we set D = 2 for both, µ,d U[ 6 6] ) ntal means, d =,..., D), σ U[, 6]), and I = 2 5 n APIS, = ND). It s clear that the PC fals n detectng the two modes whle APIS s able to lock both of them recall that s a slce of a pdf n R ). 2

13 / Dgtal Sgnal Processng 25) 9 3 PC AIS APIS xpc logse) 5 logse) PC AIS APIS xpc a) b) PC AIS APIS xpc PC AIS APIS xpc logse) 5 logse) I c) 5 d) Fgure 3. B-dmensonal banana-shaped dstrbuton example: Log-SE of the dfferent algorthms trangles: PC; x-mark: x-ais; squares: AIS; crcles: APIS): a) Best results for Test ; b) Worst results for Test as a functon of σ); c) Test 2 as a functon of I); d) Test 3 as a functon of ) AR flters wth non-gaussan nose In ths example, we consder the use of an autoregressve AR) model contamnated by a non-gaussan nose. Ths type of flters are often used for modellng fnancal tme seres, where the nose s assumed to follow the so-called generalzed hyperbolc dstrbuton [6, 62]. Namely, we consder the AR flter y k = R x r y k r + u k, k =,..., K, 6) r= where u k represents a heavy-taled drvng nose followng a generalzed hyperbolc dstrbuton u k pu) e B βu µ) λ 2 α δ2 + u µ) 2) δ2 + u µ) 2), 7) 2 λ Results PC x-ais AIS APIS Worst Best Average Table 5. ultmodal target dstrbuton example x R ): The worst, best and averaged results n terms of SE for the dfferent technques and after testng several combnatons of parameters. The smallest SE for each row s bold-faced. 3

14 / Dgtal Sgnal Processng 25) a) PC σ =, = D =, I = 2 3 ) 5 5 b) x-ais σ = 5, = 5 3, I = 4,D = ) c) AIS σ = 5, = 5 3, I = 4) d) APIS σ = 3, =, I = 2 3, D = ) Fgure 4. B-dmensonal banana-shaped dstrbuton example: Fnal confguratons of the locaton parameters means) of the proposal dstrbutons. The x-ais and AIS technques also adapt the covarance matrces the crcles show approxmately 85% of the probablty mass). wth B λ denotng the modfed Bessel functon of the second knd [63]. We set α = 2, β =, λ =.5, µ =, and δ =. In the next two subsectons, we study two dfferent cases separately, correspondng to R = 4 and R = 3, respectvely Case R=4 We synthetcally generate K = 2 observatons, {y k } k= K, settng x = [x, x 2, x 3, x 4 ] = [.5,.,.8,.] and y r = for r =, 2, 3, 4. Assumng mproper unform prors over the coeffcent, the objectve s to approxmate the posteror of vector x = [x, x 2, x 3, x 4 ]. We note that wth K = 2, the target posteror s qute sharp and concentrated around the real values x, =, 2, 3, 4, thus we assume the vector x = [.5,.,.8,.] as the true expected value of the posteror. As n the prevous examples, the methods use Gaussan proposal pdfs and the ntal samples are selected randomly. In partcular, for the j-th pdf ts mean s µ, j UR), where R = [ 5, 5] [ 5, 5] [ 5, 5] [ 5, 5], and ts covarance matrx Σ, j = ξ 2 j I 4, wth ξ 2 j = [σ 2 j,, σ2 j,2, σ2 j,3, σ2 j,4 ] and σ j,r U[, 6]) for r =,..., 4. We test dfferent combnatons of parameters keepng the total number of used samples fxed, I = 2 6. We evaluate dfferent values of, {2, 2 3, 5 3, 4, 2 4, 4 4, 5 }, and as a consequence, I = 2 6. We recall that D = n PC and D = n AIS. For the x-ais and APIS methods we use D {, 2, 5}. The parameter N for APIS s N = D 2. We also run the etropols-hastngs H) algorthm [4, ] wth a random-walk Gaussan proposal dstrbuton usng the same ntalzaton parameters as for the rest of methods. We consder a number of teratons for H such that ts executon tme s comparable to the other technques one can fnd that I H 4 4 ). 2 For smulaton of..d. samples of the generalzed hyperbolc nose, we have appled a fast and effcent CC technque the FUSS algorthm [64]) drawng samples from unvarate dstrbutons. After few teratons, the resultng samples are roughly ndependent. 2 The H algorthm s a sequental method compared to the AIS methods whch are more parallelzable. 4

15 / Dgtal Sgnal Processng 25) 9 5 x x a) B-dmensonal slde of the target x b) APIS- D = 2, N = x c) PC- D = 2 Fgure 5. ultmodal target dstrbuton example x R ): a) B-dmensonal slce of the multmodal target pdf as a functon of x and x whle fxng x 2:5 = 9.85 and x 6:9 = 8.5; b) Fnal confguraton of the locaton parameters means) of the proposal pdfs for PC; and c) Fnal confguraton of the locaton parameters means) of the proposal pdfs for APIS. Results PC x-ais AIS APIS H Worst Best Average Table 6. AR model wth non-gaussan nose example wth R = 4 unknown coeffcents.e., x R 4 ): The worst, best and averaged results n terms of SE for the dfferent technques and after testng several combnatons of parameters. The smallest SE for each row s bold-faced. We run 5 ndependent smulatons and compute the SE n the estmaton of x we average the SEs of each component). The worst best and the averaged results n terms of SE, obtaned by the dfferent methods, are shown n Table 6. The best result s provded by the standard PC snce the ntalzaton of the samples s n a regon that contans the mode of the posteror dstrbuton, whch s crtcal n the performance of the method. APIS provdes the smallest dfference between worst and best results. Fgure 6 llustrates the hstograms of the estmatons of the expected value of the target pdf namely, the outputs of the algorthms n ths example) obtaned wth the dfferent technques and some specfc values of the parameters ndcated n the captons of the subplots. The true values x = [.5,.,.8,.] are depcted wth vertcal dashed lnes. Ths fgure helps to understand that the results n Table 6 of AIS do not properly reflect the overall performance of ths method. Namely, as shown n Fg. 6 b), AIS often provdes values close to the true ones but n some partcular runs, dependng on the ntal parameters, the method does not converge and the results deterorate the SE performance. However, as seen n Fg. 6 b), AIS may provde excellent performance. Fnally, Fg. 7 dsplays the averaged convergence curves n terms of estmaton of the x values) of PC, AIS and APIS, respectvely Case R=3 We consder the case x R 3,.e., R = 3, wth K = 5 observatons. The data are generated settng x r =.7e r, and y r =, wth r =,..., 3. These coeffcents ensure that the AR flter s stable. We use Gaussan proposal denstes wth ntal samples selected randomly. In partcular, for the j-th pror ts mean s µ, j UR), where R = [ 2, 4] 3 = [ 2, 4] } {{... } [ 2, 4], 8) 3 and ts covarance matrx Σ, j = ξ 2 j I 3, wth ξ 2 j = [σ 2 j,:3 ] and σ j,r U[, ]) for r =,..., 3. We test dfferent combnatons of parameters keepng the total number of used samples fxed, I = 2 5 : - For PC: { 3, 5 3, 4, 2 4, 4 4, 5 4 } recall = D). - For x-ais: D {, 2, 3 } and {, 5, 3 }. 5

Appendix B: Resampling Algorithms

Appendix B: Resampling Algorithms 407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles

More information

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement

Markov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement Markov Chan Monte Carlo MCMC, Gbbs Samplng, Metropols Algorthms, and Smulated Annealng 2001 Bonformatcs Course Supplement SNU Bontellgence Lab http://bsnuackr/ Outlne! Markov Chan Monte Carlo MCMC! Metropols-Hastngs

More information

Conjugacy and the Exponential Family

Conjugacy and the Exponential Family CS281B/Stat241B: Advanced Topcs n Learnng & Decson Makng Conjugacy and the Exponental Famly Lecturer: Mchael I. Jordan Scrbes: Bran Mlch 1 Conjugacy In the prevous lecture, we saw conjugate prors for the

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Composite Hypotheses testing

Composite Hypotheses testing Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter

More information

x = , so that calculated

x = , so that calculated Stat 4, secton Sngle Factor ANOVA notes by Tm Plachowsk n chapter 8 we conducted hypothess tests n whch we compared a sngle sample s mean or proporton to some hypotheszed value Chapter 9 expanded ths to

More information

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010 Parametrc fractonal mputaton for mssng data analyss Jae Kwang Km Survey Workng Group Semnar March 29, 2010 1 Outlne Introducton Proposed method Fractonal mputaton Approxmaton Varance estmaton Multple mputaton

More information

Structure and Drive Paul A. Jensen Copyright July 20, 2003

Structure and Drive Paul A. Jensen Copyright July 20, 2003 Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.

More information

Gaussian Mixture Models

Gaussian Mixture Models Lab Gaussan Mxture Models Lab Objectve: Understand the formulaton of Gaussan Mxture Models (GMMs) and how to estmate GMM parameters. You ve already seen GMMs as the observaton dstrbuton n certan contnuous

More information

Hidden Markov Models & The Multivariate Gaussian (10/26/04)

Hidden Markov Models & The Multivariate Gaussian (10/26/04) CS281A/Stat241A: Statstcal Learnng Theory Hdden Markov Models & The Multvarate Gaussan (10/26/04) Lecturer: Mchael I. Jordan Scrbes: Jonathan W. Hu 1 Hdden Markov Models As a bref revew, hdden Markov models

More information

Boostrapaggregating (Bagging)

Boostrapaggregating (Bagging) Boostrapaggregatng (Baggng) An ensemble meta-algorthm desgned to mprove the stablty and accuracy of machne learnng algorthms Can be used n both regresson and classfcaton Reduces varance and helps to avod

More information

A Hybrid Variational Iteration Method for Blasius Equation

A Hybrid Variational Iteration Method for Blasius Equation Avalable at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 1932-9466 Vol. 10, Issue 1 (June 2015), pp. 223-229 Applcatons and Appled Mathematcs: An Internatonal Journal (AAM) A Hybrd Varatonal Iteraton Method

More information

Bayesian predictive Configural Frequency Analysis

Bayesian predictive Configural Frequency Analysis Psychologcal Test and Assessment Modelng, Volume 54, 2012 (3), 285-292 Bayesan predctve Confgural Frequency Analyss Eduardo Gutérrez-Peña 1 Abstract Confgural Frequency Analyss s a method for cell-wse

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

A Robust Method for Calculating the Correlation Coefficient

A Robust Method for Calculating the Correlation Coefficient A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal

More information

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity

LINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 30 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 2 Remedes for multcollnearty Varous technques have

More information

BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS. M. Krishna Reddy, B. Naveen Kumar and Y. Ramu

BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS. M. Krishna Reddy, B. Naveen Kumar and Y. Ramu BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS M. Krshna Reddy, B. Naveen Kumar and Y. Ramu Department of Statstcs, Osmana Unversty, Hyderabad -500 007, Inda. nanbyrozu@gmal.com, ramu0@gmal.com

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Tracking with Kalman Filter

Tracking with Kalman Filter Trackng wth Kalman Flter Scott T. Acton Vrgna Image and Vdeo Analyss (VIVA), Charles L. Brown Department of Electrcal and Computer Engneerng Department of Bomedcal Engneerng Unversty of Vrgna, Charlottesvlle,

More information

Markov Chain Monte Carlo Lecture 6

Markov Chain Monte Carlo Lecture 6 where (x 1,..., x N ) X N, N s called the populaton sze, f(x) f (x) for at least one {1, 2,..., N}, and those dfferent from f(x) are called the tral dstrbutons n terms of mportance samplng. Dfferent ways

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

Numerical Heat and Mass Transfer

Numerical Heat and Mass Transfer Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Chapter 13: Multiple Regression

Chapter 13: Multiple Regression Chapter 13: Multple Regresson 13.1 Developng the multple-regresson Model The general model can be descrbed as: It smplfes for two ndependent varables: The sample ft parameter b 0, b 1, and b are used to

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

Lecture 12: Classification

Lecture 12: Classification Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna

More information

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers Psychology 282 Lecture #24 Outlne Regresson Dagnostcs: Outlers In an earler lecture we studed the statstcal assumptons underlyng the regresson model, ncludng the followng ponts: Formal statement of assumptons.

More information

4DVAR, according to the name, is a four-dimensional variational method.

4DVAR, according to the name, is a four-dimensional variational method. 4D-Varatonal Data Assmlaton (4D-Var) 4DVAR, accordng to the name, s a four-dmensonal varatonal method. 4D-Var s actually a drect generalzaton of 3D-Var to handle observatons that are dstrbuted n tme. The

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography CSc 6974 and ECSE 6966 Math. Tech. for Vson, Graphcs and Robotcs Lecture 21, Aprl 17, 2006 Estmatng A Plane Homography Overvew We contnue wth a dscusson of the major ssues, usng estmaton of plane projectve

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA 4 Analyss of Varance (ANOVA) 5 ANOVA 51 Introducton ANOVA ANOVA s a way to estmate and test the means of multple populatons We wll start wth one-way ANOVA If the populatons ncluded n the study are selected

More information

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering / Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons

More information

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method Appled Mathematcal Scences, Vol. 7, 0, no. 47, 07-0 HIARI Ltd, www.m-hkar.com Comparson of the Populaton Varance Estmators of -Parameter Exponental Dstrbuton Based on Multple Crtera Decson Makng Method

More information

Quantifying Uncertainty

Quantifying Uncertainty Partcle Flters Quantfyng Uncertanty Sa Ravela M. I. T Last Updated: Sprng 2013 1 Quantfyng Uncertanty Partcle Flters Partcle Flters Appled to Sequental flterng problems Can also be appled to smoothng problems

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Chapter 11: Simple Linear Regression and Correlation

Chapter 11: Simple Linear Regression and Correlation Chapter 11: Smple Lnear Regresson and Correlaton 11-1 Emprcal Models 11-2 Smple Lnear Regresson 11-3 Propertes of the Least Squares Estmators 11-4 Hypothess Test n Smple Lnear Regresson 11-4.1 Use of t-tests

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Analyss of Varance and Desgn of Experment-I MODULE VII LECTURE - 3 ANALYSIS OF COVARIANCE Dr Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur Any scentfc experment s performed

More information

2016 Wiley. Study Session 2: Ethical and Professional Standards Application

2016 Wiley. Study Session 2: Ethical and Professional Standards Application 6 Wley Study Sesson : Ethcal and Professonal Standards Applcaton LESSON : CORRECTION ANALYSIS Readng 9: Correlaton and Regresson LOS 9a: Calculate and nterpret a sample covarance and a sample correlaton

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

Uncertainty as the Overlap of Alternate Conditional Distributions

Uncertainty as the Overlap of Alternate Conditional Distributions Uncertanty as the Overlap of Alternate Condtonal Dstrbutons Olena Babak and Clayton V. Deutsch Centre for Computatonal Geostatstcs Department of Cvl & Envronmental Engneerng Unversty of Alberta An mportant

More information

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1

On an Extension of Stochastic Approximation EM Algorithm for Incomplete Data Problems. Vahid Tadayon 1 On an Extenson of Stochastc Approxmaton EM Algorthm for Incomplete Data Problems Vahd Tadayon Abstract: The Stochastc Approxmaton EM (SAEM algorthm, a varant stochastc approxmaton of EM, s a versatle tool

More information

Chapter 8 Indicator Variables

Chapter 8 Indicator Variables Chapter 8 Indcator Varables In general, e explanatory varables n any regresson analyss are assumed to be quanttatve n nature. For example, e varables lke temperature, dstance, age etc. are quanttatve n

More information

Statistics II Final Exam 26/6/18

Statistics II Final Exam 26/6/18 Statstcs II Fnal Exam 26/6/18 Academc Year 2017/18 Solutons Exam duraton: 2 h 30 mn 1. (3 ponts) A town hall s conductng a study to determne the amount of leftover food produced by the restaurants n the

More information

An adaptive SMC scheme for ABC. Bayesian Computation (ABC)

An adaptive SMC scheme for ABC. Bayesian Computation (ABC) An adaptve SMC scheme for Approxmate Bayesan Computaton (ABC) (ont work wth Prof. Mke West) Department of Statstcal Scence - Duke Unversty Aprl/2011 Approxmate Bayesan Computaton (ABC) Problems n whch

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

A New Evolutionary Computation Based Approach for Learning Bayesian Network

A New Evolutionary Computation Based Approach for Learning Bayesian Network Avalable onlne at www.scencedrect.com Proceda Engneerng 15 (2011) 4026 4030 Advanced n Control Engneerng and Informaton Scence A New Evolutonary Computaton Based Approach for Learnng Bayesan Network Yungang

More information

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests Smulated of the Cramér-von Mses Goodness-of-Ft Tests Steele, M., Chaselng, J. and 3 Hurst, C. School of Mathematcal and Physcal Scences, James Cook Unversty, Australan School of Envronmental Studes, Grffth

More information

DUE: WEDS FEB 21ST 2018

DUE: WEDS FEB 21ST 2018 HOMEWORK # 1: FINITE DIFFERENCES IN ONE DIMENSION DUE: WEDS FEB 21ST 2018 1. Theory Beam bendng s a classcal engneerng analyss. The tradtonal soluton technque makes smplfyng assumptons such as a constant

More information

Chapter 5 Multilevel Models

Chapter 5 Multilevel Models Chapter 5 Multlevel Models 5.1 Cross-sectonal multlevel models 5.1.1 Two-level models 5.1.2 Multple level models 5.1.3 Multple level modelng n other felds 5.2 Longtudnal multlevel models 5.2.1 Two-level

More information

Statistics for Economics & Business

Statistics for Economics & Business Statstcs for Economcs & Busness Smple Lnear Regresson Learnng Objectves In ths chapter, you learn: How to use regresson analyss to predct the value of a dependent varable based on an ndependent varable

More information

A Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function

A Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function Advanced Scence and Technology Letters, pp.83-87 http://dx.do.org/10.14257/astl.2014.53.20 A Partcle Flter Algorthm based on Mxng of Pror probablty densty and UKF as Generate Importance Functon Lu Lu 1,1,

More information

MMA and GCMMA two methods for nonlinear optimization

MMA and GCMMA two methods for nonlinear optimization MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons

More information

On the correction of the h-index for career length

On the correction of the h-index for career length 1 On the correcton of the h-ndex for career length by L. Egghe Unverstet Hasselt (UHasselt), Campus Depenbeek, Agoralaan, B-3590 Depenbeek, Belgum 1 and Unverstet Antwerpen (UA), IBW, Stadscampus, Venusstraat

More information

Convergence of random processes

Convergence of random processes DS-GA 12 Lecture notes 6 Fall 216 Convergence of random processes 1 Introducton In these notes we study convergence of dscrete random processes. Ths allows to characterze phenomena such as the law of large

More information

Difference Equations

Difference Equations Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1

More information

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

NUMERICAL DIFFERENTIATION

NUMERICAL DIFFERENTIATION NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the

More information

Discussion of Extensions of the Gauss-Markov Theorem to the Case of Stochastic Regression Coefficients Ed Stanek

Discussion of Extensions of the Gauss-Markov Theorem to the Case of Stochastic Regression Coefficients Ed Stanek Dscusson of Extensons of the Gauss-arkov Theorem to the Case of Stochastc Regresson Coeffcents Ed Stanek Introducton Pfeffermann (984 dscusses extensons to the Gauss-arkov Theorem n settngs where regresson

More information

Solving Nonlinear Differential Equations by a Neural Network Method

Solving Nonlinear Differential Equations by a Neural Network Method Solvng Nonlnear Dfferental Equatons by a Neural Network Method Luce P. Aarts and Peter Van der Veer Delft Unversty of Technology, Faculty of Cvlengneerng and Geoscences, Secton of Cvlengneerng Informatcs,

More information

Global Sensitivity. Tuesday 20 th February, 2018

Global Sensitivity. Tuesday 20 th February, 2018 Global Senstvty Tuesday 2 th February, 28 ) Local Senstvty Most senstvty analyses [] are based on local estmates of senstvty, typcally by expandng the response n a Taylor seres about some specfc values

More information

Estimating the Fundamental Matrix by Transforming Image Points in Projective Space 1

Estimating the Fundamental Matrix by Transforming Image Points in Projective Space 1 Estmatng the Fundamental Matrx by Transformng Image Ponts n Projectve Space 1 Zhengyou Zhang and Charles Loop Mcrosoft Research, One Mcrosoft Way, Redmond, WA 98052, USA E-mal: fzhang,cloopg@mcrosoft.com

More information

Joint Statistical Meetings - Biopharmaceutical Section

Joint Statistical Meetings - Biopharmaceutical Section Iteratve Ch-Square Test for Equvalence of Multple Treatment Groups Te-Hua Ng*, U.S. Food and Drug Admnstraton 1401 Rockvlle Pke, #200S, HFM-217, Rockvlle, MD 20852-1448 Key Words: Equvalence Testng; Actve

More information

Suppose that there s a measured wndow of data fff k () ; :::; ff k g of a sze w, measured dscretely wth varable dscretzaton step. It s convenent to pl

Suppose that there s a measured wndow of data fff k () ; :::; ff k g of a sze w, measured dscretely wth varable dscretzaton step. It s convenent to pl RECURSIVE SPLINE INTERPOLATION METHOD FOR REAL TIME ENGINE CONTROL APPLICATIONS A. Stotsky Volvo Car Corporaton Engne Desgn and Development Dept. 97542, HA1N, SE- 405 31 Gothenburg Sweden. Emal: astotsky@volvocars.com

More information

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS

More information

Chapter 6. Supplemental Text Material

Chapter 6. Supplemental Text Material Chapter 6. Supplemental Text Materal S6-. actor Effect Estmates are Least Squares Estmates We have gven heurstc or ntutve explanatons of how the estmates of the factor effects are obtaned n the textboo.

More information

Uncertainty in measurements of power and energy on power networks

Uncertainty in measurements of power and energy on power networks Uncertanty n measurements of power and energy on power networks E. Manov, N. Kolev Department of Measurement and Instrumentaton, Techncal Unversty Sofa, bul. Klment Ohrdsk No8, bl., 000 Sofa, Bulgara Tel./fax:

More information

DETERMINATION OF UNCERTAINTY ASSOCIATED WITH QUANTIZATION ERRORS USING THE BAYESIAN APPROACH

DETERMINATION OF UNCERTAINTY ASSOCIATED WITH QUANTIZATION ERRORS USING THE BAYESIAN APPROACH Proceedngs, XVII IMEKO World Congress, June 7, 3, Dubrovn, Croata Proceedngs, XVII IMEKO World Congress, June 7, 3, Dubrovn, Croata TC XVII IMEKO World Congress Metrology n the 3rd Mllennum June 7, 3,

More information

Natural Images, Gaussian Mixtures and Dead Leaves Supplementary Material

Natural Images, Gaussian Mixtures and Dead Leaves Supplementary Material Natural Images, Gaussan Mxtures and Dead Leaves Supplementary Materal Danel Zoran Interdscplnary Center for Neural Computaton Hebrew Unversty of Jerusalem Israel http://www.cs.huj.ac.l/ danez Yar Wess

More information

arxiv:cs.cv/ Jun 2000

arxiv:cs.cv/ Jun 2000 Correlaton over Decomposed Sgnals: A Non-Lnear Approach to Fast and Effectve Sequences Comparson Lucano da Fontoura Costa arxv:cs.cv/0006040 28 Jun 2000 Cybernetc Vson Research Group IFSC Unversty of São

More information

Lecture 4. Instructor: Haipeng Luo

Lecture 4. Instructor: Haipeng Luo Lecture 4 Instructor: Hapeng Luo In the followng lectures, we focus on the expert problem and study more adaptve algorthms. Although Hedge s proven to be worst-case optmal, one may wonder how well t would

More information

Singular Value Decomposition: Theory and Applications

Singular Value Decomposition: Theory and Applications Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real

More information

Chapter 12 Analysis of Covariance

Chapter 12 Analysis of Covariance Chapter Analyss of Covarance Any scentfc experment s performed to know somethng that s unknown about a group of treatments and to test certan hypothess about the correspondng treatment effect When varablty

More information

SIMPLE LINEAR REGRESSION

SIMPLE LINEAR REGRESSION Smple Lnear Regresson and Correlaton Introducton Prevousl, our attenton has been focused on one varable whch we desgnated b x. Frequentl, t s desrable to learn somethng about the relatonshp between two

More information

Supplementary Notes for Chapter 9 Mixture Thermodynamics

Supplementary Notes for Chapter 9 Mixture Thermodynamics Supplementary Notes for Chapter 9 Mxture Thermodynamcs Key ponts Nne major topcs of Chapter 9 are revewed below: 1. Notaton and operatonal equatons for mxtures 2. PVTN EOSs for mxtures 3. General effects

More information

PHYS 450 Spring semester Lecture 02: Dealing with Experimental Uncertainties. Ron Reifenberger Birck Nanotechnology Center Purdue University

PHYS 450 Spring semester Lecture 02: Dealing with Experimental Uncertainties. Ron Reifenberger Birck Nanotechnology Center Purdue University PHYS 45 Sprng semester 7 Lecture : Dealng wth Expermental Uncertantes Ron Refenberger Brck anotechnology Center Purdue Unversty Lecture Introductory Comments Expermental errors (really expermental uncertantes)

More information

U-Pb Geochronology Practical: Background

U-Pb Geochronology Practical: Background U-Pb Geochronology Practcal: Background Basc Concepts: accuracy: measure of the dfference between an expermental measurement and the true value precson: measure of the reproducblty of the expermental result

More information

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands

1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands Content. Inference on Regresson Parameters a. Fndng Mean, s.d and covarance amongst estmates.. Confdence Intervals and Workng Hotellng Bands 3. Cochran s Theorem 4. General Lnear Testng 5. Measures of

More information

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm

Design and Optimization of Fuzzy Controller for Inverse Pendulum System Using Genetic Algorithm Desgn and Optmzaton of Fuzzy Controller for Inverse Pendulum System Usng Genetc Algorthm H. Mehraban A. Ashoor Unversty of Tehran Unversty of Tehran h.mehraban@ece.ut.ac.r a.ashoor@ece.ut.ac.r Abstract:

More information

Negative Binomial Regression

Negative Binomial Regression STATGRAPHICS Rev. 9/16/2013 Negatve Bnomal Regresson Summary... 1 Data Input... 3 Statstcal Model... 3 Analyss Summary... 4 Analyss Optons... 7 Plot of Ftted Model... 8 Observed Versus Predcted... 10 Predctons...

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

Note 10. Modeling and Simulation of Dynamic Systems

Note 10. Modeling and Simulation of Dynamic Systems Lecture Notes of ME 475: Introducton to Mechatroncs Note 0 Modelng and Smulaton of Dynamc Systems Department of Mechancal Engneerng, Unversty Of Saskatchewan, 57 Campus Drve, Saskatoon, SK S7N 5A9, Canada

More information

Lecture 3 Stat102, Spring 2007

Lecture 3 Stat102, Spring 2007 Lecture 3 Stat0, Sprng 007 Chapter 3. 3.: Introducton to regresson analyss Lnear regresson as a descrptve technque The least-squares equatons Chapter 3.3 Samplng dstrbuton of b 0, b. Contnued n net lecture

More information

A linear imaging system with white additive Gaussian noise on the observed data is modeled as follows:

A linear imaging system with white additive Gaussian noise on the observed data is modeled as follows: Supplementary Note Mathematcal bacground A lnear magng system wth whte addtve Gaussan nose on the observed data s modeled as follows: X = R ϕ V + G, () where X R are the expermental, two-dmensonal proecton

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

STAT 3008 Applied Regression Analysis

STAT 3008 Applied Regression Analysis STAT 3008 Appled Regresson Analyss Tutoral : Smple Lnear Regresson LAI Chun He Department of Statstcs, The Chnese Unversty of Hong Kong 1 Model Assumpton To quantfy the relatonshp between two factors,

More information

Homework Assignment 3 Due in class, Thursday October 15

Homework Assignment 3 Due in class, Thursday October 15 Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.

More information

Primer on High-Order Moment Estimators

Primer on High-Order Moment Estimators Prmer on Hgh-Order Moment Estmators Ton M. Whted July 2007 The Errors-n-Varables Model We wll start wth the classcal EIV for one msmeasured regressor. The general case s n Erckson and Whted Econometrc

More information

Stat260: Bayesian Modeling and Inference Lecture Date: February 22, Reference Priors

Stat260: Bayesian Modeling and Inference Lecture Date: February 22, Reference Priors Stat60: Bayesan Modelng and Inference Lecture Date: February, 00 Reference Prors Lecturer: Mchael I. Jordan Scrbe: Steven Troxler and Wayne Lee In ths lecture, we assume that θ R; n hgher-dmensons, reference

More information

This column is a continuation of our previous column

This column is a continuation of our previous column Comparson of Goodness of Ft Statstcs for Lnear Regresson, Part II The authors contnue ther dscusson of the correlaton coeffcent n developng a calbraton for quanttatve analyss. Jerome Workman Jr. and Howard

More information

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M CIS56: achne Learnng Lecture 3 (Sept 6, 003) Preparaton help: Xaoyng Huang Lnear Regresson Lnear regresson can be represented by a functonal form: f(; θ) = θ 0 0 +θ + + θ = θ = 0 ote: 0 s a dummy attrbute

More information

Statistical analysis using matlab. HY 439 Presented by: George Fortetsanakis

Statistical analysis using matlab. HY 439 Presented by: George Fortetsanakis Statstcal analyss usng matlab HY 439 Presented by: George Fortetsanaks Roadmap Probablty dstrbutons Statstcal estmaton Fttng data to probablty dstrbutons Contnuous dstrbutons Contnuous random varable X

More information

Multigradient for Neural Networks for Equalizers 1

Multigradient for Neural Networks for Equalizers 1 Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT

More information

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS HCMC Unversty of Pedagogy Thong Nguyen Huu et al. A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS Thong Nguyen Huu and Hao Tran Van Department of mathematcs-nformaton,

More information

Time-Varying Systems and Computations Lecture 6

Time-Varying Systems and Computations Lecture 6 Tme-Varyng Systems and Computatons Lecture 6 Klaus Depold 14. Januar 2014 The Kalman Flter The Kalman estmaton flter attempts to estmate the actual state of an unknown dscrete dynamcal system, gven nosy

More information

STAT 511 FINAL EXAM NAME Spring 2001

STAT 511 FINAL EXAM NAME Spring 2001 STAT 5 FINAL EXAM NAME Sprng Instructons: Ths s a closed book exam. No notes or books are allowed. ou may use a calculator but you are not allowed to store notes or formulas n the calculator. Please wrte

More information