Divide-and-Conquer Sequential Monte Carlo

Size: px
Start display at page:

Download "Divide-and-Conquer Sequential Monte Carlo"

Transcription

1 Divide-and-Conquer Joint work with: John Aston, Alexandre Bouchard-Côté, Brent Kirkpatrick, Fredrik Lindsten, Christian Næsseth, Thomas Schön University of Warwick Algorithms and Computationally Intensive Statistics November 25th, 2016

2 Outline Importance Sampling to (SMC) SMC to Divide and Conquer SMC (DC-SMC) Some Theoretical Properties of DC-SMC Illustrative Applications Conclusions and Some (Open) Questions

3 Essential Problem The Abstract Problem Given a density, π(x) = γ(x) Z, such that γ(x) can be evaluated pointwise, how can we approximate π and how about Z? Can we do so robustly? In a distributed setting?

4 Importance Sampling Simple identity: provided γ µ: γ(x) Z = γ(x)dx = µ(x) µ(x)dx So, if X 1, X 2,... iid µ, then: unbiasedness slln clt [ 1 lim N N N [ ] 1 N γ(x i ) N : E =Z N µ(x i ) 1 lim N N N i=1 ( [ ]) γ(xi ) where W N 0, Var µ(x 1 ) ϕ(x 1). N i=1 i=1 γ(x i ) µ(x i ) ϕ(x i) a.s. γ(ϕ) γ(x i ) µ(x i ) ϕ(x i) γ(ϕ) ] d W

5 Sequential Importance Sampling Write γ(x 1:n ) = γ(x 1 ) define, for p = 1,..., n then γ(x 1:n ) µ(x 1:n ) }{{} W n(x 1:n ) γ p (x 1:p ) = γ 1 (x 1 ) = γ 1(x 1 ) µ 1 (x 1 ) }{{} w 1 (x 1 ) p=2 n γ(x p x 1:p 1 ), p=2 p γ(x q x 1:q 1 ), q=2 n γ p (x 1:p ), γ p 1 (x 1:p 1 )µ p (x p x 1:p 1 ) }{{} w p(x 1:p ) and we can sequentially approximate Z p = γ p (x 1:p )dx 1:p.

6 Sequential Importance Resampling (SIR) Given a sequence γ 1 (x 1 ), γ 2 (x 1:2 ),...: Initialisation, n = 1: Sample X 1 1,..., X N 1 Compute iid µ 1 W i 1 = γ 1(X i 1 ) µ i 1 (X i 1 ) Obtain Ẑ N 1 = 1 N N i=1 W i 1 π N 1 = Ni=1 W i 1 δ X i 1 N j=1 W j 1 [This is just (self-normalized) importance sampling.]

7 Iteration, n n + 1: Resample: sample (X 1 n,1:n 1,..., X N n,1:n 1 ) iid N i=1 δ X i n 1 Sample X i n,n q n ( X i n,1:n 1 ) Compute W i n = γ n (X i n,1:n ) γ n 1 (X i n,1:n 1 ) q n(x i n,n X i n,1:n 1 ). Obtain Ẑ N n = Ẑ N n 1 1 N N Wn i π n N = i=1 N i=1 W nδ i X i n N j=1 W n j.

8 SIR: Theoretical Justification Under regularity conditions we still have: unbiasedness slln E[Ẑ N n ] = Z n lim N πn n (ϕ) a.s. = π n (ϕ) clt For a normal random variable W n of appropriate variance: lim N[ π N n (ϕ) π n (ϕ)] = d W n N although establishing this becomes a little harder (cf., e.g. Del Moral (2004), Andrieu et al. 2010).

9 Simple Particle Filters: One Family of SIR Algorithms x 1 x 2 x 3 x 4 x 5 x 6 y 1 y 2 y 3 y 4 y 5 y 6 Unobserved Markov chain {X n } transition f. Observed process {Y n } conditional density g. The joint density is available: p(x 1:n, y 1:n θ) = f θ 1 (x 1 )g θ (y 1 x 1 ) Natural SIR target distributions: n f θ (x i x i 1 )g θ (y i x i ). i=2 πn(x θ 1:n ) :=p(x 1:n y 1:n, θ) p(x 1:n, y 1:n θ) =: γn(x θ 1:n ) Zn θ = p(x 1:n, y 1:n θ)dx 1:n = p(y 1:n θ)

10 Bootstrap PFs and Similar Choosing πn(x θ 1:n ) :=p(x 1:n y 1:n, θ) p(x 1:n, y 1:n θ) =: γn(x θ 1:n ) Zn θ = p(x 1:n, y 1:n θ)dx 1:n = p(y 1:n θ) and q p (x p x 1:p 1 ) = f θ (x p x p 1 ) yields the bootstrap particle filter of Gordon et al. (1993), whereas q p (x p x 1:p 1 ) = p(x p x p 1, y p, θ) yields the locally optimal particle filter. Note: Many alternative particle filters are SIR algorithms with other targets. Cf. J. and Doucet (2008); Doucet and J. (2011).

11 Samplers: Another SIR Class Given a sequence of targets π 1,..., π n on arbitrary spaces, Del Moral et al. (2006) extend the space: π n (x 1:n ) =π n (x n ) γ n (x 1:n ) =γ n (x n ) Z n = = 1 p=n 1 1 p=n 1 γ n (x 1:n )dx 1:n γ n (x n ) L p (x p+1, x p ) L p (x p+1, x p ) 1 p=n 1 L p (x p+1, x p )dx 1:n = γ n (x n )dx n = Z n

12 A Simple SMC Sampler Given γ 1,..., γ n, on (E, E), for i = 1,..., N Sample X i 1 iid µ 1 compute W i 1 = γ 1(X i 1 ) µ 1 (X i 1 ) and Ẑ1 N = 1 N N i=1 W 1 i For p = 2,..., n Resample: X 1:N iid n,n 1 N i=1 W n 1 i δ X n 1. Sample: X i n K n (X i n,1:n 1, ), where π nk n = π n. Compute: W i n = γn(x i n,n 1 ) γ n 1(X i n,n 1 ). Then Ẑ N n = Ẑ N n 1 1 N N i=1 W i n, n and πn N i=1 = W nδ i X i n n. j=1 W n j

13 Bayesian Inference (Chopin, 2001;Del Moral et al., 2006) In a Bayesian context: Given a prior p(θ) and likelihood l(θ; y 1:m ) One could specify: Data Tempering γ p (θ) = p(θ)l(θ; y 1:mp ) for m 1 = 0 < m 2 < < m T = m Likelihood Tempering γ p (θ) = p(θ)l(θ; y 1:m ) βp β 1 = 0 < β 2 < < β T = 1 Something else? Here Z T = p(θ)l(θ; y 1:n )dθ and γ T (θ) p(θ y 1:n ). Specifying (m 1,..., m T ), (β 1,..., β T ) or (γ 1,..., γ T ) is hard 1. for 1 Just ask Lewis...

14 Illustrative Sequences of Targets dnorm(x, mu0, var0) dnorm(x, mu0, var0) x x

15 One Adaptive Scheme (Zhou, J. & Aston, 2016)+Refs Resample When ESS(W 1:N n ) = ( N i=1 (W i n) 2 ) 1 is below a threshold. Likelihood Tempering At iteration n: Set β n such that: N( N n ) 2 N k=1 W (k) (k) n 1 (w n ) = CESS 2 j=1 W (j) n 1 w (j) which controls χ 2 -discrepancy between successive distributions. Proposals Follow (Jasra et al., 2010): adapt to keep acceptance rate about right. Question Are there better, practical approaches to specifying a sequence of distributions?

16 Divide-and-Conquer (Lindsten, J. et al., 2016) Many models admit natural decompositions: Level 0: x 5 Level 1: x 4 x 4 Level 2: x 1 x 2 x 3 x 1 x 2 x 3 x 1 x 2 x 3 y 1 y 2 y 3 y 1 y 2 y 3 y 1 y 2 y 3 To which we can apply a divide-and-conquer strategy: π r π t π c1 π cc

17 A few formalities... Use a tree, T of models (with rootward variable inclusion): π r π t π c1 π cc Let t T denote a node; r T is the root. Let C(t) = {c 1,..., c C } denote the children of t. Let X t denote the space of variables included in t but not its children. dc-smc can be viewed as a recursion over this tree.

18 dc-smc(t) an extension of SIR 1. For c C(t) : 1.1 ({x i c, w i c} N i=1, Ẑ N c ) dc-smc(c). 1.2 Resample {x i c, wc} i N i=1 to obtain the equally weighted particle system {ˆx i c, 1} N i=1. 2. For particle i = 1 : N: 2.1 If X t, simulate x i t q t ( ˆx i c 1,..., ˆx i c C ), where (c 1, c 2,..., c C ) = C(t); else x i t. 2.2 Set x i t = (ˆx i c 1,..., ˆx i c C, x i t). 2.3 Compute w i t = γ t (x i t ) 1 c C(t) γc (ˆxi c ) q t ( x i t ˆxi c 1,...,ˆx i c ). C 3. Compute Ẑ N t = { 1 N N i=1 wi t} c C(t) Ẑ N c. 4. Return ({x i t, w i t} N i=1, Ẑ N t ).

19 Theoretical Properties I Unbiasedness of Normalising Constant Estimates Provided that γ t c C(t) γ c q t for every t T and an unbiased, exchangeable resampling scheme is applied to every population at every iteration, we have for any N 1: E[Ẑ r N ] = Z r = γ r (x r )dx r.

20 Strong Law of Large Numbers Under regularity conditions the weighted particle system (x N,i r, wr N,i ) N i=1 generated by dc-smc(r) is consistent in that for all functions f : Z R satisfying certain assumptions: N i=1 w N,i r N j=1 wn,j r ϕ(x N,i r ) a.s. π(ϕ(x)) as N.

21 Some (Importance) Extensions 1. Mixture Resampling: we could make use of N i=1 j=1 N wc i 1 wc j dγ t 2 (x i c dγ c1 γ 1, x j c 2 )δ (x i c2 c1,x j c 2 ) 2. Tempering (Del Moral et al, 2006): between γ c1 γ c2 and γ t 3. Adaptation (Zhou, J. and Aston, 2016): tempering sequence MCMC kernel parameters

22 An Ising Model k indexes v k V Z 2 x k { 1, +1} p(z) e βe(z), β 0 E(z) = (k,l) E x kx l We consider a grid of size with β = (the critical temperature).

23 A sequence of decompositions

24 log Z SMC D&C SMC (ann) D&C SMC (ann+mix) E[E(x)] Wall clock time (s) 2650 SMC D&C SMC (ann) D&C SMC (ann+mix) Single flip MH Wall clock time (s) Summaries over 50 independent runs of each algorithm.

25 New York Schools Maths Test: data Data organised into a tree T. A root-to-leaf path is: NYC (the root, denoted by r T ), borough, school district, school, year. Each leaf t T comes with an observation of m t exam successes out of M t trials. Total of test instances five borough (Manhattan, The Bronx, Brooklyn, Queens, Staten Island), 32 distinct districts, 710 distinct schools.

26 New York Schools Maths Test: Bayesian Model Number of successes m t at a leaf t is Bin(M t, p t ). where p t = logistic(θ t ), where θ t is a latent parameter. internal nodes of the tree also have a latent θ t model the difference in θ t along e = (t t ) as θ t = θ t + e, where, e N(0, σ 2 e). We put an improper prior (uniform on (, )) on θ r. We also make the variance random, but shared across siblings, σ 2 t Exp(1).

27 New York Schools Maths Test: Implementation The basic SIR-implementation of dc-smc. Using the natural hierarchical structure provided by the model. Given σ 2 t and the θ t at the leaves, the other random variables are multivariate normal. We instantiate values for θ t only at the leaves. At internal node t, sample only σt 2 and marginalize out θ t. Each step of dc-smc therefore is either: i. At leaves sample p t Beta(1 + m t, 1 + M t m t ) and set θ t = logit(p t ). ii. At internal nodes sample σt 2 Exp(1). Java implementation:

28 New York Schools Maths Test: Results Mean parameter NY NY Bronx NY Kings NY Manhattan NY Queens NY Richmond Posterior density approximation DC with particles. Bronx County has the highest fraction (41%) of children (under 18) living below poverty level. 2 Queens has the second lowest (19.7%), after Richmond (Staten Island, 16.7%). Staten Island contains a single school district so the posterior distribution is much flatter for this borough. 2 Statistics from the New York State Poverty Report 2013, Poverty Report.pdf

29 Normalising Constant Estimates 3825 Estimate of log(z) sampling_method Number of particles (log scale) DC STD

30 Distributed Implementation Time (s) Speedup Number of nodes 1 10 Number of nodes Xeon X GHz processors connected by a non-blocking Infiniband 4X QDR network

31 Conclusions SMC SIR D&C-SMC SIR + Coalescence Distributed implementation is straightforward D&C strategy can improve even serial performance Some questions remain unanswered: How can we construct (near) optimal tree-decompositions? How much standard SMC theory can be extended to this setting? Some application areas are appealing: Inference for phylogenetic trees in linguistics. Principled aggregration of mass univariate analyses from neuroimaging.

32 References [1] C. Andrieu, A. Doucet, and R. Holenstein. Particle Markov chain Monte Carlo. Journal of the Royal Statistical Society B, 72(3): , [2] N. Chopin. A sequential particle filter method for static models. Biometrika, 89(3): , [3] P. Del Moral. Feynman-Kac formulae: genealogical and interacting particle systems with applications. Probability and Its Applications. Springer Verlag, New York, [4] P. Del Moral, A. Doucet, and A. Jasra. methods for Bayesian Computation. In Bayesian Statistics 8. Oxford University Press, [5] A. Doucet and A. M. Johansen. A tutorial on particle filtering and smoothing: Fiteen years later. In D. Crisan and B. Rozovsky, editors, The Oxford Handbook of Nonlinear Filtering, pages Oxford University Press, [6] N. J. Gordon, S. J. Salmond, and A. F. M. Smith. Novel approach to nonlinear/non-gaussian Bayesian state estimation. IEE Proceedings-F, 140(2): , April [7] A. Jasra, D. A. Stephens, A. Doucet, and T. Tsagaris. Inference for Lévy-Driven Stochastic Volatility Models via Adaptive. Scandinavian Journal of Statistics, 38(1):1 22, Dec [8] A. M. Johansen and A. Doucet. A note on the auxiliary particle filter. Statistics and Probability Letters, 78 (12): , September URL [9] F. Lindsten, A. M. Johansen, C. A. Naesseth, B. Kirkpatrick, T. Schön, J. A. D. Aston, and A. Bouchard-Côté. Divide and conquer with sequential Monte Carlo samplers. Journal of Computational and Graphical Statistics, In press. [10] Y. Zhou, A. M. Johansen, and J. A. D. Aston. Towards automatic model comparison: An adaptive sequential Monte Carlo approach. Journal of Computational and Graphical Statistics, 25(3): , doi: /

An Brief Overview of Particle Filtering

An Brief Overview of Particle Filtering 1 An Brief Overview of Particle Filtering Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ May 11th, 2010 Warwick University Centre for Systems

More information

Monte Carlo Approximation of Monte Carlo Filters

Monte Carlo Approximation of Monte Carlo Filters Monte Carlo Approximation of Monte Carlo Filters Adam M. Johansen et al. Collaborators Include: Arnaud Doucet, Axel Finke, Anthony Lee, Nick Whiteley 7th January 2014 Context & Outline Filtering in State-Space

More information

The Hierarchical Particle Filter

The Hierarchical Particle Filter and Arnaud Doucet http://go.warwick.ac.uk/amjohansen/talks MCMSki V Lenzerheide 7th January 2016 Context & Outline Filtering in State-Space Models: SIR Particle Filters [GSS93] Block-Sampling Particle

More information

Divide-and-Conquer with Sequential Monte Carlo

Divide-and-Conquer with Sequential Monte Carlo Divide-and-Conquer with Sequential Monte Carlo F. Lindsten,3, A. M. Johansen 2, C. A. Naesseth 3, B. Kirkpatrick 4, T. B. Schön 5, J. A. D. Aston, and A. Bouchard-Côté 6 arxiv:46.4993v2 [stat.co] 3 Jun

More information

Controlled sequential Monte Carlo

Controlled sequential Monte Carlo Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation

More information

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for

More information

Supplementary material for Divide-and-Conquer with Sequential Monte Carlo

Supplementary material for Divide-and-Conquer with Sequential Monte Carlo Supplementary material for Divide-and-Conquer with Sequential Monte Carlo F. Lindsten, A. M. Johansen 2, C. A. Naesseth 3, B. Kirkpatrick 4, T. B. Schön, J. A. D. Aston 5, and A. Bouchard-Côté 6 June 3,

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Axel Gandy Department of Mathematics Imperial College London http://www2.imperial.ac.uk/~agandy London

More information

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015.

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015. Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy

More information

1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015.

1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015. Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy

More information

An introduction to Sequential Monte Carlo

An introduction to Sequential Monte Carlo An introduction to Sequential Monte Carlo Thang Bui Jes Frellsen Department of Engineering University of Cambridge Research and Communication Club 6 February 2014 1 Sequential Monte Carlo (SMC) methods

More information

Auxiliary Particle Methods

Auxiliary Particle Methods Auxiliary Particle Methods Perspectives & Applications Adam M. Johansen 1 adam.johansen@bristol.ac.uk Oxford University Man Institute 29th May 2008 1 Collaborators include: Arnaud Doucet, Nick Whiteley

More information

A Note on Auxiliary Particle Filters

A Note on Auxiliary Particle Filters A Note on Auxiliary Particle Filters Adam M. Johansen a,, Arnaud Doucet b a Department of Mathematics, University of Bristol, UK b Departments of Statistics & Computer Science, University of British Columbia,

More information

SMC 2 : an efficient algorithm for sequential analysis of state-space models

SMC 2 : an efficient algorithm for sequential analysis of state-space models SMC 2 : an efficient algorithm for sequential analysis of state-space models N. CHOPIN 1, P.E. JACOB 2, & O. PAPASPILIOPOULOS 3 1 ENSAE-CREST 2 CREST & Université Paris Dauphine, 3 Universitat Pompeu Fabra

More information

Adversarial Sequential Monte Carlo

Adversarial Sequential Monte Carlo Adversarial Sequential Monte Carlo Kira Kempinska Department of Security and Crime Science University College London London, WC1E 6BT kira.kowalska.13@ucl.ac.uk John Shawe-Taylor Department of Computer

More information

Sequential Monte Carlo Methods for Bayesian Model Selection in Positron Emission Tomography

Sequential Monte Carlo Methods for Bayesian Model Selection in Positron Emission Tomography Methods for Bayesian Model Selection in Positron Emission Tomography Yan Zhou John A.D. Aston and Adam M. Johansen 6th January 2014 Y. Zhou J. A. D. Aston and A. M. Johansen Outline Positron emission tomography

More information

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms L09. PARTICLE FILTERING NA568 Mobile Robotics: Methods & Algorithms Particle Filters Different approach to state estimation Instead of parametric description of state (and uncertainty), use a set of state

More information

Sequential Monte Carlo in the machine learning toolbox

Sequential Monte Carlo in the machine learning toolbox Sequential Monte Carlo in the machine learning toolbox Working with the trend of blending Thomas Schön Uppsala University Sweden. Symposium on Advances in Approximate Bayesian Inference (AABI) Montréal,

More information

Sequential Monte Carlo methods for system identification

Sequential Monte Carlo methods for system identification Technical report arxiv:1503.06058v3 [stat.co] 10 Mar 2016 Sequential Monte Carlo methods for system identification Thomas B. Schön, Fredrik Lindsten, Johan Dahlin, Johan Wågberg, Christian A. Naesseth,

More information

Introduction. log p θ (y k y 1:k 1 ), k=1

Introduction. log p θ (y k y 1:k 1 ), k=1 ESAIM: PROCEEDINGS, September 2007, Vol.19, 115-120 Christophe Andrieu & Dan Crisan, Editors DOI: 10.1051/proc:071915 PARTICLE FILTER-BASED APPROXIMATE MAXIMUM LIKELIHOOD INFERENCE ASYMPTOTICS IN STATE-SPACE

More information

A Review of Pseudo-Marginal Markov Chain Monte Carlo

A Review of Pseudo-Marginal Markov Chain Monte Carlo A Review of Pseudo-Marginal Markov Chain Monte Carlo Discussed by: Yizhe Zhang October 21, 2016 Outline 1 Overview 2 Paper review 3 experiment 4 conclusion Motivation & overview Notation: θ denotes the

More information

Sequential Monte Carlo Samplers for Applications in High Dimensions

Sequential Monte Carlo Samplers for Applications in High Dimensions Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex

More information

Learning Static Parameters in Stochastic Processes

Learning Static Parameters in Stochastic Processes Learning Static Parameters in Stochastic Processes Bharath Ramsundar December 14, 2012 1 Introduction Consider a Markovian stochastic process X T evolving (perhaps nonlinearly) over time variable T. We

More information

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007 Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 Importance Sampling Recall: Let s say that we want to compute some expectation (integral) E p [f] = p(x)f(x)dx and we remember

More information

Sequential Monte Carlo samplers

Sequential Monte Carlo samplers J. R. Statist. Soc. B (2006) 68, Part 3, pp. 411 436 Sequential Monte Carlo samplers Pierre Del Moral, Université Nice Sophia Antipolis, France Arnaud Doucet University of British Columbia, Vancouver,

More information

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007 Particle Filtering a brief introductory tutorial Frank Wood Gatsby, August 2007 Problem: Target Tracking A ballistic projectile has been launched in our direction and may or may not land near enough to

More information

Kernel adaptive Sequential Monte Carlo

Kernel adaptive Sequential Monte Carlo Kernel adaptive Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) December 7, 2015 1 / 36 Section 1 Outline

More information

Package RcppSMC. March 18, 2018

Package RcppSMC. March 18, 2018 Type Package Title Rcpp Bindings for Sequential Monte Carlo Version 0.2.1 Date 2018-03-18 Package RcppSMC March 18, 2018 Author Dirk Eddelbuettel, Adam M. Johansen and Leah F. South Maintainer Dirk Eddelbuettel

More information

An ABC interpretation of the multiple auxiliary variable method

An ABC interpretation of the multiple auxiliary variable method School of Mathematical and Physical Sciences Department of Mathematics and Statistics Preprint MPS-2016-07 27 April 2016 An ABC interpretation of the multiple auxiliary variable method by Dennis Prangle

More information

Density Estimation. Seungjin Choi

Density Estimation. Seungjin Choi Density Estimation Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr http://mlg.postech.ac.kr/

More information

Concentration inequalities for Feynman-Kac particle models. P. Del Moral. INRIA Bordeaux & IMB & CMAP X. Journées MAS 2012, SMAI Clermond-Ferrand

Concentration inequalities for Feynman-Kac particle models. P. Del Moral. INRIA Bordeaux & IMB & CMAP X. Journées MAS 2012, SMAI Clermond-Ferrand Concentration inequalities for Feynman-Kac particle models P. Del Moral INRIA Bordeaux & IMB & CMAP X Journées MAS 2012, SMAI Clermond-Ferrand Some hyper-refs Feynman-Kac formulae, Genealogical & Interacting

More information

Sequential Monte Carlo for Graphical Models

Sequential Monte Carlo for Graphical Models Sequential Monte Carlo for Graphical Models Christian A. Naesseth Div. of Automatic Control Linköping University Linköping, Sweden chran60@isy.liu.se Fredrik Lindsten Dept. of Engineering The University

More information

An Introduction to Particle Filtering

An Introduction to Particle Filtering An Introduction to Particle Filtering Author: Lisa Turner Supervisor: Dr. Christopher Sherlock 10th May 2013 Abstract This report introduces the ideas behind particle filters, looking at the Kalman filter

More information

Graphical model inference: Sequential Monte Carlo meets deterministic approximations

Graphical model inference: Sequential Monte Carlo meets deterministic approximations Graphical model inference: Sequential Monte Carlo meets deterministic approximations Fredrik Lindsten Department of Information Technology Uppsala University Uppsala, Sweden fredrik.lindsten@it.uu.se Jouni

More information

An introduction to particle filters

An introduction to particle filters An introduction to particle filters Andreas Svensson Department of Information Technology Uppsala University June 10, 2014 June 10, 2014, 1 / 16 Andreas Svensson - An introduction to particle filters Outline

More information

Particle Filtering for Data-Driven Simulation and Optimization

Particle Filtering for Data-Driven Simulation and Optimization Particle Filtering for Data-Driven Simulation and Optimization John R. Birge The University of Chicago Booth School of Business Includes joint work with Nicholas Polson. JRBirge INFORMS Phoenix, October

More information

Sequential Monte Carlo Methods (for DSGE Models)

Sequential Monte Carlo Methods (for DSGE Models) Sequential Monte Carlo Methods (for DSGE Models) Frank Schorfheide University of Pennsylvania, PIER, CEPR, and NBER October 23, 2017 Some References These lectures use material from our joint work: Tempered

More information

Inference in state-space models with multiple paths from conditional SMC

Inference in state-space models with multiple paths from conditional SMC Inference in state-space models with multiple paths from conditional SMC Sinan Yıldırım (Sabancı) joint work with Christophe Andrieu (Bristol), Arnaud Doucet (Oxford) and Nicolas Chopin (ENSAE) September

More information

Introduction to Particle Filters for Data Assimilation

Introduction to Particle Filters for Data Assimilation Introduction to Particle Filters for Data Assimilation Mike Dowd Dept of Mathematics & Statistics (and Dept of Oceanography Dalhousie University, Halifax, Canada STATMOS Summer School in Data Assimila5on,

More information

Bayesian Computations for DSGE Models

Bayesian Computations for DSGE Models Bayesian Computations for DSGE Models Frank Schorfheide University of Pennsylvania, PIER, CEPR, and NBER October 23, 2017 This Lecture is Based on Bayesian Estimation of DSGE Models Edward P. Herbst &

More information

Kernel Sequential Monte Carlo

Kernel Sequential Monte Carlo Kernel Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) * equal contribution April 25, 2016 1 / 37 Section

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of

More information

Markov Chain Monte Carlo Methods for Stochastic Optimization

Markov Chain Monte Carlo Methods for Stochastic Optimization Markov Chain Monte Carlo Methods for Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U of Toronto, MIE,

More information

Particle Filtering Approaches for Dynamic Stochastic Optimization

Particle Filtering Approaches for Dynamic Stochastic Optimization Particle Filtering Approaches for Dynamic Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge I-Sim Workshop,

More information

arxiv: v1 [stat.co] 1 Jun 2015

arxiv: v1 [stat.co] 1 Jun 2015 arxiv:1506.00570v1 [stat.co] 1 Jun 2015 Towards automatic calibration of the number of state particles within the SMC 2 algorithm N. Chopin J. Ridgway M. Gerber O. Papaspiliopoulos CREST-ENSAE, Malakoff,

More information

An efficient stochastic approximation EM algorithm using conditional particle filters

An efficient stochastic approximation EM algorithm using conditional particle filters An efficient stochastic approximation EM algorithm using conditional particle filters Fredrik Lindsten Linköping University Post Print N.B.: When citing this work, cite the original article. Original Publication:

More information

LTCC: Advanced Computational Methods in Statistics

LTCC: Advanced Computational Methods in Statistics LTCC: Advanced Computational Methods in Statistics Advanced Particle Methods & Parameter estimation for HMMs N. Kantas Notes at http://wwwf.imperial.ac.uk/~nkantas/notes4ltcc.pdf Slides at http://wwwf.imperial.ac.uk/~nkantas/slides4.pdf

More information

Sensor Fusion: Particle Filter

Sensor Fusion: Particle Filter Sensor Fusion: Particle Filter By: Gordana Stojceska stojcesk@in.tum.de Outline Motivation Applications Fundamentals Tracking People Advantages and disadvantages Summary June 05 JASS '05, St.Petersburg,

More information

A new class of interacting Markov Chain Monte Carlo methods

A new class of interacting Markov Chain Monte Carlo methods A new class of interacting Marov Chain Monte Carlo methods P Del Moral, A Doucet INRIA Bordeaux & UBC Vancouver Worshop on Numerics and Stochastics, Helsini, August 2008 Outline 1 Introduction Stochastic

More information

Sequential Monte Carlo Methods

Sequential Monte Carlo Methods National University of Singapore KAUST, October 14th 2014 Monte Carlo Importance Sampling Markov chain Monte Carlo Sequential Importance Sampling Resampling + Weight Degeneracy Path Degeneracy Algorithm

More information

Sequential Monte Carlo Methods

Sequential Monte Carlo Methods University of Pennsylvania Bradley Visitor Lectures October 23, 2017 Introduction Unfortunately, standard MCMC can be inaccurate, especially in medium and large-scale DSGE models: disentangling importance

More information

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Roger Frigola Fredrik Lindsten Thomas B. Schön, Carl E. Rasmussen Dept. of Engineering, University of Cambridge,

More information

The chopthin algorithm for resampling

The chopthin algorithm for resampling The chopthin algorithm for resampling Axel Gandy F. Din-Houn Lau Department of Mathematics, Imperial College London Abstract Resampling is a standard step in particle filters and more generally sequential

More information

Monte Carlo Solution of Integral Equations

Monte Carlo Solution of Integral Equations 1 Monte Carlo Solution of Integral Equations of the second kind Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ March 7th, 2011 MIRaW Monte

More information

Efficient Monitoring for Planetary Rovers

Efficient Monitoring for Planetary Rovers International Symposium on Artificial Intelligence and Robotics in Space (isairas), May, 2003 Efficient Monitoring for Planetary Rovers Vandi Verma vandi@ri.cmu.edu Geoff Gordon ggordon@cs.cmu.edu Carnegie

More information

Data-driven Particle Filters for Particle Markov Chain Monte Carlo

Data-driven Particle Filters for Particle Markov Chain Monte Carlo Data-driven Particle Filters for Particle Markov Chain Monte Carlo Patrick Leung, Catherine S. Forbes, Gael M. Martin and Brendan McCabe September 13, 2016 Abstract This paper proposes new automated proposal

More information

Markov Chain Monte Carlo Methods for Stochastic

Markov Chain Monte Carlo Methods for Stochastic Markov Chain Monte Carlo Methods for Stochastic Optimization i John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U Florida, Nov 2013

More information

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Preprints of the 9th World Congress The International Federation of Automatic Control Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Roger Frigola Fredrik

More information

Pseudo-marginal MCMC methods for inference in latent variable models

Pseudo-marginal MCMC methods for inference in latent variable models Pseudo-marginal MCMC methods for inference in latent variable models Arnaud Doucet Department of Statistics, Oxford University Joint work with George Deligiannidis (Oxford) & Mike Pitt (Kings) MCQMC, 19/08/2016

More information

Bayesian Sequential Design under Model Uncertainty using Sequential Monte Carlo

Bayesian Sequential Design under Model Uncertainty using Sequential Monte Carlo Bayesian Sequential Design under Model Uncertainty using Sequential Monte Carlo, James McGree, Tony Pettitt October 7, 2 Introduction Motivation Model choice abundant throughout literature Take into account

More information

Lecture 8: Bayesian Estimation of Parameters in State Space Models

Lecture 8: Bayesian Estimation of Parameters in State Space Models in State Space Models March 30, 2016 Contents 1 Bayesian estimation of parameters in state space models 2 Computational methods for parameter estimation 3 Practical parameter estimation in state space

More information

Recent Developments in Auxiliary Particle Filtering

Recent Developments in Auxiliary Particle Filtering Chapter 3 Recent Developments in Auxiliary Particle Filtering Nick Whiteley 1 and Adam M. Johansen 2 3.1 Background 3.1.1 State Space Models State space models (SSMs; sometimes termed hidden Markov models,

More information

Mean field simulation for Monte Carlo integration. Part II : Feynman-Kac models. P. Del Moral

Mean field simulation for Monte Carlo integration. Part II : Feynman-Kac models. P. Del Moral Mean field simulation for Monte Carlo integration Part II : Feynman-Kac models P. Del Moral INRIA Bordeaux & Inst. Maths. Bordeaux & CMAP Polytechnique Lectures, INLN CNRS & Nice Sophia Antipolis Univ.

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 7 Sequential Monte Carlo methods III 7 April 2017 Computer Intensive Methods (1) Plan of today s lecture

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS

RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS Saiat Saha and Gustaf Hendeby Linöping University Post Print N.B.: When citing this wor, cite the original article. 2014

More information

Data-driven Particle Filters for Particle Markov Chain Monte Carlo

Data-driven Particle Filters for Particle Markov Chain Monte Carlo Data-driven Particle Filters for Particle Markov Chain Monte Carlo Patrick Leung, Catherine S. Forbes, Gael M. Martin and Brendan McCabe December 21, 2016 Abstract This paper proposes new automated proposal

More information

A new iterated filtering algorithm

A new iterated filtering algorithm A new iterated filtering algorithm Edward Ionides University of Michigan, Ann Arbor ionides@umich.edu Statistics and Nonlinear Dynamics in Biology and Medicine Thursday July 31, 2014 Overview 1 Introduction

More information

Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution

Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution Andreas Svensson, Thomas B. Schön, and Fredrik Lindsten Department of Information Technology,

More information

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision The Particle Filter Non-parametric implementation of Bayes filter Represents the belief (posterior) random state samples. by a set of This representation is approximate. Can represent distributions that

More information

List of projects. FMS020F NAMS002 Statistical inference for partially observed stochastic processes, 2016

List of projects. FMS020F NAMS002 Statistical inference for partially observed stochastic processes, 2016 List of projects FMS020F NAMS002 Statistical inference for partially observed stochastic processes, 206 Work in groups of two (if this is absolutely not possible for some reason, please let the lecturers

More information

Notes on pseudo-marginal methods, variational Bayes and ABC

Notes on pseudo-marginal methods, variational Bayes and ABC Notes on pseudo-marginal methods, variational Bayes and ABC Christian Andersson Naesseth October 3, 2016 The Pseudo-Marginal Framework Assume we are interested in sampling from the posterior distribution

More information

TSRT14: Sensor Fusion Lecture 8

TSRT14: Sensor Fusion Lecture 8 TSRT14: Sensor Fusion Lecture 8 Particle filter theory Marginalized particle filter Gustaf Hendeby gustaf.hendeby@liu.se TSRT14 Lecture 8 Gustaf Hendeby Spring 2018 1 / 25 Le 8: particle filter theory,

More information

Infinite-State Markov-switching for Dynamic. Volatility Models : Web Appendix

Infinite-State Markov-switching for Dynamic. Volatility Models : Web Appendix Infinite-State Markov-switching for Dynamic Volatility Models : Web Appendix Arnaud Dufays 1 Centre de Recherche en Economie et Statistique March 19, 2014 1 Comparison of the two MS-GARCH approximations

More information

Adaptive Population Monte Carlo

Adaptive Population Monte Carlo Adaptive Population Monte Carlo Olivier Cappé Centre Nat. de la Recherche Scientifique & Télécom Paris 46 rue Barrault, 75634 Paris cedex 13, France http://www.tsi.enst.fr/~cappe/ Recent Advances in Monte

More information

Particle Metropolis-Hastings using gradient and Hessian information

Particle Metropolis-Hastings using gradient and Hessian information Particle Metropolis-Hastings using gradient and Hessian information Johan Dahlin, Fredrik Lindsten and Thomas B. Schön September 19, 2014 Abstract Particle Metropolis-Hastings (PMH) allows for Bayesian

More information

Advanced Monte Carlo integration methods. P. Del Moral (INRIA team ALEA) INRIA & Bordeaux Mathematical Institute & X CMAP

Advanced Monte Carlo integration methods. P. Del Moral (INRIA team ALEA) INRIA & Bordeaux Mathematical Institute & X CMAP Advanced Monte Carlo integration methods P. Del Moral (INRIA team ALEA) INRIA & Bordeaux Mathematical Institute & X CMAP MCQMC 2012, Sydney, Sunday Tutorial 12-th 2012 Some hyper-refs Feynman-Kac formulae,

More information

Sequential Bayesian Inference for Dynamic State Space. Model Parameters

Sequential Bayesian Inference for Dynamic State Space. Model Parameters Sequential Bayesian Inference for Dynamic State Space Model Parameters Arnab Bhattacharya and Simon Wilson 1. INTRODUCTION Dynamic state-space models [24], consisting of a latent Markov process X 0,X 1,...

More information

Graphical Models and Kernel Methods

Graphical Models and Kernel Methods Graphical Models and Kernel Methods Jerry Zhu Department of Computer Sciences University of Wisconsin Madison, USA MLSS June 17, 2014 1 / 123 Outline Graphical Models Probabilistic Inference Directed vs.

More information

Practical Bayesian Optimization of Machine Learning. Learning Algorithms

Practical Bayesian Optimization of Machine Learning. Learning Algorithms Practical Bayesian Optimization of Machine Learning Algorithms CS 294 University of California, Berkeley Tuesday, April 20, 2016 Motivation Machine Learning Algorithms (MLA s) have hyperparameters that

More information

Particle Filters: Convergence Results and High Dimensions

Particle Filters: Convergence Results and High Dimensions Particle Filters: Convergence Results and High Dimensions Mark Coates mark.coates@mcgill.ca McGill University Department of Electrical and Computer Engineering Montreal, Quebec, Canada Bellairs 2012 Outline

More information

Parallel sequential Monte Carlo samplers and estimation of the number of states in a Hidden Markov Model

Parallel sequential Monte Carlo samplers and estimation of the number of states in a Hidden Markov Model Ann Inst Stat Math (2014) 66:553 575 DOI 10.1007/s10463-014-0450-4 SPECIAL ISSUE: BAYESIAN INFERENCE AND STOCHASTIC COMPUTATION Parallel sequential Monte Carlo samplers and estimation of the number of

More information

Particle methods for maximum likelihood estimation in latent variable models

Particle methods for maximum likelihood estimation in latent variable models DOI 10.1007/s11222-007-9037-8 Particle methods for maximum likelihood estimation in latent variable models Adam M. Johansen Arnaud Doucet Manuel Davy Received: 15 May 2007 / Accepted: 31 August 2007 Springer

More information

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER Zhen Zhen 1, Jun Young Lee 2, and Abdus Saboor 3 1 Mingde College, Guizhou University, China zhenz2000@21cn.com 2 Department

More information

Master Degree in Data Science. Particle Filtering for Nonlinear State Space Models. Hans-Peter Höllwirth. Director: Dr. Christian Brownlees

Master Degree in Data Science. Particle Filtering for Nonlinear State Space Models. Hans-Peter Höllwirth. Director: Dr. Christian Brownlees Master Degree in Data Science Particle Filtering for Nonlinear State Space Models Hans-Peter Höllwirth Director: Dr. Christian Brownlees June 2017 ABSTRACT IN ENGLISH : This paper studies a novel particle

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 5 Sequential Monte Carlo methods I 31 March 2017 Computer Intensive Methods (1) Plan of today s lecture

More information

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17 MCMC for big data Geir Storvik BigInsight lunch - May 2 2018 Geir Storvik MCMC for big data BigInsight lunch - May 2 2018 1 / 17 Outline Why ordinary MCMC is not scalable Different approaches for making

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

Integrated Non-Factorized Variational Inference

Integrated Non-Factorized Variational Inference Integrated Non-Factorized Variational Inference Shaobo Han, Xuejun Liao and Lawrence Carin Duke University February 27, 2014 S. Han et al. Integrated Non-Factorized Variational Inference February 27, 2014

More information

Sequential Monte Carlo Methods in High Dimensions

Sequential Monte Carlo Methods in High Dimensions Sequential Monte Carlo Methods in High Dimensions Alexandros Beskos Statistical Science, UCL Oxford, 24th September 2012 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Andrew Stuart Imperial College,

More information

Bayesian Inference and MCMC

Bayesian Inference and MCMC Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the

More information

Lecture Particle Filters

Lecture Particle Filters FMS161/MASM18 Financial Statistics November 29, 2010 Monte Carlo filters The filter recursions could only be solved for HMMs and for linear, Gaussian models. Idea: Approximate any model with a HMM. Replace

More information

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Jonas Hallgren 1 1 Department of Mathematics KTH Royal Institute of Technology Stockholm, Sweden BFS 2012 June

More information

Particle Metropolis-adjusted Langevin algorithms

Particle Metropolis-adjusted Langevin algorithms Particle Metropolis-adjusted Langevin algorithms Christopher Nemeth, Chris Sherlock and Paul Fearnhead arxiv:1412.7299v3 [stat.me] 27 May 2016 Department of Mathematics and Statistics, Lancaster University,

More information

Sequential Monte Carlo for Bayesian Computation

Sequential Monte Carlo for Bayesian Computation BAYESIAN STATISTICS 8, pp. 1 34. J. M. Bernardo, M. J. Bayarri, J. O. Berger, A. P. Dawid, D. Heckerman, A. F. M. Smith and M. West (Eds.) c Oxford University Press, 2007 Sequential Monte Carlo for Bayesian

More information

Nested Sequential Monte Carlo Methods

Nested Sequential Monte Carlo Methods Nested Sequential Monte Carlo Methods Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön Linköping University, Sweden The University of Cambridge, UK Uppsala University, Sweden ICML 2015 Presented

More information

Based on slides by Richard Zemel

Based on slides by Richard Zemel CSC 412/2506 Winter 2018 Probabilistic Learning and Reasoning Lecture 3: Directed Graphical Models and Latent Variables Based on slides by Richard Zemel Learning outcomes What aspects of a model can we

More information

Negative Association, Ordering and Convergence of Resampling Methods

Negative Association, Ordering and Convergence of Resampling Methods Negative Association, Ordering and Convergence of Resampling Methods Nicolas Chopin ENSAE, Paristech (Joint work with Mathieu Gerber and Nick Whiteley, University of Bristol) Resampling schemes: Informal

More information