A new class of interacting Markov Chain Monte Carlo methods
|
|
- Preston Edwards
- 6 years ago
- Views:
Transcription
1 A new class of interacting Marov Chain Monte Carlo methods P Del Moral, A Doucet INRIA Bordeaux & UBC Vancouver Worshop on Numerics and Stochastics, Helsini, August 2008
2 Outline 1 Introduction Stochastic sampling problems Some stochastic engineering models Interacting sampling methods 2 Some classical distribution flows Feynman-Kac models Static Boltzmann-Gibbs models 3 Interacting stochastic sampling technology Mean field particle methods Interacting Marov chain Monte Carlo models (i-mcmc) 4 Functional fluctuation & comparisons 5 Some references
3 Stochastic sampling problems Stochastic sampling problems Nonlinear distribution flow with level of complexity η n (dx n ) = γ n(dx n ) γ n (1) Two objectives : Time index n N State var x n E n 1 Sampling independent random variables wrt η n 2 Computation of the normalizing constants γ n (1) (= Z n Partition functions)
4 Some stochastic engineering models Stochastic engineering Conditional & Boltzmann-Gibbs measures Filtering: Signal-Observation (X n, Y n ) [Radar, Sonar, GPS, ] η n = Law(X n (Y 0,, Y n )) Rare events: [Overflows, ruin processes, epidemic propagations,] η n = Law(X n n intermediate events) & Z n = P (Rare event) Molecular simulation: [ground state energies, directed polymers] η n := Feynman-Kac/Boltzmann-Gibbs Free Marov motion in an absorbing medium Combinatorial counting, Global optimization, HMM η n = 1 Z n e βnv (x) λ(dx) or η n = 1 Z n 1 An (x) λ(dx)
5 Interacting sampling methods Two simple ingredients Find or Understand the probability mass transformation η n = Φ n (η n 1 ) Cooling schemes, temp variations, constraints sequences, subset restrictions, observation data, conditional events, Natural interacting sampling idea : Use η n 1 or its empirical approx to sample wrt η n 1 Monte-Carlo/ Mean Field models : η n = Law(X n ) with Marov : X n 1 η n 1 X n 2 Interacting MCMC models : { } Use the occupation measures MCMC target η of an MCMC with target η n n 1
6 Feynman-Kac models Feynman-Kac distribution flows Wea representation: [f n test funct on a state space E n ] η n (f n ) = γ n(f n ) γ n (1) with γ n (f n ) = E f n (X n ) G p (X p ) 0 p<n ( ) A Key Formula: Z n = E 0 p<n G p(x p ) = 0 p<n η p(g p ) Path space models X n = (X 0,, X n) Examples G n [0, 1] particle absorption models G n = Observation lielihood function Filtering models G n = 1 An Conditional/Restriction models
7 Feynman-Kac models Nonlinear distribution flows Evolution equation: η n+1 = Φ n+1 (η n ) = Ψ Gn (η n )M n+1 With the only 2 transformations : X -Free Marov transport eq : [M n (x n 1, dx n ) from E n 1 into E n ] (η n 1 M n )(dx n ) := η n 1 (dx n 1 ) M n (x n 1, dx n ) E n 1 Bayes-Boltzmann-Gibbs transformation : Ψ Gn (η n )(dx n ) := 1 η n (G n ) G n(x n ) η n (dx n )
8 Static Boltzmann-Gibbs models Boltzmann-Gibbs distribution flows Target distribution flow : η n (dx) g n (x) λ(dx) Product hypothesis : g n = g n 1 G n 1 = η n = Ψ Gn 1 (η n 1 ) Running Ex: g n = 1 An with A n G n 1 = 1 An g n = e βnv with β n G n 1 = e (βn βn 1)V Problem : η n = Ψ Gn 1 (η n 1 ) = unstable equation
9 Static Boltzmann-Gibbs models Feynman-Kac modeling Choose M n (x, dy) st local fixed point eq η n = η n M n (Metropolis, Gibbs,) Stable equation : g n = g n 1 G n 1 = η n = Ψ Gn 1 (η n 1 ) = η n = η n M n = Ψ Gn 1 (η n 1 )M n = FK-model Feynman-Kac dynamical formulation (X n Marov M n ) f (x) g n (x) λ(dx) E f (X n ) G p (X p ) 0 p<n Interacting Metropolis/Gibbs/ stochastic algorithms
10 Mean field particle methods Mean field interpretation Nonlinear Marov models : Always K n,η (x, dy) Marov st η n = Φ n (η n 1 ) = η n 1 K n,ηn 1 =Law ( X n ) ie : P(X n dx n X n 1 ) = K n,ηn 1 (X n 1, dx n ) Mean field particle interpretation Marov chain ξ n = (ξn, 1, ξn N ) En N st η N n := 1 δ N ξ i n N η n 1 i N Particle approximation transitions ( 1 i N) ξn 1 i ξn i K n,η N n 1 (ξn 1, i dx n )
11 Mean field particle methods Discrete generation mean field particle model Schematic picture : ξ n E N n ξ n+1 E N n+1 ξ 1 n ξ i n ξ N n K n+1,η N n ξ 1 n+1 ξ i n+1 ξ N n+1 Rationale : η N n N η n = K n+1,η N n N K n+1,ηn = ξ i n almost iid copies of X n
12 Mean field particle methods Ex: Feynman-Kac distribution flows FK-Nonlinear Marov models : ɛ n = ɛ n (η n ) 0 st η n -ae ɛ n G n [0, 1] (ɛ n = 0 not excluded) K n+1,ηn (x, dz) = S n,ηn (x, dy) M n+1 (y, dz) S n,ηn (x, dy) := ɛ n G n (x) δ x (dy) + (1 ɛ n G n (x)) Ψ Gn (η n )(dy) Mean field genetic type particle model : Examples : ξ i n E n accept/reject/selection ξ i n E n proposal/mutation ξ i n+1 E n+1 G n = 1 A illing with uniform replacement M n -Metropolis/Gibbs moves G n -interaction function (subsets fitting or change of temperatures)
13 Mean field particle methods Mean field genetic type particle model : ξ 1 ṇ ξ i ṇ ξ N n S n,η N n ξ 1 n ξ i n ξ N n M n+1 ξ 1 n+1 ξ i n+1 ξ N n+1 Accept/Reject/Selection transition : S n,η N n (ξ i n, dx) := ɛ n G n (ξn) i δ ξ i n (dx) + ( 1 ɛ n G n (ξn) ) i N G n(ξ j P n ) j=1 N =1 Gn(ξ )δ ξ j (dx) n n Ex : G n = 1 A, ɛ n = 1 G n (ξ i n) = 1 A (ξ i n)
14 Mean field particle methods Path space models X n = (X 0,, X n) genealogical tree/ancestral lines η N n := 1 N = 1 δ N (ξ i 0,n,ξ1,n i,,ξi n,n ) N η n δ ξ i n 1 i N 1 i N Unbias particle approximations : γn N (1) = ηp N (G p ) N γ n (1) = η p (G p ) 0 p<n 0 p<n Ex G n = 1 A : γn N (1) = (success % at p) 0 p<n FK-Mean field particle models = sequential Monte Carlo, population Monte Carlo, particle filters, pruning, spawning, reconfiguration, quantum Monte carlo, go with the winner
15 Interacting Marov chain Monte Carlo models (i-mcmc) Objective Find a series of MCMC models X (n) := (X (n) ) 0 st Advantages η (n) = η n 0 l δ X (n) l Use η (n) η n to define X (n+1) with target η n+1 Using η n the sampling η n+1 is often easier Improve the proposition step in any Metropolis type model with target η n+1 ( enters the stability prop of the flow η n ) Increases the precision at every time step But CLT variance often CLT variance mean field models Easy to combine with mean field stochastic algorithms
16 Interacting Marov chain Monte Carlo models (i-mcmc) Interacting Marov chain Monte Carlo models Find M 0 and a collection of transitions M n,µ st η 0 = η 0 M 0 and Φ n (µ) = Φ n (µ)m n,µ (X (0) ) 0 Marov chain M 0 Given X (n), we let X (n+1) Rationale : η (n) η n = Example : { with Marov transtions M n+1,η (n) Φ n+1 (η (n) ) Φ n+1(η n ) = η n+1 M n+1,ηn with fixed point η n+1 M n+1,η (n) = η (n+1) η n+1 M n,µ (x, dy) = Φ n (µ)(dy) X (n+1) rv Φ n+1 ( η (n) )
17 Interacting Marov chain Monte Carlo models (i-mcmc) ((n 1)-th chain) X (n 1) 0 X (n 1) 1 X (n 1) η (n 1) η n 1 (n-th chain) M n,η (n 1) X (n) 0 X (n) M n,ηn 1 X (n) +1
18 [MEAN FIELD PARTICLE MODEL] Nonlinear semigroup Φ p,n(η p) := η n Local fluctuation theorem : W N n := N hη N n Φn η N i n 1 W n Centered Gaussian field Local transport formulation : η 0 η 1 = Φ 1 (η 0 ) η 2 = Φ 0,2 (η 0 ) Φ 0,n (η 0 ) η0 N Φ 1 (η0 N ) Φ 0,2(η0 N ) Φ 0,n(η0 N ) η1 N Φ 2 (η1 N ) Φ 1,n(η1 N ) η2 N Φ 2,n (η2 N ) ηn 1 N Φ n(ηn 1 N ) ηn N Key decomposition formula : nx η N n ηn = [Φ q,n(η N q ) Φq,n(Φq(ηN q 1 ))] q=0 1 N nx W N q Dq,n First order decomp Φp,n(η) Φp,n(µ) (η µ)dp,n + (η µ) 2 q=0 Example Functional CLT : N hη N i nx n ηn W qd q,n q=0
19 [i-mcmc] Nonlinear sg Φ p,n(η p) = η n with a first order decomp : Φ p,n(η) Φ p,n(µ) (η µ)d p,n + (η µ) 2 Functional CLT for correlated/interacting MCMC models : hη (n) i nx η n q=0 p (2(n q))! V qd q,n (n q)! with `V q q 0 Centered Gaussian field E V q(f ) 2 = η q h(f η q(f )) 2i + 2 P» m 1 ηq (f η q(f )) Mq,η m (f η q 1 q(f )) Comparisons : [Mean field case] `W q q 0 Centered Gaussian field E W q(f ) 2 = η q 1 nk q,ηq 1 (f K q,ηq 1 (f )) 2o Case : K q,η(x, dy) = M q,η(x, dy) = Φ q(η)(dy) = (V q = W q) = [Mean field] > [i-mcmc]
20 Some references Interacting stochastic simulation algorithms Mean field and Feynman-Kac particle models : Feynman-Kac formulae Genealogical and interacting particle systems, Springer (2004) Refs joint wor with L Miclo A Moran particle system approximation of Feynman-Kac formulae Stochastic Processes and their Applications, Vol 86, (2000) joint wor with L Miclo Branching and Interacting Particle Systems Approximations of Feynman-Kac Formulae Séminaire de Probabilités XXXIV, Lecture Notes in Mathematics, Springer-Verlag Berlin, Vol 1729, (2000) Sequential Monte Carlo models : joint wor with Doucet A, Jasra A Sequential Monte Carlo Samplers JRSS B (2006) joint wor with A Doucet On a class of genealogical and interacting Metropolis models Sém de Proba 37 (2003)
21 Interacting stochastic simulation algorithms i-mcmc algorithms : joint wor with A Doucet Interacting Marov Chain Monte Carlo Methods For Solving Nonlinear Measure-Valued Eq, HAL-INRIA RR-6435, (Feb 2008) joint wor with B Bercu and A Doucet Fluctuations of Interacting Marov Chain Monte Carlo Models HAL-INRIA RR-6438, (Feb 2008) joint wor with C Andrieu, A Jasra, A Doucet Non-Linear Marov chain Monte Carlo via self-interacting approximations Tech report, Dept of Math, Bristol Univ (2007) joint wor with A Brocwell and A Doucet Sequentially interacting Marov chain Monte Carlo Tech report, Dept of Statistics, Univ of British Columbia (2007)
Advanced Monte Carlo integration methods. P. Del Moral (INRIA team ALEA) INRIA & Bordeaux Mathematical Institute & X CMAP
Advanced Monte Carlo integration methods P. Del Moral (INRIA team ALEA) INRIA & Bordeaux Mathematical Institute & X CMAP MCQMC 2012, Sydney, Sunday Tutorial 12-th 2012 Some hyper-refs Feynman-Kac formulae,
More informationA Backward Particle Interpretation of Feynman-Kac Formulae
A Backward Particle Interpretation of Feynman-Kac Formulae P. Del Moral Centre INRIA de Bordeaux - Sud Ouest Workshop on Filtering, Cambridge Univ., June 14-15th 2010 Preprints (with hyperlinks), joint
More informationConcentration inequalities for Feynman-Kac particle models. P. Del Moral. INRIA Bordeaux & IMB & CMAP X. Journées MAS 2012, SMAI Clermond-Ferrand
Concentration inequalities for Feynman-Kac particle models P. Del Moral INRIA Bordeaux & IMB & CMAP X Journées MAS 2012, SMAI Clermond-Ferrand Some hyper-refs Feynman-Kac formulae, Genealogical & Interacting
More informationMean field simulation for Monte Carlo integration. Part II : Feynman-Kac models. P. Del Moral
Mean field simulation for Monte Carlo integration Part II : Feynman-Kac models P. Del Moral INRIA Bordeaux & Inst. Maths. Bordeaux & CMAP Polytechnique Lectures, INLN CNRS & Nice Sophia Antipolis Univ.
More informationAn introduction to particle simulation of rare events
An introduction to particle simulation of rare events P. Del Moral Centre INRIA de Bordeaux - Sud Ouest Workshop OPUS, 29 juin 2010, IHP, Paris P. Del Moral (INRIA) INRIA Bordeaux-Sud Ouest 1 / 33 Outline
More informationMean field simulation for Monte Carlo integration. Part I : Intro. nonlinear Markov models (+links integro-diff eq.) P. Del Moral
Mean field simulation for Monte Carlo integration Part I : Intro. nonlinear Markov models (+links integro-diff eq.) P. Del Moral INRIA Bordeaux & Inst. Maths. Bordeaux & CMAP Polytechnique Lectures, INLN
More informationContraction properties of Feynman-Kac semigroups
Journées de Statistique Marne La Vallée, January 2005 Contraction properties of Feynman-Kac semigroups Pierre DEL MORAL, Laurent MICLO Lab. J. Dieudonné, Nice Univ., LATP Univ. Provence, Marseille 1 Notations
More informationStability of Feynman-Kac Semigroup and SMC Parameters' Tuning
Stability of Feynman-Kac Semigroup and SMC Parameters' Tuning François GIRAUD PhD student, CEA-CESTA, INRIA Bordeaux Tuesday, February 14th 2012 1 Recalls and context of the analysis Feynman-Kac semigroup
More informationOn the stability and the applications of interacting particle systems
1/42 On the stability and the applications of interacting particle systems P. Del Moral INRIA Bordeaux Sud Ouest PDE/probability interactions - CIRM, April 2017 Synthesis joint works with A.N. Bishop,
More informationSequential Monte Carlo Methods for Bayesian Computation
Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter
More informationAn Brief Overview of Particle Filtering
1 An Brief Overview of Particle Filtering Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ May 11th, 2010 Warwick University Centre for Systems
More informationAuxiliary Particle Methods
Auxiliary Particle Methods Perspectives & Applications Adam M. Johansen 1 adam.johansen@bristol.ac.uk Oxford University Man Institute 29th May 2008 1 Collaborators include: Arnaud Doucet, Nick Whiteley
More informationOn the Concentration Properties of Interacting Particle Processes. Contents
Foundations and Trends R in Machine Learning Vol. 3, Nos. 3 4 (2010) 225 389 c 2012 P. Del Moral, P. Hu, and L. Wu DOI: 10.1561/2200000026 On the Concentration Properties of Interacting Particle Processes
More informationA Backward Particle Interpretation of Feynman-Kac Formulae
A Backward Particle Interpretation of Feynman-Kac Formulae Pierre Del Moral, Arnaud Doucet, Sumeetpal Singh To cite this version: Pierre Del Moral, Arnaud Doucet, Sumeetpal Singh. A Backward Particle Interpretation
More informationPerfect simulation algorithm of a trajectory under a Feynman-Kac law
Perfect simulation algorithm of a trajectory under a Feynman-Kac law Data Assimilation, 24th-28th September 212, Oxford-Man Institute C. Andrieu, N. Chopin, A. Doucet, S. Rubenthaler University of Bristol,
More informationLecture Particle Filters. Magnus Wiktorsson
Lecture Particle Filters Magnus Wiktorsson Monte Carlo filters The filter recursions could only be solved for HMMs and for linear, Gaussian models. Idea: Approximate any model with a HMM. Replace p(x)
More informationOn Adaptive Resampling Procedures for Sequential Monte Carlo Methods
On Adaptive Resampling Procedures for Sequential Monte Carlo Methods Pierre Del Moral, Arnaud Doucet, Ajay Jasra To cite this version: Pierre Del Moral, Arnaud Doucet, Ajay Jasra. On Adaptive Resampling
More informationA Note on Auxiliary Particle Filters
A Note on Auxiliary Particle Filters Adam M. Johansen a,, Arnaud Doucet b a Department of Mathematics, University of Bristol, UK b Departments of Statistics & Computer Science, University of British Columbia,
More informationLecture Particle Filters
FMS161/MASM18 Financial Statistics November 29, 2010 Monte Carlo filters The filter recursions could only be solved for HMMs and for linear, Gaussian models. Idea: Approximate any model with a HMM. Replace
More informationControlled sequential Monte Carlo
Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation
More informationSequential Monte Carlo Samplers for Applications in High Dimensions
Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex
More informationFull text available at: On the Concentration Properties of Interacting Particle Processes
On the Concentration Properties of Interacting Particle Processes On the Concentration Properties of Interacting Particle Processes Pierre Del Moral INRIA, Université Bordeaux 1 France Peng Hu INRIA, Université
More informationSTABILITY AND UNIFORM APPROXIMATION OF NONLINEAR FILTERS USING THE HILBERT METRIC, AND APPLICATION TO PARTICLE FILTERS 1
The Annals of Applied Probability 0000, Vol. 00, No. 00, 000 000 STABILITY AND UNIFORM APPROXIMATION OF NONLINAR FILTRS USING TH HILBRT MTRIC, AND APPLICATION TO PARTICL FILTRS 1 By François LeGland and
More informationOn the concentration properties of Interacting particle processes
ISTITUT ATIOAL DE RECHERCHE E IFORMATIQUE ET E AUTOMATIQUE arxiv:1107.1948v2 [math.a] 12 Jul 2011 On the concentration properties of Interacting particle processes Pierre Del Moral, Peng Hu, Liming Wu
More informationChapter 7. Markov chain background. 7.1 Finite state space
Chapter 7 Markov chain background A stochastic process is a family of random variables {X t } indexed by a varaible t which we will think of as time. Time can be discrete or continuous. We will only consider
More informationThe chopthin algorithm for resampling
The chopthin algorithm for resampling Axel Gandy F. Din-Houn Lau Department of Mathematics, Imperial College London Abstract Resampling is a standard step in particle filters and more generally sequential
More informationAsymptotics and Simulation of Heavy-Tailed Processes
Asymptotics and Simulation of Heavy-Tailed Processes Department of Mathematics Stockholm, Sweden Workshop on Heavy-tailed Distributions and Extreme Value Theory ISI Kolkata January 14-17, 2013 Outline
More informationF denotes cumulative density. denotes probability density function; (.)
BAYESIAN ANALYSIS: FOREWORDS Notation. System means the real thing and a model is an assumed mathematical form for the system.. he probability model class M contains the set of the all admissible models
More informationUsing systematic sampling for approximating Feynman-Kac solutions by Monte Carlo methods
Using systematic sampling for approximating Feynman-Kac solutions by Monte Carlo methods Ivan Gentil and Bruno Rémillard Abstract While convergence properties of many sampling selection methods can be
More informationData Analysis I. Dr Martin Hendry, Dept of Physics and Astronomy University of Glasgow, UK. 10 lectures, beginning October 2006
Astronomical p( y x, I) p( x, I) p ( x y, I) = p( y, I) Data Analysis I Dr Martin Hendry, Dept of Physics and Astronomy University of Glasgow, UK 10 lectures, beginning October 2006 4. Monte Carlo Methods
More informationVariantes Algorithmiques et Justifications Théoriques
complément scientifique École Doctorale MATISSE IRISA et INRIA, salle Markov jeudi 26 janvier 2012 Variantes Algorithmiques et Justifications Théoriques François Le Gland INRIA Rennes et IRMAR http://www.irisa.fr/aspi/legland/ed-matisse/
More informationAsymptotic Statistics-VI. Changliang Zou
Asymptotic Statistics-VI Changliang Zou Kolmogorov-Smirnov distance Example (Kolmogorov-Smirnov confidence intervals) We know given α (0, 1), there is a well-defined d = d α,n such that, for any continuous
More informationComputer Intensive Methods in Mathematical Statistics
Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of
More informationDivide-and-Conquer Sequential Monte Carlo
Divide-and-Conquer Joint work with: John Aston, Alexandre Bouchard-Côté, Brent Kirkpatrick, Fredrik Lindsten, Christian Næsseth, Thomas Schön University of Warwick a.m.johansen@warwick.ac.uk http://go.warwick.ac.uk/amjohansen/talks/
More informationControl Variates for Markov Chain Monte Carlo
Control Variates for Markov Chain Monte Carlo Dellaportas, P., Kontoyiannis, I., and Tsourti, Z. Dept of Statistics, AUEB Dept of Informatics, AUEB 1st Greek Stochastics Meeting Monte Carlo: Probability
More informationIntroduction to Rare Event Simulation
Introduction to Rare Event Simulation Brown University: Summer School on Rare Event Simulation Jose Blanchet Columbia University. Department of Statistics, Department of IEOR. Blanchet (Columbia) 1 / 31
More informationSequential Monte Carlo Methods in High Dimensions
Sequential Monte Carlo Methods in High Dimensions Alexandros Beskos Statistical Science, UCL Oxford, 24th September 2012 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Andrew Stuart Imperial College,
More informationExercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters
Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for
More informationAn introduction to Sequential Monte Carlo
An introduction to Sequential Monte Carlo Thang Bui Jes Frellsen Department of Engineering University of Cambridge Research and Communication Club 6 February 2014 1 Sequential Monte Carlo (SMC) methods
More informationSequential Monte Carlo samplers
J. R. Statist. Soc. B (2006) 68, Part 3, pp. 411 436 Sequential Monte Carlo samplers Pierre Del Moral, Université Nice Sophia Antipolis, France Arnaud Doucet University of British Columbia, Vancouver,
More informationComputer Intensive Methods in Mathematical Statistics
Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 5 Sequential Monte Carlo methods I 31 March 2017 Computer Intensive Methods (1) Plan of today s lecture
More informationThe Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision
The Particle Filter Non-parametric implementation of Bayes filter Represents the belief (posterior) random state samples. by a set of This representation is approximate. Can represent distributions that
More informationComputer Vision Group Prof. Daniel Cremers. 11. Sampling Methods: Markov Chain Monte Carlo
Group Prof. Daniel Cremers 11. Sampling Methods: Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative
More informationAn introduction to stochastic particle integration methods: with applications to risk and insurance
An introduction to stochastic particle integration methods: with applications to risk and insurance P. Del Moral, G. W. Peters, C. Vergé Abstract This article presents a guided introduction to a general
More informationSMC 2 : an efficient algorithm for sequential analysis of state-space models
SMC 2 : an efficient algorithm for sequential analysis of state-space models N. CHOPIN 1, P.E. JACOB 2, & O. PAPASPILIOPOULOS 3 1 ENSAE-CREST 2 CREST & Université Paris Dauphine, 3 Universitat Pompeu Fabra
More informationLearning Static Parameters in Stochastic Processes
Learning Static Parameters in Stochastic Processes Bharath Ramsundar December 14, 2012 1 Introduction Consider a Markovian stochastic process X T evolving (perhaps nonlinearly) over time variable T. We
More informationComputer Vision Group Prof. Daniel Cremers. 14. Sampling Methods
Prof. Daniel Cremers 14. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric
More informationPattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods
Pattern Recognition and Machine Learning Chapter 11: Sampling Methods Elise Arnaud Jakob Verbeek May 22, 2008 Outline of the chapter 11.1 Basic Sampling Algorithms 11.2 Markov Chain Monte Carlo 11.3 Gibbs
More informationA Central Limit Theorem for Fleming-Viot Particle Systems Application to the Adaptive Multilevel Splitting Algorithm
A Central Limit Theorem for Fleming-Viot Particle Systems Application to the Algorithm F. Cérou 1,2 B. Delyon 2 A. Guyader 3 M. Rousset 1,2 1 Inria Rennes Bretagne Atlantique 2 IRMAR, Université de Rennes
More informationA Backward Particle Interpretation of Feynman-Kac Formulae
A Backward Particle Interpretation of Feynman-Kac Formulae Pierre Del Moral, Arnaud Doucet, Sumeetpal Singh To cite this version: Pierre Del Moral, Arnaud Doucet, Sumeetpal Singh. A Backward Particle Interpretation
More informationIntroduction. log p θ (y k y 1:k 1 ), k=1
ESAIM: PROCEEDINGS, September 2007, Vol.19, 115-120 Christophe Andrieu & Dan Crisan, Editors DOI: 10.1051/proc:071915 PARTICLE FILTER-BASED APPROXIMATE MAXIMUM LIKELIHOOD INFERENCE ASYMPTOTICS IN STATE-SPACE
More informationPseudo-marginal MCMC methods for inference in latent variable models
Pseudo-marginal MCMC methods for inference in latent variable models Arnaud Doucet Department of Statistics, Oxford University Joint work with George Deligiannidis (Oxford) & Mike Pitt (Kings) MCQMC, 19/08/2016
More informationComputer Vision Group Prof. Daniel Cremers. 11. Sampling Methods
Prof. Daniel Cremers 11. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric
More informationMonte Carlo methods for sampling-based Stochastic Optimization
Monte Carlo methods for sampling-based Stochastic Optimization Gersende FORT LTCI CNRS & Telecom ParisTech Paris, France Joint works with B. Jourdain, T. Lelièvre, G. Stoltz from ENPC and E. Kuhn from
More informationOn-Line Parameter Estimation in General State-Space Models
On-Line Parameter Estimation in General State-Space Models Christophe Andrieu School of Mathematics University of Bristol, UK. c.andrieu@bris.ac.u Arnaud Doucet Dpts of CS and Statistics Univ. of British
More informationA6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring
Lecture 8 A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2015 http://www.astro.cornell.edu/~cordes/a6523 Applications: Bayesian inference: overview and examples Introduction
More informationInference for multi-object dynamical systems: methods and analysis
Inference for multi-object dynamical systems: methods and analysis Jérémie Houssineau National University of Singapore September 13, 2018 Joint work with: Daniel Clark, Ajay Jasra, Sumeetpal Singh, Emmanuel
More informationSequential Monte Carlo for Bayesian Computation
BAYESIAN STATISTICS 8, pp. 1 34. J. M. Bernardo, M. J. Bayarri, J. O. Berger, A. P. Dawid, D. Heckerman, A. F. M. Smith and M. West (Eds.) c Oxford University Press, 2007 Sequential Monte Carlo for Bayesian
More informationOn Partition Functions and Divisor Sums
1 2 3 47 6 23 11 Journal of Integer Sequences, Vol. 5 (2002), Article 02.1.4 On Partition Functions and Divisor Sums Neville Robbins Mathematics Department San Francisco State University San Francisco,
More informationFaithful couplings of Markov chains: now equals forever
Faithful couplings of Markov chains: now equals forever by Jeffrey S. Rosenthal* Department of Statistics, University of Toronto, Toronto, Ontario, Canada M5S 1A1 Phone: (416) 978-4594; Internet: jeff@utstat.toronto.edu
More informationSequential Monte Carlo Methods
University of Pennsylvania Bradley Visitor Lectures October 23, 2017 Introduction Unfortunately, standard MCMC can be inaccurate, especially in medium and large-scale DSGE models: disentangling importance
More informationImportance splitting for rare event simulation
Importance splitting for rare event simulation F. Cerou Frederic.Cerou@inria.fr Inria Rennes Bretagne Atlantique Simulation of hybrid dynamical systems and applications to molecular dynamics September
More informationGaussian vectors and central limit theorem
Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables
More informationThe Hierarchical Particle Filter
and Arnaud Doucet http://go.warwick.ac.uk/amjohansen/talks MCMSki V Lenzerheide 7th January 2016 Context & Outline Filtering in State-Space Models: SIR Particle Filters [GSS93] Block-Sampling Particle
More informationLecture 8: Bayesian Estimation of Parameters in State Space Models
in State Space Models March 30, 2016 Contents 1 Bayesian estimation of parameters in state space models 2 Computational methods for parameter estimation 3 Practical parameter estimation in state space
More informationWinter 2019 Math 106 Topics in Applied Mathematics. Lecture 9: Markov Chain Monte Carlo
Winter 2019 Math 106 Topics in Applied Mathematics Data-driven Uncertainty Quantification Yoonsang Lee (yoonsang.lee@dartmouth.edu) Lecture 9: Markov Chain Monte Carlo 9.1 Markov Chain A Markov Chain Monte
More informationIntroduction to Machine Learning CMU-10701
Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos Contents Markov Chain Monte Carlo Methods Sampling Rejection Importance Hastings-Metropolis Gibbs Markov Chains
More informationDiscrete solid-on-solid models
Discrete solid-on-solid models University of Alberta 2018 COSy, University of Manitoba - June 7 Discrete processes, stochastic PDEs, deterministic PDEs Table: Deterministic PDEs Heat-diffusion equation
More informationPart IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationInfinite-State Markov-switching for Dynamic. Volatility Models : Web Appendix
Infinite-State Markov-switching for Dynamic Volatility Models : Web Appendix Arnaud Dufays 1 Centre de Recherche en Economie et Statistique March 19, 2014 1 Comparison of the two MS-GARCH approximations
More informationParticle Filters: Convergence Results and High Dimensions
Particle Filters: Convergence Results and High Dimensions Mark Coates mark.coates@mcgill.ca McGill University Department of Electrical and Computer Engineering Montreal, Quebec, Canada Bellairs 2012 Outline
More informationStochastic Simulation
Stochastic Simulation Idea: probabilities samples Get probabilities from samples: X count x 1 n 1. x k total. n k m X probability x 1. n 1 /m. x k n k /m If we could sample from a variable s (posterior)
More informationPerturbed Proximal Gradient Algorithm
Perturbed Proximal Gradient Algorithm Gersende FORT LTCI, CNRS, Telecom ParisTech Université Paris-Saclay, 75013, Paris, France Large-scale inverse problems and optimization Applications to image processing
More informationRAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS
RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS Saiat Saha and Gustaf Hendeby Linöping University Post Print N.B.: When citing this wor, cite the original article. 2014
More informationNegative Association, Ordering and Convergence of Resampling Methods
Negative Association, Ordering and Convergence of Resampling Methods Nicolas Chopin ENSAE, Paristech (Joint work with Mathieu Gerber and Nick Whiteley, University of Bristol) Resampling schemes: Informal
More informationMarkov Chain Monte Carlo (MCMC)
Markov Chain Monte Carlo (MCMC Dependent Sampling Suppose we wish to sample from a density π, and we can evaluate π as a function but have no means to directly generate a sample. Rejection sampling can
More informationFault Detection and Diagnosis Using Information Measures
Fault Detection and Diagnosis Using Information Measures Rudolf Kulhavý Honewell Technolog Center & Institute of Information Theor and Automation Prague, Cech Republic Outline Probabilit-based inference
More informationA Comparison of Particle Filters for Personal Positioning
VI Hotine-Marussi Symposium of Theoretical and Computational Geodesy May 9-June 6. A Comparison of Particle Filters for Personal Positioning D. Petrovich and R. Piché Institute of Mathematics Tampere University
More informationA graph contains a set of nodes (vertices) connected by links (edges or arcs)
BOLTZMANN MACHINES Generative Models Graphical Models A graph contains a set of nodes (vertices) connected by links (edges or arcs) In a probabilistic graphical model, each node represents a random variable,
More informationRANDOM FINITE SETS AND SEQUENTIAL MONTE CARLO METHODS IN MULTI-TARGET TRACKING. Ba-Ngu Vo, Sumeetpal Singh and Arnaud Doucet*
RANDOM FINITE SETS AND SEQUENTIAL MONTE CARLO METHODS IN MULTI-TARGET TRACKING Ba-Ngu Vo, Sumeetpal Singh and Arnaud Doucet* Department of Electrical and Electronic Engineering, The University of Melbourne,
More informationMultilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems
Jonas Latz 1 Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems Jonas Latz Technische Universität München Fakultät für Mathematik Lehrstuhl für Numerische Mathematik jonas.latz@tum.de November
More informationOnline appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US
Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US Gerdie Everaert 1, Lorenzo Pozzi 2, and Ruben Schoonackers 3 1 Ghent University & SHERPPA 2 Erasmus
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is
More informationSpring 2012 Math 541B Exam 1
Spring 2012 Math 541B Exam 1 1. A sample of size n is drawn without replacement from an urn containing N balls, m of which are red and N m are black; the balls are otherwise indistinguishable. Let X denote
More informationSequential Monte Carlo methods for system identification
Technical report arxiv:1503.06058v3 [stat.co] 10 Mar 2016 Sequential Monte Carlo methods for system identification Thomas B. Schön, Fredrik Lindsten, Johan Dahlin, Johan Wågberg, Christian A. Naesseth,
More informationIntroduction to Machine Learning CMU-10701
Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos & Aarti Singh Contents Markov Chain Monte Carlo Methods Goal & Motivation Sampling Rejection Importance Markov
More informationMonte Carlo Approximation of Monte Carlo Filters
Monte Carlo Approximation of Monte Carlo Filters Adam M. Johansen et al. Collaborators Include: Arnaud Doucet, Axel Finke, Anthony Lee, Nick Whiteley 7th January 2014 Context & Outline Filtering in State-Space
More informationMarkov and Gibbs Random Fields
Markov and Gibbs Random Fields Bruno Galerne bruno.galerne@parisdescartes.fr MAP5, Université Paris Descartes Master MVA Cours Méthodes stochastiques pour l analyse d images Lundi 6 mars 2017 Outline The
More informationThe Kalman filter, Nonlinear filtering, and Markov Chain Monte Carlo
NBER Summer Institute Minicourse What s New in Econometrics: Time Series Lecture 5 July 5, 2008 The Kalman filter, Nonlinear filtering, and Markov Chain Monte Carlo Lecture 5, July 2, 2008 Outline. Models
More informationUniversity of Toronto Department of Statistics
Norm Comparisons for Data Augmentation by James P. Hobert Department of Statistics University of Florida and Jeffrey S. Rosenthal Department of Statistics University of Toronto Technical Report No. 0704
More informationIntroduction to Stochastic Gradient Markov Chain Monte Carlo Methods
Introduction to Stochastic Gradient Markov Chain Monte Carlo Methods Changyou Chen Department of Electrical and Computer Engineering, Duke University cc448@duke.edu Duke-Tsinghua Machine Learning Summer
More information17 : Markov Chain Monte Carlo
10-708: Probabilistic Graphical Models, Spring 2015 17 : Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Heran Lin, Bin Deng, Yun Huang 1 Review of Monte Carlo Methods 1.1 Overview Monte Carlo
More informationA6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011
A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 2011 Reading Chapter 5 (continued) Lecture 8 Key points in probability CLT CLT examples Prior vs Likelihood Box & Tiao
More informationOn Solving Integral Equations using Markov Chain Monte Carlo Methods
On Solving Integral quations using Markov Chain Monte Carlo Methods Arnaud Doucet Departments of Statistics and Computer Science, University of British Columbia, Vancouver, BC, Canada Adam M. Johansen,
More informationA Note on the Particle Filter with Posterior Gaussian Resampling
Tellus (6), 8A, 46 46 Copyright C Blackwell Munksgaard, 6 Printed in Singapore. All rights reserved TELLUS A Note on the Particle Filter with Posterior Gaussian Resampling By X. XIONG 1,I.M.NAVON 1,2 and
More informationSeminar: Data Assimilation
Seminar: Data Assimilation Jonas Latz, Elisabeth Ullmann Chair of Numerical Mathematics (M2) Technical University of Munich Jonas Latz, Elisabeth Ullmann (TUM) Data Assimilation 1 / 28 Prerequisites Bachelor:
More informationSequential Monte Carlo for Graphical Models
Sequential Monte Carlo for Graphical Models Christian A. Naesseth Div. of Automatic Control Linköping University Linköping, Sweden chran60@isy.liu.se Fredrik Lindsten Dept. of Engineering The University
More informationRecitation 2: Probability
Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions
More informationAnnealing Between Distributions by Averaging Moments
Annealing Between Distributions by Averaging Moments Chris J. Maddison Dept. of Comp. Sci. University of Toronto Roger Grosse CSAIL MIT Ruslan Salakhutdinov University of Toronto Partition Functions We
More informationLecture 17: The Exponential and Some Related Distributions
Lecture 7: The Exponential and Some Related Distributions. Definition Definition: A continuous random variable X is said to have the exponential distribution with parameter if the density of X is e x if
More information