Sequential Monte Carlo Methods in High Dimensions

Size: px
Start display at page:

Download "Sequential Monte Carlo Methods in High Dimensions"

Transcription

1 Sequential Monte Carlo Methods in High Dimensions Alexandros Beskos Statistical Science, UCL Oxford, 24th September 2012 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Andrew Stuart Imperial College, Singapore, UCL, Warwick Presentation based on submitted papers: On the stability of Sequential Monte Carlo methods in high dimensions (12) Error bounds and normalizing constants for Sequential Monte Carlo in high dimensions (12)

2 Background Perceived idea in DA/SMC areas that solving the full Bayesian problem for practical DA applications is unfeasible. Due to weight degeneracy happening very fast. So, standard practice is to apply Kalman-FIlter- type or variational-type methods using Gaussian approximations. Yet, there have been new attempts trying to confront weight degeneracy for SMC from DA community (e.g. van Leeuwen s talk, Chorin et al. (10)). Talk will show some efforts towards this direction from group from (mainly) SMC community.

3 Structure of the Talk Main Part: Cost of an SMC sampler in high dimensions. Secondary Part: A smoothing problem for the Navier-Stokes equation.

4 Outline 1 MCMC asymptotics as d 2 SMC Asymptotics as d : No Resampling 3 SMC Asymptotics as d : With Resampling 4 Navier Stokes Smoothing

5 Context N is number of particles; d is dimension of state space. Standard theory for SMC has looked at asymptotics as number of particles N. The context d is not as well studied or understood; there has been some work mainly in DA community: Snyder et al. (08), Bickel et al. (08), Bengtsson et al. (08), Quang et al. (11). Analytical results as d have provided powerful insights into the behaviour of MCMC algorithms: Roberts & Rosenthal (01). A detailed investigation should also benefit SMC methods.

6 Set-Up Assume iid target distribution in R d : Π(x 1:d ) = d j=1 { π(x j ) = exp d j=1 What is the cost of algorithms for large d? Is there some limit as d? } g(x j ) Can such results be used to tune and optimise algorithms?

7 Outline 1 MCMC asymptotics as d 2 SMC Asymptotics as d : No Resampling 3 SMC Asymptotics as d : With Resampling 4 Navier Stokes Smoothing

8 MCMC Algorithms Simulate ergodic Markov chain, invariant under Π(x 1:d ), up to equilibrium: x (1), x (2),..., E.g., Random-Walk Metropolis (RWM) will propose: x pr = x + h N(0, I d ) and x pr will be accepted with probability a d (x, x pr ) = 1 Π(x pr ) Π(x)

9 Limit for RWM Scale step-size to control acceptance probability: h = l 2 /d Indeed, we have a(l) = lim d E [ a d (x, x pr ) ] (0, 1) Sqeeze time at MCMC trajectory by 1/d: x (0) x (1) x (2) 1/d Trajectory of first co ordinate

10 Result + Utilisation Theorem (Roberts et al., 97) ([ t d ]) Continuous time process x 1 converges weakly to the solution of SDE: dx dt for speed function: = 1 2 s(l) (log π) (x) + s(l) dw dt s(l) = l 2 a(l), We should maximize s(l). Surprisingly, for "all" targets d j=1 π(x j): a(l opt ) = 0.234

11 Further Directions Further investigations involve: Independent x j with varying standard deviations (Bédard). Non-independent co-ordinates: change of measure from independent (Beskos et al., 09); short-length dependencies. Non-local algorithms: Hybrid Monte-Carlo (Beskos et al., 11).

12 Outline 1 MCMC asymptotics as d 2 SMC Asymptotics as d : No Resampling 3 SMC Asymptotics as d : With Resampling 4 Navier Stokes Smoothing

13 Context: Static We have target distribution: Π(x 1:d ) = d j=1 and will use particles, N, from: for some small φ 1 > 0. { π(x j ) = exp Π 1 (x 1:d ) = Π(x 1:d ) φ 1 d j=1 Direct Importance Sampling would require (Bickel, Snyder, etc.): N = O(κ d ), κ > 1. } g(x j )

14 Annealed Importance Sampling Neal (01); Chopin (02); Del Moral et al. (06). We work with the sequence of distributions: Π n (x) Π(x) φn, n = 1, 2,..., d where we have chosen φ n φ n 1 = 1 φ 1 d (so φ d 1). N particles start off with x (i) 0 Π 1 and evolve according to: d K n (x (i) n 1, dx n) = k n (x (i) n 1,j, dx n,j) j=1 such that π n k n = π n where π n (x j ) exp{ φ n g(x j )}.

15 Weights The unnormalised particle weights are as follows: W (i) n = W (i) t φ 1 n 1 Π n (x (i) n 1 ) Π n 1 (x (i) n 1 ) That is, after l d (t) = (1 φ 1 )/d steps: log W (i) t = 1 φ 1 d l d d (t) { (i) g(x n,j ) π n(g) } j=1 n=1 We will look at the stability of log W (i) t as d.

16 The Patricle Evolution Dynamics Co-ordinates evolve independently via k n (x n 1,j, dx n,j ). x 2 x 5 x 1 x 3 x 4 Trajectory of i th particle, j th co ordinate ( 1 φ ) / d 1 φ 1 φ 2 φ 3 φ 4 φ 5 1 One can think of a continuum of targets and densities: k s (x, dx ), π s (x j ) exp{ s g(x j )}.

17 Statement of One of Results Theorem: Under conditions, we have that as d : log W (i) t B σ 2 φ1 :t where B is a Brownian motion. The asymptotic variance is: t σφ 2 1 :t = (1 φ { 1) π s ĝ2 s k s (ĝs 2 ) } ds. φ 1 log W (i) 1 stabilise as d for fixed N.

18 Comments Consider ESS t = ( N i=1 W (i) t ) 2. N (i) i=1 (W t ) 2 One can also obtain, as d : ESS 1 [ N i=1 ex i ] 2 iid N, X i N(0, σ 2 φ1 i=1 e2x :1 ) i One can find that: lim E [ ESS 1] 1 + (N 1)e 3σ2 φ1 :1. d and that: ESS 1 N N,d exp{ σ 2 φ 1 :1 }.

19 Comments Recall that: σ 2 φ 1 :t = (1 φ 1) t φ 1 π s { ĝ2 s k s (ĝ 2 s ) } ds. Here, ĝ s is the solution to the Poisson equation: Note also that: g(x) π s (g) = ĝ s (x) k s (ĝ s )(x) π { ĝ 2 k(ĝ 2 ) } is the asymptotic variance in the standard CLT for geometric MCMC Markov chains.

20 Conditions for Theorem (A1) i. Minorisation condition uniformly in s: There exists set C, constant θ (0, 1) and probability law ν so that C is (1, θ, ν)-small w.r.t. k s. ii. Geometric Ergodicity uniformly in s: k s V (x) λv (x) + b I C (x), with λ < 1, b > 0 and C as above, for all s [φ 1, 1]. (A2) Controlled Perturbations of {k s }: k t k s V M t s.

21 (Very Rough) Sketch of Proof We look at the process: Z t,d = 1 φ 1 d l d (t) n=1 We have the decomposition: l d (t) n=1 { g(x (i) n,j ) π n(g) } = l d (t) n=1 { g(x (i) n,j ) π n(g) } { ĝn (x (i) n,j ) k n(ĝ n )(x (i) n 1,j ) } +R (i) d,t with the first term providing a Martingale Functional CLT and the second vanishing (when divided with d).

22 More Comments Proposition: We also have that: x (i) d,j π So, O(d) MCMC steps exactly enough to attain correct distribution. Overall cost O(N d 2 ).

23 Outline 1 MCMC asymptotics as d 2 SMC Asymptotics as d : No Resampling 3 SMC Asymptotics as d : With Resampling 4 Navier Stokes Smoothing

24 Dynamic Resampling? Analysis of algorithms under dynamic resampling is tough. Del Moral et al. (11) notice that as N dynamic (stochastic) resampling times coincide with deterministic ones with high probability. In our case, we can identify deterministic instances {t k (d)} matching the dynamic resampling times with probability at least 1 M N. We carry out the analysis doing resampling at {t k (d)}. We can also identify limits as d : t k (d) t k

25 Deterministic Resampling In particular, we have that (essentially): t 1 = inf{t [φ 1, 1] : e σ2 φ 1 :t < α}. t k = inf{t [t k 1, 1] : e σ2 t k 1 :t < α} We have defined here: σ 2 v:t = (1 φ 1 ) t v π s { ĝ2 s k s (ĝ 2 s ) } ds. No of resampling instances converges to m <.

26 Statement of Result Resampling forces dependence between particles and co-ordinates. Theorem: Under the stated conditions we have that, for instances s k (d) s k (t k 1, t k ). log W tk 1 (d):s k (d) N(0, σ 2 t k 1 :s k ) Overall, understanding of behavior of algorithm as d boils down to σ 2 v:t. Manifestation of effect of resampling.

27 A Comment on Proof Construct martingale under filtration {G j,d } d j=1 so that: G 0,d = σ(all particles, just after resampling) G 1,d = G 0,d σ(1st coordinate) G 2,d = G 1,d σ(2nd coordinate). Exploit conditional independence given G 0,d. Technique used in other applied probability applications (e.g. joint asymptotics in Monte Carlo + datasize).

28 Outline 1 MCMC asymptotics as d 2 SMC Asymptotics as d : No Resampling 3 SMC Asymptotics as d : With Resampling 4 Navier Stokes Smoothing

29 Navier Stokes Dynamics Consider NS dynamics on [0, L] [0, L], describing the evolution of the velocity v = v(x, t) of incompressible fluid: u t ν u + (u ) u + p = f u = 0 u(x, 0) = u 0 (x) with ν the viscosity, p the pressure, f the forcing. We assume periodic boundary conditions.

30 Spectral Domain Natural basis here is {ψ k } k Z 2 /{0} such that: where k = (k 2, k 1 ). So that we can expand: ψ k (x) = k k exp{ i 2π L k x} u(x) = k Z 2 /{0} for Fourier coefficients u k = u, ψ k. u k ψ k (x)

31 Bayesian Framework We observe u(x, t) with error (Eulerian case): Y s = ( u(x m, s δ) ) M + N(0, Σ) ; 1 s T m=1 We set a prior on u 0 : u 0 Π 0 = N(0, ( ) α ) We need to learn about the posterior: Π(u 0 Y ) L(Y u 0 ) Π 0 (u 0 ) for likelihood L(Y u 0 ) = e 1 2 T s=1 Ys Gs(u 0) 2 Σ, with "observation operator" u 0 G s (u 0 ) = ( u(x m, s δ) ) M m=1.

32 Non-Sparsity of Model Graph Π(u Y ) is a high-dimensional target. High dimensional posteriors arise frequently in Bayesian applications in statistics (e.g. Bayesian hierarchical modeling), and many times are successfully dealt with. This is mainly due to intrinsic conditional independencies in the model structure allowing for local computations when applying a Gibbs sampler. Such a structure is not present in our DA set-up, making it a non-standard challenging computational problem.

33 Learning from Posterior Law & Stuart (12) use a RWM-type MCMC algorithm. It proposes: u 0,pr = ρ u ρ 2 Z for noise Z Π 0, accepted will probability: 1 L(Y u 0,pr ) L(Y u 0 ) This is relevant for off-line setup, and was used to check robustness of practical approximate algorithms. Algorithm needed ρ 1 to give good acceptance probabilities, and could tackle some scenarios (state space made of 64 2 Fourier coefficients).

34 A Mixing Issue The proposal also writes as: with u 0,pr, ψ k = ρ u 0, ψ k + 1 ρ 2 Z, ψ k Re{ Z, ψ k } N(0, 1 2 ( 4π2 L 2 k 2 ) α ) Scale of noise is ideally tuned to the prior distribution, but badly tuned to the posterior. A-posteriori, "low" Fourier coefficients may have much smaller variances than a-priori, which explains ρ 1. But this destroys the mixing of "medium" Fourier coefficients, proposing very small steps relatively to their size.

35 An Improved SMC Sampler Better samplers could be build by sequentially assimilating data, and by sequentially adapting the scaling of Z. A Weight-Move Algorithm (Chopin, 02): 1 Assume collection of particles {u (i) 0 }N i=1 from Π(u 0 Y 1:s ). 2 Weight as W (i) = e 1 2 Y s+1 G s+1 (u (i) 0 ) 2 Σ 3 Resample; particles {u (i) 0 }N i=1 now represent Π(u 0 Y 1:(s+1) ). 4 Move particles according to kernel K s+1 (u (i) 0, du) invariant under Π(u 0 Y 1:(s+1) ). Importantly, current particle representation of Π(u 0 Y 1:(s+1) ) can be used to tune kernel K s+1 (u (i) 0, du).

36 Tuning of Kernel For instance, one can build K s+1 by now proposing: with u (i) 0,pr, ψ k = ρ u (i) 0, ψ k + 1 ρ 2 ξ Re{ξ} N(0, σ 2 ) and σ 2 the particle estimate of the marginal variance of Re{ u 0, ψ k } under the target Π(u 0 Y 1:(s+1) ). Chopin (02) also recommend Independence Samplers, effective in the presence of asymptotic normality. Still more time to do the numerics...

37 Discussion Described algorithm will need N realizations of NS dynamics from 0 to sδ, repeated for s = 1, 2,... T. Parallelisation over the N particles will be critical. SMC methods slowly making inroads in high-dimensional DA applications. Interaction between DA + SMC communities could provide platform for further advances.

38 References Bengtsson, Bickel and Li (08) Curse-of-dimensionality revisited: collapse of the PF in very large scale systems In Probability and statistics: essays in honor of David A. Freedman. Bickel, Li and Bengtsson (08) Sharp failure rates for the bootstrap particle filter in high dimensions In Pushing the limits of contemporary statistics: contributions in honor of Jayanta K. Ghosh. Chopin (02) A sequential particle filter method for static models Biometrika. Chorin, Morzfeld and Tu (10) Implicit particle filters for Data Assimilation Submitted. Del Moral, Doucet, and Jasra (11) On adaptive resampling procedures for sequential Monte Carlo methods To appear in Bernoulli.

39 References Del Moral, Doucet, and Jasra (06) Sequential Monte Carlo samplers J. R. Stat. Soc. Ser. B Stat. Methodol.. Law, Stuart (12) Evaluating Data Assimilation algorithms Submitted. Neal (01) Annealed importance sampling Stat. Comput.. Snyder, Bengtsson, Bickel and Anderson (08) Obstacles to high-dimensional particle filtering Monthly Weather Review van Leeuwen (10) Nonlinear Data Assimilation in geosciences: an extremely efficient particle filter Quart. J. of the R. Meteor. Soc.

40 References Beskos, Pillai, Roberts, Sanz-Serna and Stuart (11) Optimal tuning of the Hybrid Monte Carlo algorithm To appear in Bernoulli. Beskos, Roberts and Stuart (09) Optimal scalings for local Metropolis-Hastings chains on non-product targets in high dimensions Ann. Appl. Probab.. Quang, Musso, and Le Gland (10) An insight into the issue of dimensionality in particle filtering In Information Fusion (FUSION). Roberts, Gelman and Gilks (97) Weak convergence and optimal scaling of random walk Metropolis algorithms Ann. Appl. Probab.. Roberts and Rosenthal (01) Optimal scaling for various Metropolis-Hastings algorithms Statist. Sci.. Roberts and Rosenthal (98) Optimal scaling of discrete approximations to Langevin diffusions J. R. Stat. Soc. Ser. B Stat. Methodol..

Sequential Monte Carlo Samplers for Applications in High Dimensions

Sequential Monte Carlo Samplers for Applications in High Dimensions Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex

More information

Particle Filters: Convergence Results and High Dimensions

Particle Filters: Convergence Results and High Dimensions Particle Filters: Convergence Results and High Dimensions Mark Coates mark.coates@mcgill.ca McGill University Department of Electrical and Computer Engineering Montreal, Quebec, Canada Bellairs 2012 Outline

More information

Kernel adaptive Sequential Monte Carlo

Kernel adaptive Sequential Monte Carlo Kernel adaptive Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) December 7, 2015 1 / 36 Section 1 Outline

More information

Computational Complexity of Metropolis-Hastings Methods in High Dimensions

Computational Complexity of Metropolis-Hastings Methods in High Dimensions Computational Complexity of Metropolis-Hastings Methods in High Dimensions Alexandros Beskos and Andrew Stuart Abstract This article contains an overview of the literature concerning the computational

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

Pseudo-marginal MCMC methods for inference in latent variable models

Pseudo-marginal MCMC methods for inference in latent variable models Pseudo-marginal MCMC methods for inference in latent variable models Arnaud Doucet Department of Statistics, Oxford University Joint work with George Deligiannidis (Oxford) & Mike Pitt (Kings) MCQMC, 19/08/2016

More information

Controlled sequential Monte Carlo

Controlled sequential Monte Carlo Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation

More information

Monte Carlo methods for sampling-based Stochastic Optimization

Monte Carlo methods for sampling-based Stochastic Optimization Monte Carlo methods for sampling-based Stochastic Optimization Gersende FORT LTCI CNRS & Telecom ParisTech Paris, France Joint works with B. Jourdain, T. Lelièvre, G. Stoltz from ENPC and E. Kuhn from

More information

Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems

Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems Jonas Latz 1 Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems Jonas Latz Technische Universität München Fakultät für Mathematik Lehrstuhl für Numerische Mathematik jonas.latz@tum.de November

More information

MCMC Sampling for Bayesian Inference using L1-type Priors

MCMC Sampling for Bayesian Inference using L1-type Priors MÜNSTER MCMC Sampling for Bayesian Inference using L1-type Priors (what I do whenever the ill-posedness of EEG/MEG is just not frustrating enough!) AG Imaging Seminar Felix Lucka 26.06.2012 , MÜNSTER Sampling

More information

Data assimilation as an optimal control problem and applications to UQ

Data assimilation as an optimal control problem and applications to UQ Data assimilation as an optimal control problem and applications to UQ Walter Acevedo, Angwenyi David, Jana de Wiljes & Sebastian Reich Universität Potsdam/ University of Reading IPAM, November 13th 2017

More information

Hierarchical Bayesian Inversion

Hierarchical Bayesian Inversion Hierarchical Bayesian Inversion Andrew M Stuart Computing and Mathematical Sciences, Caltech cw/ S. Agapiou, J. Bardsley and O. Papaspiliopoulos SIAM/ASA JUQ 2(2014), pp. 511--544 cw/ M. Dunlop and M.

More information

Kernel Sequential Monte Carlo

Kernel Sequential Monte Carlo Kernel Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) * equal contribution April 25, 2016 1 / 37 Section

More information

Sequential Monte Carlo Methods for High-Dimensional Inverse Problems: A case study for the Navier-Stokes equations

Sequential Monte Carlo Methods for High-Dimensional Inverse Problems: A case study for the Navier-Stokes equations SIAM/ASA J. UNCERTAINTY QUANTIFICATION Vol. xx, pp. x c xxxx Society for Industrial and Applied Mathematics x x Sequential Monte Carlo Methods for High-Dimensional Inverse Problems: A case study for the

More information

Lagrangian Data Assimilation and Its Application to Geophysical Fluid Flows

Lagrangian Data Assimilation and Its Application to Geophysical Fluid Flows Lagrangian Data Assimilation and Its Application to Geophysical Fluid Flows Laura Slivinski June, 3 Laura Slivinski (Brown University) Lagrangian Data Assimilation June, 3 / 3 Data Assimilation Setup:

More information

Inference in state-space models with multiple paths from conditional SMC

Inference in state-space models with multiple paths from conditional SMC Inference in state-space models with multiple paths from conditional SMC Sinan Yıldırım (Sabancı) joint work with Christophe Andrieu (Bristol), Arnaud Doucet (Oxford) and Nicolas Chopin (ENSAE) September

More information

Convective-scale data assimilation in the Weather Research and Forecasting model using a nonlinear ensemble filter

Convective-scale data assimilation in the Weather Research and Forecasting model using a nonlinear ensemble filter Convective-scale data assimilation in the Weather Research and Forecasting model using a nonlinear ensemble filter Jon Poterjoy, Ryan Sobash, and Jeffrey Anderson National Center for Atmospheric Research

More information

Particle Metropolis-adjusted Langevin algorithms

Particle Metropolis-adjusted Langevin algorithms Particle Metropolis-adjusted Langevin algorithms Christopher Nemeth, Chris Sherlock and Paul Fearnhead arxiv:1412.7299v3 [stat.me] 27 May 2016 Department of Mathematics and Statistics, Lancaster University,

More information

Condensed Table of Contents for Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control by J. C.

Condensed Table of Contents for Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control by J. C. Condensed Table of Contents for Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control by J. C. Spall John Wiley and Sons, Inc., 2003 Preface... xiii 1. Stochastic Search

More information

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for

More information

Control Variates for Markov Chain Monte Carlo

Control Variates for Markov Chain Monte Carlo Control Variates for Markov Chain Monte Carlo Dellaportas, P., Kontoyiannis, I., and Tsourti, Z. Dept of Statistics, AUEB Dept of Informatics, AUEB 1st Greek Stochastics Meeting Monte Carlo: Probability

More information

Nonparametric Drift Estimation for Stochastic Differential Equations

Nonparametric Drift Estimation for Stochastic Differential Equations Nonparametric Drift Estimation for Stochastic Differential Equations Gareth Roberts 1 Department of Statistics University of Warwick Brazilian Bayesian meeting, March 2010 Joint work with O. Papaspiliopoulos,

More information

Zig-Zag Monte Carlo. Delft University of Technology. Joris Bierkens February 7, 2017

Zig-Zag Monte Carlo. Delft University of Technology. Joris Bierkens February 7, 2017 Zig-Zag Monte Carlo Delft University of Technology Joris Bierkens February 7, 2017 Joris Bierkens (TU Delft) Zig-Zag Monte Carlo February 7, 2017 1 / 33 Acknowledgements Collaborators Andrew Duncan Paul

More information

Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo

Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo Andrew Gordon Wilson www.cs.cmu.edu/~andrewgw Carnegie Mellon University March 18, 2015 1 / 45 Resources and Attribution Image credits,

More information

A Dirichlet Form approach to MCMC Optimal Scaling

A Dirichlet Form approach to MCMC Optimal Scaling A Dirichlet Form approach to MCMC Optimal Scaling Giacomo Zanella, Wilfrid S. Kendall, and Mylène Bédard. g.zanella@warwick.ac.uk, w.s.kendall@warwick.ac.uk, mylene.bedard@umontreal.ca Supported by EPSRC

More information

Weak convergence of Markov chain Monte Carlo II

Weak convergence of Markov chain Monte Carlo II Weak convergence of Markov chain Monte Carlo II KAMATANI, Kengo Mar 2011 at Le Mans Background Markov chain Monte Carlo (MCMC) method is widely used in Statistical Science. It is easy to use, but difficult

More information

Particle filters, the optimal proposal and high-dimensional systems

Particle filters, the optimal proposal and high-dimensional systems Particle filters, the optimal proposal and high-dimensional systems Chris Snyder National Center for Atmospheric Research Boulder, Colorado 837, United States chriss@ucar.edu 1 Introduction Particle filters

More information

Seminar: Data Assimilation

Seminar: Data Assimilation Seminar: Data Assimilation Jonas Latz, Elisabeth Ullmann Chair of Numerical Mathematics (M2) Technical University of Munich Jonas Latz, Elisabeth Ullmann (TUM) Data Assimilation 1 / 28 Prerequisites Bachelor:

More information

Sequential Monte Carlo Methods

Sequential Monte Carlo Methods National University of Singapore KAUST, October 14th 2014 Monte Carlo Importance Sampling Markov chain Monte Carlo Sequential Importance Sampling Resampling + Weight Degeneracy Path Degeneracy Algorithm

More information

Sequential Monte Carlo Methods for Bayesian Model Selection in Positron Emission Tomography

Sequential Monte Carlo Methods for Bayesian Model Selection in Positron Emission Tomography Methods for Bayesian Model Selection in Positron Emission Tomography Yan Zhou John A.D. Aston and Adam M. Johansen 6th January 2014 Y. Zhou J. A. D. Aston and A. M. Johansen Outline Positron emission tomography

More information

Markov Chain Monte Carlo methods

Markov Chain Monte Carlo methods Markov Chain Monte Carlo methods Tomas McKelvey and Lennart Svensson Signal Processing Group Department of Signals and Systems Chalmers University of Technology, Sweden November 26, 2012 Today s learning

More information

University of Toronto Department of Statistics

University of Toronto Department of Statistics Norm Comparisons for Data Augmentation by James P. Hobert Department of Statistics University of Florida and Jeffrey S. Rosenthal Department of Statistics University of Toronto Technical Report No. 0704

More information

Some Results on the Ergodicity of Adaptive MCMC Algorithms

Some Results on the Ergodicity of Adaptive MCMC Algorithms Some Results on the Ergodicity of Adaptive MCMC Algorithms Omar Khalil Supervisor: Jeffrey Rosenthal September 2, 2011 1 Contents 1 Andrieu-Moulines 4 2 Roberts-Rosenthal 7 3 Atchadé and Fort 8 4 Relationship

More information

An introduction to adaptive MCMC

An introduction to adaptive MCMC An introduction to adaptive MCMC Gareth Roberts MIRAW Day on Monte Carlo methods March 2011 Mainly joint work with Jeff Rosenthal. http://www2.warwick.ac.uk/fac/sci/statistics/crism/ Conferences and workshops

More information

A nested sampling particle filter for nonlinear data assimilation

A nested sampling particle filter for nonlinear data assimilation Quarterly Journal of the Royal Meteorological Society Q. J. R. Meteorol. Soc. : 14, July 2 A DOI:.2/qj.224 A nested sampling particle filter for nonlinear data assimilation Ahmed H. Elsheikh a,b *, Ibrahim

More information

A Note on Auxiliary Particle Filters

A Note on Auxiliary Particle Filters A Note on Auxiliary Particle Filters Adam M. Johansen a,, Arnaud Doucet b a Department of Mathematics, University of Bristol, UK b Departments of Statistics & Computer Science, University of British Columbia,

More information

Particle Filters in High Dimensions

Particle Filters in High Dimensions Particle Filters in High Dimensions Dan Crisan Imperial College London Bayesian Computation for High-Dimensional Statistical Models Opening Workshop 27-31 August 2018 Dan Crisan (Imperial College London)

More information

New Insights into History Matching via Sequential Monte Carlo

New Insights into History Matching via Sequential Monte Carlo New Insights into History Matching via Sequential Monte Carlo Associate Professor Chris Drovandi School of Mathematical Sciences ARC Centre of Excellence for Mathematical and Statistical Frontiers (ACEMS)

More information

Advances and Applications in Perfect Sampling

Advances and Applications in Perfect Sampling and Applications in Perfect Sampling Ph.D. Dissertation Defense Ulrike Schneider advisor: Jem Corcoran May 8, 2003 Department of Applied Mathematics University of Colorado Outline Introduction (1) MCMC

More information

Sequential Monte Carlo Methods

Sequential Monte Carlo Methods University of Pennsylvania Bradley Visitor Lectures October 23, 2017 Introduction Unfortunately, standard MCMC can be inaccurate, especially in medium and large-scale DSGE models: disentangling importance

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Centre for Computational Statistics and Machine Learning University College London c.archambeau@cs.ucl.ac.uk CSML

More information

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Axel Gandy Department of Mathematics Imperial College London http://www2.imperial.ac.uk/~agandy London

More information

Gradient-based Monte Carlo sampling methods

Gradient-based Monte Carlo sampling methods Gradient-based Monte Carlo sampling methods Johannes von Lindheim 31. May 016 Abstract Notes for a 90-minute presentation on gradient-based Monte Carlo sampling methods for the Uncertainty Quantification

More information

LTCC: Advanced Computational Methods in Statistics

LTCC: Advanced Computational Methods in Statistics LTCC: Advanced Computational Methods in Statistics Advanced Particle Methods & Parameter estimation for HMMs N. Kantas Notes at http://wwwf.imperial.ac.uk/~nkantas/notes4ltcc.pdf Slides at http://wwwf.imperial.ac.uk/~nkantas/slides4.pdf

More information

LECTURE 15 Markov chain Monte Carlo

LECTURE 15 Markov chain Monte Carlo LECTURE 15 Markov chain Monte Carlo There are many settings when posterior computation is a challenge in that one does not have a closed form expression for the posterior distribution. Markov chain Monte

More information

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet.

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet. Stat 535 C - Statistical Computing & Monte Carlo Methods Arnaud Doucet Email: arnaud@cs.ubc.ca 1 1.1 Outline Introduction to Markov chain Monte Carlo The Gibbs Sampler Examples Overview of the Lecture

More information

Auxiliary Particle Methods

Auxiliary Particle Methods Auxiliary Particle Methods Perspectives & Applications Adam M. Johansen 1 adam.johansen@bristol.ac.uk Oxford University Man Institute 29th May 2008 1 Collaborators include: Arnaud Doucet, Nick Whiteley

More information

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms L09. PARTICLE FILTERING NA568 Mobile Robotics: Methods & Algorithms Particle Filters Different approach to state estimation Instead of parametric description of state (and uncertainty), use a set of state

More information

Learning Static Parameters in Stochastic Processes

Learning Static Parameters in Stochastic Processes Learning Static Parameters in Stochastic Processes Bharath Ramsundar December 14, 2012 1 Introduction Consider a Markovian stochastic process X T evolving (perhaps nonlinearly) over time variable T. We

More information

An Adaptive Sequential Monte Carlo Sampler

An Adaptive Sequential Monte Carlo Sampler Bayesian Analysis (2013) 8, Number 2, pp. 411 438 An Adaptive Sequential Monte Carlo Sampler Paul Fearnhead * and Benjamin M. Taylor Abstract. Sequential Monte Carlo (SMC) methods are not only a popular

More information

An Brief Overview of Particle Filtering

An Brief Overview of Particle Filtering 1 An Brief Overview of Particle Filtering Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ May 11th, 2010 Warwick University Centre for Systems

More information

Introduction to Markov Chain Monte Carlo & Gibbs Sampling

Introduction to Markov Chain Monte Carlo & Gibbs Sampling Introduction to Markov Chain Monte Carlo & Gibbs Sampling Prof. Nicholas Zabaras Sibley School of Mechanical and Aerospace Engineering 101 Frank H. T. Rhodes Hall Ithaca, NY 14853-3801 Email: zabaras@cornell.edu

More information

Stat 516, Homework 1

Stat 516, Homework 1 Stat 516, Homework 1 Due date: October 7 1. Consider an urn with n distinct balls numbered 1,..., n. We sample balls from the urn with replacement. Let N be the number of draws until we encounter a ball

More information

Computational statistics

Computational statistics Computational statistics Markov Chain Monte Carlo methods Thierry Denœux March 2017 Thierry Denœux Computational statistics March 2017 1 / 71 Contents of this chapter When a target density f can be evaluated

More information

F denotes cumulative density. denotes probability density function; (.)

F denotes cumulative density. denotes probability density function; (.) BAYESIAN ANALYSIS: FOREWORDS Notation. System means the real thing and a model is an assumed mathematical form for the system.. he probability model class M contains the set of the all admissible models

More information

Brief introduction to Markov Chain Monte Carlo

Brief introduction to Markov Chain Monte Carlo Brief introduction to Department of Probability and Mathematical Statistics seminar Stochastic modeling in economics and finance November 7, 2011 Brief introduction to Content 1 and motivation Classical

More information

Concentration inequalities for Feynman-Kac particle models. P. Del Moral. INRIA Bordeaux & IMB & CMAP X. Journées MAS 2012, SMAI Clermond-Ferrand

Concentration inequalities for Feynman-Kac particle models. P. Del Moral. INRIA Bordeaux & IMB & CMAP X. Journées MAS 2012, SMAI Clermond-Ferrand Concentration inequalities for Feynman-Kac particle models P. Del Moral INRIA Bordeaux & IMB & CMAP X Journées MAS 2012, SMAI Clermond-Ferrand Some hyper-refs Feynman-Kac formulae, Genealogical & Interacting

More information

Likelihood-free MCMC

Likelihood-free MCMC Bayesian inference for stable distributions with applications in finance Department of Mathematics University of Leicester September 2, 2011 MSc project final presentation Outline 1 2 3 4 Classical Monte

More information

On the conjugacy of off-line and on-line Sequential Monte Carlo Samplers. Working Paper Research. by Arnaud Dufays. September 2014 No 263

On the conjugacy of off-line and on-line Sequential Monte Carlo Samplers. Working Paper Research. by Arnaud Dufays. September 2014 No 263 On the conjugacy of off-line and on-line Sequential Monte Carlo Samplers Working Paper Research by Arnaud Dufays September 2014 No 263 Editorial Director Jan Smets, Member of the Board of Directors of

More information

Chapter 12 PAWL-Forced Simulated Tempering

Chapter 12 PAWL-Forced Simulated Tempering Chapter 12 PAWL-Forced Simulated Tempering Luke Bornn Abstract In this short note, we show how the parallel adaptive Wang Landau (PAWL) algorithm of Bornn et al. (J Comput Graph Stat, to appear) can be

More information

SMC 2 : an efficient algorithm for sequential analysis of state-space models

SMC 2 : an efficient algorithm for sequential analysis of state-space models SMC 2 : an efficient algorithm for sequential analysis of state-space models N. CHOPIN 1, P.E. JACOB 2, & O. PAPASPILIOPOULOS 3 1 ENSAE-CREST 2 CREST & Université Paris Dauphine, 3 Universitat Pompeu Fabra

More information

ON CONVERGENCE RATES OF GIBBS SAMPLERS FOR UNIFORM DISTRIBUTIONS

ON CONVERGENCE RATES OF GIBBS SAMPLERS FOR UNIFORM DISTRIBUTIONS The Annals of Applied Probability 1998, Vol. 8, No. 4, 1291 1302 ON CONVERGENCE RATES OF GIBBS SAMPLERS FOR UNIFORM DISTRIBUTIONS By Gareth O. Roberts 1 and Jeffrey S. Rosenthal 2 University of Cambridge

More information

Advanced Monte Carlo integration methods. P. Del Moral (INRIA team ALEA) INRIA & Bordeaux Mathematical Institute & X CMAP

Advanced Monte Carlo integration methods. P. Del Moral (INRIA team ALEA) INRIA & Bordeaux Mathematical Institute & X CMAP Advanced Monte Carlo integration methods P. Del Moral (INRIA team ALEA) INRIA & Bordeaux Mathematical Institute & X CMAP MCQMC 2012, Sydney, Sunday Tutorial 12-th 2012 Some hyper-refs Feynman-Kac formulae,

More information

Bayesian Inverse problem, Data assimilation and Localization

Bayesian Inverse problem, Data assimilation and Localization Bayesian Inverse problem, Data assimilation and Localization Xin T Tong National University of Singapore ICIP, Singapore 2018 X.Tong Localization 1 / 37 Content What is Bayesian inverse problem? What is

More information

The University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland),

The University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland), The University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland), Geoff Nicholls (Statistics, Oxford) fox@math.auckland.ac.nz

More information

A Search and Jump Algorithm for Markov Chain Monte Carlo Sampling. Christopher Jennison. Adriana Ibrahim. Seminar at University of Kuwait

A Search and Jump Algorithm for Markov Chain Monte Carlo Sampling. Christopher Jennison. Adriana Ibrahim. Seminar at University of Kuwait A Search and Jump Algorithm for Markov Chain Monte Carlo Sampling Christopher Jennison Department of Mathematical Sciences, University of Bath, UK http://people.bath.ac.uk/mascj Adriana Ibrahim Institute

More information

Adaptive Population Monte Carlo

Adaptive Population Monte Carlo Adaptive Population Monte Carlo Olivier Cappé Centre Nat. de la Recherche Scientifique & Télécom Paris 46 rue Barrault, 75634 Paris cedex 13, France http://www.tsi.enst.fr/~cappe/ Recent Advances in Monte

More information

Computer Practical: Metropolis-Hastings-based MCMC

Computer Practical: Metropolis-Hastings-based MCMC Computer Practical: Metropolis-Hastings-based MCMC Andrea Arnold and Franz Hamilton North Carolina State University July 30, 2016 A. Arnold / F. Hamilton (NCSU) MH-based MCMC July 30, 2016 1 / 19 Markov

More information

Introduction. log p θ (y k y 1:k 1 ), k=1

Introduction. log p θ (y k y 1:k 1 ), k=1 ESAIM: PROCEEDINGS, September 2007, Vol.19, 115-120 Christophe Andrieu & Dan Crisan, Editors DOI: 10.1051/proc:071915 PARTICLE FILTER-BASED APPROXIMATE MAXIMUM LIKELIHOOD INFERENCE ASYMPTOTICS IN STATE-SPACE

More information

The zig-zag and super-efficient sampling for Bayesian analysis of big data

The zig-zag and super-efficient sampling for Bayesian analysis of big data The zig-zag and super-efficient sampling for Bayesian analysis of big data LMS-CRiSM Summer School on Computational Statistics 15th July 2018 Gareth Roberts, University of Warwick Joint work with Joris

More information

Pattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods

Pattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods Pattern Recognition and Machine Learning Chapter 11: Sampling Methods Elise Arnaud Jakob Verbeek May 22, 2008 Outline of the chapter 11.1 Basic Sampling Algorithms 11.2 Markov Chain Monte Carlo 11.3 Gibbs

More information

What do we know about EnKF?

What do we know about EnKF? What do we know about EnKF? David Kelly Kody Law Andrew Stuart Andrew Majda Xin Tong Courant Institute New York University New York, NY April 10, 2015 CAOS seminar, Courant. David Kelly (NYU) EnKF April

More information

Sequential Monte Carlo samplers for Bayesian DSGE models

Sequential Monte Carlo samplers for Bayesian DSGE models Sequential Monte Carlo samplers for Bayesian DSGE models Drew Creal First version: February 8, 27 Current version: March 27, 27 Abstract Dynamic stochastic general equilibrium models have become a popular

More information

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17 MCMC for big data Geir Storvik BigInsight lunch - May 2 2018 Geir Storvik MCMC for big data BigInsight lunch - May 2 2018 1 / 17 Outline Why ordinary MCMC is not scalable Different approaches for making

More information

Pseudo-marginal Metropolis-Hastings: a simple explanation and (partial) review of theory

Pseudo-marginal Metropolis-Hastings: a simple explanation and (partial) review of theory Pseudo-arginal Metropolis-Hastings: a siple explanation and (partial) review of theory Chris Sherlock Motivation Iagine a stochastic process V which arises fro soe distribution with density p(v θ ). Iagine

More information

Particle Filters in High Dimensions

Particle Filters in High Dimensions Particle Filters in High Dimensions Dan Crisan Imperial College London Workshop - Simulation and probability: recent trends The Henri Lebesgue Center for Mathematics 5-8 June 2018 Rennes Dan Crisan (Imperial

More information

arxiv: v1 [stat.co] 2 Nov 2017

arxiv: v1 [stat.co] 2 Nov 2017 Binary Bouncy Particle Sampler arxiv:1711.922v1 [stat.co] 2 Nov 217 Ari Pakman Department of Statistics Center for Theoretical Neuroscience Grossman Center for the Statistics of Mind Columbia University

More information

6 Markov Chain Monte Carlo (MCMC)

6 Markov Chain Monte Carlo (MCMC) 6 Markov Chain Monte Carlo (MCMC) The underlying idea in MCMC is to replace the iid samples of basic MC methods, with dependent samples from an ergodic Markov chain, whose limiting (stationary) distribution

More information

Stochastic Modelling in Climate Science

Stochastic Modelling in Climate Science Stochastic Modelling in Climate Science David Kelly Mathematics Department UNC Chapel Hill dtbkelly@gmail.com November 16, 2013 David Kelly (UNC) Stochastic Climate November 16, 2013 1 / 36 Why use stochastic

More information

Mean field simulation for Monte Carlo integration. Part II : Feynman-Kac models. P. Del Moral

Mean field simulation for Monte Carlo integration. Part II : Feynman-Kac models. P. Del Moral Mean field simulation for Monte Carlo integration Part II : Feynman-Kac models P. Del Moral INRIA Bordeaux & Inst. Maths. Bordeaux & CMAP Polytechnique Lectures, INLN CNRS & Nice Sophia Antipolis Univ.

More information

A new iterated filtering algorithm

A new iterated filtering algorithm A new iterated filtering algorithm Edward Ionides University of Michigan, Ann Arbor ionides@umich.edu Statistics and Nonlinear Dynamics in Biology and Medicine Thursday July 31, 2014 Overview 1 Introduction

More information

Bayesian Methods for Machine Learning

Bayesian Methods for Machine Learning Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),

More information

Parameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1

Parameter Estimation. William H. Jefferys University of Texas at Austin Parameter Estimation 7/26/05 1 Parameter Estimation William H. Jefferys University of Texas at Austin bill@bayesrules.net Parameter Estimation 7/26/05 1 Elements of Inference Inference problems contain two indispensable elements: Data

More information

Sampling from complex probability distributions

Sampling from complex probability distributions Sampling from complex probability distributions Louis J. M. Aslett (louis.aslett@durham.ac.uk) Department of Mathematical Sciences Durham University UTOPIAE Training School II 4 July 2017 1/37 Motivation

More information

Sequential Monte Carlo samplers

Sequential Monte Carlo samplers J. R. Statist. Soc. B (2006) 68, Part 3, pp. 411 436 Sequential Monte Carlo samplers Pierre Del Moral, Université Nice Sophia Antipolis, France Arnaud Doucet University of British Columbia, Vancouver,

More information

Sampling the posterior: An approach to non-gaussian data assimilation

Sampling the posterior: An approach to non-gaussian data assimilation Physica D 230 (2007) 50 64 www.elsevier.com/locate/physd Sampling the posterior: An approach to non-gaussian data assimilation A. Apte a, M. Hairer b, A.M. Stuart b,, J. Voss b a Department of Mathematics,

More information

Point spread function reconstruction from the image of a sharp edge

Point spread function reconstruction from the image of a sharp edge DOE/NV/5946--49 Point spread function reconstruction from the image of a sharp edge John Bardsley, Kevin Joyce, Aaron Luttman The University of Montana National Security Technologies LLC Montana Uncertainty

More information

Approximate Bayesian Computation and Particle Filters

Approximate Bayesian Computation and Particle Filters Approximate Bayesian Computation and Particle Filters Dennis Prangle Reading University 5th February 2014 Introduction Talk is mostly a literature review A few comments on my own ongoing research See Jasra

More information

Lecture 6: Markov Chain Monte Carlo

Lecture 6: Markov Chain Monte Carlo Lecture 6: Markov Chain Monte Carlo D. Jason Koskinen koskinen@nbi.ku.dk Photo by Howard Jackman University of Copenhagen Advanced Methods in Applied Statistics Feb - Apr 2016 Niels Bohr Institute 2 Outline

More information

Fast-slow systems with chaotic noise

Fast-slow systems with chaotic noise Fast-slow systems with chaotic noise David Kelly Ian Melbourne Courant Institute New York University New York NY www.dtbkelly.com May 12, 215 Averaging and homogenization workshop, Luminy. Fast-slow systems

More information

Markov Chain Monte Carlo (MCMC)

Markov Chain Monte Carlo (MCMC) Markov Chain Monte Carlo (MCMC Dependent Sampling Suppose we wish to sample from a density π, and we can evaluate π as a function but have no means to directly generate a sample. Rejection sampling can

More information

Surveying the Characteristics of Population Monte Carlo

Surveying the Characteristics of Population Monte Carlo International Research Journal of Applied and Basic Sciences 2013 Available online at www.irjabs.com ISSN 2251-838X / Vol, 7 (9): 522-527 Science Explorer Publications Surveying the Characteristics of

More information

Monte Carlo Methods. Leon Gu CSD, CMU

Monte Carlo Methods. Leon Gu CSD, CMU Monte Carlo Methods Leon Gu CSD, CMU Approximate Inference EM: y-observed variables; x-hidden variables; θ-parameters; E-step: q(x) = p(x y, θ t 1 ) M-step: θ t = arg max E q(x) [log p(y, x θ)] θ Monte

More information

Lecture 8: The Metropolis-Hastings Algorithm

Lecture 8: The Metropolis-Hastings Algorithm 30.10.2008 What we have seen last time: Gibbs sampler Key idea: Generate a Markov chain by updating the component of (X 1,..., X p ) in turn by drawing from the full conditionals: X (t) j Two drawbacks:

More information

17 : Markov Chain Monte Carlo

17 : Markov Chain Monte Carlo 10-708: Probabilistic Graphical Models, Spring 2015 17 : Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Heran Lin, Bin Deng, Yun Huang 1 Review of Monte Carlo Methods 1.1 Overview Monte Carlo

More information

Markov Chain Monte Carlo methods

Markov Chain Monte Carlo methods Markov Chain Monte Carlo methods By Oleg Makhnin 1 Introduction a b c M = d e f g h i 0 f(x)dx 1.1 Motivation 1.1.1 Just here Supresses numbering 1.1.2 After this 1.2 Literature 2 Method 2.1 New math As

More information

An introduction to Sequential Monte Carlo

An introduction to Sequential Monte Carlo An introduction to Sequential Monte Carlo Thang Bui Jes Frellsen Department of Engineering University of Cambridge Research and Communication Club 6 February 2014 1 Sequential Monte Carlo (SMC) methods

More information

A Note on the Particle Filter with Posterior Gaussian Resampling

A Note on the Particle Filter with Posterior Gaussian Resampling Tellus (6), 8A, 46 46 Copyright C Blackwell Munksgaard, 6 Printed in Singapore. All rights reserved TELLUS A Note on the Particle Filter with Posterior Gaussian Resampling By X. XIONG 1,I.M.NAVON 1,2 and

More information

Blind Equalization via Particle Filtering

Blind Equalization via Particle Filtering Blind Equalization via Particle Filtering Yuki Yoshida, Kazunori Hayashi, Hideaki Sakai Department of System Science, Graduate School of Informatics, Kyoto University Historical Remarks A sequential Monte

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information