Nested Sequential Monte Carlo Methods
|
|
- Briana Richard
- 5 years ago
- Views:
Transcription
1 Nested Sequential Monte Carlo Methods Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön Linköping University, Sweden The University of Cambridge, UK Uppsala University, Sweden ICML 2015 Presented by: Qinliang Su Aug. 17, 2016
2 Outline 1 Introduction 2 Review of Sequential Monte Carlo (SMC) 3 Nested SMC 4 Nesting of Nested SMC 5 Experiments
3 C Introduction Review Christian of Sequential A. Naesseth Monte Carlo (SMC) August Nested 31, 2015SMC Nesting 4 of Nested SMC Experiments cle filters in high dimension Introduction (1) Known to perform poorly in high (say, d 10) dimensions. ex) Spatio-temporal model: g(y t x t ) = d k=1 g(y t,k x t,k ). X1 X2 X3 X4 X5 X6 Transition: x k x k 1 h(x k x k 1 ) Measurement: y k x k g(y k x k ) f(x t x t 1 ) is typically an extremely bad proposal distribution in HD. Goal: at each time step k, use some samples to approximate the posterior k p(x 1:k y 1:k ) h(x 1 )g(y 1 x 1 ) h(x t x t 1 )g(y t x t ) Does a better proposal distribution improve our result?, t=2 and then estimate the expectation E p [f (x 1:k )] as Eˆp [f (x 1:k )] = f (x 1:k )ˆp(x 1:k )dx 1:k
4 Introduction (2) Nested Sequential Monte Carlo Methods This paper is interested in the settings: i) x k is high-dimensional, i.e. x k R d with d 1; Christian A. Naesseth Linköping University, Linköping, Sweden ii) There are local dependcency strucuture among x 1:k, both Fredrik Lindsten The University spatially of Cambridge, and Cambridge, temporally United Kingdom article filters in high dimension sted SMC Christian A. Naesseth August 31, Thomas B. Schön Uppsala University, Uppsala, Sweden Two Known examples: to perform poorly in high (say, d 10) dimensions. ex) Spatio-temporal model: g(y t x t ) = d k=1 g(y t,k x t,k ). Abstract X1 X2 X3 X4 X5 X6 We propose nested sequential Monte Carlo (NSMC), a methodology to sample from sequences of probability distributions, even where the random variables are high-dimensional. NSMC generalises the SMC framework by requiring only approximate, properly weighted, samples from the SMC proposal distribution, while still resulting in a correct SMC algorithm. Furthermore, f(x t x t 1 NSMC ) is typically can in itself be anused extremely to pro- bad proposal distribution in HD. duce such properly weighted samples. Consequently, one NSMC sampler can be used to con- CHRISTIAN.A.NAESSETH@LIU.SE FREDRIK.LINDSTEN@ENG.CAM.AC.UK THOMAS.SCHON@IT.UU.SE k 1 k k + 1 Figure 1. Example of a spatio-temporal model where π k(x 1:k) is given by a k 2 3 undirected graphical model and x k R 2 3.
5 Outline 1 Introduction 2 Review of Sequential Monte Carlo (SMC) 3 Nested SMC 4 Nesting of Nested SMC 5 Experiments
6 High-level Descriptions of SMC Procedures (at time step k): i) Select one sequence from existing ones {X1:k 1 i }N i=1, Nested SMC Christian A. Naesseth August 31, denoted as X j 1:k 1 ii) The Draw bootstrap a samplefilter Xk i from proposal distribution q(x k X j 1:k 1 ), and set X1:k i = (X j The bootstrap particle 1:k 1 filter, X approximates k i ) as the new p(x t sample y 1:t ) by iii) Assign the new sample a weight Wk i = p(x 1:k i y 1:k ) N q(x 1:k i ), due to p N (x t y 1:t ) := W t i the mismatch between the proposal l W l δ X i t pdf t (x t ). and true pdf i=1 Weighting Resampling Propagation Weighting Resampling With the pair {X Resampling: 1:k i, W {(X k i }N t 1, i=1 i, the Wt 1)} i posterior N i=1 {( X is t 1, i approximated 1/N)} N i=1. as N Propagation: p(x i Xt i f(x t X t 1). i Wk i 1:k y 1:k ) N Weighting: Wt i = g(y t i=1 Xt). i i=1 W δ i X i (x i 1:k 1:k) k The key is how to choose {(Xt, i the Wt i proposal )} N distribution i=1
7 Bootstrap Particle Filter Proposal pdf is chosen to be the transition pdf, i.e., q(x k X j 1:k 1 ) = h(x k X j k 1 ) Under this proposal, the weight can be easily computed as W i k = f (X i k X j k 1 )g(y k X i k ) f (X i k X j k 1 ) = g(y k X i k ) Bootstrap PF performs poorly in high dimensions (d > 10) - Mismatch between the proposal and target distirbutions - Weight callapse, i.e. weights are dominated by only one weight Despite of its simplicty, h(x t x t 1 ) is a bad proposal distribution
8 Fully Adapted SMC (1) The proposal pdf is chosen to adapt to the target distribution Let π k (x 1:k )= 1 Z πk π k (x 1:k ) be the target pdf. The proposal pdf is designed as q k (x k x 1:k 1 )= 1 Z qk (x 1:k 1 ) q k(x k x 1:k 1 ), where q k (x k x 1:k 1 ) = π k(x 1:k ) π k 1 (x 1:k 1 ), [= g(y k x k )h(x k x k 1 )] Under this proposal pdf, the weight becomes W i k = Z q k (x 1:k 1 )
9 Fully Adapted SMC (2) Nested SMC Christian A. Naesseth August 31, D MRF Nested SMC implementation (I/III) Example: 2D MRF x 4 x 1 x 2 x 3 x 5 x 6 x 4,1 x 4,2 x 4,3 x 4,4 Target Optimal pdf: proposals π(x 1:k ) given = 1 by: Z πk φ 1 (x 1 ) k s=2 φ s(x s )Ψ s (x s 1, x s ) q t (x t x t 1 ) = φ t (x t )ψ t (x t 1, x t ) Proposal pdf: q { k (x k x k 1 ) = φ k (x k )Ψ d d }{ k (x k 1, x k ) d } Weight: = Z qk G t,k (x k 1 (x t,k )) = φm(x k t,k 1 k )Ψ k,(x x t,k k 1 ), x k )dxψ(x k t 1,k, x t,k ) k=1 k=2 k=1
10 Fully Adapted SMC (3) Algorithm 2: - Select one sequence from {X1:k 1 i }N i=1 with probability Z qk (X1:k 1 proportional to i ) N i=1 Zq k (X 1:k 1 i ), denoted as X j 1:k 1 ; - Draw X i k from q k( X j 1:k 1 ) and let X i 1:k = (X j 1:k 1, X i k ) Repeat above algorithm N times, we obtain samples {X1:k i }N i=1, and obtain π k (x 1:k ) 1 N δ N X i (x 1:k ) 1:k i=1 However, exact computation of Z qk and sampling from q k ( X j 1:k 1 ) are often impossible in practice.
11 Outline 1 Introduction 2 Review of Sequential Monte Carlo (SMC) 3 Nested SMC 4 Nesting of Nested SMC 5 Experiments
12 Nested SMC (1) Relaxing the exact computation and sampling requirements in fully adapted SMC... Definition 1 (Properly weighted sample) Let q(x) = 1 Z q q(x). A (random) pair (X, W ) R R + is properly weighted w.r.t. q( ) if E (X,W ) [f (X)W ] = Z q E q [f (x)] for all measurable functions f (x) The exact pair (X, W ) with X q(x) and W = Z q is a special case of properly weighted samples.
13 Nested SMC (2) (A1) Let Q be a class, and let q =Q(q, M). Assume that: i) The construction of q returns a member variable Ẑ q = q.getz(); ii) Q has a member function Simulate( ) which returns a (possibly random) variable X = q.simulate() iii) (X, Ẑq) is properly weighted w.r.t. q()
14 Nested SMC (3) Replace the exact Z q and X in fully adapted SMC with q.getz() and q.simulate() Algorithm 3: - Initialize q i = Q(q k ( X1:k 1 i ), M) for i = 1, 2,, N - Set Ẑ i q k = q i.getz() for i = 1, 2,, N - Repeat N times - Select one element from {1, 2, s,, N} with Ẑ probabilities s q k N ; denote the selected index as j s=1 Ẑ q s k - Draw X i k = qj.simulate() let X i 1:k = (X j 1:k 1, X i k )
15 Nested SMC (4) Theorem 1 Assume Q satisfies condition (A1). Then, the generated samples from nested SMC satisfies ( ) N 1/2 1 N f (X1:k i N ) π D k(f ) N (0, Σ M k (f )), D i=1 where means converges in distribution. As long as (q.getz, q.simulate()) is properly weighted, the expectation estimated from nested SMC converges to the exact expectation π k (f ) as N increases
16 Outline 1 Introduction 2 Review of Sequential Monte Carlo (SMC) 3 Nested SMC 4 Nesting of Nested SMC 5 Experiments
17 Nested SMC Nesting of SMC (1) Christian A. Naesseth August 31, D MRF Nested SMC implementation (I/III) x1 x2 x3 x4 x5 x6 x4,1 x4,2 x4,3 Optimal proposals given by: { Z πk can be estimated as: Ẑπ k = Ẑπ k 1 1 } N N i=1 Ẑ q i k, where Ẑ i q k = q i.getz(). Theorem 2 q t(x t x t 1) = φ t(x t)ψ t(x t 1, x t) { d d }{ d } = G t,k(x t,k) m(x t,k 1, x t,k) ψ(x t 1,k, x t,k) k=1 k=2 The pair (X1:k i, Ẑ π i k ) is properly weighted w.r.t. π k ( ), in which X1:k i is drawn with Algorithm 3. x4,4 Implication: using nested SMC, properly weighted samples w.r.t. 2D MRF π k ( ) can be obtained from the properly weighted samples w.r.t. 1D MRF q k ( ) k=1
18 Nested SMC Nesting of Nested (2) Christian A. Naesseth August 31, D MRF Nested SMC implementation (I/III) x1 x2 x3 x4 x5 x6 x4,1 Nested Sequential x4,2 Monte Carlo Methods x4,3 x4,4 Christian A. Naesseth Optimal proposals given by: (q i Linköping.Simulate, q i University, Linköping, Sweden.GetZ) qt(xt xt 1) is= φt(xt)ψt(xt 1, properly xt) weighted w.r.t. 1D MRF q( ) Fredrik Lindsten { d d }{ d } FREDRIK.LINDSTEN@ENG.CAM.AC.UK The University of Cambridge, = Gt,k(xt,k) Cambridge, m(xt,k 1, United xt,k) Kingdomψ(xt 1,k, xt,k) k=1 k=2 k=1 (X1:k i, Ẑ π i k ) Thomas is properly B. Schön THOMAS.SCHON@IT.UU.SE weighted w.r.t. 2D MRF π( ) Uppsala University, Uppsala, Sweden (X1:k i, Ẑ π i k ) is properlyabstract weighted w.r.t. We propose nested sequential Monte Carlo 2D MRF π( ) (NSMC), a methodology to sample from sequences of probability distributions, even where the random variables are high-dimensional. Draw samples NSMCfrom generalises 3D the SMC MRF framework by requiring only approximate, properly weighted, samples from the SMC proposal distribution, while still resulting in a correct SMC algorithm. Furthermore, NSMC can in itself be used to produce such properly weighted samples. Consequently, one NSMC sampler can be used to construct an efficient high-dimensional proposal distribution for another NSMC sampler, and this k 1 k k Conclusion: One nested SMC sampler can be used as the proposal distribution for another nested SMC tarteting at higher dimensional distributions for some sequence of probability densities Figure 1. Example of a spatio-temporal model where πk(x1:k) is given by a k 2 3 undirected graphical model and xk R 2 3.
19 Outline 1 Introduction 2 Review of Sequential Monte Carlo (SMC) 3 Nested SMC 4 Nesting of Nested SMC 5 Experiments
20 Nested SMC Christian A. Naesseth August 31, Experiments (1) Particle filters in high dimension Known to perform poorly in high (say, d 10) dimensions. 1) Gaussian State Space Model ex) Spatio-temporal model: g(y t x t ) = d k=1 g(y t,k x t,k ). X 1 X 2 X 3 X 4 X 5 X 6 f(x t x t 1 ) is typically an extremely bad proposal Figure: Gaussian state space model in form of 2D MRF of size d t distribution in HD. Does a better proposal distribution improve our result? The transition and measurement pdfs are all Gaussian Two-level Nested SMC
21 Experiments (2) Nested Sequential Monte Carlo Methods d = 50 d = 100 d = ESS NSMC NSMC NSMC ST-PF ST-PF 100 ST-PF Bootstrap k k k ure 2. Median (over dimension) ESS (4) and 15 85% percentiles (shaded region). The results are based on 100 independent run GaussianFigure: MRF with dimension Mediand. effective sample size (ESS) and 15% 85% percentiles. N = 500 and M = 2d with 100 independent runs. x k x k 1, y k ) is not explicitly used. simulate data from this model for k = 1,..., 100 for ferent values of d = dim(x k ) {50, 100, 200}. ( The act filtering marginals are computed using the Kalman er. We compare with both theess(x ST-PF and k,l standard ) (bootap) E PF. e results are evaluated based on the effective sample size SS, see e.g. Fearnhead et al. (2010b)) defined as, tails of the model are given. The transition pro bility p(x k x k 1 ) is a localised Gaussian mixture the measurement ]) probability p(y k x k ) is t-distribu The model dimension 1 is d = Beskos et (2014a) report improvements for ST-PF over both the b strap σ PF and the block PF by Rebeschini & van H del (2015). k,l 2 We use N = M = 100 for both ST and NSMC (the special structure of this model imp that there is no significant computational overhead f [ (ˆx k,l µ k,l ) 2
22 C (the 2) Non-Gaussian special structure State Space of this Model model implies - The transition pdf p(x k x k 1 ) is Gaussian mixture - The measurement pdf p(y k x k ) is t-distribution e is no significant computational overhead from backward ) and the 30 p PF is 25 = re 3 we 15 e ESS (4), report improvements for ST-PF over both the bootand the block PF by Rebeschini & van Han- Experiments (3) 5). We use N = M = 100 for both ST-PF d accord- Carpenter 999). The the bootis close ESS 10 5 NSMC ST-PF Bootstrap 100 k Figure: Median ESS and 15% 85% percentiles.
23 Nested Sequential Monte Carlo Methods Introduction Review of Sequential Monte Carlo (SMC) Nested SMC Nesting of Nested SMC Experiments Experiments (4) Linköping, Sweden bridge, Cambridge, United Kingdom ppsala, Sweden Abstract 3) Spatio-Temporal Model-Drought Detection ted sequential Monte Carlo odology to sample from sebility distributions, even where iables are high-dimensional. s the SMC framework by reroximate, properly weighted, e SMC proposal distribution, g in a correct SMC algorithm. C can in itself be used to proly weighted samples. Conse- C sampler can be used to conhigh-dimensional proposal disther NSMC sampler, and this orithm can be done to an arbiallows us to consider complex onal models using SMC. We motivate the efficacy of our apfiltering problems with dimenof 100 to k 1 k k + 1 Figure 1. Example of a spatio-temporal model where πk(x1:k) is given by a k 2 3 undirected graphical model and xk R Hidden states: 0 (normal) or 1 (drought) at different locations and for some years sequence of probability densities π k(x 1:k) = Z 1 πk - Measurements: precipitation πk(x1:k), k 1, (2) with normalisation constants Z πk = π k(x 1:k)dx 1:k. Note that x 1:k := (x 1,..., x k) X k. The typical scenario that we consider is the well-known problem of inference in time series or state space models (Shumway & Stoffer, 2011; Cappé et al., 2005). Here the index k corresponds to time and we want to process some observations y 1:k in a
24 Nr of drought location Introduction Review of Sequential Monte Carlo (SMC) Nested SMC Experiments (5) Nesting of Nested SMC Experiments p( Nested Sequential Monte Carlo Methods 1920 Year 1930 p( p( North America region N 60 N Nr of drought locations Nr of drought locations N 40 N W W 0.4 p(x = 1) > 0.5 p(x = 1) > p(x = 1) > Year 1930 p(x = 1) > 0.9 p(x = 1) > 0.7 p(x = 1) > N Year North America region 60 N 40 N Sahel region 110 W N W America 1939 # of drought locations EstimateNorth of p(x k,i = 1) for locations of North America in 1939 of North Americawith in 1939 Figure 4. Top: Number of locations estimated p(x N W 140 W W 80 W North America 1939 for all sites over a span of 3 years. All results for N = W N N N N N N ornorth in America an abnormal state 1North (drought) America 1941 Measurem 110 W 80 W W 80 W 0.0
Nested Sequential Monte Carlo Methods
Technical report, arxiv Nested Sequential Monte Carlo Methods Christian A. Naesseth, Fredri Lindsten and Thomas Schön Please cite this version: Christian A. Naesseth, Fredri Lindsten and Thomas Schön.
More information1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015.
Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy
More information1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015.
Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy
More informationPARTICLE FILTERS WITH INDEPENDENT RESAMPLING
PARTICLE FILTERS WITH INDEPENDENT RESAMPLING Roland Lamberti 1, Yohan Petetin 1, François Septier, François Desbouvries 1 (1) Samovar, Telecom Sudparis, CNRS, Université Paris-Saclay, 9 rue Charles Fourier,
More informationAuxiliary Particle Methods
Auxiliary Particle Methods Perspectives & Applications Adam M. Johansen 1 adam.johansen@bristol.ac.uk Oxford University Man Institute 29th May 2008 1 Collaborators include: Arnaud Doucet, Nick Whiteley
More informationSequential Monte Carlo methods for system identification
Technical report arxiv:1503.06058v3 [stat.co] 10 Mar 2016 Sequential Monte Carlo methods for system identification Thomas B. Schön, Fredrik Lindsten, Johan Dahlin, Johan Wågberg, Christian A. Naesseth,
More informationAn introduction to Sequential Monte Carlo
An introduction to Sequential Monte Carlo Thang Bui Jes Frellsen Department of Engineering University of Cambridge Research and Communication Club 6 February 2014 1 Sequential Monte Carlo (SMC) methods
More informationSequential Monte Carlo in the machine learning toolbox
Sequential Monte Carlo in the machine learning toolbox Working with the trend of blending Thomas Schön Uppsala University Sweden. Symposium on Advances in Approximate Bayesian Inference (AABI) Montréal,
More informationAn introduction to particle filters
An introduction to particle filters Andreas Svensson Department of Information Technology Uppsala University June 10, 2014 June 10, 2014, 1 / 16 Andreas Svensson - An introduction to particle filters Outline
More informationA Note on Auxiliary Particle Filters
A Note on Auxiliary Particle Filters Adam M. Johansen a,, Arnaud Doucet b a Department of Mathematics, University of Bristol, UK b Departments of Statistics & Computer Science, University of British Columbia,
More informationExercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters
Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for
More informationLearning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution
Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution Andreas Svensson, Thomas B. Schön, and Fredrik Lindsten Department of Information Technology,
More informationDivide-and-Conquer with Sequential Monte Carlo
Divide-and-Conquer with Sequential Monte Carlo F. Lindsten,3, A. M. Johansen 2, C. A. Naesseth 3, B. Kirkpatrick 4, T. B. Schön 5, J. A. D. Aston, and A. Bouchard-Côté 6 arxiv:46.4993v2 [stat.co] 3 Jun
More informationL09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms
L09. PARTICLE FILTERING NA568 Mobile Robotics: Methods & Algorithms Particle Filters Different approach to state estimation Instead of parametric description of state (and uncertainty), use a set of state
More informationGeneralized Multiple Importance Sampling
Generalized Multiple Importance Sampling Víctor Elvira, Luca Martino, David Luengo 3, and Mónica F Bugallo 4 Télécom Lille France, Universidad de Valencia Spain, 3 Universidad Politécnica de Madrid Spain,
More informationSequential Monte Carlo Methods for Bayesian Computation
Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter
More informationSequential Monte Carlo Samplers for Applications in High Dimensions
Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex
More informationSequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007
Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 Importance Sampling Recall: Let s say that we want to compute some expectation (integral) E p [f] = p(x)f(x)dx and we remember
More informationThe Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision
The Particle Filter Non-parametric implementation of Bayes filter Represents the belief (posterior) random state samples. by a set of This representation is approximate. Can represent distributions that
More informationSampling Methods (11/30/04)
CS281A/Stat241A: Statistical Learning Theory Sampling Methods (11/30/04) Lecturer: Michael I. Jordan Scribe: Jaspal S. Sandhu 1 Gibbs Sampling Figure 1: Undirected and directed graphs, respectively, with
More informationAvailable online at ScienceDirect. IFAC PapersOnLine (2018)
Available online at www.sciencedirect.com ScienceDirect IFAC PapersOnLine 51-15 (218) 67 675 Improving the particle filter in high dimensions using conjugate artificial process noise Anna Wigren Lawrence
More informationTSRT14: Sensor Fusion Lecture 8
TSRT14: Sensor Fusion Lecture 8 Particle filter theory Marginalized particle filter Gustaf Hendeby gustaf.hendeby@liu.se TSRT14 Lecture 8 Gustaf Hendeby Spring 2018 1 / 25 Le 8: particle filter theory,
More informationSequential Monte Carlo for Graphical Models
Sequential Monte Carlo for Graphical Models Christian A. Naesseth Div. of Automatic Control Linköping University Linköping, Sweden chran60@isy.liu.se Fredrik Lindsten Dept. of Engineering The University
More informationParticle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007
Particle Filtering a brief introductory tutorial Frank Wood Gatsby, August 2007 Problem: Target Tracking A ballistic projectile has been launched in our direction and may or may not land near enough to
More informationComputer Intensive Methods in Mathematical Statistics
Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 7 Sequential Monte Carlo methods III 7 April 2017 Computer Intensive Methods (1) Plan of today s lecture
More informationDivide-and-Conquer Sequential Monte Carlo
Divide-and-Conquer Joint work with: John Aston, Alexandre Bouchard-Côté, Brent Kirkpatrick, Fredrik Lindsten, Christian Næsseth, Thomas Schön University of Warwick a.m.johansen@warwick.ac.uk http://go.warwick.ac.uk/amjohansen/talks/
More informationBayesian Machine Learning - Lecture 7
Bayesian Machine Learning - Lecture 7 Guido Sanguinetti Institute for Adaptive and Neural Computation School of Informatics University of Edinburgh gsanguin@inf.ed.ac.uk March 4, 2015 Today s lecture 1
More informationAn Brief Overview of Particle Filtering
1 An Brief Overview of Particle Filtering Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ May 11th, 2010 Warwick University Centre for Systems
More informationControlled sequential Monte Carlo
Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation
More informationApproximate Bayesian inference
Approximate Bayesian inference Variational and Monte Carlo methods Christian A. Naesseth 1 Exchange rate data 0 20 40 60 80 100 120 Month Image data 2 1 Bayesian inference 2 Variational inference 3 Stochastic
More informationGaussian Process Approximations of Stochastic Differential Equations
Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Dan Cawford Manfred Opper John Shawe-Taylor May, 2006 1 Introduction Some of the most complex models routinely run
More informationMonte Carlo Approximation of Monte Carlo Filters
Monte Carlo Approximation of Monte Carlo Filters Adam M. Johansen et al. Collaborators Include: Arnaud Doucet, Axel Finke, Anthony Lee, Nick Whiteley 7th January 2014 Context & Outline Filtering in State-Space
More informationCS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling
CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling Professor Erik Sudderth Brown University Computer Science October 27, 2016 Some figures and materials courtesy
More informationThe Hierarchical Particle Filter
and Arnaud Doucet http://go.warwick.ac.uk/amjohansen/talks MCMSki V Lenzerheide 7th January 2016 Context & Outline Filtering in State-Space Models: SIR Particle Filters [GSS93] Block-Sampling Particle
More informationSensor Fusion: Particle Filter
Sensor Fusion: Particle Filter By: Gordana Stojceska stojcesk@in.tum.de Outline Motivation Applications Fundamentals Tracking People Advantages and disadvantages Summary June 05 JASS '05, St.Petersburg,
More informationPATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter
More informationAn Introduction to Sequential Monte Carlo for Filtering and Smoothing
An Introduction to Sequential Monte Carlo for Filtering and Smoothing Olivier Cappé LTCI, TELECOM ParisTech & CNRS http://perso.telecom-paristech.fr/ cappe/ Acknowlegdment: Eric Moulines (TELECOM ParisTech)
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation
More informationParameter Estimation in a Moving Horizon Perspective
Parameter Estimation in a Moving Horizon Perspective State and Parameter Estimation in Dynamical Systems Reglerteknik, ISY, Linköpings Universitet State and Parameter Estimation in Dynamical Systems OUTLINE
More informationThe aim Part 3 2(58)
The aim Part (8 Part - onlinear state inference using sequential Monte Carlo The aim in part is to introduce the particle filter and the particle smoother. Division of Automatic Control Linköping University
More informationMarkov Chain Monte Carlo Methods for Stochastic Optimization
Markov Chain Monte Carlo Methods for Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U of Toronto, MIE,
More informationFUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS
FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS Gustaf Hendeby Fredrik Gustafsson Division of Automatic Control Department of Electrical Engineering, Linköpings universitet, SE-58 83 Linköping,
More informationParticle Filtering Approaches for Dynamic Stochastic Optimization
Particle Filtering Approaches for Dynamic Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge I-Sim Workshop,
More informationIntroduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization
Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Kai Arras 1 Motivation Recall: Discrete filter Discretize the
More informationMonte Carlo Methods. Leon Gu CSD, CMU
Monte Carlo Methods Leon Gu CSD, CMU Approximate Inference EM: y-observed variables; x-hidden variables; θ-parameters; E-step: q(x) = p(x y, θ t 1 ) M-step: θ t = arg max E q(x) [log p(y, x θ)] θ Monte
More informationRao-Blackwellised particle smoothers for mixed linear/nonlinear state-space models
Technical report from Automatic Control at Linköpings universitet Rao-Blackwellised particle smoothers for mixed linear/nonlinear state-space models Fredrik Lindsten, Thomas B. Schön Division of Automatic
More informationVehicle Motion Estimation Using an Infrared Camera an Industrial Paper
Vehicle Motion Estimation Using an Infrared Camera an Industrial Paper Emil Nilsson, Christian Lundquist +, Thomas B. Schön +, David Forslund and Jacob Roll * Autoliv Electronics AB, Linköping, Sweden.
More informationA Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization
A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization and Timothy D. Barfoot CRV 2 Outline Background Objective Experimental Setup Results Discussion Conclusion 2 Outline
More informationMarkov Chain Monte Carlo Methods for Stochastic
Markov Chain Monte Carlo Methods for Stochastic Optimization i John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U Florida, Nov 2013
More informationGraphical model inference: Sequential Monte Carlo meets deterministic approximations
Graphical model inference: Sequential Monte Carlo meets deterministic approximations Fredrik Lindsten Department of Information Technology Uppsala University Uppsala, Sweden fredrik.lindsten@it.uu.se Jouni
More informationLecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations
Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations Department of Biomedical Engineering and Computational Science Aalto University April 28, 2010 Contents 1 Multiple Model
More informationAdversarial Sequential Monte Carlo
Adversarial Sequential Monte Carlo Kira Kempinska Department of Security and Crime Science University College London London, WC1E 6BT kira.kowalska.13@ucl.ac.uk John Shawe-Taylor Department of Computer
More informationAUTOMOTIVE ENVIRONMENT SENSORS
AUTOMOTIVE ENVIRONMENT SENSORS Lecture 5. Localization BME KÖZLEKEDÉSMÉRNÖKI ÉS JÁRMŰMÉRNÖKI KAR 32708-2/2017/INTFIN SZÁMÚ EMMI ÁLTAL TÁMOGATOTT TANANYAG Related concepts Concepts related to vehicles moving
More informationComputer Vision Group Prof. Daniel Cremers. 14. Sampling Methods
Prof. Daniel Cremers 14. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric
More informationParticle Filters: Convergence Results and High Dimensions
Particle Filters: Convergence Results and High Dimensions Mark Coates mark.coates@mcgill.ca McGill University Department of Electrical and Computer Engineering Montreal, Quebec, Canada Bellairs 2012 Outline
More informationAnswers and expectations
Answers and expectations For a function f(x) and distribution P(x), the expectation of f with respect to P is The expectation is the average of f, when x is drawn from the probability distribution P E
More informationPar$cle Filters Part I: Theory. Peter Jan van Leeuwen Data- Assimila$on Research Centre DARC University of Reading
Par$cle Filters Part I: Theory Peter Jan van Leeuwen Data- Assimila$on Research Centre DARC University of Reading Reading July 2013 Why Data Assimila$on Predic$on Model improvement: - Parameter es$ma$on
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is
More informationLatent state estimation using control theory
Latent state estimation using control theory Bert Kappen SNN Donders Institute, Radboud University, Nijmegen Gatsby Unit, UCL London August 3, 7 with Hans Christian Ruiz Bert Kappen Smoothing problem Given
More informationComputer Intensive Methods in Mathematical Statistics
Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of
More informationSMC 2 : an efficient algorithm for sequential analysis of state-space models
SMC 2 : an efficient algorithm for sequential analysis of state-space models N. CHOPIN 1, P.E. JACOB 2, & O. PAPASPILIOPOULOS 3 1 ENSAE-CREST 2 CREST & Université Paris Dauphine, 3 Universitat Pompeu Fabra
More informationAdvanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering
Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Axel Gandy Department of Mathematics Imperial College London http://www2.imperial.ac.uk/~agandy London
More informationNon-Factorised Variational Inference in Dynamical Systems
st Symposium on Advances in Approximate Bayesian Inference, 08 6 Non-Factorised Variational Inference in Dynamical Systems Alessandro D. Ialongo University of Cambridge and Max Planck Institute for Intelligent
More informationTarget Tracking and Classification using Collaborative Sensor Networks
Target Tracking and Classification using Collaborative Sensor Networks Xiaodong Wang Department of Electrical Engineering Columbia University p.1/3 Talk Outline Background on distributed wireless sensor
More informationProbabilistic Graphical Models Lecture 17: Markov chain Monte Carlo
Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo Andrew Gordon Wilson www.cs.cmu.edu/~andrewgw Carnegie Mellon University March 18, 2015 1 / 45 Resources and Attribution Image credits,
More informationTowards a Bayesian model for Cyber Security
Towards a Bayesian model for Cyber Security Mark Briers (mbriers@turing.ac.uk) Joint work with Henry Clausen and Prof. Niall Adams (Imperial College London) 27 September 2017 The Alan Turing Institute
More informationParticle Filtering for Data-Driven Simulation and Optimization
Particle Filtering for Data-Driven Simulation and Optimization John R. Birge The University of Chicago Booth School of Business Includes joint work with Nicholas Polson. JRBirge INFORMS Phoenix, October
More informationIntroduction. log p θ (y k y 1:k 1 ), k=1
ESAIM: PROCEEDINGS, September 2007, Vol.19, 115-120 Christophe Andrieu & Dan Crisan, Editors DOI: 10.1051/proc:071915 PARTICLE FILTER-BASED APPROXIMATE MAXIMUM LIKELIHOOD INFERENCE ASYMPTOTICS IN STATE-SPACE
More informationIntegrated Non-Factorized Variational Inference
Integrated Non-Factorized Variational Inference Shaobo Han, Xuejun Liao and Lawrence Carin Duke University February 27, 2014 S. Han et al. Integrated Non-Factorized Variational Inference February 27, 2014
More informationarxiv: v3 [stat.co] 27 Nov 2014
arxiv:146.3183v3 [stat.co] 27 Nov 214 Approximations of the Optimal Importance Density using Gaussian Particle Flow Importance Sampling Pete Bunch and Simon Godsill Abstract Recently developed particle
More informationAn efficient stochastic approximation EM algorithm using conditional particle filters
An efficient stochastic approximation EM algorithm using conditional particle filters Fredrik Lindsten Linköping University Post Print N.B.: When citing this work, cite the original article. Original Publication:
More informationKernel adaptive Sequential Monte Carlo
Kernel adaptive Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) December 7, 2015 1 / 36 Section 1 Outline
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing
More informationThe Unscented Particle Filter
The Unscented Particle Filter Rudolph van der Merwe (OGI) Nando de Freitas (UC Bereley) Arnaud Doucet (Cambridge University) Eric Wan (OGI) Outline Optimal Estimation & Filtering Optimal Recursive Bayesian
More informationThe Kalman Filter ImPr Talk
The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman
More informationLecture 6: Bayesian Inference in SDE Models
Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs
More informationLecture Particle Filters
FMS161/MASM18 Financial Statistics November 29, 2010 Monte Carlo filters The filter recursions could only be solved for HMMs and for linear, Gaussian models. Idea: Approximate any model with a HMM. Replace
More informationMarkov Chain Monte Carlo (MCMC)
School of Computer Science 10-708 Probabilistic Graphical Models Markov Chain Monte Carlo (MCMC) Readings: MacKay Ch. 29 Jordan Ch. 21 Matt Gormley Lecture 16 March 14, 2016 1 Homework 2 Housekeeping Due
More informationComputer Vision Group Prof. Daniel Cremers. 11. Sampling Methods
Prof. Daniel Cremers 11. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric
More informationComputer Vision Group Prof. Daniel Cremers. 11. Sampling Methods: Markov Chain Monte Carlo
Group Prof. Daniel Cremers 11. Sampling Methods: Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative
More informationIN particle filter (PF) applications, knowledge of the computational
Complexity Analysis of the Marginalized Particle Filter Rickard Karlsson, Thomas Schön and Fredrik Gustafsson, Member IEEE Abstract In this paper the computational complexity of the marginalized particle
More informationLayered Adaptive Importance Sampling
Noname manuscript No (will be inserted by the editor) Layered Adaptive Importance Sampling L Martino V Elvira D Luengo J Corander Received: date / Accepted: date Abstract Monte Carlo methods represent
More informationExpectation propagation for signal detection in flat-fading channels
Expectation propagation for signal detection in flat-fading channels Yuan Qi MIT Media Lab Cambridge, MA, 02139 USA yuanqi@media.mit.edu Thomas Minka CMU Statistics Department Pittsburgh, PA 15213 USA
More informationSystem identification and sensor fusion in dynamical systems. Thomas Schön Division of Systems and Control, Uppsala University, Sweden.
System identification and sensor fusion in dynamical systems Thomas Schön Division of Systems and Control, Uppsala University, Sweden. The system identification and sensor fusion problem Inertial sensors
More information19 : Slice Sampling and HMC
10-708: Probabilistic Graphical Models 10-708, Spring 2018 19 : Slice Sampling and HMC Lecturer: Kayhan Batmanghelich Scribes: Boxiang Lyu 1 MCMC (Auxiliary Variables Methods) In inference, we are often
More informationLearning of dynamical systems
Learning of dynamical systems Particle filters and Markov chain methods Thomas B. Schön and Fredrik Lindsten c Draft date August 23, 2017 2 Contents 1 Introduction 3 1.1 A few words for readers of the
More informationNonlinear Filtering. With Polynomial Chaos. Raktim Bhattacharya. Aerospace Engineering, Texas A&M University uq.tamu.edu
Nonlinear Filtering With Polynomial Chaos Raktim Bhattacharya Aerospace Engineering, Texas A&M University uq.tamu.edu Nonlinear Filtering with PC Problem Setup. Dynamics: ẋ = f(x, ) Sensor Model: ỹ = h(x)
More informationSAMPLING ALGORITHMS. In general. Inference in Bayesian models
SAMPLING ALGORITHMS SAMPLING ALGORITHMS In general A sampling algorithm is an algorithm that outputs samples x 1, x 2,... from a given distribution P or density p. Sampling algorithms can for example be
More informationNONLINEAR STATISTICAL SIGNAL PROCESSING: A PARTI- CLE FILTERING APPROACH
NONLINEAR STATISTICAL SIGNAL PROCESSING: A PARTI- CLE FILTERING APPROACH J. V. Candy (tsoftware@aol.com) University of California, Lawrence Livermore National Lab. & Santa Barbara Livermore CA 94551 USA
More informationCalibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods
Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Jonas Hallgren 1 1 Department of Mathematics KTH Royal Institute of Technology Stockholm, Sweden BFS 2012 June
More informationComputer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo
Group Prof. Daniel Cremers 10a. Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative is Markov Chain
More informationState-Space Methods for Inferring Spike Trains from Calcium Imaging
State-Space Methods for Inferring Spike Trains from Calcium Imaging Joshua Vogelstein Johns Hopkins April 23, 2009 Joshua Vogelstein (Johns Hopkins) State-Space Calcium Imaging April 23, 2009 1 / 78 Outline
More informationLecture Particle Filters. Magnus Wiktorsson
Lecture Particle Filters Magnus Wiktorsson Monte Carlo filters The filter recursions could only be solved for HMMs and for linear, Gaussian models. Idea: Approximate any model with a HMM. Replace p(x)
More information17 : Markov Chain Monte Carlo
10-708: Probabilistic Graphical Models, Spring 2015 17 : Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Heran Lin, Bin Deng, Yun Huang 1 Review of Monte Carlo Methods 1.1 Overview Monte Carlo
More informationTerrain Navigation Using the Ambient Magnetic Field as a Map
Terrain Navigation Using the Ambient Magnetic Field as a Map Aalto University IndoorAtlas Ltd. August 30, 017 In collaboration with M. Kok, N. Wahlström, T. B. Schön, J. Kannala, E. Rahtu, and S. Särkkä
More informationMCMC and Gibbs Sampling. Kayhan Batmanghelich
MCMC and Gibbs Sampling Kayhan Batmanghelich 1 Approaches to inference l Exact inference algorithms l l l The elimination algorithm Message-passing algorithm (sum-product, belief propagation) The junction
More informationQuantitative Biology II Lecture 4: Variational Methods
10 th March 2015 Quantitative Biology II Lecture 4: Variational Methods Gurinder Singh Mickey Atwal Center for Quantitative Biology Cold Spring Harbor Laboratory Image credit: Mike West Summary Approximate
More informationAdaptive Monte Carlo methods
Adaptive Monte Carlo methods Jean-Michel Marin Projet Select, INRIA Futurs, Université Paris-Sud joint with Randal Douc (École Polytechnique), Arnaud Guillin (Université de Marseille) and Christian Robert
More informationComputer Intensive Methods in Mathematical Statistics
Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 5 Sequential Monte Carlo methods I 31 March 2017 Computer Intensive Methods (1) Plan of today s lecture
More information27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling
10-708: Probabilistic Graphical Models 10-708, Spring 2014 27 : Distributed Monte Carlo Markov Chain Lecturer: Eric P. Xing Scribes: Pengtao Xie, Khoa Luu In this scribe, we are going to review the Parallel
More information