The aim Part 3 2(58)

Size: px
Start display at page:

Download "The aim Part 3 2(58)"

Transcription

1 The aim Part (8 Part - onlinear state inference using sequential Monte Carlo The aim in part is to introduce the particle filter and the particle smoother. Division of Automatic Control Linköping University Sweden This will be done by first introducing the importance sampler and then show how this foundation can be used to derive a first working particle filter. Outline (8 Rejection sampling (I/VII 4(8. Basic sampling methods a Rejection sampling b Importance sampling. Particle filtering. Particle smoothing a Forward filtering backward smoothing (FFBSm b Forward filtering backward simulation (FFBSi Rejection sampling is a Monte Carlo method that produce i.i.d. samples from a target distribution π(z = π(z C π, where π(z can be evaluated and C π is a normalization constant. Key idea: Generate random numbers uniformly from the area under the graph of the target distribution π(z. Just as hard as the original problem, but what if...

2 Rejection sampling (II/VII (8 Rejection sampling (III/VII 6(8 Assumptions: z Bq( z π( z ubq( z Generate a sample z from a proposal distribution q(z and a sample u U[, ]. The sample z is then an i.i.d. sample from the target if u π( z Bq( z. It is easy to sample from q(z.. There exists a constant B such that π(z Bq(z, z Z.. The support of q(z includes the support of π(z, i.e., q(z > when π(z >. Algorithm Rejection sampling (RS. Sample z q(z.. Sample u U[, ].. If u π( z accept z as a sample from π(z and go to. Bq( z 4. Otherwise, reject z and go to. Rejection sampling (IV/VII 7(8 The procedure can be used with multivariate densities in the same way. The rejection rate depends on B, choose B as small as possible, while still satisfying π(z Bq(z, z Z. Choosing a good proposal distribution q(z is very important. Rejection sampling is used to construct fast particle smoothers via backward simulation. Rejection sampling (V/VII 8(8 Task: Generate i.i.d. samples from the following distribution, π(z = ( e z sin(6z + cos(z sin(4z +, C π Bq( z π( z Solution: Use rejection sampling where q(z = (z, and B =. ubq( z z

3 Rejection sampling (VI/VII 9(8 Rejection sampling (VII/VII ( (a samples 4 4 (b samples 4 4 (c samples 4 4 (d Showing rejected (red and accepted (blue samples Importance sampling (I/VIII (8 Algorithm Importance sampler (IS. Generate i.i.d. samples {z i } i= from the proposal distribution q(z.. Compute the importance weights w i = π(zi q(z i, i =,...,.. ormalize the importance weights w i = wi j= wj, i =,...,. Importance sampling (II/VIII (8 An alternative interpretation of IS IS does not provide samples from the target distribution, but the samples {z i } i= together with the normalised weights {wi } i= provides an empirical approximation of the target distribution, π(z = i= w i δ z i(z. When this approximation is inserted into I(g(z = g(zπ(zdz the resulting estimate is Î (g(z = i= w(z i g(z i.

4 Importance sampling (III/VIII (8 Importance sampling (IV/VIII 4(8 Let us revisit the same problem (scalar LGSS used in illustrating the EM algorithm, x t+ = θx t + v t, y t = x t + e t, ( vt e t (( (.,.. p(x = (x,.. The true parameter value for θ is given by θ =.9. We use p(θ = ( θ µ θ, σ θ as prior distribution for θ. The identification problem is now to determine the parameter θ on the basis of the observations y :T and the above model, using the IS algorithm. The result will be an estimate of the posterior distribution p(θ y :T. The importance sampler will target π(θ = p(θ y :T = p(y :T θp(θ p(y :T p(y :T θp(θ. Chose the proposal distribution to be the same as the prior, ( q(θ = θ µ θ, σθ. The importance weights are then computed according to i.e., the likelihood. w i = π(θi q(θ i = p(y :T θ i, i =,...,, Importance sampling (V/VIII (8 Importance sampling (VI/VIII 6(8 The Kalman filter straightforwardly allows us to evaluate the importance weights w i = p(y :T θ i, p(y :T θ i = T t= p(y t y :t, θ i = ŷ t t (θ i =. x t t (θ i, S t t (θ i =. P t t (θ i +., T t= ( y t ŷ t t (θ i, S t t (θ i, Algorithm Importance sampler targeting p(θ y :T. Generate i.i.d. samples from the proposal distribution, i.e. θ i ( θ µ θ, σθ for i =,...,.. Compute importance weights w i = p(y :T θ i for i =,...,.. ormalize the importance weights w i = wi for i =,...,. j= wj where x t t (θ i and P t t (θ i are provided by the Kalman filter.

5 Importance sampling (VII/VIII 7(8 Importance sampling (VIII/VIII 8(8 Using T = measurements, y :..6.6 p(θ y:t θ samples, q(θ = ( θ,. p(θ y:t θ samples, q(θ = ( θ,. p(θ y:t θ samples, q(θ = ( θ,. p(θ y:t θ samples, q(θ = ( θ., ote the different proposal distributions used above. The importance of a good proposal density 9(8 Using IS for our purposes (8 A first attempt to solve the nonlinear filtering problem, which amounts to compute p(x t y :t for x t+ x t f (x t+ x t, y t x t h(y t x t, x µ(x. q (x = (, (dashed curve samples used in booth experiments. q (x = (, (dashed curve Lesson learned (again: It is very important to be careful in selecting the importance density. The solution is given by p(x t y :t = h(y t x t p(x t y :t, p(y t y :t p(x t y :t = f (x t x t p(x t y :t dx t. Relevant idea: Try to solve this using importance sampling!!

6 IS algorithm again (8 Sequential sampling from the proposal (8 Algorithm 4 Importance sampler (IS. Generate i.i.d. samples {z i } i= from the proposal distribution q(z.. Compute the importance weights w i = π(zi q(z i, i =,...,.. ormalize the importance weights w i = wi j= wj, i =,...,. We have just motivated the following proposal (which is just one of many possible choices In practice this means: q(x :t = µ(x t k= At time t = we sample x µ(x. f (x k x k At each time k =,..., t we sample x i k f (x k x i k. This completes step one of the importance sampler. What about sequential computation of the importance weights? A sequential importance sampler (SIS (8 Algorithm SIS targeting p(x :t y :t. Generate initial samples x i µ(x, set the importance weights, w i = /, i =,..., and set k =.. for k = to t do (a Compute the importance weights w i k = p(y k x i k wi k. (b ormalize the importance weights w i k = wi k j= wj k and store the new weights { w i :k} i= = { w i :k, k} wi i=. (c Generate i.i.d. samples from the proposal distribution, x i k+ f (x k+ x i k and store the new samples { } x i = { :k+ x i i= :k, } xi k+ i=. SIS example (I/III 4(8 Consider the following LGSS model x t+ =.7x t + v t, v t (,., y t =.x t + e t, e t (,., p(x = (x,., We will now make use of the SIS algorithm to compute an approximation of the filtering distribution p(x t y :t = i= w i tδ x i t (x t. Study Point estimate x t t = x t p(x t y :t dx t = i= wi t xi t. The weights w i t.

7 SIS example (II/III (8 SIS example (III/III 6(8. Use T = samples, realisations of data and =, = and =, respectively x Compare with the true filter distribution (from KF. RMSE( x SIS x KF t t t t Importance weight index 4 Illustration of the problem in the weights importance weight index The importance weights at t = Histograms of the weights for t =,,, and t =, respectively. Very important question: How do we resolve this weight degeneracy problem? Solving the weight degeneracy problem 7(8 Resampling (I/II 8(8 The SIS representation of the target density is Idea: Remove the weights from the representation! This of course leads us to the next question, How? π (z = i= w i δ z i(z. An unweighted representation of the target density can be created by resampling with replacement. This is done by generating a new sample z i for each i =,...,, where ( P z i = z j = w j, j =,...,. The resulting unweighted representation is π (z = i= δ z i(z.

8 Resampling (II/II 9(8 Sampling importance resampling (SIR ( M w i. i=.4... Illustrating how resampling with replacement works (using 7 particles.. Compute the cumulative sum of the weights.. Generate u U[, ]. Algorithm 6 Sampling Importance Resampler (SIR. Generate i.i.d. samples { z i } i= from the proposal q(z.. Compute the importance weights w i = π( zi, i =,...,. q( z i. ormalize the importance weights w i = wi, i =,...,. j= wj 4. For each i =,..., generate a new sample z i, where ( P z i = z j = w j, j =,..., Particle index Three new samples are generated in the figure above, corresponding to sample, 4 and 4. ote that step corresponds to the importance sampler. Revisit the nonlinear filtering problem (8 Algorithm 7 A first particle filter targeting p(x :t y :t. Generate initial samples x i µ(x, set the importance weights, w i = /, i =,..., and set k =.. for k = to t do (a Compute the importance weights w i k = p(y k x i k wi k. (b ormalize the importance weights w i k = wi k j= wj k (c For ( each i =,..., generate a new sample x i t, where P x i t = xj t = w j, j =,...,. (d Generate i.i.d. samples from the proposal distribution, x i k+ f (x k+ x i k. PF example (I/V (8 Consider the same LGSS model used in illustrating the SIS algorithm, x t+ =.7x t + v t, v t (,., y t =.x t + e t, e t (,., p(x = (x,.. We will now make use of SIS + resampling (particle filter to compute an approximation of the filtering distribution p(x t y :t = i= w i tδ xt ( x i t. Study Point estimate x t t = x t p(x t y :t dx t = i= wi t xi t. The weights w i t.

9 PF example (II/V (8 PF example (III/V 4(8 Same setting as before, exactly the same data.. Compare with the true filter distribution (from KF, RMSE(b xpf b xkf t t t t Importance weight index Importance weight index SIS 6 4 PF ote the different scaling! SIS PF PF example (IV/V (8 PF example (V/V 6(8.7 x x.4 x x x x importance weight index 4 importance weight index 4 SIS PF PF ote the different scaling! 6 x SIS

10 An LGSS example (I/II 7(8 Whenever you are working on a nonlinear estimation algorithm, always make sure that it solves the linear special case first. Consider the following LGSS model (simple one dimensional positioning example p t+ T s T s / p t v t+ = T s v t a t+ a t ( t y t = p + e t, v t a t + ( T s /6 T s /T s v t, v t (, Q, e t (, R. The KF provides the true filtering distribution, which implies that we can compare the PF to the truth in this case. An LGSS example (II/II 8(8 ˆp P F ˆp KF (m ˆv KF ˆv KF (m/s (s (s Using particles. ˆp P F ˆp KF (m ˆv KF ˆv KF (m/s (s (s Using particles. The PF estimate converge as the number of particles tends to infinity. Xiao-Li Hu, Thomas B. Schön and Lennart Ljung. A Basic Convergence Result for Particle Filtering. IEEE Transactions on Signal Processing, 6(4:7-48, April 8. [pdf] D. Crisan and A. Doucet, A survey of convergence results on particle filtering methods for practitioners, IEEE Transactions on Signal Processing, vol., no., pp ,. [pdf] A nonlinear example (I/II 9(8 A nonlinear example (II/II 4(8 Consider the following SSM (standard example in PF literature x t+ = x t + x t + x t + 8 cos(.t + v t, v t (,., y t = x t + e t, e t (,.. What it tricky with this model? The best (only? way of really understanding something is to implement it yourself (s True state (gray and PF estimate (black State x 6 PF estimate of the filtering pdf p(x 6 y :6.

11 . Xdin of sensors and a IEEE 8..4 radio with a radio chip and an Ethernet chip. port, serving as a base station for the Beebadges. Figure.. The two main components of the radio network. onlinear example indoor positioning (I/III 4(8. Map onlinear example indoor positioning (I/III Aim: Compute the position of a person moving around indoors using sensors located in an ID badge and a map. 4(8 Decay for different n n=, m= n=, m= n=4, m= Relative probability The sensors (IMU and radio and the Figure.. Beebadge worn by a man. DSP are mounted inside an ID badge. pdf for an office environment, the (a Relative probability density for parts of bright areas are rooms and corridors Xdin s office, the bright areas are rooms and (a A Beebadge, carrying a number (b A coordinator, both (i.e., walkable the bright lines equipped are space. corridors that interconnect The inside of the ID badge.. (a An estimated trajectory at Xdin s of- (b A scenario where the filter hav Position [m]... An estimated trajectory and the converged yet. The spread in hypot fice, particles represented as circles, size section of a circle indicates weight of the is caused by a large coverage for a co (b Cross of the the relative probparticle cloud visualized at a particle. nator. ability function for a line with differparticular instance. ent n Figure 4.. Output from the particle filter. of sensors and a IEEE 8..4 radio with athe radio chip and an Ethernet rooms chip. port, serving as a base station for the Beebadges. Figure.7. Probability interpretation of the map. Figure.. The two main components of the radio network. onlinear example indoor positioning (III/III those would suffice to give a magnitude of the force. The force is intuitively 4(8 directed Sequential (SMCthe target moreand general orthogonallymonte from thecarlo wall towards multiple forces 44(8 can be added together to get a resulting force affecting the momentum of the target. Equation (.9 describes how the force is constructed. The function wallj (p Assume that at time t we have a particle system {xi:t, wit } givencase 4.. Illustration of ai= problematic is a convex function giving the magnitude and Figure direction of the force thewhere a correct trajectory ( an incorrect trajectory approximating the target distribution p(xbeing y:t byaccording to (red, causing the filter to potentially :t starved position of the target, p. fi = Show movies This is work by Johan Kihlberg and Simon Tegelid in their MSc thesis entitled Map Aided Indoor Positioning, which is available for download. [pdf] wi b p(x:tj (p iy,:t where = W is tthel δset x:twalls.. wall xi:t (of i= l= wt jœw ÿ (.9 If positions fromsteps otherintargets are algorithm available, are repellent forces from them can be The three the SMC modeled as well, which is thoroughlyi discussed in []. The {e concept is visualized. Resampling: { x, w } x, / } :t :t t i = i= and another target in Figure.8 where the target Ti is affected by two walls i. in Propose new xi:t for Tm, resulting the force fi.samples: xt+ Qt (xt+ e i =,...,.. Weighting: wit+ = Wt (xit+, e xi:t for i =,...,. SMC is a combination of SIS and resampling. Figure.. Beebadge worn by a man.

12 SMC algorithm 4(8 Algorithm 8 Sequential Monte Carlo (SMC. Initialise by sampling x i setting t =. Q (x, setting w i = W (x i and. for t = to T do (a Resample: {x i :t, wi t } i= { xi :t, /} i=. (b Propose new samples: x i t+ Q t(x t+ x i :t. (c Weighting: w i t+ = W t(x i t+, xi :t. There is a myriad of possible design choices available, some of them carrying their own name. The important thing is to understand the basic principles in action, then the rest is just design choices. SMC reformulated 46(8 otation: w t {w t,..., w t }. Algorithm 9 Sequential Monte Carlo (SMC. Initialise: Sample x i Q (x and set w i = W (x i. Set t =.. For t = : T do: (a Resampling: a i t R(a t w t. (b Sample from the proposal kernel: x i t Q t(x t x ai t :t and set x i :t = {xai t :t, xi t }. (c Weighting: w i t = W t(x i t, xai t :t. Here: a i t to denote the index of the parent/ancestor at time t of particle x i t. SMC as a random number generator 47(8 Just like any algorithm that is used to generate random numbers there is an underlying distribution also for the SMC sampler that encodes the probabilistic properties of the involved stochastic variables. The SMC sampler generates a realisation of the random variables X :T = {X:T,..., X :T } and A :T = {A :T,..., A :T }, where the pdf is { } ψ(x :T, a :T = Q (x i T R(a i t w t Q t (x i t x ai t }{{} i= }{{} t= i= Resampling initialisation and it is defined on the space X T {,..., } (T. :t }{{} Proposing new particles Sampling a state trajectory 48(8 The SMC algorithm produce the following approximation of the target distribution π (x :T = i= W i T δ X i :T(x :T. We can use π (x :T to produce samples of the state trajectory by sampling from [ ] q (x :T = E φ π (x :T = WT i δ X:T(x i :T ψ(x :T, A :T dx :T da :T. i= ote that this integration can never be carried out explicitly. Hence, we cannot evaluate this distribution for a specific sample, but it can be used to generate samples from it.

13 Illustration of the particle degeneracy problem 49(8 Implication of the particle degeneracy problem (8. State... This implies that if we are interested in the smoothing distribution p(x :T y :T or some of its marginals we are forced to use different algorithms, which leads us to particle smoothers.. 4 Particle smoother (8 Algorithm Forward filtering/backward smoothing (FFBSm. Run the PF and store the particles and their weights {(w i t, xi t } i= for t =,..., T.. Initialise the smoothed weights w i T T = wi T, for i =,...,.. for t = T to do (a Compute the smoothed weights w i t T = wi t w i t+ T k= f (x k t+ xi t, v k t = v k t i= w i tf (x k t+ xi t. The resulting approximation p(x t y :T = i= wi t T δ x i t (x t does not suffer from degeneracy. Sampling from the backward kernel (8 Key idea: Generate a backward trajectory by sampling from the following approximation p(x t x j t+, y :t = i= w i t f ( xj t+ xi t k= wk t f ( xj t+ xk t }{{ δ x i (x t t } w i,j t T of the backward kernel. The resulting sample trajectory is a sample from the joint smoothing density p(x :T y :T. Still computationally very costly! x j t p(x t x j t+, y :t, x j t:t = { xj t, xj t+:t }.

14 Sampling from the backward kernel (8 Sampling from the backward kernel - in words 4(8 Algorithm Backward simulator (BSi. Initialise: Set b T = m with probability w m T / l= wl T.. For t = T : : do: (a Given x b t+ t+, compute the smoothing weights, w m t T = wm t f (xb t+ t+ xm t l= wl t f (xb t+ t+ m =,...,. xl t, (b Set b t = m with probability w m t T. The BSi samples a trajectory from the empirical joint smoothing density by first choosing a particle x m T with probability proportional to w m T and then generating the ancestral path b :T according to the backward kernel. The resulting trajectory is x :T = {x b,..., xb T T }. The computational cost is O(, which is too expensive to be practical. Sampling from the backward kernel - in pictures (8 Sampling from the backward kernel - in pictures 6(8 Recall that the main problem with using the PF to sample from p(x :T y :T is the particle degeneracy problem. State In the figures we show two samples from p(x :T y :T generated using the PF. The diversity is completely gone from time and backwards. The backward simulator resolves the particle degeneracy problem. State ote that in the above samples there is no loss of diversity. 4

15 Forward filtering/backward simulation (FFBSi 7(8 Towards a practical particle smoother 8(8 Algorithm Forward filtering/backward simulation (FFBSi. Run the PF and store {(w m t, xm t } m= for t =,..., T.. Initialise: Set the smoothed weights w m T T = wm T and set t = T.. For t = T : : do: 4. For j = : M do: (a Set v j t = k= wk t f ( xj t+ xk t (b For m =,...,, compute w m,j t T = wm t f ( x j t+ xm t. v k t (c Sample from( the empirical backward kernel, I(j Cat { w m,j t T } m=, x j t = xi(j t, and append the sample x j t:t = { xj t, xj t+:t } To obtain a practical algorithm we have to reduce the computational complexity, how?! Key insight: We do not have to compute all the smoothing weights } to be able to sample from the empirical backward kernel. { w m,j t T How: Use rejection sampling to sample the indices! This means that we should not have to compute indices just to draw one sample. For details and MATLAB code, see Algorithm.4 (Fast FFBSi in F. Lindsten, Rao-Blackwellised particle methods for inference and identification. Licentiate thesis, Linköping University, LiU-TEK-LIC-:9,. [pdf] Original reference: R. Douc, A. Garivier, E. Moulines, and J. Olsson. Sequential Monte Carlo smoothing for general state space hidden Markov models. Annals of Applied Probability, (6:9 4,. [pdf]

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for

More information

System identification and sensor fusion in dynamical systems. Thomas Schön Division of Systems and Control, Uppsala University, Sweden.

System identification and sensor fusion in dynamical systems. Thomas Schön Division of Systems and Control, Uppsala University, Sweden. System identification and sensor fusion in dynamical systems Thomas Schön Division of Systems and Control, Uppsala University, Sweden. The system identification and sensor fusion problem Inertial sensors

More information

Rao-Blackwellised particle smoothers for mixed linear/nonlinear state-space models

Rao-Blackwellised particle smoothers for mixed linear/nonlinear state-space models Technical report from Automatic Control at Linköpings universitet Rao-Blackwellised particle smoothers for mixed linear/nonlinear state-space models Fredrik Lindsten, Thomas B. Schön Division of Automatic

More information

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015.

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015. Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy

More information

Outline lecture 6 2(35)

Outline lecture 6 2(35) Outline lecture 35 Lecture Expectation aximization E and clustering Thomas Schön Division of Automatic Control Linöping University Linöping Sweden. Email: schon@isy.liu.se Phone: 13-1373 Office: House

More information

An efficient stochastic approximation EM algorithm using conditional particle filters

An efficient stochastic approximation EM algorithm using conditional particle filters An efficient stochastic approximation EM algorithm using conditional particle filters Fredrik Lindsten Linköping University Post Print N.B.: When citing this work, cite the original article. Original Publication:

More information

1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015.

1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015. Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy

More information

AUTOMATIC CONTROL REGLERTEKNIK LINKÖPINGS UNIVERSITET. Machine Learning T. Schön. (Chapter 11) AUTOMATIC CONTROL REGLERTEKNIK LINKÖPINGS UNIVERSITET

AUTOMATIC CONTROL REGLERTEKNIK LINKÖPINGS UNIVERSITET. Machine Learning T. Schön. (Chapter 11) AUTOMATIC CONTROL REGLERTEKNIK LINKÖPINGS UNIVERSITET About the Eam I/II) ), Lecture 7 MCMC and Sampling Methods Thomas Schön Division of Automatic Control Linköping University Linköping, Sweden. Email: schon@isy.liu.se, Phone: 3-373, www.control.isy.liu.se/~schon/

More information

An introduction to Sequential Monte Carlo

An introduction to Sequential Monte Carlo An introduction to Sequential Monte Carlo Thang Bui Jes Frellsen Department of Engineering University of Cambridge Research and Communication Club 6 February 2014 1 Sequential Monte Carlo (SMC) methods

More information

Sequential Monte Carlo methods for system identification

Sequential Monte Carlo methods for system identification Technical report arxiv:1503.06058v3 [stat.co] 10 Mar 2016 Sequential Monte Carlo methods for system identification Thomas B. Schön, Fredrik Lindsten, Johan Dahlin, Johan Wågberg, Christian A. Naesseth,

More information

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007 Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 Importance Sampling Recall: Let s say that we want to compute some expectation (integral) E p [f] = p(x)f(x)dx and we remember

More information

An Brief Overview of Particle Filtering

An Brief Overview of Particle Filtering 1 An Brief Overview of Particle Filtering Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ May 11th, 2010 Warwick University Centre for Systems

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter

More information

Target Tracking and Classification using Collaborative Sensor Networks

Target Tracking and Classification using Collaborative Sensor Networks Target Tracking and Classification using Collaborative Sensor Networks Xiaodong Wang Department of Electrical Engineering Columbia University p.1/3 Talk Outline Background on distributed wireless sensor

More information

State-Space Methods for Inferring Spike Trains from Calcium Imaging

State-Space Methods for Inferring Spike Trains from Calcium Imaging State-Space Methods for Inferring Spike Trains from Calcium Imaging Joshua Vogelstein Johns Hopkins April 23, 2009 Joshua Vogelstein (Johns Hopkins) State-Space Calcium Imaging April 23, 2009 1 / 78 Outline

More information

Sequential Monte Carlo in the machine learning toolbox

Sequential Monte Carlo in the machine learning toolbox Sequential Monte Carlo in the machine learning toolbox Working with the trend of blending Thomas Schön Uppsala University Sweden. Symposium on Advances in Approximate Bayesian Inference (AABI) Montréal,

More information

Monte Carlo Approximation of Monte Carlo Filters

Monte Carlo Approximation of Monte Carlo Filters Monte Carlo Approximation of Monte Carlo Filters Adam M. Johansen et al. Collaborators Include: Arnaud Doucet, Axel Finke, Anthony Lee, Nick Whiteley 7th January 2014 Context & Outline Filtering in State-Space

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

Learning of dynamical systems

Learning of dynamical systems Learning of dynamical systems Particle filters and Markov chain methods Thomas B. Schön and Fredrik Lindsten c Draft date August 23, 2017 2 Contents 1 Introduction 3 1.1 A few words for readers of the

More information

Outline lecture 2 2(30)

Outline lecture 2 2(30) Outline lecture 2 2(3), Lecture 2 Linear Regression it is our firm belief that an understanding of linear models is essential for understanding nonlinear ones Thomas Schön Division of Automatic Control

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

Nested Sequential Monte Carlo Methods

Nested Sequential Monte Carlo Methods Nested Sequential Monte Carlo Methods Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön Linköping University, Sweden The University of Cambridge, UK Uppsala University, Sweden ICML 2015 Presented

More information

PARTICLE FILTERS WITH INDEPENDENT RESAMPLING

PARTICLE FILTERS WITH INDEPENDENT RESAMPLING PARTICLE FILTERS WITH INDEPENDENT RESAMPLING Roland Lamberti 1, Yohan Petetin 1, François Septier, François Desbouvries 1 (1) Samovar, Telecom Sudparis, CNRS, Université Paris-Saclay, 9 rue Charles Fourier,

More information

Infering the Number of State Clusters in Hidden Markov Model and its Extension

Infering the Number of State Clusters in Hidden Markov Model and its Extension Infering the Number of State Clusters in Hidden Markov Model and its Extension Xugang Ye Department of Applied Mathematics and Statistics, Johns Hopkins University Elements of a Hidden Markov Model (HMM)

More information

Outline Lecture 3 2(40)

Outline Lecture 3 2(40) Outline Lecture 3 4 Lecture 3 Expectation aximization E and Clustering Thomas Schön Division of Automatic Control Linöping University Linöping Sweden. Email: schon@isy.liu.se Phone: 3-8373 Office: House

More information

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Jonas Hallgren 1 1 Department of Mathematics KTH Royal Institute of Technology Stockholm, Sweden BFS 2012 June

More information

Outline Lecture 2 2(32)

Outline Lecture 2 2(32) Outline Lecture (3), Lecture Linear Regression and Classification it is our firm belief that an understanding of linear models is essential for understanding nonlinear ones Thomas Schön Division of Automatic

More information

Sensor Fusion: Particle Filter

Sensor Fusion: Particle Filter Sensor Fusion: Particle Filter By: Gordana Stojceska stojcesk@in.tum.de Outline Motivation Applications Fundamentals Tracking People Advantages and disadvantages Summary June 05 JASS '05, St.Petersburg,

More information

Towards a Bayesian model for Cyber Security

Towards a Bayesian model for Cyber Security Towards a Bayesian model for Cyber Security Mark Briers (mbriers@turing.ac.uk) Joint work with Henry Clausen and Prof. Niall Adams (Imperial College London) 27 September 2017 The Alan Turing Institute

More information

Parameter Estimation in a Moving Horizon Perspective

Parameter Estimation in a Moving Horizon Perspective Parameter Estimation in a Moving Horizon Perspective State and Parameter Estimation in Dynamical Systems Reglerteknik, ISY, Linköpings Universitet State and Parameter Estimation in Dynamical Systems OUTLINE

More information

Robotics. Mobile Robotics. Marc Toussaint U Stuttgart

Robotics. Mobile Robotics. Marc Toussaint U Stuttgart Robotics Mobile Robotics State estimation, Bayes filter, odometry, particle filter, Kalman filter, SLAM, joint Bayes filter, EKF SLAM, particle SLAM, graph-based SLAM Marc Toussaint U Stuttgart DARPA Grand

More information

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms L09. PARTICLE FILTERING NA568 Mobile Robotics: Methods & Algorithms Particle Filters Different approach to state estimation Instead of parametric description of state (and uncertainty), use a set of state

More information

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007 Particle Filtering a brief introductory tutorial Frank Wood Gatsby, August 2007 Problem: Target Tracking A ballistic projectile has been launched in our direction and may or may not land near enough to

More information

An introduction to particle filters

An introduction to particle filters An introduction to particle filters Andreas Svensson Department of Information Technology Uppsala University June 10, 2014 June 10, 2014, 1 / 16 Andreas Svensson - An introduction to particle filters Outline

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

A Note on Auxiliary Particle Filters

A Note on Auxiliary Particle Filters A Note on Auxiliary Particle Filters Adam M. Johansen a,, Arnaud Doucet b a Department of Mathematics, University of Bristol, UK b Departments of Statistics & Computer Science, University of British Columbia,

More information

Robert Collins CSE586, PSU Intro to Sampling Methods

Robert Collins CSE586, PSU Intro to Sampling Methods Intro to Sampling Methods CSE586 Computer Vision II Penn State Univ Topics to be Covered Monte Carlo Integration Sampling and Expected Values Inverse Transform Sampling (CDF) Ancestral Sampling Rejection

More information

Controlled sequential Monte Carlo

Controlled sequential Monte Carlo Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation

More information

The Hierarchical Particle Filter

The Hierarchical Particle Filter and Arnaud Doucet http://go.warwick.ac.uk/amjohansen/talks MCMSki V Lenzerheide 7th January 2016 Context & Outline Filtering in State-Space Models: SIR Particle Filters [GSS93] Block-Sampling Particle

More information

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Axel Gandy Department of Mathematics Imperial College London http://www2.imperial.ac.uk/~agandy London

More information

Auxiliary Particle Methods

Auxiliary Particle Methods Auxiliary Particle Methods Perspectives & Applications Adam M. Johansen 1 adam.johansen@bristol.ac.uk Oxford University Man Institute 29th May 2008 1 Collaborators include: Arnaud Doucet, Nick Whiteley

More information

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Roger Frigola Fredrik Lindsten Thomas B. Schön, Carl E. Rasmussen Dept. of Engineering, University of Cambridge,

More information

Robert Collins CSE586, PSU Intro to Sampling Methods

Robert Collins CSE586, PSU Intro to Sampling Methods Intro to Sampling Methods CSE586 Computer Vision II Penn State Univ Topics to be Covered Monte Carlo Integration Sampling and Expected Values Inverse Transform Sampling (CDF) Ancestral Sampling Rejection

More information

Inference in state-space models with multiple paths from conditional SMC

Inference in state-space models with multiple paths from conditional SMC Inference in state-space models with multiple paths from conditional SMC Sinan Yıldırım (Sabancı) joint work with Christophe Andrieu (Bristol), Arnaud Doucet (Oxford) and Nicolas Chopin (ENSAE) September

More information

Sequential Monte Carlo Samplers for Applications in High Dimensions

Sequential Monte Carlo Samplers for Applications in High Dimensions Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex

More information

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Preprints of the 9th World Congress The International Federation of Automatic Control Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Roger Frigola Fredrik

More information

Linear Dynamical Systems

Linear Dynamical Systems Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations

More information

Particle Filters. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

Particle Filters. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics Particle Filters Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics Motivation For continuous spaces: often no analytical formulas for Bayes filter updates

More information

Outline lecture 4 2(26)

Outline lecture 4 2(26) Outline lecture 4 2(26), Lecture 4 eural etworks () and Introduction to Kernel Methods Thomas Schön Division of Automatic Control Linköping University Linköping, Sweden. Email: schon@isy.liu.se, Phone:

More information

Machine Learning for OR & FE

Machine Learning for OR & FE Machine Learning for OR & FE Hidden Markov Models Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com Additional References: David

More information

Optimal input design for nonlinear dynamical systems: a graph-theory approach

Optimal input design for nonlinear dynamical systems: a graph-theory approach Optimal input design for nonlinear dynamical systems: a graph-theory approach Patricio E. Valenzuela Department of Automatic Control and ACCESS Linnaeus Centre KTH Royal Institute of Technology, Stockholm,

More information

Pattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods

Pattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods Pattern Recognition and Machine Learning Chapter 11: Sampling Methods Elise Arnaud Jakob Verbeek May 22, 2008 Outline of the chapter 11.1 Basic Sampling Algorithms 11.2 Markov Chain Monte Carlo 11.3 Gibbs

More information

TSRT14: Sensor Fusion Lecture 9

TSRT14: Sensor Fusion Lecture 9 TSRT14: Sensor Fusion Lecture 9 Simultaneous localization and mapping (SLAM) Gustaf Hendeby gustaf.hendeby@liu.se TSRT14 Lecture 9 Gustaf Hendeby Spring 2018 1 / 28 Le 9: simultaneous localization and

More information

Particle Filters: Convergence Results and High Dimensions

Particle Filters: Convergence Results and High Dimensions Particle Filters: Convergence Results and High Dimensions Mark Coates mark.coates@mcgill.ca McGill University Department of Electrical and Computer Engineering Montreal, Quebec, Canada Bellairs 2012 Outline

More information

Bayesian Inference and MCMC

Bayesian Inference and MCMC Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the

More information

F denotes cumulative density. denotes probability density function; (.)

F denotes cumulative density. denotes probability density function; (.) BAYESIAN ANALYSIS: FOREWORDS Notation. System means the real thing and a model is an assumed mathematical form for the system.. he probability model class M contains the set of the all admissible models

More information

Expectation propagation for signal detection in flat-fading channels

Expectation propagation for signal detection in flat-fading channels Expectation propagation for signal detection in flat-fading channels Yuan Qi MIT Media Lab Cambridge, MA, 02139 USA yuanqi@media.mit.edu Thomas Minka CMU Statistics Department Pittsburgh, PA 15213 USA

More information

SAMPLING ALGORITHMS. In general. Inference in Bayesian models

SAMPLING ALGORITHMS. In general. Inference in Bayesian models SAMPLING ALGORITHMS SAMPLING ALGORITHMS In general A sampling algorithm is an algorithm that outputs samples x 1, x 2,... from a given distribution P or density p. Sampling algorithms can for example be

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 7 Sequential Monte Carlo methods III 7 April 2017 Computer Intensive Methods (1) Plan of today s lecture

More information

Contents lecture 6 2(17) Automatic Control III. Summary of lecture 5 (I/III) 3(17) Summary of lecture 5 (II/III) 4(17) H 2, H synthesis pros and cons:

Contents lecture 6 2(17) Automatic Control III. Summary of lecture 5 (I/III) 3(17) Summary of lecture 5 (II/III) 4(17) H 2, H synthesis pros and cons: Contents lecture 6 (7) Automatic Control III Lecture 6 Linearization and phase portraits. Summary of lecture 5 Thomas Schön Division of Systems and Control Department of Information Technology Uppsala

More information

Bayesian Methods in Positioning Applications

Bayesian Methods in Positioning Applications Bayesian Methods in Positioning Applications Vedran Dizdarević v.dizdarevic@tugraz.at Graz University of Technology, Austria 24. May 2006 Bayesian Methods in Positioning Applications p.1/21 Outline Problem

More information

Introduction to Particle Filters for Data Assimilation

Introduction to Particle Filters for Data Assimilation Introduction to Particle Filters for Data Assimilation Mike Dowd Dept of Mathematics & Statistics (and Dept of Oceanography Dalhousie University, Halifax, Canada STATMOS Summer School in Data Assimila5on,

More information

Basic math for biology

Basic math for biology Basic math for biology Lei Li Florida State University, Feb 6, 2002 The EM algorithm: setup Parametric models: {P θ }. Data: full data (Y, X); partial data Y. Missing data: X. Likelihood and maximum likelihood

More information

Bayesian Monte Carlo Filtering for Stochastic Volatility Models

Bayesian Monte Carlo Filtering for Stochastic Volatility Models Bayesian Monte Carlo Filtering for Stochastic Volatility Models Roberto Casarin CEREMADE University Paris IX (Dauphine) and Dept. of Economics University Ca Foscari, Venice Abstract Modelling of the financial

More information

Lecture 7: Optimal Smoothing

Lecture 7: Optimal Smoothing Department of Biomedical Engineering and Computational Science Aalto University March 17, 2011 Contents 1 What is Optimal Smoothing? 2 Bayesian Optimal Smoothing Equations 3 Rauch-Tung-Striebel Smoother

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is

More information

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Review. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with

More information

Bayesian Machine Learning - Lecture 7

Bayesian Machine Learning - Lecture 7 Bayesian Machine Learning - Lecture 7 Guido Sanguinetti Institute for Adaptive and Neural Computation School of Informatics University of Edinburgh gsanguin@inf.ed.ac.uk March 4, 2015 Today s lecture 1

More information

FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS

FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS Gustaf Hendeby Fredrik Gustafsson Division of Automatic Control Department of Electrical Engineering, Linköpings universitet, SE-58 83 Linköping,

More information

Fast Sequential Monte Carlo PHD Smoothing

Fast Sequential Monte Carlo PHD Smoothing 14th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 11 Fast Sequential Monte Carlo PHD Smoothing Sharad Nagappa and Daniel E. Clark School of EPS Heriot Watt University

More information

RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS

RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS Saiat Saha and Gustaf Hendeby Linöping University Post Print N.B.: When citing this wor, cite the original article. 2014

More information

Efficient Sensitivity Analysis in Hidden Markov Models

Efficient Sensitivity Analysis in Hidden Markov Models Efficient Sensitivity Analysis in Hidden Markov Models Silja Renooij Department of Information and Computing Sciences, Utrecht University P.O. Box 80.089, 3508 TB Utrecht, The Netherlands silja@cs.uu.nl

More information

DAG models and Markov Chain Monte Carlo methods a short overview

DAG models and Markov Chain Monte Carlo methods a short overview DAG models and Markov Chain Monte Carlo methods a short overview Søren Højsgaard Institute of Genetics and Biotechnology University of Aarhus August 18, 2008 Printed: August 18, 2008 File: DAGMC-Lecture.tex

More information

Approximate Bayesian inference

Approximate Bayesian inference Approximate Bayesian inference Variational and Monte Carlo methods Christian A. Naesseth 1 Exchange rate data 0 20 40 60 80 100 120 Month Image data 2 1 Bayesian inference 2 Variational inference 3 Stochastic

More information

Notes on pseudo-marginal methods, variational Bayes and ABC

Notes on pseudo-marginal methods, variational Bayes and ABC Notes on pseudo-marginal methods, variational Bayes and ABC Christian Andersson Naesseth October 3, 2016 The Pseudo-Marginal Framework Assume we are interested in sampling from the posterior distribution

More information

Latent state estimation using control theory

Latent state estimation using control theory Latent state estimation using control theory Bert Kappen SNN Donders Institute, Radboud University, Nijmegen Gatsby Unit, UCL London August 3, 7 with Hans Christian Ruiz Bert Kappen Smoothing problem Given

More information

Expectation Propagation in Dynamical Systems

Expectation Propagation in Dynamical Systems Expectation Propagation in Dynamical Systems Marc Peter Deisenroth Joint Work with Shakir Mohamed (UBC) August 10, 2012 Marc Deisenroth (TU Darmstadt) EP in Dynamical Systems 1 Motivation Figure : Complex

More information

Identification using Convexification and Recursion. This Page will be Replaced before Printing

Identification using Convexification and Recursion. This Page will be Replaced before Printing Identification using Convexification and Recursion This Page will be Replaced before Printing 3 Abstract System identification studies how to construct mathematical models for dynamical systems from the

More information

The chopthin algorithm for resampling

The chopthin algorithm for resampling The chopthin algorithm for resampling Axel Gandy F. Din-Houn Lau Department of Mathematics, Imperial College London Abstract Resampling is a standard step in particle filters and more generally sequential

More information

Smoothers: Types and Benchmarks

Smoothers: Types and Benchmarks Smoothers: Types and Benchmarks Patrick N. Raanes Oxford University, NERSC 8th International EnKF Workshop May 27, 2013 Chris Farmer, Irene Moroz Laurent Bertino NERSC Geir Evensen Abstract Talk builds

More information

An Introduction to Particle Filtering

An Introduction to Particle Filtering An Introduction to Particle Filtering Author: Lisa Turner Supervisor: Dr. Christopher Sherlock 10th May 2013 Abstract This report introduces the ideas behind particle filters, looking at the Kalman filter

More information

Statistical Inference and Methods

Statistical Inference and Methods Department of Mathematics Imperial College London d.stephens@imperial.ac.uk http://stats.ma.ic.ac.uk/ das01/ 31st January 2006 Part VI Session 6: Filtering and Time to Event Data Session 6: Filtering and

More information

Bayesian Methods for Machine Learning

Bayesian Methods for Machine Learning Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),

More information

Robert Collins CSE586, PSU Intro to Sampling Methods

Robert Collins CSE586, PSU Intro to Sampling Methods Robert Collins Intro to Sampling Methods CSE586 Computer Vision II Penn State Univ Robert Collins A Brief Overview of Sampling Monte Carlo Integration Sampling and Expected Values Inverse Transform Sampling

More information

TSRT14: Sensor Fusion Lecture 8

TSRT14: Sensor Fusion Lecture 8 TSRT14: Sensor Fusion Lecture 8 Particle filter theory Marginalized particle filter Gustaf Hendeby gustaf.hendeby@liu.se TSRT14 Lecture 8 Gustaf Hendeby Spring 2018 1 / 25 Le 8: particle filter theory,

More information

Lecture 8: Bayesian Estimation of Parameters in State Space Models

Lecture 8: Bayesian Estimation of Parameters in State Space Models in State Space Models March 30, 2016 Contents 1 Bayesian estimation of parameters in state space models 2 Computational methods for parameter estimation 3 Practical parameter estimation in state space

More information

Rao-Blackwellized Particle Filter for Multiple Target Tracking

Rao-Blackwellized Particle Filter for Multiple Target Tracking Rao-Blackwellized Particle Filter for Multiple Target Tracking Simo Särkkä, Aki Vehtari, Jouko Lampinen Helsinki University of Technology, Finland Abstract In this article we propose a new Rao-Blackwellized

More information

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER Zhen Zhen 1, Jun Young Lee 2, and Abdus Saboor 3 1 Mingde College, Guizhou University, China zhenz2000@21cn.com 2 Department

More information

Introduction. log p θ (y k y 1:k 1 ), k=1

Introduction. log p θ (y k y 1:k 1 ), k=1 ESAIM: PROCEEDINGS, September 2007, Vol.19, 115-120 Christophe Andrieu & Dan Crisan, Editors DOI: 10.1051/proc:071915 PARTICLE FILTER-BASED APPROXIMATE MAXIMUM LIKELIHOOD INFERENCE ASYMPTOTICS IN STATE-SPACE

More information

Learning the hyper-parameters. Luca Martino

Learning the hyper-parameters. Luca Martino Learning the hyper-parameters Luca Martino 2017 2017 1 / 28 Parameters and hyper-parameters 1. All the described methods depend on some choice of hyper-parameters... 2. For instance, do you recall λ (bandwidth

More information

Particle Filtering for Data-Driven Simulation and Optimization

Particle Filtering for Data-Driven Simulation and Optimization Particle Filtering for Data-Driven Simulation and Optimization John R. Birge The University of Chicago Booth School of Business Includes joint work with Nicholas Polson. JRBirge INFORMS Phoenix, October

More information

Bayesian Inference for Nonlinear Dynamical Systems Applications and Software Implementation Nordh, Jerker

Bayesian Inference for Nonlinear Dynamical Systems Applications and Software Implementation Nordh, Jerker Bayesian Inference for Nonlinear Dynamical Systems Applications and Software Implementation Nordh, Jerker 2015 Document Version: Publisher's PDF, also known as Version of record Link to publication Citation

More information

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17 MCMC for big data Geir Storvik BigInsight lunch - May 2 2018 Geir Storvik MCMC for big data BigInsight lunch - May 2 2018 1 / 17 Outline Why ordinary MCMC is not scalable Different approaches for making

More information

System Identification of Nonlinear State-Space Models

System Identification of Nonlinear State-Space Models System Identification of Nonlinear State-Space Models Thomas B. Schön a, Adrian Wills b, Brett Ninness b a Division of Automatic Control, Linköping University, SE-581 83 Linköping, Sweden b School of Electrical

More information

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations Department of Biomedical Engineering and Computational Science Aalto University April 28, 2010 Contents 1 Multiple Model

More information

Sequential Monte Carlo Methods (for DSGE Models)

Sequential Monte Carlo Methods (for DSGE Models) Sequential Monte Carlo Methods (for DSGE Models) Frank Schorfheide University of Pennsylvania, PIER, CEPR, and NBER October 23, 2017 Some References These lectures use material from our joint work: Tempered

More information

Why do we care? Examples. Bayes Rule. What room am I in? Handling uncertainty over time: predicting, estimating, recognizing, learning

Why do we care? Examples. Bayes Rule. What room am I in? Handling uncertainty over time: predicting, estimating, recognizing, learning Handling uncertainty over time: predicting, estimating, recognizing, learning Chris Atkeson 004 Why do we care? Speech recognition makes use of dependence of words and phonemes across time. Knowing where

More information

April 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning

April 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning for for Advanced Topics in California Institute of Technology April 20th, 2017 1 / 50 Table of Contents for 1 2 3 4 2 / 50 History of methods for Enrico Fermi used to calculate incredibly accurate predictions

More information