Sequential Bayesian Inference for Dynamic State Space. Model Parameters

Size: px
Start display at page:

Download "Sequential Bayesian Inference for Dynamic State Space. Model Parameters"

Transcription

1 Sequential Bayesian Inference for Dynamic State Space Model Parameters Arnab Bhattacharya and Simon Wilson 1. INTRODUCTION Dynamic state-space models [24], consisting of a latent Markov process X 0,X 1,... and noisy observations Y 1,Y 2,... that are conditionally independent, are used in a wide variety of applications e.g. wireless networks [8], object tracking [21], econometrics [7] etc. The model is specified by an initial distribution p(x 0 ), a transition kernel p(x t x t 1, ) and an observation distribution p(y t x t, ). These distributions are defined in terms of a set of K static (e.g. non-time varying) parameters =( 1,..., K ). The joint model to time T is: p(y 1:T, x 0:T, )= TY p(y t x t, )p(x t x t t=1 1, )! p(x 0 )p( ), (1.1) where y 1:T =(y 1,...,y T ), etc. These models are also known as hidden Markov models [20]. In this paper we focus on sequential Bayesian estimation of static (i.e. not timedependent) parameter for these models; at time T, we observe y T and wish to compute the posterior distribution p( y 1:T ). Further, this is to be done in a setting where on-line estimation is required, so that there is an issue of trade-o between computation speed 1

2 and accuracy. The constraint on computation time means that for some su ciently large T it becomes infeasible to simply recompute p( y 1:T ) from scratch by Monte Carlo (e.g. MCMC) or a functional approximation method (e.g. the integrated nested Laplace approximation) or even by some o -line application of particle filter [6] adapted to also infer these static parameters [1], or maximum likelihood based filtering methods like [10]. We propose a method that accomplishes this for a fairly broad set of dynamic state space models. As regards the static parameter estimation problem, almost no closed form solutions are available, even in linear Gaussian models [2]. [24] show that conjugate sequential updates of the state and observation variances, as well as for x 0, are available for some specific cases. Noteworthy work specific to online inference on static parameters, applicable for general state space models are found in [14, 23] and [3]. [12] is a good overview of parameter estimation, including both o -line approaches and the use of sequential Monte Carlo for on-line parameter estimation. The rest of the chapter is organized as follows. Section 2. outlines the principle of the method. Sections 3. and describe one of the main issues to be resolved in order to implement the method: approximations to one-step ahead filtering and prediction densities. Section 4. illustrates the method and assesses its performance against alternative approaches. Section 5. contains some concluding remarks. 2. PRINCIPLE The principle of the proposed method is based on two fundamental theoretical ideas. The first idea is that many dynamic state space models have a relatively small number of static parameters, so that in principle p( y 1:T ) can be computed and stored on a discrete grid of practical size. In a good number of situations, the parameters are time varying processes themselves; and there are hyper-parameters that are static but 2

3 unknown. This has been noted as a property of many latent models [22]. It is noted that the transition kernel of some dynamic state space models is itself defined in terms of a set of static parameters ( 1 ) e.g. p(x t x t 1, t, 1 ) and time-varying parameters ( t ) that also evolve as a Markov process depending on some hyper-parameters ( 2 ) e.g. p( 0:T 2 ) = p( 0 2 ) Q T t=1 p( t t 1, 2 ); for example dynamic linear models with a trend [24]. Without loss of generality, such cases are also incorporated in our problem by considering (x t, t ) to be the latent process and by denoting the complete set of static parameters and hyper-parameters as. The second significant point to pay attention to is that there exists useful identities for parameter estimation in latent models. This identity, also known as basic marginal likelihood identity (BMI) is reported in [4] and is used for the calculation of marginal likelihood in the original paper. In this paper, the following approach is taken p( y 1:T ) / p(y 1:T, ) = p(y 1:T, x 0:T, ), (2.1) p(x 0:T y 1:T, ) x0:t =x ( ) valid for any x 0:T for which p(x 0:T y 1:T, ) > 0. Under the assumption that p(x 0:T ) is Gaussian, the above identity forms the basis of the integrated nested Laplace approximation (INLA) of [22]. Here a Gaussian approximation is made for the denominator term, and it is evaluated on a discrete grid of values of. The method also includes a way to derive such a grid intelligently. The value x 0:T = x ( ) is allowed to be a function of and typically x ( ) = arg max x0:t p(x 0:T y 1:T, ) is used, which is the mean of the Gaussian approximation to p(x 0:T y 1:T, ). Another useful identity is: p( y 1:T ) / p( y 1:T 1 ) p(y T y 1:T 1, ). (2.2) 3

4 In our case, we estimate p(y T y 1:T 1, ) by the following: p(y T y 1:T 1, )= p(y T x T, )p(x T y 1:T 1, ) p(x T y 1:T, ) ; (2.3) x T =x ( ) as with Equation 2.1, we choose x ( ) = arg max xt p(x T y 1:T, ). This identity is clearly useful for sequential estimation and does not su er from the dimension-increasing problem of Equation 2.1. Taking Equation 2.2, then if prediction and filtering approximations p(x T y 1:T 1, ) and p(x T y 1:T, ) are available, any approximation p( y 1:T 1 ) at time T 1 can be updated: p( y 1:T ) / p( y 1:T 1 ) p(y T y 1:T 1, ) = p( y 1:T 1 ) p(y T x T, ) p(x T y 1:T 1, ) p(x T y 1:T, ), (2.4) x T =x ( ) where x ( ) = arg max xt p(x T y 1:T, ). For of low dimension, computing Equation 2.4 on a discrete grid o ers the potential for fast sequential estimation. This suggests the following sequential estimation algorithm when approximate prediction and filtering distributions are available; and it is named as SINLA or Sequential INLA. Initially, p( y 1:T ) is approximated by INLA because it is accurate and produces a discrete grid T over which p( y 1:T ) is computed. At some time T INLA this will prove to be too slow to compute, and from then on the sequential update of Equation 2.4 will be used. The main issue that remain to be addressed in order to implement this algorithm is the form of the approximations p(x T y 1:T next section. 1, ) and p(x T y 1:T, ). It is addressed in the 4

5 3. PREDICTING AND FILTERING DENSITY APPROXIMATIONS For the Kalman filter (where p(y t x t, ), p(x 0 ) and p(x t x t 1, ) are linear and Gaussian), the prediction and filtering distributions are Gaussian, Equation 2.2 can be computed exactly, and the INLA approximation is also exact. The means and variances of these Gaussians are sequentially updated [18]. All that we need to store are the means and variances of the prediction and filtering distributions for each in the grid; from this p( y 1:T ) can be computed. An equivalent definition of Eq. 1.1, and one that is useful in describing some aspects of the approximations that we propose, is the general state-space representation: y t = f(x t,u t,v t, ); (3.1) x t = g(x t 1,w t, ), (3.2) where v t and w t are observation and system errors, and u t are (possibly non-existent) exogenous variables. The likelihood p(y t x t, )isspecifiedbyf and v t, while the transition density p(x t x t 1, )isspecifiedbyg and w t. Two of several algorithms found in the literature are listed here. 3.1 Basic Approximations When either the linear or Gaussian property does not hold, 2 extensions of the Kalman filter can be computed quickly. Extended KF The extended Kalman filter was one of the first generalisations of the Kalman filter to non-linear models [17]. It linearizes a non-linear model to create a Kalman filter 5

6 (e.g. Gaussian) approximation to the filtering and prediction densities [9]. Hence the prediction and filtering distribution approximations are Gaussian and make use of the fast sequential updating of their mean and variances. Unscented KF The unscented Kalman filter also produces Gaussian approximations to the filtering and prediction densities but avoids linearising by approximately propagating the means and covariances through the non-linear function [11]. It tends to be more accurate than the extended Kalman filter, more so for strongly non-linear models. The non-linearity in the model is propagated deterministically through a small set of points, known as sigma points. Weights are associated with each point and an estimate of the mean and variance of the Gaussian approximations to p(x t y 1:t 1, ) and p(x t y 1:t, ) is made as weighted means and variances of these points. The method is computationally fast as it only requires the updating of these points, and then the means and variances of the approximation, at each observation. 4. EXAMPLES: LINEAR DYNAMIC MODEL Our method is implemented on an example and compared to INLA (an o ine method) and a particle filter developed for online inference of static parameters. Average performance is measured across many replications of simulated data. To keep the computation time comparison fair, all methods were implemented in R [19]. In all these replications T INLA had been set at 20. The model performances were compared using two di erent measures, as described below: Mahalanobis distance: This is used as a measure to judge the accuracy of the estimates of the parameters in the model in a multivariate parameter space setting [15]. 6

7 Computation time: The time to compute the posterior approximation is also recorded. The statistical model in this example has been assumed to be of the form: y t = x t t (4.1) x t = x t 1 + t (4.2) where t N(0, 2 err ), 1 is a vector of 1 s and t MVN(0, ). The covariance matrix is assumed to be dependent on a single unknown parameter: = obs. The entries of are of the following type: 8 >< 1 if i = j, ij = >: exp( rd(i, j)) if i 6= j, where r>0 and d(i, j) is some measure of distance betweens nodes i and j. defines the well-known Gaussian spatial process [16]. Data has been generated by fixing the values at =0.7, 2 obs = 1 and sys 2 = 1, and it is assumed that y t is of dimension 3. Further, the form of is known and fixed. Instead of variance parameters, precision parameters are used which are denoted as Obs and Sys respectively. The algorithm is run with T INLA = 20. The Kalman filter has been used for optimal filtering at each step, since the system and observation equations are linear. The above simulation has been replicated 10 times for SINLA and 5 times each for INLA and BSMC. For SINLA (and also INLA), the AR parameter has a normal prior with mean 0.1 and s.d. 1, truncated at 0.99 and Both Sys and Obs have a gamma prior with parameters 3 and 3. More stronger priors were provided for BSMC. The prior for is normal with mean 0.5 and s.d. 1, 7

8 again truncated at are now set at and 0.99; where as for the gamma priors, both the parameters The approximate mode of each of the parameters along with the approximate 95% probability bounds are shown in Figure 1. Figure 1 shows that our method works well initially, but the performance degrades over time. The starting grid, as computed using INLA, is not su cient enough to cover for the support of the marginals as T increases. The mode of the posterior is at the tail of the support of grid-points, hinting at the fact that the grid needs to shift over time. For T>T INLA, fast alternative methods of updating the grid must be determined. Table 1 has been constructed to compare the accuracy and computation-time of our algorithm with two methods; namely the online Bayesian sequential Monte Carlo (BSMC) by [14] and o -line Bayesian inference with INLA. The comparison is done for T = 500 and T = 1000, based on point estimates, Mahalanobis distance and computation times for each of the methods. Table 1: Table containing point estimates of the parameters and measures of accuracy and computation time for the three algorithms, namely SINLA, BSMC and INLA. Methods T = 500 T = 1000 Estimates MD Time(s) Estimates MD Time(s) SINLA 0.68, 0.98, , 0.96, BSMC 0.72, 1.66, , 1.68, INLA 0.7, 0.99, , 1.00, INLA, as one can expect, is computationally the most expensive algorithm while producing very accurate outputs. The particle filter has been implemented using the R package pomp [13]. The computation time of the SMC is much higher than the new algorithm; and it is partially dependent on the number of particles used. For this example, particles have been used, with the intention of achieving greater accuracy. But degeneracy sets in in all the examples, causing the output to be extremely inaccurate. No Mahalanobis distance values are reported for BSMC as they are extremely high, 8

9 caused due to very low estimated covariance matrix and relatively inaccurate point estimates. The point estimates also shows the relative inaccuracy of BSMC compared to the other two methods. 5. CONCLUDING REMARKS A method of fast sequential parameter estimation for dynamic state space models has been proposed and compared to two alternatives: the integrated nested Laplace approximation, and a particle filter. In all the examples that we consider, INLA proved the most accurate but the slowest. Our method achieved much better accuracy than the particle filter and also proved to be much faster than both the algorithms. It is also worth noting that our method does not su er from issues of degeneracy that a ect the SMC [5]. The principal disadvantages of the approach, in its current form are the following. The constant grid, computed using INLA does not cover the true support of the posterior over time. There is the need to develop a grid shifting algorithm, i.e. one that dynamically adds or drops grid points over time. Another crucial disadvantage of this method is that it is restricted to models with a relatively small number of fixed parameters. Finally there is a need to develop asymptotic properties related to the convergence of the filter. While some idea of the accuracy of posterior of Y 1:t seems to be directly related to the dimension of the latent process as shown by [22] for INLA, it needs to be extended in a sequential setting. We feel that the consistency properties of our filter are completely dependent on that of the state filtering mechanism. But this needs to be looked into as future work. 9

10 References [1] Christophe Andrieu, Arnaud Doucet, and Roman Holenstein. Particle Markov chain Monte Carlo methods. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 72(3): , [2] Christophe Andrieu, Arnaud Doucet, and Vladislav B. Tadic. On-line parameter estimation in general state-space models. In IEEE, editor, Proceedings of the 44th IEEE Conference on Decision and Control, pages , [3] Carlos M. Carvalho, Michael Johannes, Hedibert F. Lopes, and Nicholas Polson. Particle learning and smoothing. Statistical Science, 25(1):88 106, [4] Siddhartha Chib. Marginal likelihood from the Gibbs output. Journal of the American Statistical Association, 90(432): , [5] A. Doucet and A. M. Johansen. A tutorial on particle filtering and smoothing: fifteen years later. In Oxford Handbook of Nonlinear Filtering. Oxford University Press, [6] N.J. Gordon, D. J. Salmond, and A. F. M. Smith. Novel approach to nonlinear/non- Gaussian Bayesian state estimation. Radar and Signal Processing, IEE Proceedings F, 140(2): , [7] James D. Hamilton. State-space models. In R. F. Engle and D. McFadden, editors, Handbook of Econometrics, volume 4, chapter 50, pages Elsevier, [8] S. Haykin, K. Huber, and Zhe Chen. Bayesian sequential state estimation for mimo wireless communications. Proceedings of the IEEE, 92(3): , [9] Simon Haykin. Kalman filtering and neural networks. Wiley-Interscience,

11 [10] Edward L. Ionides, Anindya Bhadra, Yves Atchadé, and Aaron King. Iterated filtering. Annals of Statistics, 39(3): , [11] S. J. Julier and J.K. Uhlmann. A new extension of the Kalman filter to nonlinear systems. In Proceedings of AeroSense: The 11th International Symposium on Aerospace/Defense Sensing, Simulation and Controls, Orlando, Florida, pages , [12] N. Kantas, Sumeetpal S. Singh, and J.M. Maciejowski. An overview of sequential Monte Carlo methods for parameter estimation in general state-space models. In Proceedings IFAC System Identification (SySid) Meeting, [13] Aaron A. King, Edward L. Ionides, Carles Martinez Bretó, Steve Ellner, Bruce Kendall, Helen Wearing, Matthew J. Ferrari, Michael Lavine, and Daniel C. Reuman. pomp: Statistical inference for partially observed Markov processes (R package), [14] J Liu and M West. Combined parameter and state estimation in simulation-based filtering. In De Freitas and N. J. Gordon, editors, Sequential Monte Carlo Methods in Practice. New York. Springer-Verlag, New York, [15] P. C. Mahalanobis. On the generalised distance in Statistics. Proceedings of the National Institute of Sciences of India, 2(1):49 55, [16] B. Matérn. Spatial Variation. Springer-Verlag, [17] B. A. McElhoe. An assessment of the navigation and course corrections for a manned flyby of Mars or Venus. IEEE Transactions on Aerospace and Electronic Systems, 2: , [18] Richard J. Meinhold and Nozer D. Singpurwalla. Understanding the Kalman filter. The American Statistician, 37(2): ,

12 [19] R Development Core Team. R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria, [20] Lawrence R. Rabiner. A tutorial on hidden Markov models and selected applications in speech recognition. In Proceedings of the IEEE, volume 77, pages , February [21] Branko Ristic, Sanjeev Arulampalam, and Neil Gordon. Beyond the Kalman Filter: Particle Filters for Tracking Applications. Artech House, [22] H. Rue, S. Martino, and N. Chopin. Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations. Journal of the Royal Statistical Society, Series B (Statistical Methodology), 71: , [23] Geir Storvik. Particle filters for state-space models with the presence of unknown static parameters. IEEE Transactions on Signal Processing, 50(2): , February [24] M. West and J. Harrison. Bayesian forecasting and dynamic models. Springer series in Statistics. Springer, second edition,

13 (a) AR parameter x (b) State precision parameter y (c) Observation precision parameter Figure 1: Plots (a), (b) and (c) represent trace plots showing trajectories of the averaged approximate mode and approximate 95% probability bounds of the posteriors of, Sys and Obs respectively. The light grey line displays the true parameter value. 13

Sensor Fusion: Particle Filter

Sensor Fusion: Particle Filter Sensor Fusion: Particle Filter By: Gordana Stojceska stojcesk@in.tum.de Outline Motivation Applications Fundamentals Tracking People Advantages and disadvantages Summary June 05 JASS '05, St.Petersburg,

More information

Learning Static Parameters in Stochastic Processes

Learning Static Parameters in Stochastic Processes Learning Static Parameters in Stochastic Processes Bharath Ramsundar December 14, 2012 1 Introduction Consider a Markovian stochastic process X T evolving (perhaps nonlinearly) over time variable T. We

More information

Inferring biological dynamics Iterated filtering (IF)

Inferring biological dynamics Iterated filtering (IF) Inferring biological dynamics 101 3. Iterated filtering (IF) IF originated in 2006 [6]. For plug-and-play likelihood-based inference on POMP models, there are not many alternatives. Directly estimating

More information

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Axel Gandy Department of Mathematics Imperial College London http://www2.imperial.ac.uk/~agandy London

More information

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER Zhen Zhen 1, Jun Young Lee 2, and Abdus Saboor 3 1 Mingde College, Guizhou University, China zhenz2000@21cn.com 2 Department

More information

An introduction to Sequential Monte Carlo

An introduction to Sequential Monte Carlo An introduction to Sequential Monte Carlo Thang Bui Jes Frellsen Department of Engineering University of Cambridge Research and Communication Club 6 February 2014 1 Sequential Monte Carlo (SMC) methods

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter

More information

Monte Carlo Approximation of Monte Carlo Filters

Monte Carlo Approximation of Monte Carlo Filters Monte Carlo Approximation of Monte Carlo Filters Adam M. Johansen et al. Collaborators Include: Arnaud Doucet, Axel Finke, Anthony Lee, Nick Whiteley 7th January 2014 Context & Outline Filtering in State-Space

More information

A new iterated filtering algorithm

A new iterated filtering algorithm A new iterated filtering algorithm Edward Ionides University of Michigan, Ann Arbor ionides@umich.edu Statistics and Nonlinear Dynamics in Biology and Medicine Thursday July 31, 2014 Overview 1 Introduction

More information

An Brief Overview of Particle Filtering

An Brief Overview of Particle Filtering 1 An Brief Overview of Particle Filtering Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ May 11th, 2010 Warwick University Centre for Systems

More information

Introduction to Particle Filters for Data Assimilation

Introduction to Particle Filters for Data Assimilation Introduction to Particle Filters for Data Assimilation Mike Dowd Dept of Mathematics & Statistics (and Dept of Oceanography Dalhousie University, Halifax, Canada STATMOS Summer School in Data Assimila5on,

More information

The Hierarchical Particle Filter

The Hierarchical Particle Filter and Arnaud Doucet http://go.warwick.ac.uk/amjohansen/talks MCMSki V Lenzerheide 7th January 2016 Context & Outline Filtering in State-Space Models: SIR Particle Filters [GSS93] Block-Sampling Particle

More information

A Note on Auxiliary Particle Filters

A Note on Auxiliary Particle Filters A Note on Auxiliary Particle Filters Adam M. Johansen a,, Arnaud Doucet b a Department of Mathematics, University of Bristol, UK b Departments of Statistics & Computer Science, University of British Columbia,

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

Expectation Propagation in Dynamical Systems

Expectation Propagation in Dynamical Systems Expectation Propagation in Dynamical Systems Marc Peter Deisenroth Joint Work with Shakir Mohamed (UBC) August 10, 2012 Marc Deisenroth (TU Darmstadt) EP in Dynamical Systems 1 Motivation Figure : Complex

More information

An introduction to particle filters

An introduction to particle filters An introduction to particle filters Andreas Svensson Department of Information Technology Uppsala University June 10, 2014 June 10, 2014, 1 / 16 Andreas Svensson - An introduction to particle filters Outline

More information

Controlled sequential Monte Carlo

Controlled sequential Monte Carlo Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation

More information

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for

More information

AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER. Qi Cheng and Pascal Bondon. CNRS UMR 8506, Université Paris XI, France.

AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER. Qi Cheng and Pascal Bondon. CNRS UMR 8506, Université Paris XI, France. AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER Qi Cheng and Pascal Bondon CNRS UMR 8506, Université Paris XI, France. August 27, 2011 Abstract We present a modified bootstrap filter to draw

More information

POMP inference via iterated filtering

POMP inference via iterated filtering POMP inference via iterated filtering Edward Ionides University of Michigan, Department of Statistics Lecture 3 at Wharton Statistics Department Thursday 27th April, 2017 Slides are online at http://dept.stat.lsa.umich.edu/~ionides/talks/upenn

More information

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS Frédéric Mustière e-mail: mustiere@site.uottawa.ca Miodrag Bolić e-mail: mbolic@site.uottawa.ca Martin Bouchard e-mail: bouchard@site.uottawa.ca

More information

Efficient Monitoring for Planetary Rovers

Efficient Monitoring for Planetary Rovers International Symposium on Artificial Intelligence and Robotics in Space (isairas), May, 2003 Efficient Monitoring for Planetary Rovers Vandi Verma vandi@ri.cmu.edu Geoff Gordon ggordon@cs.cmu.edu Carnegie

More information

RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS

RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS Saiat Saha and Gustaf Hendeby Linöping University Post Print N.B.: When citing this wor, cite the original article. 2014

More information

Dynamic System Identification using HDMR-Bayesian Technique

Dynamic System Identification using HDMR-Bayesian Technique Dynamic System Identification using HDMR-Bayesian Technique *Shereena O A 1) and Dr. B N Rao 2) 1), 2) Department of Civil Engineering, IIT Madras, Chennai 600036, Tamil Nadu, India 1) ce14d020@smail.iitm.ac.in

More information

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu Lecture: Gaussian Process Regression STAT 6474 Instructor: Hongxiao Zhu Motivation Reference: Marc Deisenroth s tutorial on Robot Learning. 2 Fast Learning for Autonomous Robots with Gaussian Processes

More information

Particle Filtering for Data-Driven Simulation and Optimization

Particle Filtering for Data-Driven Simulation and Optimization Particle Filtering for Data-Driven Simulation and Optimization John R. Birge The University of Chicago Booth School of Business Includes joint work with Nicholas Polson. JRBirge INFORMS Phoenix, October

More information

Density Propagation for Continuous Temporal Chains Generative and Discriminative Models

Density Propagation for Continuous Temporal Chains Generative and Discriminative Models $ Technical Report, University of Toronto, CSRG-501, October 2004 Density Propagation for Continuous Temporal Chains Generative and Discriminative Models Cristian Sminchisescu and Allan Jepson Department

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Dan Cawford Manfred Opper John Shawe-Taylor May, 2006 1 Introduction Some of the most complex models routinely run

More information

Fast Sequential Parameter Inference for Dynamic State Space Models

Fast Sequential Parameter Inference for Dynamic State Space Models Fast Sequential Parameter Inference for Dynamic State Space Models A thesis submitted to University of Dublin, Trinity College in partial fulfilment of the requirements for the degree of Doctor of Philosophy

More information

Markov Chain Monte Carlo Methods for Stochastic Optimization

Markov Chain Monte Carlo Methods for Stochastic Optimization Markov Chain Monte Carlo Methods for Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U of Toronto, MIE,

More information

Particle Filtering Approaches for Dynamic Stochastic Optimization

Particle Filtering Approaches for Dynamic Stochastic Optimization Particle Filtering Approaches for Dynamic Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge I-Sim Workshop,

More information

The Kalman Filter ImPr Talk

The Kalman Filter ImPr Talk The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman

More information

A new unscented Kalman filter with higher order moment-matching

A new unscented Kalman filter with higher order moment-matching A new unscented Kalman filter with higher order moment-matching KSENIA PONOMAREVA, PARESH DATE AND ZIDONG WANG Department of Mathematical Sciences, Brunel University, Uxbridge, UB8 3PH, UK. Abstract This

More information

in a Rao-Blackwellised Unscented Kalman Filter

in a Rao-Blackwellised Unscented Kalman Filter A Rao-Blacwellised Unscented Kalman Filter Mar Briers QinetiQ Ltd. Malvern Technology Centre Malvern, UK. m.briers@signal.qinetiq.com Simon R. Masell QinetiQ Ltd. Malvern Technology Centre Malvern, UK.

More information

Package RcppSMC. March 18, 2018

Package RcppSMC. March 18, 2018 Type Package Title Rcpp Bindings for Sequential Monte Carlo Version 0.2.1 Date 2018-03-18 Package RcppSMC March 18, 2018 Author Dirk Eddelbuettel, Adam M. Johansen and Leah F. South Maintainer Dirk Eddelbuettel

More information

Particle Learning and Smoothing

Particle Learning and Smoothing Particle Learning and Smoothing Carlos Carvalho, Michael Johannes, Hedibert Lopes and Nicholas Polson This version: September 2009 First draft: December 2007 Abstract In this paper we develop particle

More information

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Preprints of the 9th World Congress The International Federation of Automatic Control Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Roger Frigola Fredrik

More information

1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015.

1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015. Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy

More information

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Roger Frigola Fredrik Lindsten Thomas B. Schön, Carl E. Rasmussen Dept. of Engineering, University of Cambridge,

More information

Markov Chain Monte Carlo Methods for Stochastic

Markov Chain Monte Carlo Methods for Stochastic Markov Chain Monte Carlo Methods for Stochastic Optimization i John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U Florida, Nov 2013

More information

Inference for partially observed stochastic dynamic systems: A new algorithm, its theory and applications

Inference for partially observed stochastic dynamic systems: A new algorithm, its theory and applications Inference for partially observed stochastic dynamic systems: A new algorithm, its theory and applications Edward Ionides Department of Statistics, University of Michigan ionides@umich.edu Statistics Department

More information

Adversarial Sequential Monte Carlo

Adversarial Sequential Monte Carlo Adversarial Sequential Monte Carlo Kira Kempinska Department of Security and Crime Science University College London London, WC1E 6BT kira.kowalska.13@ucl.ac.uk John Shawe-Taylor Department of Computer

More information

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms L09. PARTICLE FILTERING NA568 Mobile Robotics: Methods & Algorithms Particle Filters Different approach to state estimation Instead of parametric description of state (and uncertainty), use a set of state

More information

Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets

Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets J. Clayton Kerce a, George C. Brown a, and David F. Hardiman b a Georgia Tech Research Institute, Georgia Institute of Technology,

More information

Nonlinear and/or Non-normal Filtering. Jesús Fernández-Villaverde University of Pennsylvania

Nonlinear and/or Non-normal Filtering. Jesús Fernández-Villaverde University of Pennsylvania Nonlinear and/or Non-normal Filtering Jesús Fernández-Villaverde University of Pennsylvania 1 Motivation Nonlinear and/or non-gaussian filtering, smoothing, and forecasting (NLGF) problems are pervasive

More information

Integrated Non-Factorized Variational Inference

Integrated Non-Factorized Variational Inference Integrated Non-Factorized Variational Inference Shaobo Han, Xuejun Liao and Lawrence Carin Duke University February 27, 2014 S. Han et al. Integrated Non-Factorized Variational Inference February 27, 2014

More information

Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model

Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model UNIVERSITY OF TEXAS AT SAN ANTONIO Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model Liang Jing April 2010 1 1 ABSTRACT In this paper, common MCMC algorithms are introduced

More information

Particle Filters. Outline

Particle Filters. Outline Particle Filters M. Sami Fadali Professor of EE University of Nevada Outline Monte Carlo integration. Particle filter. Importance sampling. Degeneracy Resampling Example. 1 2 Monte Carlo Integration Numerical

More information

Combined Particle and Smooth Variable Structure Filtering for Nonlinear Estimation Problems

Combined Particle and Smooth Variable Structure Filtering for Nonlinear Estimation Problems 14th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 2011 Combined Particle and Smooth Variable Structure Filtering for Nonlinear Estimation Problems S. Andrew Gadsden

More information

The Unscented Particle Filter

The Unscented Particle Filter The Unscented Particle Filter Rudolph van der Merwe (OGI) Nando de Freitas (UC Bereley) Arnaud Doucet (Cambridge University) Eric Wan (OGI) Outline Optimal Estimation & Filtering Optimal Recursive Bayesian

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 13: Learning in Gaussian Graphical Models, Non-Gaussian Inference, Monte Carlo Methods Some figures

More information

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015.

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015. Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy

More information

Recursive Noise Adaptive Kalman Filtering by Variational Bayesian Approximations

Recursive Noise Adaptive Kalman Filtering by Variational Bayesian Approximations PREPRINT 1 Recursive Noise Adaptive Kalman Filtering by Variational Bayesian Approximations Simo Särä, Member, IEEE and Aapo Nummenmaa Abstract This article considers the application of variational Bayesian

More information

Sequential Monte Carlo in the machine learning toolbox

Sequential Monte Carlo in the machine learning toolbox Sequential Monte Carlo in the machine learning toolbox Working with the trend of blending Thomas Schön Uppsala University Sweden. Symposium on Advances in Approximate Bayesian Inference (AABI) Montréal,

More information

A FEASIBILITY STUDY OF PARTICLE FILTERS FOR MOBILE STATION RECEIVERS. Michael Lunglmayr, Martin Krueger, Mario Huemer

A FEASIBILITY STUDY OF PARTICLE FILTERS FOR MOBILE STATION RECEIVERS. Michael Lunglmayr, Martin Krueger, Mario Huemer A FEASIBILITY STUDY OF PARTICLE FILTERS FOR MOBILE STATION RECEIVERS Michael Lunglmayr, Martin Krueger, Mario Huemer Michael Lunglmayr and Martin Krueger are with Infineon Technologies AG, Munich email:

More information

Bayesian Monte Carlo Filtering for Stochastic Volatility Models

Bayesian Monte Carlo Filtering for Stochastic Volatility Models Bayesian Monte Carlo Filtering for Stochastic Volatility Models Roberto Casarin CEREMADE University Paris IX (Dauphine) and Dept. of Economics University Ca Foscari, Venice Abstract Modelling of the financial

More information

Dynamic Modeling and Inference for Ecological and Epidemiological Systems

Dynamic Modeling and Inference for Ecological and Epidemiological Systems 1 Dynamic Modeling and Inference for Ecological and Epidemiological Systems European Meeting of Statisticians July, 2013 Edward Ionides The University of Michigan, Department of Statistics 2 Outline 1.

More information

A note on Reversible Jump Markov Chain Monte Carlo

A note on Reversible Jump Markov Chain Monte Carlo A note on Reversible Jump Markov Chain Monte Carlo Hedibert Freitas Lopes Graduate School of Business The University of Chicago 5807 South Woodlawn Avenue Chicago, Illinois 60637 February, 1st 2006 1 Introduction

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

An efficient stochastic approximation EM algorithm using conditional particle filters

An efficient stochastic approximation EM algorithm using conditional particle filters An efficient stochastic approximation EM algorithm using conditional particle filters Fredrik Lindsten Linköping University Post Print N.B.: When citing this work, cite the original article. Original Publication:

More information

Auxiliary Particle Methods

Auxiliary Particle Methods Auxiliary Particle Methods Perspectives & Applications Adam M. Johansen 1 adam.johansen@bristol.ac.uk Oxford University Man Institute 29th May 2008 1 Collaborators include: Arnaud Doucet, Nick Whiteley

More information

A Tree Search Approach to Target Tracking in Clutter

A Tree Search Approach to Target Tracking in Clutter 12th International Conference on Information Fusion Seattle, WA, USA, July 6-9, 2009 A Tree Search Approach to Target Tracking in Clutter Jill K. Nelson and Hossein Roufarshbaf Department of Electrical

More information

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17 MCMC for big data Geir Storvik BigInsight lunch - May 2 2018 Geir Storvik MCMC for big data BigInsight lunch - May 2 2018 1 / 17 Outline Why ordinary MCMC is not scalable Different approaches for making

More information

Dual Estimation and the Unscented Transformation

Dual Estimation and the Unscented Transformation Dual Estimation and the Unscented Transformation Eric A. Wan ericwan@ece.ogi.edu Rudolph van der Merwe rudmerwe@ece.ogi.edu Alex T. Nelson atnelson@ece.ogi.edu Oregon Graduate Institute of Science & Technology

More information

State Estimation using Moving Horizon Estimation and Particle Filtering

State Estimation using Moving Horizon Estimation and Particle Filtering State Estimation using Moving Horizon Estimation and Particle Filtering James B. Rawlings Department of Chemical and Biological Engineering UW Math Probability Seminar Spring 2009 Rawlings MHE & PF 1 /

More information

Bayesian Networks BY: MOHAMAD ALSABBAGH

Bayesian Networks BY: MOHAMAD ALSABBAGH Bayesian Networks BY: MOHAMAD ALSABBAGH Outlines Introduction Bayes Rule Bayesian Networks (BN) Representation Size of a Bayesian Network Inference via BN BN Learning Dynamic BN Introduction Conditional

More information

Human Pose Tracking I: Basics. David Fleet University of Toronto

Human Pose Tracking I: Basics. David Fleet University of Toronto Human Pose Tracking I: Basics David Fleet University of Toronto CIFAR Summer School, 2009 Looking at People Challenges: Complex pose / motion People have many degrees of freedom, comprising an articulated

More information

Efficient Particle Filtering for Jump Markov Systems. Application to Time-Varying Autoregressions

Efficient Particle Filtering for Jump Markov Systems. Application to Time-Varying Autoregressions 1762 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 51, NO. 7, JULY 2003 Efficient Particle Filtering for Jump Markov Systems. Application to Time-Varying Autoregressions Christophe Andrieu, Manuel Davy,

More information

Sequential Monte Carlo Samplers for Applications in High Dimensions

Sequential Monte Carlo Samplers for Applications in High Dimensions Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex

More information

Seminar: Data Assimilation

Seminar: Data Assimilation Seminar: Data Assimilation Jonas Latz, Elisabeth Ullmann Chair of Numerical Mathematics (M2) Technical University of Munich Jonas Latz, Elisabeth Ullmann (TUM) Data Assimilation 1 / 28 Prerequisites Bachelor:

More information

Particle Filter Track Before Detect Algorithms

Particle Filter Track Before Detect Algorithms Particle Filter Track Before Detect Algorithms Theory and Applications Y. Boers and J.N. Driessen JRS-PE-FAA THALES NEDERLAND Hengelo The Netherlands Email: {yvo.boers,hans.driessen}@nl.thalesgroup.com

More information

NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES

NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES 2013 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES Simo Särä Aalto University, 02150 Espoo, Finland Jouni Hartiainen

More information

Bayesian Learning and Inference in Recurrent Switching Linear Dynamical Systems

Bayesian Learning and Inference in Recurrent Switching Linear Dynamical Systems Bayesian Learning and Inference in Recurrent Switching Linear Dynamical Systems Scott W. Linderman Matthew J. Johnson Andrew C. Miller Columbia University Harvard and Google Brain Harvard University Ryan

More information

R-INLA. Sam Clifford Su-Yun Kang Jeff Hsieh. 30 August Bayesian Research and Analysis Group 1 / 14

R-INLA. Sam Clifford Su-Yun Kang Jeff Hsieh. 30 August Bayesian Research and Analysis Group 1 / 14 1 / 14 R-INLA Sam Clifford Su-Yun Kang Jeff Hsieh Bayesian Research and Analysis Group 30 August 2012 What is R-INLA? R package for Bayesian computation Integrated Nested Laplace Approximation MCMC free

More information

Self Adaptive Particle Filter

Self Adaptive Particle Filter Self Adaptive Particle Filter Alvaro Soto Pontificia Universidad Catolica de Chile Department of Computer Science Vicuna Mackenna 4860 (143), Santiago 22, Chile asoto@ing.puc.cl Abstract The particle filter

More information

Recent Advances in Bayesian Inference Techniques

Recent Advances in Bayesian Inference Techniques Recent Advances in Bayesian Inference Techniques Christopher M. Bishop Microsoft Research, Cambridge, U.K. research.microsoft.com/~cmbishop SIAM Conference on Data Mining, April 2004 Abstract Bayesian

More information

Sequential Monte Carlo Methods (for DSGE Models)

Sequential Monte Carlo Methods (for DSGE Models) Sequential Monte Carlo Methods (for DSGE Models) Frank Schorfheide University of Pennsylvania, PIER, CEPR, and NBER October 23, 2017 Some References These lectures use material from our joint work: Tempered

More information

Bagging During Markov Chain Monte Carlo for Smoother Predictions

Bagging During Markov Chain Monte Carlo for Smoother Predictions Bagging During Markov Chain Monte Carlo for Smoother Predictions Herbert K. H. Lee University of California, Santa Cruz Abstract: Making good predictions from noisy data is a challenging problem. Methods

More information

Variational Principal Components

Variational Principal Components Variational Principal Components Christopher M. Bishop Microsoft Research 7 J. J. Thomson Avenue, Cambridge, CB3 0FB, U.K. cmbishop@microsoft.com http://research.microsoft.com/ cmbishop In Proceedings

More information

Sampling Methods (11/30/04)

Sampling Methods (11/30/04) CS281A/Stat241A: Statistical Learning Theory Sampling Methods (11/30/04) Lecturer: Michael I. Jordan Scribe: Jaspal S. Sandhu 1 Gibbs Sampling Figure 1: Undirected and directed graphs, respectively, with

More information

Towards a Bayesian model for Cyber Security

Towards a Bayesian model for Cyber Security Towards a Bayesian model for Cyber Security Mark Briers (mbriers@turing.ac.uk) Joint work with Henry Clausen and Prof. Niall Adams (Imperial College London) 27 September 2017 The Alan Turing Institute

More information

Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother

Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother Simo Särkkä, Aki Vehtari and Jouko Lampinen Helsinki University of Technology Department of Electrical and Communications

More information

A Review of Pseudo-Marginal Markov Chain Monte Carlo

A Review of Pseudo-Marginal Markov Chain Monte Carlo A Review of Pseudo-Marginal Markov Chain Monte Carlo Discussed by: Yizhe Zhang October 21, 2016 Outline 1 Overview 2 Paper review 3 experiment 4 conclusion Motivation & overview Notation: θ denotes the

More information

EM-algorithm for Training of State-space Models with Application to Time Series Prediction

EM-algorithm for Training of State-space Models with Application to Time Series Prediction EM-algorithm for Training of State-space Models with Application to Time Series Prediction Elia Liitiäinen, Nima Reyhani and Amaury Lendasse Helsinki University of Technology - Neural Networks Research

More information

The Scaled Unscented Transformation

The Scaled Unscented Transformation The Scaled Unscented Transformation Simon J. Julier, IDAK Industries, 91 Missouri Blvd., #179 Jefferson City, MO 6519 E-mail:sjulier@idak.com Abstract This paper describes a generalisation of the unscented

More information

A Monte Carlo Sequential Estimation for Point Process Optimum Filtering

A Monte Carlo Sequential Estimation for Point Process Optimum Filtering 2006 International Joint Conference on Neural Networks Sheraton Vancouver Wall Centre Hotel, Vancouver, BC, Canada July 16-21, 2006 A Monte Carlo Sequential Estimation for Point Process Optimum Filtering

More information

Quantile Filtering and Learning

Quantile Filtering and Learning Quantile Filtering and Learning Michael Johannes, Nicholas Polson and Seung M. Yae October 5, 2009 Abstract Quantile and least-absolute deviations (LAD) methods are popular robust statistical methods but

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

A new Hierarchical Bayes approach to ensemble-variational data assimilation

A new Hierarchical Bayes approach to ensemble-variational data assimilation A new Hierarchical Bayes approach to ensemble-variational data assimilation Michael Tsyrulnikov and Alexander Rakitko HydroMetCenter of Russia College Park, 20 Oct 2014 Michael Tsyrulnikov and Alexander

More information

Constrained State Estimation Using the Unscented Kalman Filter

Constrained State Estimation Using the Unscented Kalman Filter 16th Mediterranean Conference on Control and Automation Congress Centre, Ajaccio, France June 25-27, 28 Constrained State Estimation Using the Unscented Kalman Filter Rambabu Kandepu, Lars Imsland and

More information

Summary STK 4150/9150

Summary STK 4150/9150 STK4150 - Intro 1 Summary STK 4150/9150 Odd Kolbjørnsen May 22 2017 Scope You are expected to know and be able to use basic concepts introduced in the book. You knowledge is expected to be larger than

More information

Chapter 4 Dynamic Bayesian Networks Fall Jin Gu, Michael Zhang

Chapter 4 Dynamic Bayesian Networks Fall Jin Gu, Michael Zhang Chapter 4 Dynamic Bayesian Networks 2016 Fall Jin Gu, Michael Zhang Reviews: BN Representation Basic steps for BN representations Define variables Define the preliminary relations between variables Check

More information

NONLINEAR STATISTICAL SIGNAL PROCESSING: A PARTI- CLE FILTERING APPROACH

NONLINEAR STATISTICAL SIGNAL PROCESSING: A PARTI- CLE FILTERING APPROACH NONLINEAR STATISTICAL SIGNAL PROCESSING: A PARTI- CLE FILTERING APPROACH J. V. Candy (tsoftware@aol.com) University of California, Lawrence Livermore National Lab. & Santa Barbara Livermore CA 94551 USA

More information

Inference for partially observed stochastic dynamic systems. University of California, Davis Statistics Department Seminar

Inference for partially observed stochastic dynamic systems. University of California, Davis Statistics Department Seminar Edward Ionides Inference for partially observed stochastic dynamic systems 1 Inference for partially observed stochastic dynamic systems University of California, Davis Statistics Department Seminar May

More information

Mini-course 07: Kalman and Particle Filters Particle Filters Fundamentals, algorithms and applications

Mini-course 07: Kalman and Particle Filters Particle Filters Fundamentals, algorithms and applications Mini-course 07: Kalman and Particle Filters Particle Filters Fundamentals, algorithms and applications Henrique Fonseca & Cesar Pacheco Wellington Betencurte 1 & Julio Dutra 2 Federal University of Espírito

More information

An Introduction to Particle Filtering

An Introduction to Particle Filtering An Introduction to Particle Filtering Author: Lisa Turner Supervisor: Dr. Christopher Sherlock 10th May 2013 Abstract This report introduces the ideas behind particle filters, looking at the Kalman filter

More information

Lecture 9. Time series prediction

Lecture 9. Time series prediction Lecture 9 Time series prediction Prediction is about function fitting To predict we need to model There are a bewildering number of models for data we look at some of the major approaches in this lecture

More information

Time Series Prediction by Kalman Smoother with Cross-Validated Noise Density

Time Series Prediction by Kalman Smoother with Cross-Validated Noise Density Time Series Prediction by Kalman Smoother with Cross-Validated Noise Density Simo Särkkä E-mail: simo.sarkka@hut.fi Aki Vehtari E-mail: aki.vehtari@hut.fi Jouko Lampinen E-mail: jouko.lampinen@hut.fi Abstract

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 7 Approximate

More information

Kernel adaptive Sequential Monte Carlo

Kernel adaptive Sequential Monte Carlo Kernel adaptive Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) December 7, 2015 1 / 36 Section 1 Outline

More information