Non-particle filters

Size: px
Start display at page:

Download "Non-particle filters"

Transcription

1 Non-particle filters Fred Daum & Misha Krichman Raytheon Company 225 Presidential Way Woburn, MA ABSTRACT We have developed a new nonlinear filter that is superior to particle filters in five ways: (1) it exploits smoothness; (2) it uses an exact solution of the Fokker-Planck equation in continuous time; (3) it uses a convolution to compute the effect of process noise at discrete times; (4) it uses the adjoint method to compute the optimal density of points in state space to represent the smooth conditional probability density, and (5) it uses Bayes rule exactly by exploiting the exponential family of probability densities. In contrast to particle filters, which do not exploit smoothness, the new filter avoids importance sampling and Monte Carlo methods. The new non-particle filter should be superior to particle filters for a broad class of practical problems. In particular, the new filter should dramatically reduce the curse of dimensionality for many (but not all) important real world nonlinear filter problems. KEY WORDS Nonlinear filters, particle filter, adjoint method, meshfree, Fokker-Planck equation, Kalman filter 1.0 INTRODUCTION We describe a new nonlinear filter that exploits smoothness to reduce the curse of dimensionality for a broad class of important practical problems. We use a hybrid model of nonlinear dynamics that allows us to solve the Fokker-Planck equation exactly; in particular, we use discrete time diffusion but continuous time drift. With this hybrid model, the Fokker- Planck equation is equivalent to an ODE for the unnormalized conditional density of the d- dimensional state vector. Therefore, we do not need an extremely fine quantization in time (or an implicit method or ADI method) to compensate for fine quantization in state space; that is, we do not need to worry about stability of the numerical solution of the Fokker-Planck equation, as defined by the Courant-Friedrich- Lewy stability criterion. Moreover, we implement Bayes rule exactly for updates of the unnormalized density with measurements using the exponential family of probability densities. The effect of diffusion (also called process noise by engineers) in the Fokker- Planck equation is computed using a fast convolution of two probability densities. The new filter is summarized in Tables 1, 2 & 3. The derivation in Table 3 assumes that the probability density is smooth and nowhere vanishing; this formula was known to Liouville. The new filter fully exploits the smoothness of the Fokker-Planck equation, and therefore it should be superior to particle filters, which do not exploit any smoothness and which do not exploit exact solutions or the exponential family. However, if we use a uniform grid in d- dimensional state space to represent the conditional probability density, then we would still suffer from the curse of dimensionality. Therefore, we represent the density using a sparse grid computed adaptively in real time 1

2 with the adjoint method [11]. The adjoint method for solving PDEs numerically is analogous to the adjoint (aka Lagrange multipliers) used in optimal control (see Table 2). The adjoint method is an industrial strength numerical algorithm that is widely used for solving PDEs. Intuitively, the reason that we can use this hybrid discrete-time/continuoustime model for nonlinear filtering is that engineers use the diffusion term (so-called process noise ) as a design parameter, unlike physics and chemistry, where the diffusion tensor is defined by Nature. In particular, engineers typically tune the process noise covariance matrix to get improved results with extended Kalman filters [4], but Nature does not allow such tuning of Avogadro s number or other physical constants. Engineers commonly increase or decrease process noise variance by a factor of two or three without any significant effect on filter performance, but changing the drift term by one percent can wreak havoc with performance in some applications. We exploit this insensitivity to model variation in diffusion, but we pay strict attention to the physics which is encoded in the drift term in the Fokker- Planck equation. This allows us to model process noise in discrete time and use a convolution to compute the effect of diffusion on the conditional density; this greatly reduces computational complexity. It would be a shame to lavish Gflops of computer throughput on carefully solving the Fokker-Planck equation with non-zero diffusion tensor, considering that an exact model of the diffusion tensor is both unknown and of little importance in practical engineering applications. In most practical applications the process noise covariance matrix is diagonal; if not, it can be diagonalized at the cost of d 3 computations; this means that we can use d one-dimensional convolutions of the two probability densities. The key issue in nonlinear filters is the curse of dimensionality, which is a phrase coined by Richard Bellman four decades ago. The curse of dimensionality means that the computational complexity of solving a problem increases extremely fast with the dimension of the problem. For nonlinear filters, the dimension refers to the dimension of the state vector of the dynamical system to be estimated. The term extremely fast is usually taken to mean that computer time increases exponentially with dimension. It is easy to see why the curse of dimensionality is relevant for nonlinear filters. As explained below, we need to solve a partial differential equation (PDE) in d-dimensional state space in order to solve the nonlinear filtering problem. Standard textbook methods for solving PDEs numerically use a fixed grid in d-dimensional space, and the computational complexity grows as N d where N is the number of grid points in each dimension. We can conclude from this that using a fixed grid results in computational complexity growing exponentially with dimension. Hence, using a fixed grid is an extremely bad idea, and that a non-uniform set of nodes computed adaptively is required to have any hope of mitigating the curse of dimensionality. That is the key idea of this paper, as well as particle filters, as well as all modern work on solving PDEs numerically. We emphasize that hardboiled engineers are only interested in good approximations rather than exact solutions. The question of what is good enough depends on the specific application. There are many different algorithms to solve the nonlinear filtering problem, including: extended Kalman filters, unscented Kalman filters, particle filters, explicit numerical solution of the Fokker-Planck equation, Daum filters, etc. A tutorial introduction to a wide range of state-of-the-art nonlinear filters is given in [4]. It has been asserted in engineering journals that particle filters beat the curse of dimensionality, but this is generally wrong. It turns out that particle filters depend on a good proposal density, and without such help the particle filters also suffer from the curse of dimensionality [4]. 2

3 Particle filters are extremely popular, owing to the ease of coding and the simple theory required. One can code a pretty good particle filter in one or two pages of MATLAB, and one does not need to understand the finer points of stochastic calculus or any fancy methods for solving partial differential equations. Also, particle filters are popular due to their generality and flexibility, as well as a certain amount of hype associated with the claim that they beat the curse of dimensionality. On the other hand, particle filters do not exploit the smoothness of the nonlinear filtering problem, and hence we expect that the new filter described here should be superior to particle filters for many practical applications. 2.0 THE VALUE OF SMOOTHNESS Smoothness can dramatically reduce computational complexity for high dimensional problems. In particular, for approximation of smooth functions, a well known theoretical bound [12] gives: c( d) σ T = d / s in which T = computation time to achieve an approximation error of σ d = dimension of independent variable s = smoothness of the functions being approximated c(d) = time for one function evaluation (e.g., d³ for typical engineering problems) We emphasize that the word smoothness in this context does not mean, for example twice continuously differentiable (for s = 2), but rather the word smoothness as used here defines a class of functions with mixed partial derivatives of order s that are bounded by unity [12]. To quote Nemirovsky & Yudin [13]: Smoothness does not, in itself, count for much; what is important is the values of the numerical parameters which characterize this smoothness (the values of the corresponding derivatives, and so on). This is intuitively obvious. For example, for d = 20, if the conditional density is in the class of functions with s = 2, we have reduced the computational complexity by an enormous factor, as if the dimension was only d = 10. We might be tempted to say that the effective dimension is d = 10 in this case. If the theoretical bound given above applies to our nonlinear filter problem, then we have not beaten the curse of dimensionality, but we have certainly improved the situation dramatically. Unfortunately, the simple bound given above is isotropic, whereas our problem might be much smoother in certain directions than others, and therefore it is difficult to quantify the reduction in computational complexity using a simple formula with just a few parameters. Nevertheless, the simple back-of-the-envelope formula above gives considerable insight into the benefit of smoothness. There are other bounds on computational complexity for multivariate integration of smooth functions in d-dimensions [12], as well as distinct formulas that apply for estimation of smooth probability densities in d-dimensions [10] and [14]-[15]. References (1) W. Bangerth and R. Rannacher, Adaptive finite element methods for differential equations, Birkhauser Inc., (2) M. B. Giles and E. Suli, Adjoint methods for PDEs, Acta Numerica, pages , Cambridge University Press, (3) R. Becker and R. Rannacher, An optimal control approach to a posteriori error 3

4 estimation in finite element methods, Acta Numerica, pages 1-102, Cambridge University Press, (4) F. E. Daum, Nonlinear filters: beyond the Kalman filter, special tutorial issue of IEEE AES Systems Magazine, August (5) F. E. Daum, Industrial Strength Nonlinear Filters, Proceedings of Workshop in honor of Yaakov Bar- Shalom, Monterey California, May (6) F. E. Daum, New Exact Nonlinear Filters, Chapter 8 in Bayesian Analysis of Time Series and Dynamic Models, edited by J. C. Spall, New York: Marcel Dekker, Inc (7) F. E. Daum, Exact Finite Dimensional Nonlinear Filters, IEEE Transactions on Automatic Control, July (8) M.-S. Oh, Monte Carlo integration via importance sampling: dimensionality effect and an adaptive algorithm, Contemporary Mathematics, volume 115, pages , (9) K. Kastella, Finite Difference Methods for Nonlinear Filtering and Automatic Target Recognition, in Multitarget/Multisensor Tracking Volume III, edited by Y. Bar-Shalom & W. D. Blair, Artech House, Inc., (10) Luc Devroye and Gabor Lugosi, Combinatorial Methods in Density Estimation, Springer-Verlag, (11) Fred Daum and Mikhail Krichman, Meshfree Adjoint Methods for Nonlinear Filtering, Proceedings of IEEE Aerospace Conference, Big Sky Montana, March (12) J. Traub and A. Werschultz, Complexity and Information, Cambridge University Press, (13) A. S. Nemirovsky and D. B. Yudin, Problem Complexity and Method Efficiency in Optimization, translated by E. R. Dawson, John Wiley & Sons, Inc., (14) F. Cucker and S. Smale, On the mathematical foundations of learning, Bulletin of American Math. Society, volume 39, number 1, pages 1-49, (15) M. Griebel, Sparse grids and related approximation schemes for higher dimensional problems, Univ. Bonn, Table 1 New filter vs. Particle filter 1. Prediction of density (drift) 2. Prediction of density (diffusion) 3. Adaptive method to avoid uniform grid 4. Representation of density 5. Exploits smoothness NEW FILTER Exact solution of Fokker- Planck PDE convolution of two probability densities Adjoint method Hybrid of continuous & discrete in state-space Yes PARTICLE FILTER Monte Carlo Monte Carlo Importance sampling from proposal density Particles No 4

5 Table 2 Adjoint method for PDEs vs. optimal control PDEs Computes optimal density of points in state space: q(x,t) Uses feedback: residuals of both the primal & dual solutions Lp = f & L*v = g Functional to be minimized: error in numerical approximation of conditional mean Optimal control Computes optimal control: u(x,t) Uses feedback Euler-Lagrange equations Functional to be minimized: J = L(x, u,t) dt Table 3 Exact solution of Fokker- Planck equation for zero diffusion p/ t = - p/ x f p Tr( f/ x) + ½ Tr(Q ²p/ ²x) p/ t = - p/ x f p Tr( f/ x) for Q = 0 dp/dt = p/ t + p/ x f dp/dt = - p Tr( f/ x) dp/p = - Tr( f/ x) dt for p > 0. Hence, p(x, t) = p(x, 0) exp ( - Tr( f/ x) dt ) 5

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Centre for Computational Statistics and Machine Learning University College London c.archambeau@cs.ucl.ac.uk CSML

More information

Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother

Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother Simo Särkkä, Aki Vehtari and Jouko Lampinen Helsinki University of Technology Department of Electrical and Communications

More information

Time Series Prediction by Kalman Smoother with Cross-Validated Noise Density

Time Series Prediction by Kalman Smoother with Cross-Validated Noise Density Time Series Prediction by Kalman Smoother with Cross-Validated Noise Density Simo Särkkä E-mail: simo.sarkka@hut.fi Aki Vehtari E-mail: aki.vehtari@hut.fi Jouko Lampinen E-mail: jouko.lampinen@hut.fi Abstract

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

Particle Filters. Outline

Particle Filters. Outline Particle Filters M. Sami Fadali Professor of EE University of Nevada Outline Monte Carlo integration. Particle filter. Importance sampling. Degeneracy Resampling Example. 1 2 Monte Carlo Integration Numerical

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Dan Cawford Manfred Opper John Shawe-Taylor May, 2006 1 Introduction Some of the most complex models routinely run

More information

NONLINEAR BAYESIAN FILTERING FOR STATE AND PARAMETER ESTIMATION

NONLINEAR BAYESIAN FILTERING FOR STATE AND PARAMETER ESTIMATION NONLINEAR BAYESIAN FILTERING FOR STATE AND PARAMETER ESTIMATION Kyle T. Alfriend and Deok-Jin Lee Texas A&M University, College Station, TX, 77843-3141 This paper provides efficient filtering algorithms

More information

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations Department of Biomedical Engineering and Computational Science Aalto University April 28, 2010 Contents 1 Multiple Model

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 13: Learning in Gaussian Graphical Models, Non-Gaussian Inference, Monte Carlo Methods Some figures

More information

Density Approximation Based on Dirac Mixtures with Regard to Nonlinear Estimation and Filtering

Density Approximation Based on Dirac Mixtures with Regard to Nonlinear Estimation and Filtering Density Approximation Based on Dirac Mixtures with Regard to Nonlinear Estimation and Filtering Oliver C. Schrempf, Dietrich Brunn, Uwe D. Hanebeck Intelligent Sensor-Actuator-Systems Laboratory Institute

More information

EM-algorithm for Training of State-space Models with Application to Time Series Prediction

EM-algorithm for Training of State-space Models with Application to Time Series Prediction EM-algorithm for Training of State-space Models with Application to Time Series Prediction Elia Liitiäinen, Nima Reyhani and Amaury Lendasse Helsinki University of Technology - Neural Networks Research

More information

Lecture 6: Bayesian Inference in SDE Models

Lecture 6: Bayesian Inference in SDE Models Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs

More information

NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES

NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES 2013 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES Simo Särä Aalto University, 02150 Espoo, Finland Jouni Hartiainen

More information

STA414/2104 Statistical Methods for Machine Learning II

STA414/2104 Statistical Methods for Machine Learning II STA414/2104 Statistical Methods for Machine Learning II Murat A. Erdogdu & David Duvenaud Department of Computer Science Department of Statistical Sciences Lecture 3 Slide credits: Russ Salakhutdinov Announcements

More information

Performance assessment of MIMO systems under partial information

Performance assessment of MIMO systems under partial information Performance assessment of MIMO systems under partial information H Xia P Majecki A Ordys M Grimble Abstract Minimum variance (MV) can characterize the most fundamental performance limitation of a system,

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter

More information

Markov chain Monte Carlo methods for visual tracking

Markov chain Monte Carlo methods for visual tracking Markov chain Monte Carlo methods for visual tracking Ray Luo rluo@cory.eecs.berkeley.edu Department of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA 94720

More information

MULTI-MODEL FILTERING FOR ORBIT DETERMINATION DURING MANOEUVRE

MULTI-MODEL FILTERING FOR ORBIT DETERMINATION DURING MANOEUVRE MULTI-MODEL FILTERING FOR ORBIT DETERMINATION DURING MANOEUVRE Bruno CHRISTOPHE Vanessa FONDBERTASSE Office National d'etudes et de Recherches Aérospatiales (http://www.onera.fr) B.P. 72 - F-92322 CHATILLON

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

Convergence of the Ensemble Kalman Filter in Hilbert Space

Convergence of the Ensemble Kalman Filter in Hilbert Space Convergence of the Ensemble Kalman Filter in Hilbert Space Jan Mandel Center for Computational Mathematics Department of Mathematical and Statistical Sciences University of Colorado Denver Parts based

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

Bayesian Inverse problem, Data assimilation and Localization

Bayesian Inverse problem, Data assimilation and Localization Bayesian Inverse problem, Data assimilation and Localization Xin T Tong National University of Singapore ICIP, Singapore 2018 X.Tong Localization 1 / 37 Content What is Bayesian inverse problem? What is

More information

Dynamic System Identification using HDMR-Bayesian Technique

Dynamic System Identification using HDMR-Bayesian Technique Dynamic System Identification using HDMR-Bayesian Technique *Shereena O A 1) and Dr. B N Rao 2) 1), 2) Department of Civil Engineering, IIT Madras, Chennai 600036, Tamil Nadu, India 1) ce14d020@smail.iitm.ac.in

More information

Multi-Factor Finite Differences

Multi-Factor Finite Differences February 17, 2017 Aims and outline Finite differences for more than one direction The θ-method, explicit, implicit, Crank-Nicolson Iterative solution of discretised equations Alternating directions implicit

More information

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein Kalman filtering and friends: Inference in time series models Herke van Hoof slides mostly by Michael Rubinstein Problem overview Goal Estimate most probable state at time k using measurement up to time

More information

A NEW NONLINEAR FILTER

A NEW NONLINEAR FILTER COMMUNICATIONS IN INFORMATION AND SYSTEMS c 006 International Press Vol 6, No 3, pp 03-0, 006 004 A NEW NONLINEAR FILTER ROBERT J ELLIOTT AND SIMON HAYKIN Abstract A discrete time filter is constructed

More information

Efficient Monitoring for Planetary Rovers

Efficient Monitoring for Planetary Rovers International Symposium on Artificial Intelligence and Robotics in Space (isairas), May, 2003 Efficient Monitoring for Planetary Rovers Vandi Verma vandi@ri.cmu.edu Geoff Gordon ggordon@cs.cmu.edu Carnegie

More information

Online tests of Kalman filter consistency

Online tests of Kalman filter consistency Tampere University of Technology Online tests of Kalman filter consistency Citation Piché, R. (216). Online tests of Kalman filter consistency. International Journal of Adaptive Control and Signal Processing,

More information

Expectation Propagation in Dynamical Systems

Expectation Propagation in Dynamical Systems Expectation Propagation in Dynamical Systems Marc Peter Deisenroth Joint Work with Shakir Mohamed (UBC) August 10, 2012 Marc Deisenroth (TU Darmstadt) EP in Dynamical Systems 1 Motivation Figure : Complex

More information

Using the Kalman Filter to Estimate the State of a Maneuvering Aircraft

Using the Kalman Filter to Estimate the State of a Maneuvering Aircraft 1 Using the Kalman Filter to Estimate the State of a Maneuvering Aircraft K. Meier and A. Desai Abstract Using sensors that only measure the bearing angle and range of an aircraft, a Kalman filter is implemented

More information

Condensed Table of Contents for Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control by J. C.

Condensed Table of Contents for Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control by J. C. Condensed Table of Contents for Introduction to Stochastic Search and Optimization: Estimation, Simulation, and Control by J. C. Spall John Wiley and Sons, Inc., 2003 Preface... xiii 1. Stochastic Search

More information

Probabilistic Machine Learning

Probabilistic Machine Learning Probabilistic Machine Learning Bayesian Nets, MCMC, and more Marek Petrik 4/18/2017 Based on: P. Murphy, K. (2012). Machine Learning: A Probabilistic Perspective. Chapter 10. Conditional Independence Independent

More information

2D Image Processing. Bayes filter implementation: Kalman filter

2D Image Processing. Bayes filter implementation: Kalman filter 2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz http://av.dfki.de

More information

State Estimation by IMM Filter in the Presence of Structural Uncertainty 1

State Estimation by IMM Filter in the Presence of Structural Uncertainty 1 Recent Advances in Signal Processing and Communications Edited by Nios Mastorais World Scientific and Engineering Society (WSES) Press Greece 999 pp.8-88. State Estimation by IMM Filter in the Presence

More information

FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS

FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS Gustaf Hendeby Fredrik Gustafsson Division of Automatic Control Department of Electrical Engineering, Linköpings universitet, SE-58 83 Linköping,

More information

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 58, NO. 10, OCTOBER

IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 58, NO. 10, OCTOBER IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 58, NO. 10, OCTOBER 2010 4977 Cubature Kalman Filtering for Continuous-Discrete Systems: Theory and Simulations Ienkaran Arasaratnam, Simon Haykin, Life Fellow,

More information

2D Image Processing. Bayes filter implementation: Kalman filter

2D Image Processing. Bayes filter implementation: Kalman filter 2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Dr. Gabriele Bleser Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche

More information

A Tree Search Approach to Target Tracking in Clutter

A Tree Search Approach to Target Tracking in Clutter 12th International Conference on Information Fusion Seattle, WA, USA, July 6-9, 2009 A Tree Search Approach to Target Tracking in Clutter Jill K. Nelson and Hossein Roufarshbaf Department of Electrical

More information

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS Frédéric Mustière e-mail: mustiere@site.uottawa.ca Miodrag Bolić e-mail: mbolic@site.uottawa.ca Martin Bouchard e-mail: bouchard@site.uottawa.ca

More information

Recursive Noise Adaptive Kalman Filtering by Variational Bayesian Approximations

Recursive Noise Adaptive Kalman Filtering by Variational Bayesian Approximations PREPRINT 1 Recursive Noise Adaptive Kalman Filtering by Variational Bayesian Approximations Simo Särä, Member, IEEE and Aapo Nummenmaa Abstract This article considers the application of variational Bayesian

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 12: Gaussian Belief Propagation, State Space Models and Kalman Filters Guest Kalman Filter Lecture by

More information

On the Conditional Distribution of the Multivariate t Distribution

On the Conditional Distribution of the Multivariate t Distribution On the Conditional Distribution of the Multivariate t Distribution arxiv:604.0056v [math.st] 2 Apr 206 Peng Ding Abstract As alternatives to the normal distributions, t distributions are widely applied

More information

Expectation propagation for signal detection in flat-fading channels

Expectation propagation for signal detection in flat-fading channels Expectation propagation for signal detection in flat-fading channels Yuan Qi MIT Media Lab Cambridge, MA, 02139 USA yuanqi@media.mit.edu Thomas Minka CMU Statistics Department Pittsburgh, PA 15213 USA

More information

Organization. I MCMC discussion. I project talks. I Lecture.

Organization. I MCMC discussion. I project talks. I Lecture. Organization I MCMC discussion I project talks. I Lecture. Content I Uncertainty Propagation Overview I Forward-Backward with an Ensemble I Model Reduction (Intro) Uncertainty Propagation in Causal Systems

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 3 Linear

More information

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER Zhen Zhen 1, Jun Young Lee 2, and Abdus Saboor 3 1 Mingde College, Guizhou University, China zhenz2000@21cn.com 2 Department

More information

OPTIMAL CONTROL AND ESTIMATION

OPTIMAL CONTROL AND ESTIMATION OPTIMAL CONTROL AND ESTIMATION Robert F. Stengel Department of Mechanical and Aerospace Engineering Princeton University, Princeton, New Jersey DOVER PUBLICATIONS, INC. New York CONTENTS 1. INTRODUCTION

More information

State Estimation of Linear and Nonlinear Dynamic Systems

State Estimation of Linear and Nonlinear Dynamic Systems State Estimation of Linear and Nonlinear Dynamic Systems Part I: Linear Systems with Gaussian Noise James B. Rawlings and Fernando V. Lima Department of Chemical and Biological Engineering University of

More information

Pattern Recognition and Machine Learning

Pattern Recognition and Machine Learning Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability

More information

1.2 Derivation. d p f = d p f(x(p)) = x fd p x (= f x x p ). (1) Second, g x x p + g p = 0. d p f = f x g 1. The expression f x gx

1.2 Derivation. d p f = d p f(x(p)) = x fd p x (= f x x p ). (1) Second, g x x p + g p = 0. d p f = f x g 1. The expression f x gx PDE-constrained optimization and the adjoint method Andrew M. Bradley November 16, 21 PDE-constrained optimization and the adjoint method for solving these and related problems appear in a wide range of

More information

A Novel Gaussian Sum Filter Method for Accurate Solution to Nonlinear Filtering Problem

A Novel Gaussian Sum Filter Method for Accurate Solution to Nonlinear Filtering Problem A Novel Gaussian Sum Filter Method for Accurate Solution to Nonlinear Filtering Problem Gabriel Terejanu a Puneet Singla b Tarunraj Singh b Peter D. Scott a Graduate Student Assistant Professor Professor

More information

ECE521 lecture 4: 19 January Optimization, MLE, regularization

ECE521 lecture 4: 19 January Optimization, MLE, regularization ECE521 lecture 4: 19 January 2017 Optimization, MLE, regularization First four lectures Lectures 1 and 2: Intro to ML Probability review Types of loss functions and algorithms Lecture 3: KNN Convexity

More information

An Introduction to the Kalman Filter

An Introduction to the Kalman Filter An Introduction to the Kalman Filter by Greg Welch 1 and Gary Bishop 2 Department of Computer Science University of North Carolina at Chapel Hill Chapel Hill, NC 275993175 Abstract In 1960, R.E. Kalman

More information

A Concept of Approximated Densities for Efficient Nonlinear Estimation

A Concept of Approximated Densities for Efficient Nonlinear Estimation EURASIP Journal on Applied Signal Processing 2002:10, 1145 1150 c 2002 Hindawi Publishing Corporation A Concept of Approximated Densities for Efficient Nonlinear Estimation Virginie F. Ruiz Department

More information

State Estimation using Moving Horizon Estimation and Particle Filtering

State Estimation using Moving Horizon Estimation and Particle Filtering State Estimation using Moving Horizon Estimation and Particle Filtering James B. Rawlings Department of Chemical and Biological Engineering UW Math Probability Seminar Spring 2009 Rawlings MHE & PF 1 /

More information

Linear Dependency Between and the Input Noise in -Support Vector Regression

Linear Dependency Between and the Input Noise in -Support Vector Regression 544 IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 14, NO. 3, MAY 2003 Linear Dependency Between the Input Noise in -Support Vector Regression James T. Kwok Ivor W. Tsang Abstract In using the -support vector

More information

The Kalman Filter. Data Assimilation & Inverse Problems from Weather Forecasting to Neuroscience. Sarah Dance

The Kalman Filter. Data Assimilation & Inverse Problems from Weather Forecasting to Neuroscience. Sarah Dance The Kalman Filter Data Assimilation & Inverse Problems from Weather Forecasting to Neuroscience Sarah Dance School of Mathematical and Physical Sciences, University of Reading s.l.dance@reading.ac.uk July

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 11 Project

More information

Sequential Monte Carlo methods for filtering of unobservable components of multidimensional diffusion Markov processes

Sequential Monte Carlo methods for filtering of unobservable components of multidimensional diffusion Markov processes Sequential Monte Carlo methods for filtering of unobservable components of multidimensional diffusion Markov processes Ellida M. Khazen * 13395 Coppermine Rd. Apartment 410 Herndon VA 20171 USA Abstract

More information

Human Pose Tracking I: Basics. David Fleet University of Toronto

Human Pose Tracking I: Basics. David Fleet University of Toronto Human Pose Tracking I: Basics David Fleet University of Toronto CIFAR Summer School, 2009 Looking at People Challenges: Complex pose / motion People have many degrees of freedom, comprising an articulated

More information

Uncertainty quantification for Wavefield Reconstruction Inversion

Uncertainty quantification for Wavefield Reconstruction Inversion Uncertainty quantification for Wavefield Reconstruction Inversion Zhilong Fang *, Chia Ying Lee, Curt Da Silva *, Felix J. Herrmann *, and Rachel Kuske * Seismic Laboratory for Imaging and Modeling (SLIM),

More information

Decoupled Feedforward Control for an Air-Conditioning and Refrigeration System

Decoupled Feedforward Control for an Air-Conditioning and Refrigeration System American Control Conference Marriott Waterfront, Baltimore, MD, USA June 3-July, FrB1.4 Decoupled Feedforward Control for an Air-Conditioning and Refrigeration System Neera Jain, Member, IEEE, Richard

More information

Uncertainty Propagation

Uncertainty Propagation Setting: Uncertainty Propagation We assume that we have determined distributions for parameters e.g., Bayesian inference, prior experiments, expert opinion Ṫ 1 = 1 - d 1 T 1 - (1 - ")k 1 VT 1 Ṫ 2 = 2 -

More information

Data assimilation Schrödinger s perspective

Data assimilation Schrödinger s perspective Data assimilation Schrödinger s perspective Sebastian Reich (www.sfb1294.de) Universität Potsdam/ University of Reading IMS NUS, August 3, 218 Universität Potsdam/ University of Reading 1 Core components

More information

Gaussian Filters for Nonlinear Filtering Problems

Gaussian Filters for Nonlinear Filtering Problems 910 IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL. 45, NO. 5, MAY 2000 Gaussian Filters for Nonlinear Filtering Problems Kazufumi Ito Kaiqi Xiong Abstract In this paper we develop analyze real-time accurate

More information

EECE Adaptive Control

EECE Adaptive Control EECE 574 - Adaptive Control Overview Guy Dumont Department of Electrical and Computer Engineering University of British Columbia Lectures: Thursday 09h00-12h00 Location: PPC 101 Guy Dumont (UBC) EECE 574

More information

Journal of Inequalities in Pure and Applied Mathematics

Journal of Inequalities in Pure and Applied Mathematics Journal of Inequalities in Pure and Applied Mathematics MATRIX AND OPERATOR INEQUALITIES FOZI M DANNAN Department of Mathematics Faculty of Science Qatar University Doha - Qatar EMail: fmdannan@queduqa

More information

Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications

Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications Stochastic Collocation Methods for Polynomial Chaos: Analysis and Applications Dongbin Xiu Department of Mathematics, Purdue University Support: AFOSR FA955-8-1-353 (Computational Math) SF CAREER DMS-64535

More information

* Matrix Factorization and Recommendation Systems

* Matrix Factorization and Recommendation Systems Matrix Factorization and Recommendation Systems Originally presented at HLF Workshop on Matrix Factorization with Loren Anderson (University of Minnesota Twin Cities) on 25 th September, 2017 15 th March,

More information

Introduction to Computational Stochastic Differential Equations

Introduction to Computational Stochastic Differential Equations Introduction to Computational Stochastic Differential Equations Gabriel J. Lord Catherine E. Powell Tony Shardlow Preface Techniques for solving many of the differential equations traditionally used by

More information

entropy Continuous-Discrete Path Integral Filtering Entropy 2009, 11, ; doi: /e ISSN

entropy Continuous-Discrete Path Integral Filtering Entropy 2009, 11, ; doi: /e ISSN Entropy 9,, 4-43; doi:.339/e34 Article Continuous-Discrete Path Integral Filtering OPEN ACCESS entropy ISSN 99-43 www.mdpi.com/journal/entropy Bhashyam Balaji Radar Systems Section, Defence Research and

More information

AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER. Qi Cheng and Pascal Bondon. CNRS UMR 8506, Université Paris XI, France.

AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER. Qi Cheng and Pascal Bondon. CNRS UMR 8506, Université Paris XI, France. AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER Qi Cheng and Pascal Bondon CNRS UMR 8506, Université Paris XI, France. August 27, 2011 Abstract We present a modified bootstrap filter to draw

More information

Lecture 7 January 26, 2016

Lecture 7 January 26, 2016 MATH 262/CME 372: Applied Fourier Analysis and Winter 26 Elements of Modern Signal Processing Lecture 7 January 26, 26 Prof Emmanuel Candes Scribe: Carlos A Sing-Long, Edited by E Bates Outline Agenda:

More information

A new unscented Kalman filter with higher order moment-matching

A new unscented Kalman filter with higher order moment-matching A new unscented Kalman filter with higher order moment-matching KSENIA PONOMAREVA, PARESH DATE AND ZIDONG WANG Department of Mathematical Sciences, Brunel University, Uxbridge, UB8 3PH, UK. Abstract This

More information

Lecture 7: Optimal Smoothing

Lecture 7: Optimal Smoothing Department of Biomedical Engineering and Computational Science Aalto University March 17, 2011 Contents 1 What is Optimal Smoothing? 2 Bayesian Optimal Smoothing Equations 3 Rauch-Tung-Striebel Smoother

More information

Lecture 3. Linear Regression II Bastian Leibe RWTH Aachen

Lecture 3. Linear Regression II Bastian Leibe RWTH Aachen Advanced Machine Learning Lecture 3 Linear Regression II 02.11.2015 Bastian Leibe RWTH Aachen http://www.vision.rwth-aachen.de/ leibe@vision.rwth-aachen.de This Lecture: Advanced Machine Learning Regression

More information

State estimation of linear dynamic system with unknown input and uncertain observation using dynamic programming

State estimation of linear dynamic system with unknown input and uncertain observation using dynamic programming Control and Cybernetics vol. 35 (2006) No. 4 State estimation of linear dynamic system with unknown input and uncertain observation using dynamic programming by Dariusz Janczak and Yuri Grishin Department

More information

Cramer-Rao Bound Fundamentals of Kalman Filtering: A Practical Approach

Cramer-Rao Bound Fundamentals of Kalman Filtering: A Practical Approach Cramer-Rao Bound 24-1 What is the Cramer-Rao Lower Bound (CRLB) and What Does it Mean? According to Bar Shalom* - The mean square error corresponding to the estimator of a parameter cannot be smaller than

More information

A Posteriori Estimates for Cost Functionals of Optimal Control Problems

A Posteriori Estimates for Cost Functionals of Optimal Control Problems A Posteriori Estimates for Cost Functionals of Optimal Control Problems Alexandra Gaevskaya, Ronald H.W. Hoppe,2 and Sergey Repin 3 Institute of Mathematics, Universität Augsburg, D-8659 Augsburg, Germany

More information

Filtering Sparse Regular Observed Linear and Nonlinear Turbulent System

Filtering Sparse Regular Observed Linear and Nonlinear Turbulent System Filtering Sparse Regular Observed Linear and Nonlinear Turbulent System John Harlim, Andrew J. Majda Department of Mathematics and Center for Atmosphere Ocean Science Courant Institute of Mathematical

More information

The Scaled Unscented Transformation

The Scaled Unscented Transformation The Scaled Unscented Transformation Simon J. Julier, IDAK Industries, 91 Missouri Blvd., #179 Jefferson City, MO 6519 E-mail:sjulier@idak.com Abstract This paper describes a generalisation of the unscented

More information

Recursive Generalized Eigendecomposition for Independent Component Analysis

Recursive Generalized Eigendecomposition for Independent Component Analysis Recursive Generalized Eigendecomposition for Independent Component Analysis Umut Ozertem 1, Deniz Erdogmus 1,, ian Lan 1 CSEE Department, OGI, Oregon Health & Science University, Portland, OR, USA. {ozertemu,deniz}@csee.ogi.edu

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin

More information

Bayesian Inference: Principles and Practice 3. Sparse Bayesian Models and the Relevance Vector Machine

Bayesian Inference: Principles and Practice 3. Sparse Bayesian Models and the Relevance Vector Machine Bayesian Inference: Principles and Practice 3. Sparse Bayesian Models and the Relevance Vector Machine Mike Tipping Gaussian prior Marginal prior: single α Independent α Cambridge, UK Lecture 3: Overview

More information

Recent Advances in Bayesian Inference Techniques

Recent Advances in Bayesian Inference Techniques Recent Advances in Bayesian Inference Techniques Christopher M. Bishop Microsoft Research, Cambridge, U.K. research.microsoft.com/~cmbishop SIAM Conference on Data Mining, April 2004 Abstract Bayesian

More information

Fundamentals of Data Assimila1on

Fundamentals of Data Assimila1on 014 GSI Community Tutorial NCAR Foothills Campus, Boulder, CO July 14-16, 014 Fundamentals of Data Assimila1on Milija Zupanski Cooperative Institute for Research in the Atmosphere Colorado State University

More information

Multistage Methodologies for Partitioning a Set of Exponential. populations.

Multistage Methodologies for Partitioning a Set of Exponential. populations. Multistage Methodologies for Partitioning a Set of Exponential Populations Department of Mathematics, University of New Orleans, 2000 Lakefront, New Orleans, LA 70148, USA tsolanky@uno.edu Tumulesh K.

More information

State Space Representation of Gaussian Processes

State Space Representation of Gaussian Processes State Space Representation of Gaussian Processes Simo Särkkä Department of Biomedical Engineering and Computational Science (BECS) Aalto University, Espoo, Finland June 12th, 2013 Simo Särkkä (Aalto University)

More information

Numerical Optimal Control Overview. Moritz Diehl

Numerical Optimal Control Overview. Moritz Diehl Numerical Optimal Control Overview Moritz Diehl Simplified Optimal Control Problem in ODE path constraints h(x, u) 0 initial value x0 states x(t) terminal constraint r(x(t )) 0 controls u(t) 0 t T minimize

More information

Algorithm for Multiple Model Adaptive Control Based on Input-Output Plant Model

Algorithm for Multiple Model Adaptive Control Based on Input-Output Plant Model BULGARIAN ACADEMY OF SCIENCES CYBERNEICS AND INFORMAION ECHNOLOGIES Volume No Sofia Algorithm for Multiple Model Adaptive Control Based on Input-Output Plant Model sonyo Slavov Department of Automatics

More information

ECE521 week 3: 23/26 January 2017

ECE521 week 3: 23/26 January 2017 ECE521 week 3: 23/26 January 2017 Outline Probabilistic interpretation of linear regression - Maximum likelihood estimation (MLE) - Maximum a posteriori (MAP) estimation Bias-variance trade-off Linear

More information

Seminar: Data Assimilation

Seminar: Data Assimilation Seminar: Data Assimilation Jonas Latz, Elisabeth Ullmann Chair of Numerical Mathematics (M2) Technical University of Munich Jonas Latz, Elisabeth Ullmann (TUM) Data Assimilation 1 / 28 Prerequisites Bachelor:

More information

A Robust Controller for Scalar Autonomous Optimal Control Problems

A Robust Controller for Scalar Autonomous Optimal Control Problems A Robust Controller for Scalar Autonomous Optimal Control Problems S. H. Lam 1 Department of Mechanical and Aerospace Engineering Princeton University, Princeton, NJ 08544 lam@princeton.edu Abstract Is

More information

Nonparametric Bayesian Methods (Gaussian Processes)

Nonparametric Bayesian Methods (Gaussian Processes) [70240413 Statistical Machine Learning, Spring, 2015] Nonparametric Bayesian Methods (Gaussian Processes) Jun Zhu dcszj@mail.tsinghua.edu.cn http://bigml.cs.tsinghua.edu.cn/~jun State Key Lab of Intelligent

More information

Stochastic Optimization with Inequality Constraints Using Simultaneous Perturbations and Penalty Functions

Stochastic Optimization with Inequality Constraints Using Simultaneous Perturbations and Penalty Functions International Journal of Control Vol. 00, No. 00, January 2007, 1 10 Stochastic Optimization with Inequality Constraints Using Simultaneous Perturbations and Penalty Functions I-JENG WANG and JAMES C.

More information

Suppression of impulse noise in Track-Before-Detect Algorithms

Suppression of impulse noise in Track-Before-Detect Algorithms Computer Applications in Electrical Engineering Suppression of impulse noise in Track-Before-Detect Algorithms Przemysław Mazurek West-Pomeranian University of Technology 71-126 Szczecin, ul. 26. Kwietnia

More information

Overview of Accelerated Simulation Methods for Plasma Kinetics

Overview of Accelerated Simulation Methods for Plasma Kinetics Overview of Accelerated Simulation Methods for Plasma Kinetics R.E. Caflisch 1 In collaboration with: J.L. Cambier 2, B.I. Cohen 3, A.M. Dimits 3, L.F. Ricketson 1,4, M.S. Rosin 1,5, B. Yann 1 1 UCLA Math

More information

Karl-Rudolf Koch Introduction to Bayesian Statistics Second Edition

Karl-Rudolf Koch Introduction to Bayesian Statistics Second Edition Karl-Rudolf Koch Introduction to Bayesian Statistics Second Edition Karl-Rudolf Koch Introduction to Bayesian Statistics Second, updated and enlarged Edition With 17 Figures Professor Dr.-Ing., Dr.-Ing.

More information

These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop

These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop Music and Machine Learning (IFT68 Winter 8) Prof. Douglas Eck, Université de Montréal These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop

More information

Stability of Ensemble Kalman Filters

Stability of Ensemble Kalman Filters Stability of Ensemble Kalman Filters Idrissa S. Amour, Zubeda Mussa, Alexander Bibov, Antti Solonen, John Bardsley, Heikki Haario and Tuomo Kauranne Lappeenranta University of Technology University of

More information