PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

Similar documents
3/8/2016. Contents in latter part PATTERN RECOGNITION AND MACHINE LEARNING. Dynamical Systems. Dynamical Systems. Linear Dynamical Systems

Linear Dynamical Systems

Particle Filters. Outline

STA 4273H: Statistical Machine Learning

2D Image Processing (Extended) Kalman and particle filter

Introduction to Machine Learning

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER

Machine Learning Techniques for Computer Vision

Probabilistic Graphical Models

Sensor Fusion: Particle Filter

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 9. Time series prediction

Human Pose Tracking I: Basics. David Fleet University of Toronto

Pattern Recognition and Machine Learning

SCUOLA DI SPECIALIZZAZIONE IN FISICA MEDICA. Sistemi di Elaborazione dell Informazione. Regressione. Ruggero Donida Labati

Content.

Networks. Dynamic. Bayesian. A Whirlwind Tour. Johannes Traa. Computational Audio Lab, UIUC

Sequential Monte Carlo Methods for Bayesian Computation

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Multiple Speaker Tracking with the Factorial von Mises- Fisher Filter

Machine Learning for OR & FE

Part 1: Expectation Propagation

ECE521 Lecture 19 HMM cont. Inference in HMM

Sequential Bayesian Updating

Hidden Markov Models. Aarti Singh Slides courtesy: Eric Xing. Machine Learning / Nov 8, 2010

Bayesian Methods in Positioning Applications

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012

Learning Static Parameters in Stochastic Processes

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering

Machine Learning for Data Science (CS4786) Lecture 24

STA 414/2104: Machine Learning

TSRT14: Sensor Fusion Lecture 8

Approximate Inference

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Dimension Reduction. David M. Blei. April 23, 2012

Multivariate Bayesian Linear Regression MLAI Lecture 11

A Note on the Particle Filter with Posterior Gaussian Resampling

Par$cle Filters Part I: Theory. Peter Jan van Leeuwen Data- Assimila$on Research Centre DARC University of Reading

State-Space Methods for Inferring Spike Trains from Calcium Imaging

RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS

Dynamic models 1 Kalman filters, linearization,

Target Tracking and Classification using Collaborative Sensor Networks

Lagrangian Data Assimilation and Its Application to Geophysical Fluid Flows

STAT 518 Intro Student Presentation

Particle Filters. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

CIS 390 Fall 2016 Robotics: Planning and Perception Final Review Questions

Parallel Particle Filter in Julia

MMSE-Based Filtering for Linear and Nonlinear Systems in the Presence of Non-Gaussian System and Measurement Noise

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms

Introduction to Machine Learning CMU-10701

Fast Likelihood-Free Inference via Bayesian Optimization

Gaussian Process Approximations of Stochastic Differential Equations

Why do we care? Measurements. Handling uncertainty over time: predicting, estimating, recognizing, learning. Dealing with time

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

PROBABILISTIC REASONING OVER TIME

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007

Kernel adaptive Sequential Monte Carlo

AUTOMOTIVE ENVIRONMENT SENSORS

Blind Equalization via Particle Filtering

9 Multi-Model State Estimation

Monte Carlo Approximation of Monte Carlo Filters

CS281 Section 4: Factor Analysis and PCA

Factor Analysis and Kalman Filtering (11/2/04)

The Kalman Filter ImPr Talk

Hidden Markov Models. By Parisa Abedi. Slides courtesy: Eric Xing

Towards inference for skewed alpha stable Levy processes

STA414/2104. Lecture 11: Gaussian Processes. Department of Statistics

Approximate Bayesian Computation

Bayesian Networks BY: MOHAMAD ALSABBAGH

Efficient Monitoring for Planetary Rovers

Rao-Blackwellized Particle Filter for Multiple Target Tracking

Sequential Monte Carlo Methods (for DSGE Models)

Ensemble Data Assimilation and Uncertainty Quantification

Hidden Markov models

ABSTRACT INTRODUCTION

Expectation propagation for signal detection in flat-fading channels

Linear Dynamical Systems (Kalman filter)

STA 4273H: Statistical Machine Learning

Recent Advances in Bayesian Inference Techniques

Introduction to Particle Filters for Data Assimilation

Note Set 5: Hidden Markov Models

STA414/2104 Statistical Methods for Machine Learning II

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS

STA 4273H: Statistical Machine Learning

Answers and expectations

Basic Sampling Methods

STA 4273H: Sta-s-cal Machine Learning

Computer Intensive Methods in Mathematical Statistics

Local Positioning with Parallelepiped Moving Grid

Why do we care? Examples. Bayes Rule. What room am I in? Handling uncertainty over time: predicting, estimating, recognizing, learning

Markov chain Monte Carlo methods for visual tracking

Particle Filters; Simultaneous Localization and Mapping (Intelligent Autonomous Robotics) Subramanian Ramamoorthy School of Informatics

Introduction to Probabilistic Graphical Models: Exercises

Variational Message Passing. By John Winn, Christopher M. Bishop Presented by Andy Miller

Advanced Introduction to Machine Learning CMU-10715

Chapter 4 Dynamic Bayesian Networks Fall Jin Gu, Michael Zhang

Replicated Softmax: an Undirected Topic Model. Stephen Turner

Variational Principal Components

Nonparametric Bayesian Methods (Gaussian Processes)

Transcription:

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter Its simple algorithm and broad application

Dynamical Systems State space model of Dynamical systems Now latent variable is continuous rather than discrete z n F( z 1, w) n State equation x n H( z, v) n Observation equation

Dynamical Systems State space model of Dynamical systems Now latent variable is continuous rather than discrete z x n F( z 1, w) n n H( z, v) n or p( z n z n 1) p( x n zn)

Linear Dynamical Systems Special case of dynamical systems Gaussian assumption on distribution Where z: Latent variable x: Observation A: Dynamics matrix C: Emission matrix w, v, u: Noise

Linear Dynamical Systems Bayesian form of LDS Since LDS is linear Gaussian model, joint distribution over all latent and observed variables is simply Gaussian.

Kalman Filter Kalman filter does exact inference in LDS in which all latent and observed variables are Gaussian (incl. multivariate Gaussian). Kalman filter handles multiple dimensions in a single set of calculations. Kalman filter has two distinct phases: Predict and Update

Application of Kalman Filter Tracking moving object Blue: True position Green: Measurement Red: Post. estimate

Two phases in Kalman Filter Predict Prediction of state estimate and estimate covariance Update Update of state estimate and estimate covariance with Kalman gain

Estimation of Parameter in LDS Distribution of Zn-1 is used as a prior for estimation of Zn Predict Update Blue: Red: Red: Green: Blue:

Derivation of Kalman Filter We use sum-product algorithm for efficient inference of latent variables. LDS is continuous case of HMM (sum -> integer) where

Sum-product algorithm where Solve where (Kalman Gain Matrix)

What we have estimated? Predict: Update:

What we have estimated? Predict: Update: Prediction error Predicted mean of Zn Observed Xn Predicted mean of Xn

Limitation of Kalman Filter Due to assumption of Gaussian distribution in KF, KF can not estimate well in nonlinear/non- Gaussian problem. One simple extension is mixture of Gaussians In mixture of K Gaussians, is mixture of K Gaussians, and will comprise mixture of K n Gaussians. -> Computationally intractable

Limitation of Kalman Filter To resolve nonlinear dynamical system problem, other methods are developed. Extended Kalman filter: Gaussian approximation by linearizing around the mean of predicted distribution Particle filter: Resampling method, see later Switching state-space model: continuous type of switching HMM

Particle Filter In nonlinear/non-gaussian dynamical systems, it is hard to estimate posterior distribution by KF. Apply the sampling-importance-resampling (SIR) to obtain a sequential Monte Carlo algorithm, particle filter. Advantages of Particle Filter Simple algorithm -> Easy to implement Good estimation in nonlinear/non-gaussian problem

How to Represent Distribution Original distribution (mixture of Gaussian) Gaussian approximation Approximation by PF (distribution of particle) Approximation by PF (histogram)

Where s a landmine? Use metal detector to find a landmine (orange star).

Where s a landmine? Random survey in the field (red circle). The filter draws a number of randomly distributed estimates, called particles. All particles are given the same likelihood

Where s a landmine? Get response from each point (strength: size). Assign a likelihood to each particle such that the particular particle can explain the measurement.

Where s a landmine? Decide the place to survey in next step. Scale the weight of particles to select the particle for resampling

Where s a landmine? Intensive survey of possible place Draw random particles based upon their likelihood (Resampling). High likelihood -> more particle; low likelihood -> less particle All particle have equal likelihood again.

Operation of Particle Filter

Algorithm of Particle Filter Sample representation of the posterior distribution p(zn Xn) expressed as a samples {z (l) n} with corresponding weights {w (l) n}. where Draw samples from mixture distribution where Use new observation xn+1 to evaluate the corresponding weights.

Limitation of Particle Filter In high dimensional models, enormous particles are required to approximate posterior distribution. Repeated resampling cause degeneracy of algorithm. All but one of the importance weights are close to zero Avoided by proper choice of resampling method