Foundations of State Estimation Part II

Similar documents
Computer Robot Vision Conference 2010

Fall 2010 Graduate Course on Dynamic Learning

Lecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press,

Introduction ( Week 1-2) Course introduction A brief introduction to molecular biology A brief introduction to sequence comparison Part I: Algorithms

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 4

WiH Wei He

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.

Localization & Mapping

FI 3103 Quantum Physics

Filtrage particulaire et suivi multi-pistes Carine Hue Jean-Pierre Le Cadre and Patrick Pérez

Study on Multi-Target Tracking Based on Particle Filter Algorithm

CHAPTER 10: LINEAR DISCRIMINATION

グラフィカルモデルによる推論 確率伝搬法 (2) Kenji Fukumizu The Institute of Statistical Mathematics 計算推論科学概論 II (2010 年度, 後期 )

Face Detection: The Problem

Dynamic Team Decision Theory. EECS 558 Project Shrutivandana Sharma and David Shuman December 10, 2005

Outline. Probabilistic Model Learning. Probabilistic Model Learning. Probabilistic Model for Time-series Data: Hidden Markov Model

Chapter Lagrangian Interpolation

Discrete Markov Process. Introduction. Example: Balls and Urns. Stochastic Automaton. INTRODUCTION TO Machine Learning 3rd Edition

Lecture 18: The Laplace Transform (See Sections and 14.7 in Boas)

Gray-dynamic EKF for Mobile Robot SLAM in Indoor Environment

Normal Random Variable and its discriminant functions

Hidden Markov Models Following a lecture by Andrew W. Moore Carnegie Mellon University

Particle Filter Based Robot Self-localization Using RGBD Cues and Wheel Odometry Measurements Enyang Gao1, a*, Zhaohua Chen1 and Qizhuhui Gao1

doi: info:doi/ /

Advanced Machine Learning & Perception

Clustering (Bishop ch 9)

( ) [ ] MAP Decision Rule

Lecture 11 SVM cont

Objectives. Image R 1. Segmentation. Objects. Pixels R N. i 1 i Fall LIST 2

Probabilistic Lane Tracking in Difficult Road Scenarios Using Stereovision

Hidden Markov Models with Kernel Density Estimation of Emission Probabilities and their Use in Activity Recognition

Pattern Classification (III) & Pattern Verification

Digital Speech Processing Lecture 20. The Hidden Markov Model (HMM)

A Monte Carlo Localization Algorithm for 2-D Indoor Self-Localization Based on Magnetic Field

A Cell Decomposition Approach to Online Evasive Path Planning and the Video Game Ms. Pac-Man

J i-1 i. J i i+1. Numerical integration of the diffusion equation (I) Finite difference method. Spatial Discretization. Internal nodes.

Object Tracking Based on Visual Attention Model and Particle Filter

A New Method for Computing EM Algorithm Parameters in Speaker Identification Using Gaussian Mixture Models

Chapter 3: Maximum-Likelihood & Bayesian Parameter Estimation (part 1)

CHAPTER 5: MULTIVARIATE METHODS

CS 536: Machine Learning. Nonparametric Density Estimation Unsupervised Learning - Clustering

Volatility Interpolation

Clustering with Gaussian Mixtures

Kernel-Based Bayesian Filtering for Object Tracking

Imperfect Information

Introduction to Boosting

Department of Economics University of Toronto

Mechanics Physics 151

Mechanics Physics 151

John Geweke a and Gianni Amisano b a Departments of Economics and Statistics, University of Iowa, USA b European Central Bank, Frankfurt, Germany

Lecture 9: Dynamic Properties

Endogeneity. Is the term given to the situation when one or more of the regressors in the model are correlated with the error term such that

Solution in semi infinite diffusion couples (error function analysis)

of Manchester The University COMP14112 Hidden Markov Models

Math 128b Project. Jude Yuen

Consider processes where state transitions are time independent, i.e., System of distinct states,

Anomaly Detection. Lecture Notes for Chapter 9. Introduction to Data Mining, 2 nd Edition by Tan, Steinbach, Karpatne, Kumar

Curves and Curved Surfaces. Where there is matter, there is geometry. Johannes Kepler

An introduction to Support Vector Machine

Pavel Azizurovich Rahman Ufa State Petroleum Technological University, Kosmonavtov St., 1, Ufa, Russian Federation

Neural Networks-Based Time Series Prediction Using Long and Short Term Dependence in the Learning Process

Lecture VI Regression

GMM parameter estimation. Xiaoye Lu CMPS290c Final Project

Lecture 6: Learning for Control (Generalised Linear Regression)

Lecture Slides for INTRODUCTION TO. Machine Learning. ETHEM ALPAYDIN The MIT Press,

Density estimation. Density estimations. CS 2750 Machine Learning. Lecture 5. Milos Hauskrecht 5329 Sennott Square

Learning of Graphical Models Parameter Estimation and Structure Learning

ハイブリッドモンテカルロ法に よる実現確率的ボラティリティモデルのベイズ推定

Continuous-time Nonlinear Estimation Filters Using UKF-aided Gaussian Sum Representations

Tackling the Premature Convergence Problem in Monte Carlo Localization Gert Kootstra and Bart de Boer

New M-Estimator Objective Function. in Simultaneous Equations Model. (A Comparative Study)

Calculating Model Parameters Using Gaussian Mixture Models; Based on Vector Quantization in Speaker Identification

FACE DETECTION AND TRACKING USING A BOOSTED ADAPTIVE PARTICLE FILTER WENLONG ZHENG. (Under the Direction of Suchendra M. Bhandarkar) ABSTRACT

Machine Learning 2nd Edition

Abstract This paper considers the problem of tracking objects with sparsely located binary sensors. Tracking with a sensor network is a

Announcements. Recap: Filtering. Recap: Reasoning Over Time. Example: State Representations for Robot Localization. Particle Filtering

A Bayesian algorithm for tracking multiple moving objects in outdoor surveillance video

Robust and Accurate Cancer Classification with Gene Expression Profiling

Robustness Experiments with Two Variance Components

Bayes rule for a classification problem INF Discriminant functions for the normal density. Euclidean distance. Mahalanobis distance

Zürich. ETH Master Course: L Autonomous Mobile Robots Localization II

Parameter Estimation for Relational Kalman Filtering

Computing Relevance, Similarity: The Vector Space Model

In the complete model, these slopes are ANALYSIS OF VARIANCE FOR THE COMPLETE TWO-WAY MODEL. (! i+1 -! i ) + [(!") i+1,q - [(!

Multi-Modal User Interaction Fall 2008

Intelligent Fastening Tool Tracking Systems Using Hybrid Remote Sensing Technologies

Detection of Waving Hands from Images Using Time Series of Intensity Values

Fitting a Conditional Linear Gaussian Distribution

A novel kernel-pls method for object tracking

Variants of Pegasos. December 11, 2009

PHYS 1443 Section 001 Lecture #4

Density estimation III.

A New Approach for Large-Scale Localization and Mapping: Hybrid Metric-Topological SLAM

CSE 473: Ar+ficial Intelligence. Probability Recap. Markov Models - II. Condi+onal probability. Product rule. Chain rule.

Machine Learning Linear Regression

OP = OO' + Ut + Vn + Wb. Material We Will Cover Today. Computer Vision Lecture 3. Multi-view Geometry I. Amnon Shashua

CSE 473: Ar+ficial Intelligence. Example. Par+cle Filters for HMMs. An HMM is defined by: Ini+al distribu+on: Transi+ons: Emissions:

CHAPTER 7: CLUSTERING

Mechanics Physics 151

Modélisation de la détérioration basée sur les données de surveillance conditionnelle et estimation de la durée de vie résiduelle

Transcription:

Foundaons of Sae Esmaon Par II Tocs: Hdden Markov Models Parcle Flers Addonal readng: L.R. Rabner, A uoral on hdden Markov models," Proceedngs of he IEEE, vol. 77,. 57-86, 989. Sequenal Mone Carlo Mehods n Pracce. A. Douce, N. de Freas, N. Gordon eds. Srnger-Verlag, 00. Radford M. Neal, 993. Probablsc Inference Usng Markov Chan Mone Carlo Mehods. Unversy of Torono CS Tech Reor. Robus Mone Carlo Localzaon for Moble Robos. S. Thrun, D. Fo, W. Burgard and F. Dellaer. Arfcal Inellgence. 8:-, 99-4 00. Hdden Markov Models Acons a Observable Hdden Belefs Observaons Oz b Z T a, b Z Saes Dscree saes, acons and observaons f,,, h, can now be wren as ables

Somewha Useful for Localzaon n Toologcal Mas :,a.9 3 : 3,a.05 4 : 4,a.05 Observaons can be feaures such as corrdor feaures, uncon feaures, ec. Belef Trackng Esmang s now easy Afer each acon a and observaon z, X, udae : z a, ' X ' Ths algorhm s quadrac n X. Recall ha Kalman Fler s quadrac n number of sae feaures. Connuous X means nfne number of saes.

The Three Basc Problems for HMMs Gven he hsory Oa,z,a,z,...,a T,z T, and a model λa,b,π, how do we effcenly comue POλ, he robably of he hsory, gven he model? Gven he hsory Oa,z,a,z,...,a T,z T and a model λ, how do we choose a corresondng sae sequence X,,..., T whch s omal n some meanngful sense.e., bes elans he observaons? 3 How do we adus he model arameers λa,b,π o mamze POλ? HMM Basc Problem Probably of hsory O gven λ s sum over all sae sequences Q,, 3,..., T,: P O λ all Q all q, q P O Q, λ P Q λ π,... z Summng over all sae sequences s T X T Insead, buld lace of saes forward n me, comung robables of each ossble raecory as lace s bul Forward algorhm s X T, a z 3, a...

HMM Basc Problem π α, X + + α α X α λ 3 Termnaon 4 Back rackng HMM Basc Problem algorhm, wh an era erm Inalzaon Inducon 0 ψ π δ [ ] [ ],, ma X X δ ψ δ δ [ ] [ ] ma * P T T T δ δ * * + + ψ. Inalzaon. Inducon: Reea for :T 3. Termnaon: z z a O Verb Decodng: Same rncle as forward z ma arg a z a arg ma X X Imlemenaon of he comuaon of n erms of a lace of observaons, and saes. Observaon, Sae 3 T N α

HMM Basc Problem 3 Gven labelled daa sequence, D{,a,z,,a,z,..., T,a T,z T }, esmang z, and,a k s us counng Gven unlabelled daa sequence, D{a,z,a, z,...,a T,z T }, esmang z, and,a k s equvalen o smulaneous localzaon and mang ne lecure Parcle Flers

Mone Carlo Localzaon: The Parcle Fler Sae Sace Samle arcles randomly from dsrbuon Carry around arcles, raher han full dsrbuon Samlng from unform dsrbuons s easy Samlng from Gaussans and oher arameerc dsrbuons s a lle harder Wha abou arbrary dsrbuons? Many algorhms Reecon samlng Imorance samlng Gbbs samlng Merools samlng. How o samle Wan o samle from arbrary Don know Do know for any secfc Do know how o samle from q Samle from q Comare q o Adus samles accordngly

Reecon Samlng Sae Sace Samle from an easy funcon Sae Sace Comue reecon rao: α /cq Sae Sace Kee arcles wh robably α, reec wh robably -α Samle Imorance Resamlng Sae Sace Samle from an easy funcon Sae Sace Comue morance weghs Sae Sace Resamle arcles from arcle se, accordng o morance weghs

Robo Localzaon usng SIR I. Samle { } from, y, θ II. Ierae: Samle Predcon from moon model accordng o acon a, o ge roosal dsrbuon q Comue morance weghs w q Measuremen 3 Resamle from { } accordng o {w } Samlng from Moon Model A common moon model: Decomose moon no roaon, ranslaon, roaon Roaon: µ θ, σ θ α d+ α θ Translaon: µ d, σ d α 3 d+ α 4 θ + θ Roaon: µ θ,σ θ α d+ α θ Comue roaon, ranslaon, roaon from odomery For each arcle, samle a new moon rle by from Gaussans descrbed above Use geomery o generae oseror arcle oson

Sensor Model 0.5 Aromaed Measured Probably y, 0. Eeced dsance 0.075 0.05 0.05 0 00 00 300 400 500 Measured dsance y [cm] Sensor Model Laser model bul from colleced daa Laser model fed o measured daa, usng aromae geomerc dsrbuon

Problem How o comue eeced dsance for any gven, y, θ? Ray-racng Cached eeced dsances for all, y, θ. Aromaon: Assume a symmerc sensor model deendng only on d: absolue dfference beween eeced and measured ranges Comue eeced dsance only for, y Much faser o comue hs sensor model Only useful for hghly-accurae range sensors e.g., laser range sensors, bu no sonar Comung Imorance Weghs Aromae Mehod Off-lne, for each emy grd-cell, y Comue d, y he dsance o neares flled cell from, y Sore hs eeced dsance ma A run-me, for a arcle, y and observaon z r, θ Comue end-on, y +rcosθ,y+rsnθ Rereve d, y, error n measuremen Comue robably of error, d, from Gaussan sensor model of secfc σ

Bayes Flers Kalman fler Unmodal Gaussan HMM Lnear-Gaussan moon and sensor models Daa assocaon roblem Quadrac n number of sae feaures Dscree mulmodal dsrbuon Arbrary moon and sensor models Quadrac n number of saes Parcle Flers Arbrary dsrbuons Arbrary moon and sensor models Eonenal n number of sae feaures Wha you should know Kalman Mulhyohess rackng Grd HMM Toology Parcle Belef Unmodal Mulmodal Dscree Dscree Non-aramerc or dscree Accuracy + + 0 - + Robusness 0 + + + + Sensor varey - - + 0 + Effcency + 0-0 0 Imlemenaon 0-0 0 +

Wha you should know Wha a Hdden Markov Model s The Forward algorhm The Verb algorhm How o mlemen arcle flerng Pros and cons of arcle flers How o mlemen robo localzaon usng arcle flers