2D Image Processing (Extended) Kalman and particle filter

Similar documents
2D Image Processing. Bayes filter implementation: Kalman filter

2D Image Processing. Bayes filter implementation: Kalman filter

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms

Modeling and state estimation Examples State estimation Probabilities Bayes filter Particle filter. Modeling. CSC752 Autonomous Robotic Systems

Computer Vision Group Prof. Daniel Cremers. 14. Sampling Methods

CONDENSATION Conditional Density Propagation for Visual Tracking

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

Sensor Fusion: Particle Filter

EKF and SLAM. McGill COMP 765 Sept 18 th, 2017

Human Pose Tracking I: Basics. David Fleet University of Toronto

Introduction to Machine Learning

Bayes Filter Reminder. Kalman Filter Localization. Properties of Gaussians. Gaussians. Prediction. Correction. σ 2. Univariate. 1 2πσ e.

Particle Filters; Simultaneous Localization and Mapping (Intelligent Autonomous Robotics) Subramanian Ramamoorthy School of Informatics

CSC487/2503: Foundations of Computer Vision. Visual Tracking. David Fleet

AUTOMOTIVE ENVIRONMENT SENSORS

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Particle Filters. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision

ROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino

Why do we care? Measurements. Handling uncertainty over time: predicting, estimating, recognizing, learning. Dealing with time

Robotics. Mobile Robotics. Marc Toussaint U Stuttgart

Autonomous Mobile Robot Design

Autonomous Navigation for Flying Robots

Probabilistic Fundamentals in Robotics. DAUIN Politecnico di Torino July 2010

Robert Collins CSE586, PSU Intro to Sampling Methods

Particle Filters. Outline

Robotics. Lecture 4: Probabilistic Robotics. See course website for up to date information.

Robert Collins CSE586, PSU Intro to Sampling Methods

Sensor Tasking and Control

Probabilistic Fundamentals in Robotics

Autonomous Mobile Robot Design

Probabilistic Fundamentals in Robotics. DAUIN Politecnico di Torino July 2010

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering

Why do we care? Examples. Bayes Rule. What room am I in? Handling uncertainty over time: predicting, estimating, recognizing, learning

Using the Kalman filter Extended Kalman filter

Markov chain Monte Carlo methods for visual tracking

The Unscented Particle Filter

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Mobile Robot Localization

Efficient Monitoring for Planetary Rovers

L11. EKF SLAM: PART I. NA568 Mobile Robotics: Methods & Algorithms

Self Adaptive Particle Filter

Mobile Robot Localization

Recursive Bayes Filtering

Density Propagation for Continuous Temporal Chains Generative and Discriminative Models

Lecture 7: Optimal Smoothing

Content.

Results: MCMC Dancers, q=10, n=500

CS491/691: Introduction to Aerial Robotics

The Kalman Filter ImPr Talk

Bayesian Methods / G.D. Hager S. Leonard

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER

Par$cle Filters Part I: Theory. Peter Jan van Leeuwen Data- Assimila$on Research Centre DARC University of Reading

Expectation Propagation Algorithm

Robotics 2 Target Tracking. Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard

CIS 390 Fall 2016 Robotics: Planning and Perception Final Review Questions

Dynamic models 1 Kalman filters, linearization,

SLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada

Nonparametric Bayesian Methods (Gaussian Processes)

CSEP 573: Artificial Intelligence

Probabilistic Graphical Models

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

Part 1: Expectation Propagation

Kalman Filter Computer Vision (Kris Kitani) Carnegie Mellon University

E190Q Lecture 10 Autonomous Robot Navigation

Robot Mapping. Least Squares. Cyrill Stachniss

arxiv: v1 [cs.ro] 1 May 2015

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 2: PROBABILITY DISTRIBUTIONS

Lecture : Probabilistic Machine Learning

Linear Dynamical Systems

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

9 Multi-Model State Estimation

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations

Blind Equalization via Particle Filtering

Answers and expectations

CSE 473: Artificial Intelligence

1 Kalman Filter Introduction

Particle lter for mobile robot tracking and localisation

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms

Variable Resolution Particle Filter

Lecture Outline. Target Tracking: Lecture 3 Maneuvering Target Tracking Issues. Maneuver Illustration. Maneuver Illustration. Maneuver Detection

Parallel Particle Filter in Julia

Robots Autónomos. Depto. CCIA. 2. Bayesian Estimation and sensor models. Domingo Gallardo

Rao-Blackwellized Particle Filter for Multiple Target Tracking

Lecture 6: Bayesian Inference in SDE Models

Local Positioning with Parallelepiped Moving Grid

CS 343: Artificial Intelligence

ISOMAP TRACKING WITH PARTICLE FILTER

CS 343: Artificial Intelligence

TSRT14: Sensor Fusion Lecture 8

STA 4273H: Statistical Machine Learning

From Bayes to Extended Kalman Filter

Partially Observable Markov Decision Processes (POMDPs)

State Estimation using Moving Horizon Estimation and Particle Filtering

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012

Transcription:

2D Image Processing (Extended) Kalman and particle filter Prof. Didier Stricker Dr. Gabriele Bleser Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz http://av.dfki.de SA-1 Some slides based on S. Thrun and K. Smith

Outline Recap: Kalman filter algorithm Extended Kalman filter algorithm Particle Filter 2

Recap: linear Gaussian state-space model Process noise Measurement noise 3

Recap: Kalman filter Bayes update rule + linear Gaussian models Kalman filter correct measure predict posterior likelihood motion model posterior at t-1 4

Recap: Kalman filter algorithm 1. Kalman_filter(μ t-1, Σ t-1, u t, z t ): 2. Prediction: 3. 4. 5. Correction: 6. 7. 8. Requirements: Initial belief is Gaussian Linear Gaussian state-space model 9. Return μ t, Σ t 5

Constant velocity model Assuming that the object follows a const. velocity model: Better prediction Still problems in case of accelerations Cont d estimate, even if we don t have measurements 6

Kalman filter limitations Applies to linear Gaussian models Many visual tracking problems are nonlinear, e.g. as soon as we move to tracking in 3D 7 Courtesy of G. Panin

Humanoid robot catches flying balls Several cameras observe flying balls The 3D trajectory is estimated A robot is supposed to catch the balls nonlinear estimation problem 8 Courtesy of U. Frese

SA-1 Extended Kalman filter Slides based on S. Thrun

Nonlinear state-space model, Gaussian noise Motion model: Nonlinear function (most general model) Measurement model: Nonlinear function Often used: model with additive noise 10

Propagate Gaussian through linear function 11

Propagate Gaussian through nonlinear function PDF obtained from 500.000 Monte Carlo samples + histogramming Then sample mean and covariance 12

Bayes update rule Problem: We do not stay in the Gaussian world, if motion and/or measurement models are nonlinear functions of the state Non-Gauss Nonlinear + Gauss Nonlinear + Gauss Gauss No general closed-form solution for Bayes filter Approximate solutions: Keep the functions and approximate distributions Unscented Kalman filter, particle filter Linearize the functions and use again the Kalman filter 13

EKF linearization Monte Carlo transform + sample statistics àblue Gaussian First Order Taylor approximation àred Gaussian Mismatch = linearization error 14

First order Taylor approximation (multivariate) Jacobian matrix 15

EKF linearization of state-space model (additive noise) Jacobian matrix Jacobian matrix 16

Extended Kalman filter: additive noise 1. Extended_Kalman_filter(μ t-1, Σ t-1, u t, z t ): 2. Prediction: 3. 4. Gauss approximation formula 5. Correction: 6. 7. 8. 9. Return μ t, Σ t 17

Linearization of state-space model (non-additive noise) Jacobians Jacobians 18

Gauss approximation formula (1 st order) derived =0 19

Extended Kalman filter: non-additive noise 1. Extended_Kalman_filter(μ t-1, Σ t-1, u t, z t ): 2. Prediction: 3. 4. 5. Correction: 6. 7. 8. 9. Return μ t, Σ t 20

EKF linearization: problems Bigger uncertainty à bigger linearization error 21

EKF linearization: problems 22

EKF linearization: problems Higher local nonlinearity à bigger linearization error 23

EKF linearization: problems 24

(Extended) Kalman filter: limitations No multimodal distributions 25

(Extended) Kalman filter: limitations Unimodal distributions fail for unpredicted motion Prediction too far from actual location to recover 26

Extended Kalman filter: summary 27

Monte Carlo (particle) approximation Non-parametric representation, as a set of weighted particles PDF of faces appearing in the image 28

Filter methods (rules-of-thumb) Linear Gaussian models Bayes Filter Kalman Filter n 1 Kalman Filter banks Particle Filter Nonlinear models, Gaussian noises (Non)linear models, Gaussian noises, multimodal Highly nonlinear models, non-gaussian noises, multimodal Unscented Kalman Filter Extended Kalman Filter 29

Particle filters Go by many names: Sequential Monte Carlo Methods Sequential importance resampling (SIR) Bootstrap filters Condensation trackers Survival of the fittest Originally used for problems in Statistics Fluid mechanics Statistical mechanics Signal processing Introduced to the Computer Vision community by Michael Isard and Andrew Blake, CONDENSATION Conditional Density Propagation for Visual Tracking, International Journal of Computer Vision (IJCV), 29, 1, 5--28, (1998) 30

Introductory example Mobile robot localization in a known environment based on motion (odometry) and sensing (door detection) The robot knows a map of the environment (corridor with 3 indistinguishable doors) Wanted: position of the robot in the map 31

weight Prior particle distribution (uniform) Initially, the state is unknown. Measurement z 1 : here is a door. Measurement likelihood Posterior particle distribution Particles close to doors have higher weights. 32

Resampling: Generate a new particle set by drawing particles (with replacement) with probabilities according to their weights. (Particles with high weights will be duplicated) Odometry input u 1 : 1m forward. Move and diffuse (with noise samples) particles according to the motion model. 33

Measurement z 2 : here is a door. Update weights by multiplying with measurement likelihood. Resampling: Generate a new particle set by drawing particles (with replacement) with probabilities according to their weights. 34

Particle filter algorithm: overview Initialization: Randomly draw particles from the state-space. Measurement update: Multiply weights with measurement likelihood (plausibility). Resampling: Randomly draw new particles from the old set with probabilities proportional to the weights. Time update: Move particles according to the motion model. This includes diffusing the particles by adding random noise. 35

Particle filter algorithm: overview Question: What does the robot know after each step? Initialization: The robot knows nothing. Measurement update: The robot knows that some hypotheses (particles) are more plausible. Resampling: The robot knows the same, but in a different representation. It concentrates its resources on the more likely areas of the state-space. Time update: The robot knows: If I was previously here, I am now here. However, uncertainty grows. 36

Particle filter Advantages: Easy to implement. Not restricted to Gaussian densities. Handles highly nonlinear functions and multimodal distributions. Very successful in practical applications. Disadvantages: Needs many particles in high-dimensional state-spaces. Precision highly depends on number of particles. Target distribution (Gaussian mixture) Too few samples Added samples 37

SA-1 Particle filter in more details Along concrete tracking example Slides based on K. Smith

Example: tracking based on color histograms Track girl in pink 39

What is a particle? A sample of the posterior Particles contain a state estimate/ hypothesis weight Summing the particles gives an approximation to the target distribution 40

What is a particle? Each particle contains a state estimate weight Const. velocity model, fixed bounding box size (could be improved by estimating also the bounding box size) 41

SIR particle filter Begin with weighted samples from t-1 Resample: draw samples according to {w t-1 } n=1:n Move: apply motion model (no noise) Diffuse: apply noise to spread particles Measure: weights are assigned by likelihood response Finish: density estimate 42 42

Previous estimate Prediction SIR particle filter Begin with weighted samples from t-1 Resample: draw samples according to {w t- 1} n=1:n Move: apply motion model (no noise) Diffuse: apply noise to spread particles Correction Measure: weights are assigned by likelihood response Posterior Finish: density estimate 43

SIR particle filter Begin with weighted samples from t-1 44 44

Previous estimate Receive posterior estimate from previous time step 45

SIR particle filter Resample: draw samples according to {w t-1 } n=1:n N new samples are drawn from the previous set with replacement. New samples are assigned uniform weights. 46 46

Resample N new samples are drawn from the previous set with replacement to prevent degeneracy. Repeated samples occur by design. Weighted sampling with replacement New sample set is given uniform weights 47

Resample N new samples are drawn from the previous set with replacement to prevent degeneracy. Repeated samples occur by design. Weighted sampling with replacement New sample set is given uniform weights Samples still in same place, but more likely ones are duplicated 48

Degeneracy Failing to resample results in degeneracy. Iteratively propagating the particles and assigning weights tends to make a few samples dominate the rest t Distribution no longer representative 49

Resampling 50

Naive solution: independent resampling High sampling variance W n-1 w n w 1 w 2 w 3 51

Better solution: systematic resampling m W n-1 w n w 1 w 2 w 3 52

Systematic resampling: algorithm W n-1 w n w 1 w 2 w 3 53

Systematic resampling: pseudo code 1. Algorithm systematic_resampling(s,n): 2. Initialize empty particle set 3. For Generate cumulative weights 4. Generate random number 5. For Draw samples with replacement 6. While ( ) Search for 1 st interval with cdf > u j 7. 8. Insert into new set (normalized weight) 9. Next index 10. Return S 54

SIR particle filter: predict Apply the (non-)linear motion model p(x t x t-1, u t ) to every particle! noise Move: apply motion model (no noise) Diffuse: apply noise to spread particles 55 55

Motion model (here linear Gaussian) Apply the motion model p(x t x t-1 ) to every particle! linear motion noise model 56

SIR particle filter: measure Measure: weights are proportional to the measurement likelihood 57 57

Measurement model 58

Measurement model 59

Measurement model (based on color histograms) LAB space: LAB color histogram designed to approximate human vision L lightness A,B color Measurements are obtained by converting pixel values within bounding boxes to LAB color space, and concatenating to form an AB channel histogram 60

Measurement model known color model 61

Measurement model poor match good match known color model 62

Excursion: Kullback-Leibler divergence Determines, how peaked the likelihood function is Measured color model (histogram) Known color model (histogram) 63

Measurement model poor match good match 64

Measurement model 65

SIR particle filter Begin with weighted samples from t-1 Resample: draw samples according to {w t-1 } n=1:n Move: apply motion model (no noise) Diffuse: apply noise to spread particles Measure: weights are assigned by likelihood response Finish: density estimate 66 66

Obtaining a solution So far, we do not have an explicit state estimate, we have a cloud of particles! How do we extract an answer? It depends Compute a weighted mean Confidence: inverse variance For discrete labels, this does not work! Use the mode? 67

Obtaining a solution 68

Particle filter: relation to Bayes update rule

Importance sampling principle Target distribution Proposal distribution Indicator function: is 1, if argument true 70

Particle filters in action Example: head tracking (likelihood based on skin color) Video of K. Smith 71

Particle filter: people tracking example Uses an adaptive classifier based on Adaboost à past lecture D. Klein, D. Schulz, S. Frintrop, and A. Cremers, Adaptive Real-Time Video Tracking for Arbitrary Objects, International Conference on Intelligent Robots and Systems (IROS), 2010 72

Particle filters in action Michael Isard and Andrew Blake CONDENSATION conditional density propagation for visual tracking, International Journal of Computer Vision (IJCV), 29, 1, 5--28, (1998) 73

Particle filters in action Tracking a ball Particle filter recovers 1. multi-modal 2. random sampling 74

Summary: particle filter algorithm Particle filters are an implementation of recursive Bayesian filtering. They represent the posterior by a set of weighted samples. In the context of localization, the particles are propagated according to the motion model. They are then weighted according to the likelihood of the observations. In a resampling step, new particles are drawn with a probability proportional to their weight. 75

Summary: particle filters Represents arbitrary (multimodal) densities Handle nonlinear, non-gaussian systems Concentrate particles on interesting regions (by resampling) Number of samples is important Use as few as necessary (for efficiency) But use enough to do a good job exploring the state space Complexity grows exponentially with dimensionality of the state space as parallel trackers, cost is 3C = 9 Further readings: advanced particle filters are described, e.g. in Probabilistic Robotics by S. Thrun as a joint tracker, cost is C^3 = 27 76

SA-1 Thank you!

EKF application example Set-up: Two fully calibrated cameras observe a flying ball Measurements: 2D ball positions in each image (feature level) Wanted: (predicted) 3D position of ball 78

EKF application example 79

Image formation Perspective projection Camera intrinsics Camera translation 80

State-space model EKF required + linearization of measurement model 81

EKF linearization (1 st order Taylor approximation) Jacobian 82

Measurement Jacobian (4x6) state output???????????????????????? 83

Measurement Jacobian (4x6) 84

Extended Kalman filter (adapted to example) 1. Extended_Kalman_filter(μ t-1, Σ t-1, u t, z t ): 2. Prediction: 3. 4. 5. Correction: 6. 7. 8. 9. Return μ t, Σ t Motion model is linear Nonlinear measurement model has additive noise 85