Appendix B: Resampling Algorithms

Similar documents
Quantifying Uncertainty

x = , so that calculated

AN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING

A PROBABILITY-DRIVEN SEARCH ALGORITHM FOR SOLVING MULTI-OBJECTIVE OPTIMIZATION PROBLEMS

A Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function

Chapter 13: Multiple Regression

Joint Statistical Meetings - Biopharmaceutical Section

Linear Regression Analysis: Terminology and Notation

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA

Statistical analysis using matlab. HY 439 Presented by: George Fortetsanakis

BOOTSTRAP METHOD FOR TESTING OF EQUALITY OF SEVERAL MEANS. M. Krishna Reddy, B. Naveen Kumar and Y. Ramu

Sampling Theory MODULE VII LECTURE - 23 VARYING PROBABILITY SAMPLING

Comparison of the Population Variance Estimators. of 2-Parameter Exponential Distribution Based on. Multiple Criteria Decision Making Method

Lecture Notes on Linear Regression

U-Pb Geochronology Practical: Background

Kernel Methods and SVMs Extension

Statistics II Final Exam 26/6/18

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

STAT 511 FINAL EXAM NAME Spring 2001

Lecture 4: Universal Hash Functions/Streaming Cont d

Copyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for U Charts. Dr. Wayne A. Taylor

Markov Chain Monte Carlo Lecture 6

PHYS 450 Spring semester Lecture 02: Dealing with Experimental Uncertainties. Ron Reifenberger Birck Nanotechnology Center Purdue University

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

Department of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6

Parametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010

Stat 642, Lecture notes for 01/27/ d i = 1 t. n i t nj. n j

Annexes. EC.1. Cycle-base move illustration. EC.2. Problem Instances

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012

A Gauss Implementation of Particle Filters The PF library

CHAPTER 17 Amortized Analysis

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

Parameter Estimation for Dynamic System using Unscented Kalman filter

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Multivariate Ratio Estimator of the Population Total under Stratified Random Sampling

Topic 23 - Randomized Complete Block Designs (RCBD)

Why Monte Carlo Integration? Introduction to Monte Carlo Method. Continuous Probability. Continuous Probability

Convergence of random processes

A Hybrid Variational Iteration Method for Blasius Equation

Lecture 4: November 17, Part 1 Single Buffer Management

Application of Dynamic Time Warping on Kalman Filtering Framework for Abnormal ECG Filtering

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests

Homework Assignment 3 Due in class, Thursday October 15

Chapter 9: Statistical Inference and the Relationship between Two Variables

Feb 14: Spatial analysis of data fields

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD

Limited Dependent Variables

Bayesian predictive Configural Frequency Analysis

The Study of Teaching-learning-based Optimization Algorithm

Adaptive Importance Sampling in Signal Processing

Hashing. Alexandra Stefan

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /

Copyright 2017 by Taylor Enterprises, Inc., All Rights Reserved. Adjusted Control Limits for P Charts. Dr. Wayne A. Taylor

Lecture 14: Bandits with Budget Constraints

find (x): given element x, return the canonical element of the set containing x;

Lecture 6: Introduction to Linear Regression

Investigation of a New Monte Carlo Method for the Transitional Gas Flow

Durban Watson for Testing the Lack-of-Fit of Polynomial Regression Models without Replications

F statistic = s2 1 s 2 ( F for Fisher )

Notes on Frequency Estimation in Data Streams

An (almost) unbiased estimator for the S-Gini index

A linear imaging system with white additive Gaussian noise on the observed data is modeled as follows:

Pulse Coded Modulation

Comparison of Regression Lines

Chapter 3 Describing Data Using Numerical Measures

A Robust Method for Calculating the Correlation Coefficient

Some basic statistics and curve fitting techniques

arxiv:cs.cv/ Jun 2000

Statistics for Economics & Business

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

Case A. P k = Ni ( 2L i k 1 ) + (# big cells) 10d 2 P k.

Phase I Monitoring of Nonlinear Profiles

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

Negative Binomial Regression

Topic- 11 The Analysis of Variance

Some modelling aspects for the Matlab implementation of MMA

CSci 6974 and ECSE 6966 Math. Tech. for Vision, Graphics and Robotics Lecture 21, April 17, 2006 Estimating A Plane Homography

Statistics for Managers Using Microsoft Excel/SPSS Chapter 13 The Simple Linear Regression Model and Correlation

Lecture 17 : Stochastic Processes II

SPANC -- SPlitpole ANalysis Code User Manual

MATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2)

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

A Tutorial on Particle Filtering and Smoothing: Fifteen years later

Lossy Compression. Compromise accuracy of reconstruction for increased compression.

NEW ASTERISKS IN VERSION 2.0 OF ACTIVEPI

Econ Statistical Properties of the OLS estimator. Sanjaya DeSilva

MULTISPECTRAL IMAGE CLASSIFICATION USING BACK-PROPAGATION NEURAL NETWORK IN PCA DOMAIN

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora

Introduction to Algorithms

Maximizing Overlap of Large Primary Sampling Units in Repeated Sampling: A comparison of Ernst s Method with Ohlsson s Method

Lecture Space-Bounded Derandomization

Tornado and Luby Transform Codes. Ashish Khisti Presentation October 22, 2003

Week 5: Neural Networks

ECE559VV Project Report

/ n ) are compared. The logic is: if the two

Natural Images, Gaussian Mixtures and Dead Leaves Supplementary Material

Laboratory 1c: Method of Least Squares

Estimating the Fundamental Matrix by Transforming Image Points in Projective Space 1

COMPARING NOISE REMOVAL IN THE WAVELET AND FOURIER DOMAINS

Transcription:

407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles wth tme. The term varance of the weghts must be understood as the potental varablty of the weghts among the possble dfferent executons of the partcle flter. In order to prevent ths growth of varance, whch entals a loss of partcle dversty, one of a set of resamplng methods must be employed, as t was explaned n chapter 7. The am of resamplng s to replace an old set of N partcles by a new one, typcally wth the same populaton sze, but where partcles have been duplcated or removed accordng to ther weghts. More specfcally, the expected duplcaton count of the th partcle, denoted by N, must tend to N ω [ ]. After resamplng, all the weghts become equal to preserve the mportance samplng of the target pdf. Decdng whether to perform resamplng or not s most commonly done by montorng the Effectve Sample Sze (ESS). As mentoned n chapter 7, the ESS provdes a measure of the varance of the partcle weghts, e.g. the ESS tends to 1 when one sngle partcle carres the largest weght and the rest have neglgble weghts n comparson. In the followng we revew the most common resamplng algorthms. 1. REVIEW OF RESAMPLING ALGORITHMS Ths secton descrbes four dfferent strateges for resamplng a set of partcles whose normalzed weghts are gven by ω [ ], for = 1,..., N. All the methods wll be explaned usng a vsual analogy wth a wheel whose permeter s assgned to the dfferent partcles n such a way that the length of the permeter assocated to each partcle s proportonal to ts weght. Therefore, pckng a random drecton n ths wheel mples choosng a partcle wth a probablty proportonal to ts weght. For a more formal descrpton of the methods, please refer to the excellent revews n (Arumlampalam, Maskell, Gordon, & Clapp, 2002; Douc, Capp, & Moulnes, 2005). The four methods descrbed here have O ( N ) mplementatons, that s, ther executon tmes can be made to be lnear wth the number of partcles (Carpenter, Clfford, & Fearnhead, 1999; Arumlampalam, Maskell, Gordon, & Clapp, 2002). Multnomal resamplng: It s the most straghtforward resamplng method, where N ndependent random numbers are generated to pck a partcle from the old set. In the wheel analogy, llustrated n Fgure 1, ths method conssts of pckng N ndependent random drectons from the center of the wheel and takng the ponted partcle. Ths method s named after the fact that the probablty mass functon for the duplcaton counts N s a multnomal dstrbuton wth the weghts as parameters. A naïve mplementaton would have a tme complexty of O ( N log N ), but applyng the method of smulatng order statstcs (Carpenter, Clfford, & Fearnhead, 1999), t can be mplemented n O ( N ).

408 Fgure 1. The multnomal resamplng algorthm Fgure 2. The resdual resamplng algorthm. The shaded areas represent the nteger parts of ω [ ] 1 N ω [ ]. ( ). The resdual parts of the weghts, subtractng these areas, are taken as the modfed weghts

409 Fgure 3. The stratfed resamplng algorthm. The entre crcumference s dvded nto N equal parts, represented as the N crcular sectors of 1 N permeter lengths each. Fgure 4. The systematc resamplng algorthm

410 Fgure 5. A smple benchmark to measure the loss of hypothess dversty wth tme n an RBPF for the four dfferent resamplng technques dscussed n ths appendx. The multnomal method clearly emerges as the worst choce. Resdual resamplng: Ths method comprses two stages, as can be seen n Fgure 1. Frstly, partcles are resampled determnstcally by pckng N = N ω [ ] copes of the th partcle where x stands for the floor of x, the largest nteger above or equal to x. Then, multnomal samplng s performed wth the resdual weghts: ω = ω N [ ] [ ] / N (see Fgure 1-4). Stratfed resamplng: In ths method, the wheel representng the old set of partcles s dvded nto N equally-szed segments, as represented n Fgure 3. Then, N numbers are ndependently generated from a unform dstrbuton lke n multnomal samplng, but nstead of mappng each draw to the entre crcumference, they are mapped wthn ts correspondng partton out of the N ones. Systematc resamplng: Also called unversal samplng, ths popular technque draws only one random number,.e., one drecton n the wheel, wth the others N 1 drectons beng fxed at 1 N ncrements from that randomly pcked drecton. 2. COMPARISON OF THE DIFFERENT METHODS In the context of Rao-Blackwellzed Partcle Flters (RBPF), where each partcle carres a hypothess of the complete hstory of the system state evoluton, resamplng becomes a crucal operaton that reduces the dversty of the PF estmate for past states. We saw the applcaton of those flters to SLAM n chapter 9.

411 In order to evaluate the mpact of the resamplng strategy on ths loss, the four dfferent resamplng methods dscussed above have been evaluated n a benchmark that measures the dversty of dfferent states remanng after t tme steps, assumng all the states were ntally dfferent. The results, dsplayed n Fgure 5, agree wth the theoretcal conclusons n Douc, Capp, and Moulnes (2005), statng that multnomal resamplng s the worst of the four methods n terms of varance of the sample weghts. Therefore, due to ts smple mplementaton and good results, the systematc method s recommended when usng a statc number of partcles n all the teratons. If a dynamc number of samples s desred, thngs get more nvolved and t s recommended to swtch to a specfc partcle flter algorthm whch smultaneously takes nto account ths partcularty whle also amng at optmal samplng (Blanco, González, & Fernandez-Madrgal, 2010). REFERENCES Arumlampalam, M. S., Maskell, S., Gordon, N., & Clapp, T. (2002). A tutoral on partcle flters for onlne nonlnear/non-gaussan Bayesan trackng. IEEE Transactons on Sgnal Processng, 50(2), 174 188. do:10.1109/78.978374 Blanco, J. L., González, J., & Fernández-Madrgal, J. A. (2010). Optmal flterng for non-parametrc observaton models: Applcatons to localzaton and SLAM. The Internatonal Journal of Robotcs Research, 29(14), 1726 1742. do:10.1177/0278364910364165 Carpenter, J., Clfford, P., & Fearnhead, P. (1999). Improved partcle flter for nonlnear problems. IEEE Proceedngs on Radar. Sonar and Navgaton, 146(1), 2 7. do:10.1049/p-rsn:19990255 Douc, R., Capp, O., & Moulnes, E. (2005). Comparson of resamplng schemes for partcle flterng. In Proceedngs of the 4th Internatonal Symposum on Image and Sgnal Processng and Analyss, (pp. 64 69). IEEE.