Gaussian Mixture Filter in Hybrid Navigation

Similar documents
Local Positioning with Parallelepiped Moving Grid

Gaussian Mixture Filter Allowing Negative Weights and its Application to Positioning Using Signal Strength Measurements

Gaussian Mixture Filters in Hybrid Positioning

Tampere University of Technology Tampere Finland

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Partitioned Update Kalman Filter

Evaluating the Consistency of Estimation

Tampere University of Technology. Ala-Luhtala, Juha; Seppänen, Mari; Ali-Löytty, Simo; Piché, Robert; Nurminen, Henri

Lecture 7: Optimal Smoothing

TSRT14: Sensor Fusion Lecture 8

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture Outline. Target Tracking: Lecture 3 Maneuvering Target Tracking Issues. Maneuver Illustration. Maneuver Illustration. Maneuver Detection

CONSTRAINT KALMAN FILTER FOR INDOOR BLUETOOTH LOCALIZATION

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering

9 Multi-Model State Estimation

Lecture 6: Bayesian Inference in SDE Models

Localization of DECT mobile phones based on a new nonlinear filtering technique

Binomial Gaussian mixture filter

Posterior Cramer-Rao Lower Bound for Mobile Tracking in Mixed Line-of-Sight/Non Line-of-Sight Conditions

Nonlinear Filtering. With Polynomial Chaos. Raktim Bhattacharya. Aerospace Engineering, Texas A&M University uq.tamu.edu

Forecasting and data assimilation

Autonomous Navigation for Flying Robots

Machine Learning Lecture 2

(Extended) Kalman Filter

RECURSIVE OUTLIER-ROBUST FILTERING AND SMOOTHING FOR NONLINEAR SYSTEMS USING THE MULTIVARIATE STUDENT-T DISTRIBUTION

Nonlinear Measurement Update and Prediction: Prior Density Splitting Mixture Estimator

Online tests of Kalman filter consistency

MULTI-MODEL FILTERING FOR ORBIT DETERMINATION DURING MANOEUVRE

Bayesian Methods in Positioning Applications

SYDE 372 Introduction to Pattern Recognition. Probability Measures for Classification: Part I

Bayesian Decision Theory

SLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada

Introduction to Probabilistic Graphical Models: Exercises

System identification and sensor fusion in dynamical systems. Thomas Schön Division of Systems and Control, Uppsala University, Sweden.

Expectation Propagation in Dynamical Systems

UNDERSTANDING DATA ASSIMILATION APPLICATIONS TO HIGH-LATITUDE IONOSPHERIC ELECTRODYNAMICS

Estimation and Maintenance of Measurement Rates for Multiple Extended Target Tracking

From Bayes to Extended Kalman Filter

Gaussian Process Approximations of Stochastic Differential Equations

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

Parameterized Joint Densities with Gaussian Mixture Marginals and their Potential Use in Nonlinear Robust Estimation

EnKF-based particle filters

The Kalman Filter ImPr Talk

Expectation Propagation Algorithm

An introduction to particle filters

A Tree Search Approach to Target Tracking in Clutter

CIS 390 Fall 2016 Robotics: Planning and Perception Final Review Questions

Lagrangian Data Assimilation and Its Application to Geophysical Fluid Flows

NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES

Graphical Models for Collaborative Filtering

Introduction to Mobile Robotics Bayes Filter Kalman Filter

Terrain Navigation Using the Ambient Magnetic Field as a Map

Problem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30

Gaussian Processes for Machine Learning

I. INTRODUCTION 338 IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS VOL. 33, NO. 1 JANUARY 1997

Fundamentals of Data Assimila1on

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

Machine Learning Lecture 2

Robotics. Mobile Robotics. Marc Toussaint U Stuttgart

Time Series Prediction by Kalman Smoother with Cross-Validated Noise Density

Sensor Fusion: Particle Filter

1 Kalman Filter Introduction

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC

PATTERN RECOGNITION AND MACHINE LEARNING

An Efficient Ensemble Data Assimilation Approach To Deal With Range Limited Observation

A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

2D Image Processing (Extended) Kalman and particle filter

Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother

CSE446: Clustering and EM Spring 2017

Content.

FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012

On-line Dispersion Source Estimation using Adaptive Gaussian Mixture Filter

BAYESIAN DECISION THEORY

Gaussian Mixtures Proposal Density in Particle Filter for Track-Before-Detect

AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER. Qi Cheng and Pascal Bondon. CNRS UMR 8506, Université Paris XI, France.

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER

LECTURE NOTE #3 PROF. ALAN YUILLE

Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo

An introduction to data assimilation. Eric Blayo University of Grenoble and INRIA

IN particle filter (PF) applications, knowledge of the computational

Answers and expectations

Data assimilation with and without a model

Dynamic System Identification using HDMR-Bayesian Technique

ENGR352 Problem Set 02

Integrated Non-Factorized Variational Inference

Bayesian Learning. Bayesian Learning Criteria

Localization with Multi-Modal Vision Measurements in Limited GPS Environments Using Gaussian Sum Filters

Covariance Matrix Simplification For Efficient Uncertainty Management

Bayesian decision theory Introduction to Pattern Recognition. Lectures 4 and 5: Bayesian decision theory

Gaussian Process Approximations of Stochastic Differential Equations

Fundamentals of Data Assimila1on

Probabilistic Graphical Models

Kernel Bayes Rule: Nonparametric Bayesian inference with kernels

State Estimation Based on Nested Particle Filters

University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries

Modeling and state estimation Examples State estimation Probabilities Bayes filter Particle filter. Modeling. CSC752 Autonomous Robotic Systems

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms

Transcription:

Digest of TISE Seminar 2007, editor Pertti Koivisto, pages 1-5. Gaussian Mixture Filter in Hybrid Navigation Simo Ali-Löytty Institute of Mathematics Tampere University of Technology simo.ali-loytty@tut.fi http://math.tut.fi/posgroup/ 1 Introduction The GMF is the approximation of a Bayesian filter (see section 2) where prior and posterior densities are Gaussian mixtures, i.e. convex combinations of normal density functions. Kalman Filter, Extended Kalman Filter (EKF), Second Order Extended Kalman Filter and Bank of Kalman Filters are some special cases of GMF [2, 4]. Hybrid navigation means that measurements used in navigation come from many different sources e.g. Global Navigation Satellite System (e.g. GPS), Inertial Measurement Unit, or local wireless networks such as a cellular network, WLAN, or Bluetooth. Range, pseudorange, deltarange, altitude, restrictive [3] and compass measurements are examples of typical measurements in hybrid navigation. The EKF is very commonly used in satellite based positioning and has also been applied to hybrid navigation. Unfortunately, EKF has serious consistency problem in highly nonlinear situations, which means that EKF does not work correctly [4]. 2 Bayesian filter Bayesian filtering problem formulation includes three things: initial state x 0, state model and measurement model. State model tells how the next state x k+1 depends on the current state x k, x k+1 = f(x k ) + w k or p(x k+1 x k ). Measurement model tells how measurements depend on current state y k = h(x k ) + v k or p(y k x k ). Here states x, measurements y and error terms w and v are random variables. The aim of Bayesian filtering is to solve the state conditional probability density function (cpdf) p(x k y 1:k ), where y 1:k = {y 1,..., y k } are past and current measurements. In the hybrid navigation case the cpdf cannot be determined analytically. Because of this, there are many approximative solutions for example Particle Filter, Grid Based Method and GMF, which are the topic of the next section [6, 7]. 1

3 Gaussian Mixture Filter 3.1 Basics of GMF Idea of GMF [5] is that both prior density p(x k y 1:k 1 ) and posterior density p(x k y 1:k ) are Gaussian mixtures p(x) = α i N µ i Σ i (x), (1) where N µ i Σ i (x) is the normal density function with mean µ i and covariance matrix Σ i, weights α i 0 and p α i = 1. The mean and covariance of a Gaussian mixture (1) are µ = α i µ i and Σ = α i (Σ i + (µ i µ)(µ i µ) T ). We assume that prior is p(x) (1) and likelihood is p(y x) = m j=1 β j N H jx R j (y). So now the posterior, based on Bayes rule, is p(x y) = p(y x)p(x) p(y) = mj=1 p α i β j N H jµ i (y)nˆx i,j P i,j (x) ˆP i,j mj=1 p α i β j N H jµ i, (2) P i,j (y) where P i,j = H j Σ i H T j + R j, ˆx i,j = µ i + Σ i H T j P 1 i,j (y H j µ i ) and ˆP i,j = (I Σ i H T j P 1 i,j H j )Σ i. We see that posterior is also a Gaussian mixture. 3.2 Where do mixtures come from? Here are some situations why and when GMF may be preferred over conventional nonlinear Kalman filter extensions, which can be considered as special (i.e. onecomponent) cases of GMF. Models Of course it is clear that if our initial state or error models are Gaussian mixtures then GMF is an obvious solution. In hybrid navigation for example, we can create more realistic error models using Gaussian mixture than only one Gaussian. 2

Approximation Even if our models are not Gaussian mixtures, it is possible approximate our density functions as Gaussian mixture because it is showed that any probability density can be approximated as closely as desired by a Gaussian mixture. In hybrid navigation for example, if we can compute likelihood peaks z j, then we can approximate likelihood as a Gaussian mixture m exp ( 1 p(y x) y h(z 2 j) h (z j )(x z j ) 2 ) R 1. j=1 det(2πr) Robustifying Sometimes filters do not work correctly, usually because of approximation errors or modeling errors. One way to detect that something is wrong is to check the normalization factor p(y). If p(y) is smaller than a threshold value then either the prior or the measurement is wrong with some risk level. Then γp(x) + (1 γ) m j=1 α i N z j H 1 RH T (x), where γ [0, 1] and H = h (z j ), may be more reasonable posterior than (2). 3.3 Components reduction One major challenge in using GMF efficiently is keeping the number of components as small as possible without losing significant information. There is many ways to do so. We use three different types of mixture reduction algorithms: forgetting, merging and resampling. Forgetting we give zero weights to mixture components whose weights are lower than some threshold value, for example ( ) min 0.001, 0.01 max(α i ). i After that, we normalize weights of the remaining mixture components. Merging We merge two mixture components to one if distance between components is lower than some threshold value. Distance is for example d ij = α iα j α i + α j (µ i µ j ) T Σ 1 (µ i µ j ). We merge components so that merging preserves overall mean and covariance. This method, collapsing by moments is optimal in a sense of Kullback-Leibler distance. Resampling If after forgetting and merging we have too many mixture components, we can use a resampling algorithm to choose which mixture components we use. Finally, we normalize weights of these mixture components. This approach induces less approximation error, using L 1 -norm, than merging two distant components. 3

4 Simulations Figure 1 gives an example where GMF, which approximates likelihood as Gaussian mixture, gives better results than EKF. Measurements are range measurements from two base stations. More results and specific parameters will be published in [1]. True EKF GMF mean GMF components 100 m Start Figure 1: Example of an inconsistency problem of EKF and how GMF solves the problem. References [1] S. Ali-Löytty and N. Sirola. Gaussian Mixture Filter in Hybrid Navigation. In Proceedings of The European Navigation Conference GNSS 2007, (to be published) [2] S. Ali-Löytty. Hybrid positioning algorithms. In P. Koivisto, editor, Digest of TISE Seminar 2006, volume 5, pages 43 46. TISE, 2006. [3] S. Ali-Löytty and N. Sirola. A modified Kalman filter for hybrid positioning. In Proceedings of ION GNSS 2006, September 2006. 4

[4] S. Ali-Löytty, N. Sirola, and R. Piché. Consistency of three Kalman filter extensions in hybrid navigation. In Proceedings of The European Navigation Conference GNSS 2005, Munich, Germany, July 2005. [5] D. L. Alspach and H. W. Sorenson. Nonlinear bayesian estimation using gaussian sum approximations. IEEE Transactions on Automatic Control, 17(4):439 448, Aug 1972. [6] N. Sirola and S. Ali-Löytty. Moving grid filter in hybrid local positioning. In Proceedings of the ENC 2006, Manchester, May 7-10, 2006. [7] N. Sirola, S. Ali-Löytty, and R. Piché. Benchmarking nonlinear filters. In Nonlinear Statistical Signal Processing Workshop NSSPW06, Cambridge, September 2006. 5