Digest of TISE Seminar 2007, editor Pertti Koivisto, pages 1-5. Gaussian Mixture Filter in Hybrid Navigation Simo Ali-Löytty Institute of Mathematics Tampere University of Technology simo.ali-loytty@tut.fi http://math.tut.fi/posgroup/ 1 Introduction The GMF is the approximation of a Bayesian filter (see section 2) where prior and posterior densities are Gaussian mixtures, i.e. convex combinations of normal density functions. Kalman Filter, Extended Kalman Filter (EKF), Second Order Extended Kalman Filter and Bank of Kalman Filters are some special cases of GMF [2, 4]. Hybrid navigation means that measurements used in navigation come from many different sources e.g. Global Navigation Satellite System (e.g. GPS), Inertial Measurement Unit, or local wireless networks such as a cellular network, WLAN, or Bluetooth. Range, pseudorange, deltarange, altitude, restrictive [3] and compass measurements are examples of typical measurements in hybrid navigation. The EKF is very commonly used in satellite based positioning and has also been applied to hybrid navigation. Unfortunately, EKF has serious consistency problem in highly nonlinear situations, which means that EKF does not work correctly [4]. 2 Bayesian filter Bayesian filtering problem formulation includes three things: initial state x 0, state model and measurement model. State model tells how the next state x k+1 depends on the current state x k, x k+1 = f(x k ) + w k or p(x k+1 x k ). Measurement model tells how measurements depend on current state y k = h(x k ) + v k or p(y k x k ). Here states x, measurements y and error terms w and v are random variables. The aim of Bayesian filtering is to solve the state conditional probability density function (cpdf) p(x k y 1:k ), where y 1:k = {y 1,..., y k } are past and current measurements. In the hybrid navigation case the cpdf cannot be determined analytically. Because of this, there are many approximative solutions for example Particle Filter, Grid Based Method and GMF, which are the topic of the next section [6, 7]. 1
3 Gaussian Mixture Filter 3.1 Basics of GMF Idea of GMF [5] is that both prior density p(x k y 1:k 1 ) and posterior density p(x k y 1:k ) are Gaussian mixtures p(x) = α i N µ i Σ i (x), (1) where N µ i Σ i (x) is the normal density function with mean µ i and covariance matrix Σ i, weights α i 0 and p α i = 1. The mean and covariance of a Gaussian mixture (1) are µ = α i µ i and Σ = α i (Σ i + (µ i µ)(µ i µ) T ). We assume that prior is p(x) (1) and likelihood is p(y x) = m j=1 β j N H jx R j (y). So now the posterior, based on Bayes rule, is p(x y) = p(y x)p(x) p(y) = mj=1 p α i β j N H jµ i (y)nˆx i,j P i,j (x) ˆP i,j mj=1 p α i β j N H jµ i, (2) P i,j (y) where P i,j = H j Σ i H T j + R j, ˆx i,j = µ i + Σ i H T j P 1 i,j (y H j µ i ) and ˆP i,j = (I Σ i H T j P 1 i,j H j )Σ i. We see that posterior is also a Gaussian mixture. 3.2 Where do mixtures come from? Here are some situations why and when GMF may be preferred over conventional nonlinear Kalman filter extensions, which can be considered as special (i.e. onecomponent) cases of GMF. Models Of course it is clear that if our initial state or error models are Gaussian mixtures then GMF is an obvious solution. In hybrid navigation for example, we can create more realistic error models using Gaussian mixture than only one Gaussian. 2
Approximation Even if our models are not Gaussian mixtures, it is possible approximate our density functions as Gaussian mixture because it is showed that any probability density can be approximated as closely as desired by a Gaussian mixture. In hybrid navigation for example, if we can compute likelihood peaks z j, then we can approximate likelihood as a Gaussian mixture m exp ( 1 p(y x) y h(z 2 j) h (z j )(x z j ) 2 ) R 1. j=1 det(2πr) Robustifying Sometimes filters do not work correctly, usually because of approximation errors or modeling errors. One way to detect that something is wrong is to check the normalization factor p(y). If p(y) is smaller than a threshold value then either the prior or the measurement is wrong with some risk level. Then γp(x) + (1 γ) m j=1 α i N z j H 1 RH T (x), where γ [0, 1] and H = h (z j ), may be more reasonable posterior than (2). 3.3 Components reduction One major challenge in using GMF efficiently is keeping the number of components as small as possible without losing significant information. There is many ways to do so. We use three different types of mixture reduction algorithms: forgetting, merging and resampling. Forgetting we give zero weights to mixture components whose weights are lower than some threshold value, for example ( ) min 0.001, 0.01 max(α i ). i After that, we normalize weights of the remaining mixture components. Merging We merge two mixture components to one if distance between components is lower than some threshold value. Distance is for example d ij = α iα j α i + α j (µ i µ j ) T Σ 1 (µ i µ j ). We merge components so that merging preserves overall mean and covariance. This method, collapsing by moments is optimal in a sense of Kullback-Leibler distance. Resampling If after forgetting and merging we have too many mixture components, we can use a resampling algorithm to choose which mixture components we use. Finally, we normalize weights of these mixture components. This approach induces less approximation error, using L 1 -norm, than merging two distant components. 3
4 Simulations Figure 1 gives an example where GMF, which approximates likelihood as Gaussian mixture, gives better results than EKF. Measurements are range measurements from two base stations. More results and specific parameters will be published in [1]. True EKF GMF mean GMF components 100 m Start Figure 1: Example of an inconsistency problem of EKF and how GMF solves the problem. References [1] S. Ali-Löytty and N. Sirola. Gaussian Mixture Filter in Hybrid Navigation. In Proceedings of The European Navigation Conference GNSS 2007, (to be published) [2] S. Ali-Löytty. Hybrid positioning algorithms. In P. Koivisto, editor, Digest of TISE Seminar 2006, volume 5, pages 43 46. TISE, 2006. [3] S. Ali-Löytty and N. Sirola. A modified Kalman filter for hybrid positioning. In Proceedings of ION GNSS 2006, September 2006. 4
[4] S. Ali-Löytty, N. Sirola, and R. Piché. Consistency of three Kalman filter extensions in hybrid navigation. In Proceedings of The European Navigation Conference GNSS 2005, Munich, Germany, July 2005. [5] D. L. Alspach and H. W. Sorenson. Nonlinear bayesian estimation using gaussian sum approximations. IEEE Transactions on Automatic Control, 17(4):439 448, Aug 1972. [6] N. Sirola and S. Ali-Löytty. Moving grid filter in hybrid local positioning. In Proceedings of the ENC 2006, Manchester, May 7-10, 2006. [7] N. Sirola, S. Ali-Löytty, and R. Piché. Benchmarking nonlinear filters. In Nonlinear Statistical Signal Processing Workshop NSSPW06, Cambridge, September 2006. 5