Particle Filter Data Fusion Enhancements for MEMS-IMU/GPS

Similar documents
Autonomous Mobile Robot Design

Sensor Fusion: Particle Filter

UAV Navigation: Airborne Inertial SLAM

2D Image Processing (Extended) Kalman and particle filter

1 Kalman Filter Introduction

Automated Tuning of the Nonlinear Complementary Filter for an Attitude Heading Reference Observer

Off-the-Shelf Sensor Integration for mono-slam on Smart Devices

Combined Particle and Smooth Variable Structure Filtering for Nonlinear Estimation Problems

SLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada

Mini-Course 07 Kalman Particle Filters. Henrique Massard da Fonseca Cesar Cunha Pacheco Wellington Bettencurte Julio Dutra

Rao-Blackwellized Particle Filtering for 6-DOF Estimation of Attitude and Position via GPS and Inertial Sensors

RECURSIVE OUTLIER-ROBUST FILTERING AND SMOOTHING FOR NONLINEAR SYSTEMS USING THE MULTIVARIATE STUDENT-T DISTRIBUTION

in a Rao-Blackwellised Unscented Kalman Filter

VEHICLE WHEEL-GROUND CONTACT ANGLE ESTIMATION: WITH APPLICATION TO MOBILE ROBOT TRACTION CONTROL

Constrained State Estimation Using the Unscented Kalman Filter

Recursive Noise Adaptive Kalman Filtering by Variational Bayesian Approximations

Pose Tracking II! Gordon Wetzstein! Stanford University! EE 267 Virtual Reality! Lecture 12! stanford.edu/class/ee267/!

EKF and SLAM. McGill COMP 765 Sept 18 th, 2017

System identification and sensor fusion in dynamical systems. Thomas Schön Division of Systems and Control, Uppsala University, Sweden.

An Adaptive Filter for a Small Attitude and Heading Reference System Using Low Cost Sensors

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering

Fundamentals of attitude Estimation

Tracking for VR and AR

Locating and supervising relief forces in buildings without the use of infrastructure

A Study of Covariances within Basic and Extended Kalman Filters

Quaternion based Extended Kalman Filter

Introduction to Unscented Kalman Filter

Sequential Monte Carlo Methods for Bayesian Computation

Image Alignment and Mosaicing Feature Tracking and the Kalman Filter

An introduction to particle filters

Integrated Navigation System Using Sigma-Point Kalman Filter and Particle Filter

Research Article Error Modeling, Calibration, and Nonlinear Interpolation Compensation Method of Ring Laser Gyroscope Inertial Navigation System

Tuning of Extended Kalman Filter for nonlinear State Estimation

TSRT14: Sensor Fusion Lecture 8

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER

with Application to Autonomous Vehicles

TSRT14: Sensor Fusion Lecture 9

A New Nonlinear Filtering Method for Ballistic Target Tracking

Partially Observable Markov Decision Processes (POMDPs)

9 Multi-Model State Estimation

Two dimensional rate gyro bias estimation for precise pitch and roll attitude determination utilizing a dual arc accelerometer array

NAVAL POSTGRADUATE SCHOOL THESIS

Adaptive Unscented Kalman Filter with Multiple Fading Factors for Pico Satellite Attitude Estimation

Lecture. Aided INS EE 570: Location and Navigation. 1 Overview. 1.1 ECEF as and Example. 1.2 Inertial Measurements

Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother

The Unscented Particle Filter

Data Fusion Kalman Filtering Self Localization

The Research of Tight MINS/GPS Integrated navigation System Based Upon Date Fusion

Autonomous Mobile Robot Design

State Estimation and Motion Tracking for Spatially Diverse VLC Networks

Probabilistic Fundamentals in Robotics. DAUIN Politecnico di Torino July 2010

Content.

ROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino

The Kalman Filter ImPr Talk

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC

Uncertainty Propagation and Analysis of Image-guided Surgery

Sensors Fusion for Mobile Robotics localization. M. De Cecco - Robotics Perception and Action

CONDENSATION Conditional Density Propagation for Visual Tracking

Chapter 4 State Estimation

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Robotics. Mobile Robotics. Marc Toussaint U Stuttgart

EE 570: Location and Navigation

ON MODEL SELECTION FOR STATE ESTIMATION FOR NONLINEAR SYSTEMS. Robert Bos,1 Xavier Bombois Paul M. J. Van den Hof

Probabilistic Fundamentals in Robotics

Application of state observers in attitude estimation using low-cost sensors

Comparision of Probabilistic Navigation methods for a Swimming Robot

On a Data Assimilation Method coupling Kalman Filtering, MCRE Concept and PGD Model Reduction for Real-Time Updating of Structural Mechanics Model

Adaptive Estimation of Measurement Bias in Six Degree of Freedom Inertial Measurement Units: Theory and Preliminary Simulation Evaluation

Gaussian Mixtures Proposal Density in Particle Filter for Track-Before-Detect

2D Image Processing. Bayes filter implementation: Kalman filter

Multi-Sensor Fusion with Interaction Multiple Model and Chi-Square Test Tolerant Filter

Improved Particle Filtering Based on Biogeography-based Optimization for UGV Navigation

A Gaussian Mixture PHD Filter for Nonlinear Jump Markov Models

CS 4495 Computer Vision Principle Component Analysis

arxiv: v1 [math.oc] 12 May 2018

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

Measurement Observers for Pose Estimation on SE(3)

An Improved Particle Filter with Applications in Ballistic Target Tracking

Robotics 2 Target Tracking. Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard

Gaussian Process Approximations of Stochastic Differential Equations

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Stochastic Models, Estimation and Control Peter S. Maybeck Volumes 1, 2 & 3 Tables of Contents

Observability Analysis of Aided INS with Heterogeneous Features of Points, Lines and Planes

COS Lecture 16 Autonomous Robot Navigation

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS

Probabilistic Fundamentals in Robotics. DAUIN Politecnico di Torino July 2010

Efficient Monitoring for Planetary Rovers

From Bayes to Extended Kalman Filter

Sigma-Point Kalman Filters for Nonlinear Estimation and Sensor-Fusion - Applications to Integrated Navigation - Rudolph van der Merwe and Eric A.

Human Pose Tracking I: Basics. David Fleet University of Toronto

Extension of Farrenkopf Steady-State Solutions with Estimated Angular Rate

A FEASIBILITY STUDY OF PARTICLE FILTERS FOR MOBILE STATION RECEIVERS. Michael Lunglmayr, Martin Krueger, Mario Huemer

Nonlinear and/or Non-normal Filtering. Jesús Fernández-Villaverde University of Pennsylvania

Cubature Particle filter applied in a tightly-coupled GPS/INS navigation system

Consistent Triangulation for Mobile Robot Localization Using Discontinuous Angular Measurements

A Comparison of Particle Filters for Personal Positioning

Local Positioning with Parallelepiped Moving Grid

Adaptive Kalman Filter for MEMS-IMU based Attitude Estimation under External Acceleration and Parsimonious use of Gyroscopes

Multi-Sensor Fusion for Localization of a Mobile Robot in Outdoor Environments

Transcription:

ntelligent nformation Management,,, 47-4 doi:.436/iim..75 Published Online July (http://www.scirp.org/journal/iim) Particle Filter Data Fusion Enhancements for MEMS-MU/PS Abstract Yafei Ren, Xizhen Ke Faculty of Automation and nformation Engineering, Xi'an University of Technology, Xi an, China E-mail: doctor_liu@6.com Received April 4, ; revised May 8, ; epted June, This research aims at enhancing the uracy of navigation systems by integrating PS and Micro-Electro- Mechanical-System (MEMS) based inertial measurement units (MU). Because of the conditions required by the large number of restrictions on empirical data, a conventional Extended Kalman Filtering (EKF) is limited to apply in navigation systems by integrating MEMS-MU/PS. n response to non-linear non-aussian dynamic models of the inertial sensors, the methods rely on a particle cloud representation of the filtering distribution which evolves through time using importance sampling and resampling ideas. Then Particle Filtering (PF) can be used to data fusion of the inertial information and real-time updates from the PS location and speed of information urately. The experiments show that PF as opposed to EKF is more effective in raising MEMS-MU/PS navigation system s data integration uracy. Keywords: Micro-Electro-Mechanical-System, Particle Filter, Data Fusion, Extended Kalman Filtering. ntroduction nertial sensors are widely used for navigation systems []. Compared to PS tracing result, inertial tracing offers attractive complementary features. iven perfect sensors, no external information other than initial pose estimation is required. Lang et al. [,] show that inertial sensors can provide a good signal-to-noise ratio, especially in cases of rapid directional change (eleration/deceleration) and for high rotational speed. However, since inertial sensors only measure the variation rate or elerations, the output signals have to be integrated to obtain the position and orientation data. As a result, longer integrated time produces significant umulated drift because of noise or bias. n MEMS-MU/PS integration, there are nonlinear models that should be properly handled, for example: ) nonlinear state equations describing the MEMS-MU error propagation; ) nonlinear measurement equations that are related to pseudo ranges, carrier phases and Doppler shifts measured in the PS receiver [3]. n recent years, to overcome the problems with the nonlinearity, other nonlinear filters are also considered for use in the MEMS-MU/PS integration, for example: ) Particle Filter(PF), ) Unscented Kalman Filter(UKF), 3) SR Particle Filter(SPF) [4,5]. t is reported [5,6] that the integrated systems with these nonlinear filters show the similar performances, producing almost the same uracies in horizontal position and velocity while the uracy of the heading angle can be improved. This paper combined PS and inertial sensor with a two-channel complementary PF, which can tae advantage of the low-frequency stability of PS sensors and the high-frequency tracing of gyro sensors.. System Overview Hybrid solutions attempt to overcome the drawbacs of any single sensing solution by combining the measurements of at least two tracing methods. The fusion of complementary sensors should be used to build better tracing systems. Synergies can be exploited to gain robustness, tracing speed and uracy, and to reduce jitter and noise. owadays, a hybrid tracing system is the best solution to achieve a better object pose estimation and is widely applied in recent research wors. A Particle Filter (PF) is used to estimate motion by fusion of inertial and PS data. The Extended Kalman Filter (EKF) is used for data fusion and error compensation. Most of the hybrid approaches use the EKF to estimate the object state by fusion of inertial and PS data. However, the EKF algorithm provides only an approximation Copyright SciRes. M

48 Y. F. RE ET AL. to optimal nonlinear estimation. This can introduce large errors in the true posterior mean and covariance of the transformed aussian random variables, which may lead to suboptimal performance and sometimes divergence of the filter. Our PS system consists of a robust pose and an nertial Measurement Unit (MU) providing 3-D linear eleration and 3-D rate of turn (rate gyro) [7]. Moreover, the Particle Filtering algorithm is suggested and assessed to model the variations of the MEMS sensors performance characteristics. nitial results show the efficiency and precision of the proposed PF modeling algorithm. 3. Three-Dimensional Error Estimation 3.. Motion Model and System Dynamics Any navigation systems tracing approach requires some ind of motion model, even if it is constant motion. Since we have no a priori nowledge about the forces changing the motion of PS system or the objects, we assume no forces (elerations) and hence constant velocities. Augmented reality (AR) systems [] have the goal of enhancing a person s perception of the surrounding world. We used the motion model proposed by Ref [7], where the objects motion is represented by a 5 vector: X gps (,, xxx,, ) () where θ is the orientation of PS with respect to the world (we use Z Y X Euler angles), ω is the angular velocity, x, xx, are the position, velocity and eleration of PS with respect to the world. With these states, the discretized system dynamics are given by: TW( ) w w 3 x x Tx T x w () x 4 xt x w x 5 x w where T is the sampling period, W ( ) is the Jacobian matrix that relates the absolute rotation angle to the i angular rate and w is the system random distribution noise. At each time, the 3-D PS pose is given by the values of the x and parameters. 3.. nertial Sensor Measurements Our inertial sensor consists of three orthogonal rate gyroscopes to sense angular rates of rotation along three perpendicular axes and three elerometers which produce three eleration measurements. PS orientation changes are reported by the inertial sensor, so the transformation between the {} and {} is needed to relate inertial and PS motion. The rotation motion relationship between the two coordinates can be derived by: where and R (3) denote the angular velocity relative to the PS coordinate frame and the inertial coordinate frame, respectively. n order to determine the transfor- mation matrix R we have developed an efficient calibration method which is described in more detail in Ref [8]. The elerometers produce three eleration measurements, one for each axis (units: mm/s ). The elerometers provide linear eleration measurements in their own coordinate frame {}. However, the eleration term in our state vector is the linear eleration from {C} to {W}. The function relating the state vector X PS and the linear eleration measurements is given by: d W C a ( c RW x ( RC ( ) P ) g) (4) dt C C where the rotation from {W} to {} is R, P is the position of the origin of the {} frame with respect to the {C} frame and g is gravity. R C and P C are determined by the calibration procedure. 3.3. Fusion Filter Figure. System dataflow. The goal of the fusion filtering is to estimate object pose parameters of () from the measurements of the vision and inertial sensors [9]. The basis of our fusion algorithm is a SR particle filter [8,]. n this section we will explain how to use such a filter to estimate the camera pose. For more details on particle filter theory, see Refs [-3]. The motion tracing can be considered as a single-target non-linear discrete time system whose process model can be expressed as follows: X f( X, w ) (5) Copyright SciRes. M

Y. F. RE ET AL. 49 where X denotes the state vector of the system at time, it is defined by (). f ( X, w ) represents the deterministic process which is given by (). The measurement model is given by: y h( X ) n (6) where n represent the measurements noise. The nonlinear function h relates the state vector X to the measurements y. n our design, the input measurements to the fusion filter come from three different sensors, i.e., PS, gyroscope and elerometers, each with its own output equation h and an uncertainty in the output space: 3.3.. For the yroscope The gyroscope produces three angular velocity measurements, one for each axis (units: rad/s). This information will be associated only with the angular velocity term in the state vector. The gyroscope measurement model is then given by: gyro gyro gyro y h ( X ) n H X n (7) gyro gyro relating the state vector with the measurement vector using an identity matrix, so: Hgyro [ ] (8) 33 33 33 33 33 where 3 3 represents a 3 3 zero matrix and 3 identity matrix. 3 3 a 3 3.3.. For the Accelerometers The elerometers produce three eleration measurements, one for each axis (units: mm/s ). The elerometer measurement model is defined by: y h ( X ) n (9) where h is given by (4) and n is the elerometer s measurements noise. 3.3.3. Particle Filtering After having formulated the process and measurement models for both of the inertial and PS sensors, we can now give an iteration of the SR particle filter: ) Hypotheses. Let: be the number of particles. px ( ) px ( / y ) be the prior distribution (at ). ) nitialization. For : initialize the particles and generate X i from the initial distribution the weights. 3) Evolution. px ( ) initialize () Predict new particles X i using different noise realization and the process model: X f( X, w ), i,...,. () 4) Weighting. n our application, the measurement noises are considered aussian whose means are zero, and whose error covariance matrices are, respectively gyro R, R, and PS () R. n this case, the weights i are computed as follows: exp( y h( X ) ) R ( j) exp( y ( ) ) j h X () R T ( () R ()) R where y is the observed measurements from PS or inertial sensors and h is the measurement model, corresponding either to h PS, h gyro or h ording to the availability of the measurement data. 5) Estimation. Compute the output of the SR filter by: i Xˆ X () ncrease and iterate to item (3). Since PS data are obtained at a slower rate than the inertial sensor data, the filter will perform object pose estimation when gyroscope data, elerometer data or PS data is available. Thus, we implement a complementary filter as shown in Figure. There are two particles weighting channels sharing a common estimation module: one is for PS measurements and the other is for inertial sensor measurements. ndependent channel processing handles incomplete information measurements. For example, when no PS measurement is available (e.g., due to occlusions), the overall system maintains object pose tracing by only using the inertial weighting channel vice versa, when PS measurement is available, only the PS weighting channel is used to estimate object pose to overcome the problems of inertial sensor drift due to longer integrated time. Figure. Bloc diagram of the fusion filter. Copyright SciRes. M

4 Y. F. RE ET AL. 4. Experiments This experiment evaluates the uracy and the robustness of our fiducially detection and recognition method. We propose to use the particle filter framewor for the sensor fusion system on MEMS-MU/PS. Particle filters are sequential Monte-Carlo methods based upon a point mass (or particle ) representation of probability densities, which can be applied to any state space model and which generalize the traditional Kalman filtering methods. We have tested our algorithm to evaluate its performance and have compared the results obtained by the particle filter with those given by a classical extended Kalman filter. Experimental data are presented in Figure 3. First, the experiments research based on Matlab simulation soft, which is the language of technical computing. The original data of a pilot study based on the inertia output data of the inertial measurement unit and realtime PS location and speed, which is used to the particle filtering data fusion arithmetic. The first result is taen on one-dimensional position above the error data analysis as Figure 3. Then, and respectively, used Kalman filter the traditional extended Kalman filter the unscented Kalman filter and particle filter to data fusion experiment. Three filters arithmetic of the first are not to introduce, particle filter specific reference to the course of the last section, the formula is derived, and available Experimental results are presented in Figure 4. Finally, a new technique augmenting the powerful PF predictor with the traditional KF for improving the integrated MEMS-MU/PS integration system performance is presented. nitial test results show the significance of the proposed PF augmentation in reducing position and velocity drifts during PS outages. 5. Conclusions n this paper we presented a hybrid AR approach which uses a particle filter to combine PS and inertial technologies in order to improve the stability and the uracy of the registration between the virtual and real world objects when enhancing the user perception. An overview of the developed navigation system was described, and experiments demonstrated the feasibility and reliability of the system under various situations. Otherwise, we have implemented a SR particle filter to fuse inertial and PS data and, thus, to estimate the object poses. We have used the RMSE analysis to describe the performances of the filter. The results have been very satisfactory compared to those of classical AR techniques; they showed that the fusion method using the particle filter achieves high tracing uracy, stability and robustness. t is possible to apply more than three distributions to Data 8 6 4 8 6 4-3 4 5 6 Time States (x) Observations(y) Figure 3. Emulational data of experiment. Figure 4. Result of the fusion filter. Table. The RMSE of the fusion filter. Origin EKF UKF PF RMSE 3.56.4749.4443. the PF and there are more error states to be investigated in real situations. Therefore the relations between the number of distributions and the filter performances will be investigated in the future wor. 6. References [] P. Lang, A. Kusej, A. Pinz and. Brasseur, nertial Tracing for Mobile Augmented Reality, EEE nstrumentation and Measurement Technology Conference, Anchorage, AK, USA,, pp. 583-587. [] E. Marchand, P. Bouthemy and F. Chaumette, A D-3D Model-Based Approach to Real-Time Visual Tracing, mage Vision Computer, Vol. 9, o. 3,, pp. 94-955. Copyright SciRes. M

Y. F. RE ET AL. 4 [3] M. S. rewal, L. R. Weill and A. P. Andrews, lobal Positioning Systems, nertial avigation, and ntegration, nd Edition, John Wiley & Sons, Hoboen, 7. [4] Y. Yi and D. A. rejner-brzezinsa, onlinear Bayesian Filter: Alternative to the Extended Kalman Filter in the PS/S Fusion Systems, Proceedings of the nstitute of avigation, O SS 5, Long Beach, CA, 5, pp. 39-4. [5] Y. Kubo, T. Sato and S. Sugimoto, Modified aussian Sum Filtering Methods for S/PS ntegration, Journal of lobal Positioning Systems, Vol. 6, o., 7, pp. 65-73. [6] M. ishiyama, S. Fujioa, Y. Kubo, T. Sato and S. Sugimoto, Performance Studies of onlinear Filtering Methods in S/PS n-motion Alignment, Proceedings of the nstitute of avigation, O SS 6, Fort Worth, TX, 6, pp. 733-74. [7] L. Chai, W. Hoff and T. Vincent, Three-Dimensional Motion and Structure Estimation Using nertial Sensors and Computer Vision for Augmented Reality, Presence: Teleoperators and Virtual Environments, Vol., o. 5,, pp. 474-49. [8] M. Maidi, F. Ababsa and M. Mallem, Vision-nertial System Calibration for Tracing in Augmented Reality, Proceedings of the nd nternational Conference on nformatics in Control, Automation and Robotics, Barcelona, 5, pp. 56-6. [9] S. You and U. eumann, Fusion of Vision and yro Tracing for Robust Augmented Reality Registration, Proceedings of EEE Conference on Virtual Reality, Washington, DC,, pp. 7-77. [] A. Doucet,. J. ordon and V. Krishnamurthy, Particle Filters for State Estimation of Jump Marov Linear Systems, EEE Transactions on Signal Processing, Vol. 49, o. 3,, pp. 63-64. []. Bergman and A. Doucet, Marov Chain Monte Carlo data Association for Target Tracing, Proceedings of EEE nternational Conference on Acoustics, Speech, and Signal Processing, stanbul,, pp. 75-78. []. J. ordon, A Hybrid Bootstrap Filter for Target Tracing in Clutter, EEE Transactions on Aerospace and Electronic Systems, Vol. 33, o., 997, pp. 353-358. [3] A. Doucet,. de Freitas and. ordon, An ntroduction to Sequential Monte Carlo Methods, n: A. Doucet,. de Freitas and. ordon, Eds., Sequential Monte Carlo Methods in Practice, Springer, ew Yor,, pp. 3-4. Copyright SciRes. M