Vision-Aided Navigation Based on Three-View Geometry

Similar documents
Navigation Performance Enhancement using Online Mosaicking

UAV Navigation: Airborne Inertial SLAM

Multi-Sensor Fusion for Localization of a Mobile Robot in Outdoor Environments

Design of Adaptive Filtering Algorithm for Relative Navigation

with Application to Autonomous Vehicles

Data Fusion of Dual Foot-Mounted Zero Velocity Update (ZUPT) Aided Inertial Navigation Systems (INSs) using Centroid Method

VN-100 Velocity Compensation

Delayed Fusion of Relative State Measurements by Extending Stochastic Cloning via Direct Kalman Filtering

Towards airborne seismic. Thomas Rapstine & Paul Sava

Lecture 13 Visual Inertial Fusion

SENSITIVITY ANALYSIS OF MULTIPLE FAULT TEST AND RELIABILITY MEASURES IN INTEGRATED GPS/INS SYSTEMS

System identification and sensor fusion in dynamical systems. Thomas Schön Division of Systems and Control, Uppsala University, Sweden.

Lecture. Aided INS EE 570: Location and Navigation. 1 Overview. 1.1 ECEF as and Example. 1.2 Inertial Measurements

Scalable Sparsification for Efficient Decision Making Under Uncertainty in High Dimensional State Spaces

On the Observability and Self-Calibration of Visual-Inertial Navigation Systems

Information and Observability Metrics of Inertial SLAM for On-line Path-planning on an Aerial Vehicle

One Approach to the Integration of Inertial and Visual Navigation Systems

Locating and supervising relief forces in buildings without the use of infrastructure

DMS-UAV Accuracy Assessment: AP20 with Nikon D800E

Simultaneous Localization and Map Building Using Natural features in Outdoor Environments

EE 570: Location and Navigation

Joint GPS and Vision Estimation Using an Adaptive Filter

Control de robots y sistemas multi-robot basado en visión

Adaptive Estimation of Measurement Bias in Six Degree of Freedom Inertial Measurement Units: Theory and Preliminary Simulation Evaluation

Calibration of a magnetometer in combination with inertial sensors

Evaluation of different wind estimation methods in flight tests with a fixed-wing UAV

SLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada

Cooperative Visual-Inertial Sensor Fusion: Fundamental Equations

IMU-Camera Calibration: Observability Analysis

Fuzzy Adaptive Kalman Filtering for INS/GPS Data Fusion

Dynamic P n to P n Alignment

Inertial Navigation System Aiding Using Vision

GPS-LiDAR Sensor Fusion Aided by 3D City Models for UAVs Akshay Shetty and Grace Xingxin Gao

Rao-Blackwellized Particle Filtering for 6-DOF Estimation of Attitude and Position via GPS and Inertial Sensors

Particle Filters; Simultaneous Localization and Mapping (Intelligent Autonomous Robotics) Subramanian Ramamoorthy School of Informatics

Quadrotor Modeling and Control

TSRT14: Sensor Fusion Lecture 9

Kalman Filters with Uncompensated Biases

Automated Tuning of the Nonlinear Complementary Filter for an Attitude Heading Reference Observer

Local Reference Filter for Life-Long Vision Aided Inertial Navigation

Lecture 5. Epipolar Geometry. Professor Silvio Savarese Computational Vision and Geometry Lab. 21-Jan-15. Lecture 5 - Silvio Savarese

Autonomous Vision Based Detection of Non-stellar Objects Flying in Formation with Camera Point of View

Distributed Robust Localization from Arbitrary Initial Poses via EM and Model Selection

The Unicycle in Presence of a Single Disturbance: Observability Properties

NAVIGATION OF THE WHEELED TRANSPORT ROBOT UNDER MEASUREMENT NOISE

Using the Kalman Filter for SLAM AIMS 2015

Determining absolute orientation of a phone by imaging celestial bodies

EE 565: Position, Navigation, and Timing

Robotics 2 Target Tracking. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard

Observabilty Properties and Deterministic Algorithms in Visual-Inertial Structure from Motion

Aerial Robotics. Vision-based control for Vertical Take-Off and Landing UAVs. Toulouse, October, 2 nd, Henry de Plinval (Onera - DCSD)

Sensor Aided Inertial Navigation Systems

SINPLEX - Small Integrated Navigator for PLanetary EXploration Stephen Steffes October 24, 2012 ADCSS 2012

Autonomous Vision Based Detection of Non-stellar Objects Flying in formation with Camera Point of View

Barometer-Aided Road Grade Estimation

Absolute map-based localization for a planetary rover

EKF and SLAM. McGill COMP 765 Sept 18 th, 2017

Observability-constrained Vision-aided Inertial Navigation

Trinocular Geometry Revisited

Kalman Filter Enhancement for UAV Navigation

Vision for Mobile Robot Navigation: A Survey

A New Kalman Filtering Method to Resist Motion Model Error In Dynamic Positioning Chengsong Zhoua, Jing Peng, Wenxiang Liu, Feixue Wang

Baro-INS Integration with Kalman Filter

From Bayes to Extended Kalman Filter

Adaptive Unscented Kalman Filter with Multiple Fading Factors for Pico Satellite Attitude Estimation

Vision-Aided Inertial Navigation: Closed-Form Determination of Absolute Scale, Speed and Attitude

Control of the Laser Interferometer Space Antenna

Measurement Observers for Pose Estimation on SE(3)

Application of Data Fusion to Aerial Robotics. Paul Riseborough March 24, 2015

An Underwater Vehicle Navigation System Using Acoustic and Inertial Sensors

Invariant Extended Kalman Filter: Theory and application to a velocity-aided estimation problem

Multi-Sensor Fusion with Interaction Multiple Model and Chi-Square Test Tolerant Filter

AS3010: Introduction to Space Technology

Chapter 4 Systems of Linear Equations; Matrices

Navigation System for Reusable Launch Vehicle

EE565:Mobile Robotics Lecture 6

ANALYTICAL MATHEMATICS FOR APPLICATIONS 2018 LECTURE NOTES 3

Multi-layer Flight Control Synthesis and Analysis of a Small-scale UAV Helicopter

Fundamentals of attitude Estimation

Pseudo-Measurements as Aiding to INS during GPS. Outages

Chapter 1. Introduction. 1.1 System Architecture

Sensor Fusion, 2014 Lecture 1: 1 Lectures

Inversion Based Direct Position Control and Trajectory Following for Micro Aerial Vehicles

GPS/INS Tightly coupled position and attitude determination with low-cost sensors Master Thesis

FIBER OPTIC GYRO-BASED ATTITUDE DETERMINATION FOR HIGH- PERFORMANCE TARGET TRACKING

Towards Consistent Vision-aided Inertial Navigation

Stochastic Models, Estimation and Control Peter S. Maybeck Volumes 1, 2 & 3 Tables of Contents

Adaptive Two-Stage EKF for INS-GPS Loosely Coupled System with Unknown Fault Bias

Initial Orbit Determination Using Stereoscopic Imaging and Gaussian Mixture Models

Adaptive Guidance and Control for Autonomous Formation Flight

Attitude determination method using single-antenna GPS, Gyro and Magnetometer

RELATIVE NAVIGATION FOR SATELLITES IN CLOSE PROXIMITY USING ANGLES-ONLY OBSERVATIONS

Nonlinear Observer Design for GNSS-Aided Inertial Navigation Systems with Time-Delayed GNSS Measurements

State Estimation for Autopilot Control of Small Unmanned Aerial Vehicles in Windy Conditions

Mobile Robot Localization

Improved Particle Filtering Based on Biogeography-based Optimization for UGV Navigation

Manipulators. Robotics. Outline. Non-holonomic robots. Sensors. Mobile Robots

Multiple Autonomous Robotic Systems Laboratory Technical Report Number

Simplified Filtering Estimator for Spacecraft Attitude Determination from Phase Information of GPS Signals

Vision-Based Estimation and Tracking Using Multiple Unmanned Aerial Vehicles. Mingfeng Zhang

Transcription:

Vision-Aided Navigation Based on hree-view Geometry Vadim Indelman, Pini Gurfil Distributed Space Systems Lab, Aerospace Engineering, echnion Ehud Rivlin Computer Science, echnion Hector Rotstein RAFAEL Advanced Defense Systems February

Contents Introduction hree-view geometry constraints Fusion with navigation system Results Summary

Introduction Pure inertial navigation solution diverges with time due to error integration Inertial solution may be compensated to eliminate or at least reduce errors, using e.g.: GPS DM Vision-based methods (VBM) GPS may be\become unavailable or unreliable Objective: Position estimation assuming only INS and a single onboard camera Recover scale ambiguity Real-time navigation aiding Handling loop scenarios

Previous Work SLAM: 6DoF SLAM aided GNSS/INS Navigation in GNSS Denied and Unknown Environments, Kim J. and Sukkarieh S., 5 Multiple-view + Bundle adjustment: A Dual-Layer Estimator Architecture for Long-term Localization, Mourikis A.I. and Roumeliotis S.I., 8 rifocal tensor + Motion Estimation: Recursive Camera-Motion Estimation With the rifocal ensor, Yu Y.K., et. al., 6 Enhanced-two-view + Constant-size state vector: Real-ime Mosaic-Aided Aerial Navigation, Indelman V., et. al., 9 4

Introduction (Cont.) Approach Decouple navigation aiding and environment representation construction (e.g. mosaic) in contrast to SLAM Constant-size state vector Store and manage captured images (and some nav. data) hree-view geometry Avoid structure reconstruction Assumptions INS Camera 5

hree View Geometry t t t q λ q, λ q, λ, p - ground landmark λ q - range - line of sight (LOS) i j - translation from to ij p 6

hree View Geometry (cont.) Coordinate systems L - Local Level Local North (LLLN) C - Camera t t Position of a ground landmark p relative to camera position at t : r r r λq = + λq r r r r λ q = + + λ q Expressing in LLLN system at t : C r r C C C C r C λc L q = CL + λcl q r r r λ = + +, C C C C C C C C CL q C L C L λ C L q t q λ q, λ q, λ r p λ q - range - LOS 7

hree View Geometry (cont.) Matrix formulation r r r r q q r r r r q q 6 4 A Note: Range parameters are unknown λ λ = λ 4 r 6 LOS and translation vectors are expressed in LLLN of t rank( A) < 4 Denote A % - a matrix comprised of any 4 rows of det( A% ) = A 8

hree View Geometry (cont.) he following constraints were obtained for a single ground landmark: r r ( r q ) q = r r q ( r ) q = r r r r r r r r ( ) ( ) ( ) q q q = q ( q q ) First two equations epipolar constraints hird equation relates between the magnitudes of r and r r r r r r r r m = b = b q, q, q r r r r r r r c = c q, q, q d = where r r r r r r r d = d( q, q) r b = c r r r r m= m( q, q) ( ) ( ) 9

hree View Geometry (cont.) Multiple features Matching pairs between st and nd view Matching pairs between nd and rd view Matching triplets between the three views r r N C C { q, q } i i i= r N C r C { q, q } i i i= r C r C r C { q, q, q } i i i N i= r r r r b i c = i=, K, N r r d j = j=, K, N r r m r = r=, K, N i B C D r r = N = N + N + N N M N

Fusion with Navigation Implicit Extended Kalman Filter (IEKF) formulation Recall B C r z D r r N M N All original LOS vectors are expressed in camera system of the appropriate view r r, are functions of Pos( t), Pos( t), Pos( t) is the current time t r z = h Pos t Ψ t Pos t Ψ r r r ( ( ) ( ) ( ) ( ) ( ) ( ) { C }) C C Pos t, Ψ t,,,, t, q, q, q i i i

Fusion with Navigation (cont.) State vector definition r r r r r r X = P V Ψ d b R Continuous system matrix I B As C L B Φ c = C L R 5 5 5 A s - a skew-matrix constructed based on accelerometer sensors readings B C L - DCM from Body to Local Level Local North systems

Fusion with Navigation (cont.) Standard equations of IEKF update step K = P L L P L + D R D Xˆ Xˆ r = + K z L Xˆ k+ k+ k k+ k+ k+ k k+ k+ k+ k+ [ ] []. ( ) k+ k+ k+ k k+ k+ k+ k+ k k+ k+ = k+ k+ k+ k + k+ k+ k+ k+ k+ P I K L P K D R D K Simplified vision-aided navigation scheme

Experiment Results Experiment Setup An IMU and a camera were mounted on top of a ground vehicle IMU\INS: Xsens Mi-G Camera: Axis 7MW IMU data and captured images were stored and synchronized IMU data @ Hz Imagery data @ 5Hz he method was applied in two scenarios Sequential updates Loop updates 4

Experiment Results (Cont.) rue trajectory North [m] 4 rue trajectory 5 5 5 rue rajectory East [m] Height [m] 5 5 5 5-5 5 5 ime [s] Alt [m] - 4 North [m] 4 East [m] 6 Recorded imagery 5

Experiment Results (Cont.) Example Image Implementation Matching details (simplified) riplets SIF features extraction from each image Feature matching based on their descriptor vectors False matches rejection using RANSAC over the Fundamental matrix model Image Image he Fundamental matrix is not required elsewhere Image Image Image Image r r r r r r r N N N { C } { } { } C C C C C C q, q, q, q, q, q, q i= i= i= i i i i i i i 6

Experiment Results (Cont.) Sequential upd Loop upd North [m] East [m] Height [m] 5 Position in NED 5 5 5 4 Estimated Inertial rue 5 5 5 5 5-5 5 5 5 ime [s] North [m] East [m] Height [m] 5 Position Error 5 5 5-5 5 5 Nav Err Sqrt Cov 5 5 5 ime [s] 7

Potential Applications Navigation aiding in loop scenarios Additional scenarios with overlapping three images Satellite navigation Formation flying navigation he developed constraints may be used in SLAM framework 8

Summary A new method for vision-aided navigation based on three-view geometry was presented he method does not require any a-priori information or external sensors, apart from the camera sensor he associated navigation data for each of the three images allowed to determine the scale ambiguity he method allows to reduce position errors in all axes to the levels of errors present while the first two images were captured Reduced computational requirements for vision-aided navigation phase Environment representation construction (e.g. mosaic) may be executed in a background process Various potential applications 9