Vision-Aided Navigation Based on hree-view Geometry Vadim Indelman, Pini Gurfil Distributed Space Systems Lab, Aerospace Engineering, echnion Ehud Rivlin Computer Science, echnion Hector Rotstein RAFAEL Advanced Defense Systems February
Contents Introduction hree-view geometry constraints Fusion with navigation system Results Summary
Introduction Pure inertial navigation solution diverges with time due to error integration Inertial solution may be compensated to eliminate or at least reduce errors, using e.g.: GPS DM Vision-based methods (VBM) GPS may be\become unavailable or unreliable Objective: Position estimation assuming only INS and a single onboard camera Recover scale ambiguity Real-time navigation aiding Handling loop scenarios
Previous Work SLAM: 6DoF SLAM aided GNSS/INS Navigation in GNSS Denied and Unknown Environments, Kim J. and Sukkarieh S., 5 Multiple-view + Bundle adjustment: A Dual-Layer Estimator Architecture for Long-term Localization, Mourikis A.I. and Roumeliotis S.I., 8 rifocal tensor + Motion Estimation: Recursive Camera-Motion Estimation With the rifocal ensor, Yu Y.K., et. al., 6 Enhanced-two-view + Constant-size state vector: Real-ime Mosaic-Aided Aerial Navigation, Indelman V., et. al., 9 4
Introduction (Cont.) Approach Decouple navigation aiding and environment representation construction (e.g. mosaic) in contrast to SLAM Constant-size state vector Store and manage captured images (and some nav. data) hree-view geometry Avoid structure reconstruction Assumptions INS Camera 5
hree View Geometry t t t q λ q, λ q, λ, p - ground landmark λ q - range - line of sight (LOS) i j - translation from to ij p 6
hree View Geometry (cont.) Coordinate systems L - Local Level Local North (LLLN) C - Camera t t Position of a ground landmark p relative to camera position at t : r r r λq = + λq r r r r λ q = + + λ q Expressing in LLLN system at t : C r r C C C C r C λc L q = CL + λcl q r r r λ = + +, C C C C C C C C CL q C L C L λ C L q t q λ q, λ q, λ r p λ q - range - LOS 7
hree View Geometry (cont.) Matrix formulation r r r r q q r r r r q q 6 4 A Note: Range parameters are unknown λ λ = λ 4 r 6 LOS and translation vectors are expressed in LLLN of t rank( A) < 4 Denote A % - a matrix comprised of any 4 rows of det( A% ) = A 8
hree View Geometry (cont.) he following constraints were obtained for a single ground landmark: r r ( r q ) q = r r q ( r ) q = r r r r r r r r ( ) ( ) ( ) q q q = q ( q q ) First two equations epipolar constraints hird equation relates between the magnitudes of r and r r r r r r r r m = b = b q, q, q r r r r r r r c = c q, q, q d = where r r r r r r r d = d( q, q) r b = c r r r r m= m( q, q) ( ) ( ) 9
hree View Geometry (cont.) Multiple features Matching pairs between st and nd view Matching pairs between nd and rd view Matching triplets between the three views r r N C C { q, q } i i i= r N C r C { q, q } i i i= r C r C r C { q, q, q } i i i N i= r r r r b i c = i=, K, N r r d j = j=, K, N r r m r = r=, K, N i B C D r r = N = N + N + N N M N
Fusion with Navigation Implicit Extended Kalman Filter (IEKF) formulation Recall B C r z D r r N M N All original LOS vectors are expressed in camera system of the appropriate view r r, are functions of Pos( t), Pos( t), Pos( t) is the current time t r z = h Pos t Ψ t Pos t Ψ r r r ( ( ) ( ) ( ) ( ) ( ) ( ) { C }) C C Pos t, Ψ t,,,, t, q, q, q i i i
Fusion with Navigation (cont.) State vector definition r r r r r r X = P V Ψ d b R Continuous system matrix I B As C L B Φ c = C L R 5 5 5 A s - a skew-matrix constructed based on accelerometer sensors readings B C L - DCM from Body to Local Level Local North systems
Fusion with Navigation (cont.) Standard equations of IEKF update step K = P L L P L + D R D Xˆ Xˆ r = + K z L Xˆ k+ k+ k k+ k+ k+ k k+ k+ k+ k+ [ ] []. ( ) k+ k+ k+ k k+ k+ k+ k+ k k+ k+ = k+ k+ k+ k + k+ k+ k+ k+ k+ P I K L P K D R D K Simplified vision-aided navigation scheme
Experiment Results Experiment Setup An IMU and a camera were mounted on top of a ground vehicle IMU\INS: Xsens Mi-G Camera: Axis 7MW IMU data and captured images were stored and synchronized IMU data @ Hz Imagery data @ 5Hz he method was applied in two scenarios Sequential updates Loop updates 4
Experiment Results (Cont.) rue trajectory North [m] 4 rue trajectory 5 5 5 rue rajectory East [m] Height [m] 5 5 5 5-5 5 5 ime [s] Alt [m] - 4 North [m] 4 East [m] 6 Recorded imagery 5
Experiment Results (Cont.) Example Image Implementation Matching details (simplified) riplets SIF features extraction from each image Feature matching based on their descriptor vectors False matches rejection using RANSAC over the Fundamental matrix model Image Image he Fundamental matrix is not required elsewhere Image Image Image Image r r r r r r r N N N { C } { } { } C C C C C C q, q, q, q, q, q, q i= i= i= i i i i i i i 6
Experiment Results (Cont.) Sequential upd Loop upd North [m] East [m] Height [m] 5 Position in NED 5 5 5 4 Estimated Inertial rue 5 5 5 5 5-5 5 5 5 ime [s] North [m] East [m] Height [m] 5 Position Error 5 5 5-5 5 5 Nav Err Sqrt Cov 5 5 5 ime [s] 7
Potential Applications Navigation aiding in loop scenarios Additional scenarios with overlapping three images Satellite navigation Formation flying navigation he developed constraints may be used in SLAM framework 8
Summary A new method for vision-aided navigation based on three-view geometry was presented he method does not require any a-priori information or external sensors, apart from the camera sensor he associated navigation data for each of the three images allowed to determine the scale ambiguity he method allows to reduce position errors in all axes to the levels of errors present while the first two images were captured Reduced computational requirements for vision-aided navigation phase Environment representation construction (e.g. mosaic) may be executed in a background process Various potential applications 9