840 likes | 1.2k Vues
Vision-based SLAM. Simon Lacroix Robotics and AI group LAAS/CNRS, Toulouse With contributions from: Anthony Mallet, Il-Kyun Jung, Thomas Lemaire and Joan Sola. Perceive data In a volume Very far Very precisely. 1024 x 1024 pixels 60º x 60º FOV 0.06 º pixel resolution
E N D
Vision-based SLAM Simon Lacroix Robotics and AI group LAAS/CNRS, Toulouse With contributions from: Anthony Mallet, Il-Kyun Jung, Thomas Lemaire and Joan Sola
Perceive data • In a volume • Very far • Very precisely 1024 x 1024 pixels 60º x 60º FOV 0.06 º pixel resolution 1.0 cm at 10.0 m • Stereovision • 2 cameras provide depth Benefits of vision for SLAM ? • Cameras : low cost, light and power-saving • Images carry a vast amount of information • A vast know-how exists in the computer vision community
Stereo camera Stereo image pair Stereo images viewer 0. A few words on stereovision • The way humans perceive depth • Very popular in the early 20th century • Anaglyphs Polarization Red/Blue
Left image Right image d Disparity b Rightcamera Left camera Principle of stereovision In 2 dimensions (two linear cameras):
Principle of stereovision In 3 dimensions (two usual matrix cameras): • Establish the geometry of the system (off line) • Establish matches between the two images, compute the disparity • On the basis of the matches disparity, compute the 3D coordinates
P2 P P1 pl pr pr2 pr1 Geometry of stereovision x z x z y y Ol Or
Q P pl Qr Ql x z x pr z y y Ol Or Geometry of stereovision
R Epipolar lines Epipoles Epipolar geometry Geometry of stereovision x z x z y y Ol Or
Stereo images rectification Goal: transform the images so that epipolar lines are parallel Interest: computational cost reduction of the matching process
Several ways to compare windows: “SAD”, “SSD”, “ZNCC”, Hamming distance on census-transformed images… The matches are computed on windows Dense pixel-based stereovision Problem: « For each pixel in the left image, find its correspondant in the right image » Left line ??? Right line
Original image Disparity map 3D image Dense pixel-based stereovision
Outline 0. A few words on stereovision 0-bis. Visual odometry
1. Stereovision 4. Motion estimation 3. Pixels tracking 2. Pixels selection Stereovision Visual odometry principle
Visual odometry • Fairly good precision (up to 1% on 100m trajectories) • But: • Depends on odometry (to track pixels) • No error model available
Visual odometry • Applied on the Mars Exploration Rovers 50 % slip
Outline 0. A few words on stereovision 0-bis. Visual odometry 1. Stereovision SLAM
What kind of landmarks ? Interest points = sharp peaks of the autocorrelation function Harris detector (precise version [Schmidt 98]) Auto-correlation matrix: Principal curvatures defined by the two eigen values of the matrix (s: scale of the detection)
? • Landmark matching Landmarks : interest points
Interest points stability • Interest point repeatability • Interest point similarity : resemblance measure of the two principal curvatures of repeated points = 70% (7 repeated points out of 10 detected points) Maximum point similarity: 1
Interest points stability Repeatability and point similarity evaluation: Evaluated with known artificial rotation and scale changes
Interest points matching Principle: combine signal and geometric information to match groups of points [Jung ICCV 01]
Landmark matching results Consecutive images Large viewpoint change Small overlap
Landmark matching results 1.5 scale change 3.0 scale change
Landmark matching results (Ced) Detected points Matched points An other example
Stereovision SLAM • Landmark detection • Relative observations (measures) • Of the landmark positions • Of the robot motions • Observation associations • Refinement of the landmark and robot positions • Vision : interest points • Stereovision • Visual motion estimation • Interest points matching • Extended Kalman filter
Dense stereovision actually not required IP matching applied on stereo frames (even easier !)
Dense stereovision actually not required IP matching applied on stereo frames (even easier !)
1. Stereovision 2. Interest point detection 5. Motion estimation 3. Interest points matching 4. Stereovision Visual motion estimation
Stereovision SLAM • Landmark detection • Relative observations (measures) • Of the landmark positions • Of the robot motions • Observation associations • Refinement of the landmark and robot positions • Vision : interest points OK • Stereovision OK • Visual motion estimation OK • Interest points matching OK • Extended Kalman filter
Need to estimate the errors Seting up the Kalman filter • System state: • System equation: • Observation equation: • Prediction: motion estimates • Landmark “discovery”: stereovision • Observation : matching + stereovision
Error estimates (1) Stereovision error: • Errors on the disparity estimates empirical study: s • Errors on the 3D coordinates : Online estimation of the errors Maximal errors : 0.4 m baseline: 1.2 m baseline:
: variance of stereo vision error 1 pixel X0 Xk wk Error estimates (2) • Interest point matching error (not miss-matching) • - Correlation surface built thanks to rotation and scale adaptive correlation, • fitted with a Gaussian distribution Correlation surface Gaussian distribution • Combination of matching and stereo error • - Driven by 8 neighbor 3D points and projecting one sigma covariance ellipse to 3D surface
Error estimates (3) • Visual motion estimation error • Propagating the uncertainty of 3D matching points set to optimal motion estimate • - 3D matching points set • - Optimal motion estimate • - Cost function • Covariance of the random perturbation u : propagation using Taylor series expansion of the Jacobian of the cost function • around
Results 70m loop, altitude from 25 to 30m, 90 stereo pair processed Landmark error ellipses (x40) Trajectory and landmarks Position and attitude variances
Results 70m loop, altitude from 25 to 30m, 90 stereo pair processed
Results (Ced) 270m loop, altitude from 25 to 30m, 400 stereo pairs processed, 350 landmark mapped Landmark error ellipses (x30) Trajectory and landmarks Position and attitude variances
Results (Ced) 270m loop, altitude from 25 to 30m, 400 stereo pairs processed, 350 landmark mapped
Application to ground rovers • 110 stereo pairs processed, 60m loop landmark uncertainty ellipses (x5)
Application to ground rovers • 110 stereo pairs processed, 60m loop
Application to indoor robots • About 30 m long trajectory, 1300 stereo image pairs … … …
Application to indoor robots • About 30 m long trajectory, 1300 stereo image pairs 10 times Cov. ellipse
End of loop Middle of loop Beginning of loop 10 times Cov. ellipse Application to indoor robots • About 30 m long trajectory, 1300 stereo image pairs
Elevation Camera Phi Theta Phi Theta Elevation Application to indoor robots • About 30 m long trajectory, 1300 stereo image pairs • Two rotation angles(Phi, Theta) • and Elevation must be zero
Outline 0. A few words on stereovision 0-bis. Visual odometry Stereovision SLAM Monocular (bearing-only) SLAM
Bearing-only SLAM Generic SLAM • Landmark detection • Relative observations (measures) • Of the landmark positions • Of the robot motions • Observation associations • Refinement of the landmark and robot positions Stereovision SLAM • Vision : interest points • Stereovision • Visual motion estimation • Interest points matching • Extended Kalman filter
Bearing-only SLAM Generic SLAM • Landmark detection • Relative observations (measures) • Of the landmark positions • Of the robot motions • Observation associations • Refinement of the landmark and robot positions Monocular SLAM • Vision : interest points • « Multi-view stereovision » • INS, Motion model, GPS… • Interest points matching • Particle filter + extended Kalman filter
Landmark observations « Observation filter » ≈ Gaussian particles 1. Landmark initialisation 2. Landmark observations