1 / 66

Vision-based SLAM

Vision-based SLAM. Simon Lacroix Robotics and AI group LAAS/CNRS, Toulouse With contributions from: Anthony Mallet, Il-Kyun Jung, Thomas Lemaire and Joan Sola. Perceive data In a volume Very far Very precisely. 1024 x 1024 pixels 60º x 60º FOV  0.06 º pixel resolution

sheryl
Télécharger la présentation

Vision-based SLAM

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Vision-based SLAM Simon Lacroix Robotics and AI group LAAS/CNRS, Toulouse With contributions from: Anthony Mallet, Il-Kyun Jung, Thomas Lemaire and Joan Sola

  2. Perceive data • In a volume • Very far • Very precisely 1024 x 1024 pixels 60º x 60º FOV  0.06 º pixel resolution 1.0 cm at 10.0 m • Stereovision • 2 cameras provide depth   Benefits of vision for SLAM ? • Cameras : low cost, light and power-saving • Images carry a vast amount of information • A vast know-how exists in the computer vision community

  3. Stereo camera Stereo image pair Stereo images viewer 0. A few words on stereovision • The way humans perceive depth • Very popular in the early 20th century • Anaglyphs Polarization Red/Blue

  4.  Left image Right image d Disparity b Rightcamera Left camera Principle of stereovision In 2 dimensions (two linear cameras):

  5. Principle of stereovision In 3 dimensions (two usual matrix cameras): • Establish the geometry of the system (off line) • Establish matches between the two images, compute the disparity • On the basis of the matches disparity, compute the 3D coordinates

  6. P2 P P1 pl pr pr2 pr1 Geometry of stereovision x z x z y y Ol Or

  7. Q P pl Qr Ql x z x pr z y y Ol Or Geometry of stereovision

  8. R Epipolar lines Epipoles Epipolar geometry Geometry of stereovision x z x z y y Ol Or

  9. Stereo images rectification Goal: transform the images so that epipolar lines are parallel Interest: computational cost reduction of the matching process

  10. Several ways to compare windows: “SAD”, “SSD”, “ZNCC”, Hamming distance on census-transformed images… The matches are computed on windows Dense pixel-based stereovision Problem:  « For each pixel in the left image, find its correspondant in the right image » Left line ??? Right line

  11. Original image Disparity map 3D image Dense pixel-based stereovision

  12. Outline 0. A few words on stereovision 0-bis. Visual odometry

  13. 1. Stereovision 4. Motion estimation 3. Pixels tracking 2. Pixels selection Stereovision Visual odometry principle

  14. Visual odometry • Fairly good precision (up to 1% on 100m trajectories) • But: • Depends on odometry (to track pixels) • No error model available

  15. Visual odometry • Applied on the Mars Exploration Rovers 50 % slip

  16. Outline 0. A few words on stereovision 0-bis. Visual odometry 1. Stereovision SLAM

  17. What kind of landmarks ? Interest points = sharp peaks of the autocorrelation function Harris detector (precise version [Schmidt 98]) Auto-correlation matrix: Principal curvatures defined by the two eigen values of the matrix (s: scale of the detection)

  18. ? • Landmark matching Landmarks : interest points

  19. Interest points stability • Interest point repeatability • Interest point similarity : resemblance measure of the two principal curvatures of repeated points = 70% (7 repeated points out of 10 detected points) Maximum point similarity: 1

  20. Interest points stability Repeatability and point similarity evaluation: Evaluated with known artificial rotation and scale changes

  21. Interest points matching Principle: combine signal and geometric information to match groups of points [Jung ICCV 01]

  22. Landmark matching results Consecutive images Large viewpoint change Small overlap

  23. Landmark matching results 1.5 scale change 3.0 scale change

  24. Landmark matching results (Ced) Detected points Matched points An other example

  25. Stereovision SLAM • Landmark detection • Relative observations (measures) • Of the landmark positions • Of the robot motions • Observation associations • Refinement of the landmark and robot positions • Vision : interest points • Stereovision • Visual motion estimation • Interest points matching • Extended Kalman filter

  26. Dense stereovision actually not required IP matching applied on stereo frames (even easier !)

  27. Dense stereovision actually not required IP matching applied on stereo frames (even easier !)

  28. 1. Stereovision 2. Interest point detection 5. Motion estimation 3. Interest points matching 4. Stereovision Visual motion estimation

  29. Stereovision SLAM • Landmark detection • Relative observations (measures) • Of the landmark positions • Of the robot motions • Observation associations • Refinement of the landmark and robot positions • Vision : interest points  OK • Stereovision  OK • Visual motion estimation  OK • Interest points matching  OK • Extended Kalman filter

  30. Need to estimate the errors Seting up the Kalman filter • System state: • System equation: • Observation equation: • Prediction: motion estimates • Landmark “discovery”: stereovision • Observation : matching + stereovision

  31. Error estimates (1) Stereovision error: • Errors on the disparity estimates empirical study: s • Errors on the 3D coordinates : Online estimation of the errors Maximal errors : 0.4 m baseline: 1.2 m baseline:

  32. : variance of stereo vision error 1 pixel X0 Xk wk Error estimates (2) • Interest point matching error (not miss-matching) • - Correlation surface built thanks to rotation and scale adaptive correlation, • fitted with a Gaussian distribution Correlation surface Gaussian distribution • Combination of matching and stereo error • - Driven by 8 neighbor 3D points and projecting one sigma covariance ellipse to 3D surface

  33. Error estimates (3) • Visual motion estimation error • Propagating the uncertainty of 3D matching points set to optimal motion estimate • - 3D matching points set • - Optimal motion estimate • - Cost function • Covariance of the random perturbation u : propagation using Taylor series expansion of the Jacobian of the cost function • around

  34. Results 70m loop, altitude from 25 to 30m, 90 stereo pair processed Landmark error ellipses (x40) Trajectory and landmarks Position and attitude variances

  35. Results 70m loop, altitude from 25 to 30m, 90 stereo pair processed

  36. Results (Ced) 270m loop, altitude from 25 to 30m, 400 stereo pairs processed, 350 landmark mapped Landmark error ellipses (x30) Trajectory and landmarks Position and attitude variances

  37. Results (Ced) 270m loop, altitude from 25 to 30m, 400 stereo pairs processed, 350 landmark mapped

  38. Application to ground rovers • 110 stereo pairs processed, 60m loop landmark uncertainty ellipses (x5)

  39. Application to ground rovers • 110 stereo pairs processed, 60m loop

  40. Application to indoor robots • About 30 m long trajectory, 1300 stereo image pairs … … …

  41. Application to indoor robots • About 30 m long trajectory, 1300 stereo image pairs 10 times Cov. ellipse

  42. End of loop Middle of loop Beginning of loop 10 times Cov. ellipse Application to indoor robots • About 30 m long trajectory, 1300 stereo image pairs

  43. Elevation Camera Phi Theta Phi Theta Elevation Application to indoor robots • About 30 m long trajectory, 1300 stereo image pairs • Two rotation angles(Phi, Theta) • and Elevation must be zero

  44. Outline 0. A few words on stereovision 0-bis. Visual odometry Stereovision SLAM Monocular (bearing-only) SLAM

  45. Bearing-only SLAM Generic SLAM • Landmark detection • Relative observations (measures) • Of the landmark positions • Of the robot motions • Observation associations • Refinement of the landmark and robot positions Stereovision SLAM • Vision : interest points • Stereovision • Visual motion estimation • Interest points matching • Extended Kalman filter

  46. Bearing-only SLAM Generic SLAM • Landmark detection • Relative observations (measures) • Of the landmark positions • Of the robot motions • Observation associations • Refinement of the landmark and robot positions Monocular SLAM • Vision : interest points • « Multi-view stereovision » • INS, Motion model, GPS… • Interest points matching • Particle filter + extended Kalman filter

  47. Landmark observations « Observation filter » ≈ Gaussian particles 1. Landmark initialisation 2. Landmark observations

  48. Bearing-only SLAM

More Related