1 / 19

A Framework for use in SLAM algorithms Principle Investigator: Shaun Egan

A Framework for use in SLAM algorithms Principle Investigator: Shaun Egan Supervisor: Dr. Karen Bradshaw. SLAM is a family of algorithms. Count Left. Count Right. Odometry Algorithm. Change in wheel position:. c : encoder count d: wheel diameter i : increments per tour.

dirk
Télécharger la présentation

A Framework for use in SLAM algorithms Principle Investigator: Shaun Egan

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Framework for use in SLAM algorithms Principle Investigator: Shaun Egan Supervisor: Dr. Karen Bradshaw

  2. SLAM is a family of algorithms

  3. CountLeft CountRight Odometry Algorithm

  4. Change in wheel position: c: encoder count d: wheel diameter i: increments per tour Change in heading: r: right encoder count l: left encoder count a: axis wheel ration d: wheel diameter Average heading: ast known heading New robot position and heading ))

  5. The problem with ultrasonic range detection The answer?

  6. Range scans • Triangulation

  7. Triangulation Add current sensor information to the sliding window Compare each of the new sensor range readings with all other previous readings in the window Triangulate these readings to create valid points for each new sensor reading: Calculate some required values: s: sensor position r: range value Calculate the points of intersection And finally, find which point is the one we want (the one visible to both sensors: For each point (i): For each sensor (k):

  8. Range scans • Triangulation • Point refinement

  9. Point refinement: Calculate the average position of points: For every point (i): Create a new landmark: new Landmark()

  10. Range scans • Triangulation • Point refinement • Data association • Landmark refinement

  11. Data association and landmark refinement: 1) The newly created landmark is added to the map 2) Compare landmarks 3) If landmarks are within a certain range of each other: Considered the same landmark Merged according to their triangulation count

  12. phew…

  13. Additional landmark extraction and data association models • The addition of a package for localization algorithms • The addition of a package for loop closure utilities • Extensions for cooperative SLAM

  14. ?

More Related