1 / 38

Harsha Kikkeri, Gershon Parent, Mihai Jalobeanu, and Stan Birchfield Microsoft Robotics

Harsha Kikkeri, Gershon Parent, Mihai Jalobeanu, and Stan Birchfield Microsoft Robotics. An Inexpensive Method for Evaluating the Localization Performance of a Mobile Robot Navigation System. Motivation. Goal: Automatically measure performance of mobile robot navigation system Purpose:

andrew
Télécharger la présentation

Harsha Kikkeri, Gershon Parent, Mihai Jalobeanu, and Stan Birchfield Microsoft Robotics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Harsha Kikkeri, Gershon Parent, Mihai Jalobeanu, and Stan Birchfield Microsoft Robotics An Inexpensive Method for Evaluating the Localization Performance of a Mobile Robot Navigation System

  2. Motivation Goal: Automatically measure performance of mobile robot navigation system Purpose: • Internal comparison – how is my system improving over time? • External comparison – how does my system compare to others? Requirements: • Repeatable – not just playback of recorded file, but run the system again (with environment dynamics) • Reproducible – others should be able to measure the performance of their system in their environment • Comparable – need to compare solutions with different hardware and sensors, in different environments • Inexpensive – cost should not be a barrier to use We focus only on localization performance here

  3. Scalability • System should scale • in space (large environments) • in time (long runs) • in variety (different types of environments) • Simplicity is key to scalability: • Low setup time • Easy calibration • Inexpensive components • Non-intrusive

  4. Previous work • Datasets: Radish, New College, SLAM datasets do not always have ground truth • SLAM with ground truth: Rawseeds, Freiburg, TUM use prerecorded data, do not scale easily • Qualitative evaluation: RoboCupRescue, RoboCupHome focus is on achieving a particular task • Benchmarking initiatives: EURON, RoSta, PerMIS, RTP have not yet created definitive set of metrics / benchmarks for nav • Comparison on small scale: Teleworkbench small scale • Retroreflective markers and laser: Tong-Barfoot ICRA 2011 requires laser, subject to occlusion

  5. Our approach Landmark • Checkerboard pattern • Yields 3D pose of camera relative to target • Convert to 2D pose of robot on floor x y

  6. A useful instrument Laser level: • Upward facing laser provides plumb-up line • Downward facing laser provides plumb-down line • Horizontal laser (not used) • Self-leveling, so plumb lines are parallel to gravity • Used to determine point on ground directly below origin of target

  7. Procedure • Calibration • Internal camera parameters • External camera parameters w.r.t. robot (position, tilt) • Floor parameters under each landmark (tilt) • Map-building • Build map • When under landmark, user presses button • Pose estimation + calibration  robot pose w.r.t. landmark • Store robot pose w.r.t. map* • Runtime • Generate sequence of waypoints • When robot thinks it is under a landmark,* • Pose estimation + calibration  robot pose w.r.t. landmark • Error is difference between pose at runtime and pose at map-building *Note: Any type of map can be used

  8. { Coordinate systems POSE ESTIMATION { image  3D Euclidean (external camera parameters) landmark internal camera parameters  2D Euclidean (optional) ? CALIBRATION camera 2D/3D Euclidean (relative metric) { world 2D/3D Euclidean ? 2D Euclidean (absolute metric) robot LOCALIZATION (what we want)

  9. Camera-to-robot calibration • Need to determine: • rotation between camera and robot 3 • translation between camera and robot + 3 6 parameters • If floor were completely flat, and camera were mounted perfectly upright, then camera xr = x – drc cos qrc yr = y – drc sin qrc qr = q – qa camera offset wheel base driving direction camera roll robot center robot robot pose camera pose But floor is often not flat, and camera is never upright

  10. Camera-to-robot calibration • When floor is not flat, and camera is not upright, then estimate • tilt of camera w.r.t. floor normal (fc) • azimuth of camera tilt plane w.r.t. forward direction of robot (qc) • tilt of floor w.r.t. gravity (ff) • azimuth of floor tilt plane w.r.t. positive x axis of landmark (qf) • Rotate robot incrementally 360 degrees • Rotation axis is perpendicular to floor • Optical axis traces cone ff floor normal fc optical axis gravity floor xr = x – drc cos qrc – z sin fc cos (qc+q) – z sin ff cos qf yr = y – drc sin qrc – z sin fc cos (qc+q) – z sin ff cos qf qr= q – qa } } rf rc

  11. Calibration geometry landmark gravity floor

  12. Calibration geometry landmark gravity floor camera center robot

  13. Calibration geometry landmark gravity floor camera center ff robot

  14. Calibration geometry axis of rotation (= floor normal) landmark ff gravity floor camera center ff

  15. Calibration geometry optical axis1 axis ofrotation landmark ff gravity fc floor camera center ff

  16. Calibration geometry optical axis1 axis ofrotation landmark ff z1 gravity fc floor camera center ff

  17. Calibration geometry optical axis1 axis ofrotation landmark x1 ff z1 gravity fc floor camera center ff

  18. Calibration geometry optical axis1 axis ofrotation optical axis2 landmark x1 rotate robot ff z1 gravity fc floor camera center ff

  19. Calibration geometry optical axis1 axis ofrotation optical axis2 These are 180o apart landmark x1 rotate robot ff z1 gravity fc floor camera center ff

  20. Calibration geometry optical axis1 axis ofrotation optical axis2 landmark x1 ff z1 gravity fc fc floor camera center ff

  21. Calibration geometry optical axis1 axis ofrotation optical axis2 landmark x1 ff z1 gravity fc fc z2 floor camera center ff

  22. Calibration geometry optical axis1 axis ofrotation optical axis2 landmark x1 x2 ff z1 gravity fc fc z2 floor camera center ff

  23. Calibration geometry Note: x1 + (x2-x1) / 2 = (x2+x1) / 2 (x1,z1), (x2,z2) are from pose estimation sin fc = (x2-x1) / 2z sin ff = (x2+x1) / 2z where z = (z1+z2)/2 optical axis1 axis ofrotation optical axis2 x2 – x1 landmark x1 x2 ff z1 gravity fc fc z2 floor camera center ff

  24. Calibration geometry radius of circle: (x1,z1), (x2,z2)are from pose estimation sin fc = (x2-x1) / 2z sin ff = (x2+x1) / 2z where z = (z1+z2)/2 optical axis1 axis ofrotation optical axis2 x2 – x1 landmark x1 x2 distance from landmark center to circle center: ff z1 gravity fc fc z2 floor camera center ff

  25. Calibration geometry radius of circle: (x1,z1), (x2,z2) are from pose estimation rc/ z sin fc = (x2-x1) / 2z sin ff = (x2+x1) / 2z rf/ z where z = (z1+z2)/2 optical axis1 axis ofrotation optical axis2 x2 – x1 landmark x1 x2 distance from landmark center to circle center: ff z1 gravity fc fc z2 floor where camera center ff

  26. Calibration geometry Tilt angles Top-down view of circle Azimuth angles where (from real data)

  27. Evaluating accuracy • Mounted camera tocarriage of CNC machine • Move to different known(x,y,q), measure pose • Covered area 1.3 x 0.6 m • Position err: m=5 s=2 mm max=11 mm Angular err: m=0.3 s=0.2 deg max=1 deg

  28. Evaluating accuracy • Placed robotat 20 randompositions under one landmark  Position err usually < 20 mm Orient err usually < 1 deg

  29. Evaluating accuracy • 15 landmarksacross 2 bldgs. • Placed robotat 5 canonicalpositions  Position err usually < 20 mm Orient err usually < 1 deg

  30. Evaluating accuracy • Our accuracy iscomparable to other systems • Our system isscalable to largeenvironments scales to arbitrarily large environments scales to very large single-floor environments (with additional step) • GTvision/GTlaser from Ceriani et al. AR 2009 (Rawseeds) • mocap from Kummerle et al. AR 2009 • retroreflective from Tong, Barfoot ICRA 2011

  31. Evaluating accuracy Two different buildings on the Microsoft campus

  32. Evaluating accuracy • Automated runs in 2 diff. environments • Accuracy comparable • Easy to setup • Easy to maintain

  33. Computing global coordinates Theodolite: • Horizontal laser • emanates from pan-tilt head • Reflects off mirror • Measures (w.r.t. gravity) • horizontal distance to mirror • pan angle to mirror • tilt angle to mirror(not used)

  34. Computing global coordinates theodolite For target positions: • Repeatedly measure distance and angle for each triplet of targets with line-of-sight •  2D Euclidean coordinates of all targets in a common global coordinate system • High accuracy of theodolite removes nearly all drift • Drift can be checked by adding all angles in a loop, comparing with 360 degrees (optional) l23 q2 reflector l12 q3 l34 q1 reflector q4 l15 q5 l45 q6 l67 q7 l78

  35. Computing global coordinates For target orientation: • Place reflector under several positions within target target q theodolite reflector (multiple locations – only 2 needed) tlength q h Given l1, l2, a (from theodolite)and tlength (known), find q l1 l2 Naïve solution sin q = (l1 - l2 cosa) / tlength Better solution tan q = (l1 - l2 cos a) / h where (l1 - l2 cosa)2 + h2 = l22 a theodolite Naïve solution is sensitive to noiseKey is to use only measured values

  36. Navigation contest • Microsoft and Adept are organizing Kinect Autonomous Mobile Robot Contest at IROS 2014 in Chicago http://www.iros2014.org/program/kinect-robot-navigation-contest

  37. Conclusion • System for evaluating localization accuracy of navigation • Inexpensive • Easy to setup • Easy to maintain • Highly accurate • Scalable to arbitrarily large environments • Scalable to arbitrarily run lengths (time or space) • With theodolite, global coordinates are possible • We have begun long-term, large-scale comparisons (results forthcoming) • Mobile robot navigation contest at IROS 2014

  38. Thanks!

More Related