1 / 25

Environmental Mapping Using Distance Ranging for Self-Aware Locomotion

Environmental Mapping Using Distance Ranging for Self-Aware Locomotion. Dan Lander Huiyu Luo. Contents. Introduction Background and Theory of Approach Methodology Results/Discussion Conclusions Demos References and Bibliography. Introduction.

marek
Télécharger la présentation

Environmental Mapping Using Distance Ranging for Self-Aware Locomotion

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Environmental Mapping Using Distance Ranging for Self-Aware Locomotion Dan Lander Huiyu Luo

  2. Contents • Introduction • Background and Theory of Approach • Methodology • Results/Discussion • Conclusions • Demos • References and Bibliography

  3. Introduction • Range Sensors need a way of knowing what is in front of them. This requires knowledge of the surrounding environment. • In the past, data imagery was used, but this could not give information about where an object was located. • Today, distance ranging is used to project and map where possible objects may lie in front of a sensor node. • This information is gathered using a range sensor and it provides the sensor node with self-aware locomotion. • The presentation outlines how this is done and describes some advantages this will have in the future.

  4. Background and Approach • Our approach to distance ranging involves two major algorithms: Object mapping and Mobility Control. A.) Object Mapping -Uses the distances recorded by the range sensor and computes the projected coordinates of where possible objects may be situated. B.) Control -Using an initial target location, the distance to the target is computed and compared with the actual distance recorded to determine whether the target is viewable or obstructed.

  5. Propagation Matrix • Before any mapping is done, a propagation matrix with entries Dij is constructed to organize all of the distance measurements recorded by the range sensor • The rows j of the matrix denote range sensor positions n and the columns i denote angles theta at which a position was measured at relative to the sensor

  6. Propagation Matrix • A example of propagation matrix entries • Dij: distance to the obstruction • j: positional index • i: angular index 1<=j<=Nx 1<=i<=Na dx = L/(Nx-1) da = pi/Na Na is constrained by angular resolution of the ranging sensor

  7. Object Mapping Algorithm • This algorithm works as follows: a.) Given a distance d from sensor position j taken at angle theta, the x and y coordinates of that distance are determined by: x = d*cos(theta) + xj, xj = x0 + (j-1)*dx, yj = y0 y = d*sin(theta) + yj, theta = i*da b.) If the distance d is judged to be less than the sensor’s visibility range, it is deemed that possible objects may lie in front of the sensor between those two points. c.) However, if the distance d lies at the same distance as or greater than the sensor’s visibility range, then it is deemed that no objects are within view of the sensor between those two points.

  8. Control Algorithm • This Algorithm works as follows: a.) Given the target point (x, y) that the sensor is to monitor and the sensor coordinates (xj, yj), the distance to the target at each possible sensor location is determined by: d = squareroot( (x-xj)^2+(y-yj)^2 ) b.) Next the angle the target makes relative to the sensor is determined and the two closest angles in the propagation matrix are used if the angle lies between two measured points. theta = arccos( (x-xj) / d ) i1 = floor(theta/da) i2 = ceil(theta/da)

  9. Control Algorithm c.) Comparing the distance to the target with the corresponding measured distances (Di1,j and Di2,j) found using the angles from step b above, it is determined whether the target is viewable from each of the sensor locations. d.) If viewable locations exist, determine the location that has the minimum distance to the target. If the target is not viewable from any location, determine a sensible location to relocate the range sensor to.

  10. Methodology • The major steps involved in our Project were: 1.) Researching and purchasing a suitable range sensor 2.) Outlining and specifying our mapping and control algorithms 3.) Hooking up the range sensor to a mote and hardware platform 4.) Moving the range sensor around to survey and record distance measurements 5.) Coding and implementing our mapping and control algorithms

  11. Range sensor • 1.) Smart Sensor 600 from SensComp • Ultra sound sensor (49.4 kHz) • Very light weight (.7 oz, 19 g) • Small dimensions (.95” by 3.4” by 3.4”) • Good distance range (6”~35’, .15~10.7 m) • Reasonable beamwidth and resolution (15 deg –6 dB) • TTL compatible • Able to operate at multiple echo mode • Affordable

  12. Pin layout • Range Sensor Details

  13. Single echo mode 1. Power on for 5 ms 2. Assert INIT  transmit 16 impulses at 49.4 kHz 3. Internal blanking for 2.38 ms to avoid false echo detect. 4. Assert ECHO when receiving reflections

  14. Mica2 • We use mica2 from UC Berkeley

  15. Range sensing • Mica2 pin layout • Wiring: • INIT – LED1 • ECHO – INT1 • Algorithm: • Power on • Assert LED1  INIT high  start the counter • ECHO asserts  generate an interrupt stop counter • Record the counter value • Transmit recordings to PC

  16. Multiple echo mode • 1, 2, 3 same as single mode operation. 4. Wait for ECHO 5. Echo goes high  Assert BLNK for 0.44 ms 6. Back to 1

  17. Results and discussion • Range sensor implementation • Didn’t manage to get the range sensor work with the mote. • Mapping and control algorithm • Obtained results on mapping and control algorithms by mimicing the behavior of range sensor and produce propagation matrices. • Model range sensor behavior • Angular resolution • At single echo mode, Dij is the distance to the closet object within the 15 degree beamwidth. • Detection range • Beyond detection range, can’t tell if there are obstacles or not.

  18. Results • Object Mapping Results • Sensor moves along x axis ~ [-5, 5] • Sinusoid obstruction • Mapping from propagation matrix

  19. Results • Control Results • Targeting position is at [10, 5] • Report positions that can view the target and identify the closest one.

  20. Some other examples

  21. Some other examples

  22. Discussion • Angular resolution vs. detectability • While sensors with narrow beamwidth have higher angular resolution, they may not detect some irregular obstacles. • Multiple echo mode • Multiple echo mode may be used to detect multiple object within the beamwidth. By comparing to adjacent measurements, finer resolution can be obtained. • Detection range • Range sensors have both min and max detection range. Sensors covered by fallen leaves may declare no obstacles due to Min. • Power considerations • This not only affect the selection of range sensor (small dimension, light weight) but also detecting strategies, such as how many discrete position and angles to take measurements at, how often to update the propagation matrix.

  23. Conclusions • Despite limitations of currently available range sensors (detecting ranging, angular resolution etc.), they can provide more reliable obstacle information than cameras. • With distance ranging ability, it is possible for sensors to conduct self-aware locomotion and improve sensing accuracy. • Careful design based on multiple echo mode can result in higher sensing resolution. • Issues such as power, sensor track adds additional constraints on algorithm design.

  24. Demos • An additional Demos will now be presented: A C++ program calculating projected object coordinates from distance measurements and possible sensor locations for viewing a predetermined target position.

  25. References and Bibliography • Robert K. Harle and Andy Hopper, "Building world models by ray-tracing within ceiling-mounted positioning systems," UbiComp 2003, LNCS 2864, pp. 1-17, 2003. • Yang Li, Jason I. Hong and James A. Landay, "ContextMap: modeling scenes of the real world for context-aware computing," UbiComp 2003. • SensComp 600 series smart sensor specifications. • Jessica Feng et al. Obstacle identification and localization, SENS poster. • Campbell Scientific SR50 ranging sensor manual, http://www.campbellsci.ca/CampbellScientific/Catalogue/SR50.pdf. • Linux platform, http://www.linux.org/. • UC Berkeley Tinyos, http://webs.cs.berkeley.edu/tos/. • ATmel AVR, http://www.atmel.com/products/AVR/. • Discussions with Aman Kansal and Yen-Cheng Kuang.

More Related