1 / 43

Perception for Robot Detection

Perception for Robot Detection. 2011/12/08. Robot Detection. Robot Detection. Better Localization and Tracking No Collisions with others. Goal. Robust Robot Detection Long Range Short Range. Long Range C urrent M ethod : Heuristic Color-Based .

gad
Télécharger la présentation

Perception for Robot Detection

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Perception for Robot Detection 2011/12/08

  2. Robot Detection • Robot Detection • Better Localization and Tracking • No Collisions with others

  3. Goal Robust Robot Detection • Long Range • ShortRange

  4. Long RangeCurrent Method:Heuristic Color-Based non-line white segments are clustered. The extracted clusters are classified as Nao robots if the following three criterions are satisfied: • the number of segments in the cluster should be larger than 3 • the width-to-height ratio should be larger than 0.2 • the highest point of the cluster should be close enough to the border line within 10 pixels as the observed robot should intersect with the field border in the camera view if both of the observing and observed robots are standing in the field.

  5. Long RangeImprovement:Feature-Based Considerations • Scale Invariant • Affine Invariant • Complexity Possible Solutions • SIFT(Scale-invariant feature transform) • SURF(Speeded Up Robust Features) • MSER(Maximally Stable ExtremalRegions)

  6. Long RangeImprovement:Feature-Based • Offline Models • Online Object Recognition using Predefined Models Feature Detection Feature Detection

  7. Short RangeSonar and Vision • Two-Stage • Sonar • Active Vision • Feet Detection(sufficiently large white spot)

  8. New NAOPossible Improvement • Using two cameras • One for ball, the other for localization • One for feet detection, the other for localization • …… • Not Downsampling • 320 x 240 -> 640 x 480

  9. References • bhuman11_coderelease • SIFT(http://www.cs.ubc.ca/~lowe/keypoints/) • SURF Paper:Speeded-Up Robust Features (SURF) • MSER Tracking Paper:Efficient Maximally Stable Extremal Region (MSER) Tracking

  10. Perception for Robot Detection 2011/12/22

  11. Color-Based SUCCESSFUL Cases

  12. Front View SURF Affine SIFT 209.318ms 73 -> 201 7s 11655 -> 14237

  13. Back View SURF ASIFT 6s 11377 -> 13930 125.309ms 74->186

  14. Side View SURF ASIFT 7s 7763 -> 13060 18matches

  15. Color-Based Failed Cases

  16. False Alarm Misclassifications of the field lines. SURF ASIFT

  17. < 100cm front SURF ASIFT 92 matches

  18. < 100cm back SURF ASIFT

  19. < 100cm side SURF ASIFT 40 matches

  20. 300cm front SURF ASIFT 21 matches

  21. 300cm side ASIFT

  22. 350cm front ASIFT

  23. Conclusion • Performance is not significantly better • Processing time is an issue

  24. Perception for Robot Detection 2011/1/5

  25. ROBOT Detection using adaboost with sift

  26. Multi-Class Training Stage:Using Adaboost • Classes = (different view point of nao robots) X (different scale of nao robots) X (different illuminations) • Input:For each class, training images (I1, l1)…(In, ln) where li = 0, 1 for negative and positive examples, respectively. • Output:strong classifier (set of weak classifiers) for each class.

  27. Issues In Training Stage • Number of Classes:It depends on the limits of SIFT features(angle of view-invariant, range of scale-invariant, degree of illumination-invariant)

  28. Detection Stage • Input:input image from nao camera • Output: • Number of robots in the image • Classes each robot belongs to => rough distance and facing direction of the detected robot

  29. Issues In Training Stage • Speed:Using sharing and non-sharing features to speed up. Input Image SIFT Feature Extraction Extracted Features Class 1 Detection using Sharing Features from Training Stage Detection using Non-sharing Features from Training Stage yes Class 2 Class 3 NO

  30. References • Hand Posture Recognition Using Adaboost with SIFT for Human Robot Interaction • Sharing features: efficient boosting procedures for multiclass object detection

  31. Aldebaran SDK 2012/3/16

  32. NAOqi Framework NAOqi is a process, which is like a module look-up server.

  33. Aldebaran Modules • Local Modules: • It is compiled as a library(xxxx.so), and can only be used on the robot. • More efficient than a remote module. • Launched in the same process. They speak to each other using only ONE broker. They can share variables and call each others’ methods without serialization nor networking. • Remote Modules: • it is compiled as an executable file(xxxx), and can be run outside the robot. • Less performance in terms of speed and memory usage. • Modules communicate each other by using the network.

  34. BHuman Lib-bhuman is a Aldebaran module which manages Nao’s hardware-related memory(joints, sensor data).

  35. C++ SDK 1.12 Installation • Installation Guide: http://www.aldebaran-robotics.com/documentation/dev/cpp/install_guide.html • Related Files: On Lab Server: /usr/home/markcsie/AldebaranSDK Requirements: 1. Linux(Ubuntu 10.04) 2. gcc> 4.4 is required. 3. CMake2.8(Used by qibuild) 4. qibuild-1.12 5. naoqi-sdk-1.12-linux32.tar.gz 6. nao-geode-cross-toolchain-1.12.0.tar.gz (For NAO 3.3) 7. nao-atom-cross-toolchain-1.12.0.tar.gz (For NAO 4.0) 8. nao-flasher-1.12.1.3-linux32.tar.gz (Flasher) 9. opennao-geode-system-image-1.12.gz (OS For NAO 3.3 ) 10. opennao-atom-system-image-1.12.opn (OS For NAO 4.0 ) 10. IDE: QtCreator(optional)

  36. Installation 1. Edit ~/.bashrc: export LD_LIBRARY_PATH=[path to sdk]/lib export PATH=${PATH}:~/.local/bin:~/bin 2. $[path to qibuild]/install-qibuild.sh 3. $ cd [Programming Workspace] $ qibuildinit–interactive(choose UNIX Makefiles) 4. $ qitoolchaincreate [toolchain name] [path to sdk]/toolchain.xml –default

  37. Create and Build a Project 1. $ qibuild create [project name] 2. $ qibuild configure [project name] –c [toolchainname] (--release) 3. $ qibuild make [project name] –c [toolchainname] (–release) 4. $ qibuildopen [project name] • 3. == running Makefile

  38. Cross Compile(Local Module) $ qitoolchain create opennao-geode [path to cross toolchain]/toolchain.xml –default $ qibuild configure [project name] –c opennao-geode $ qibuildmake [project name] –c opennao-geode

  39. Get Image Example There will be a compilation error due to openCV. http://users.aldebaran-robotics.com/index.php?option=com_kunena&Itemid=14&func=view&catid=68&id=8133 /** Create a proxy to ALVideoDevice on the robot.*/ ALVideoDeviceProxycamProxy(robotIp, 9559); /** Subscribe a client image requiring 320*240 and BGR colorspace.*/ conststd::string clientName = camProxy.subscribe("test", kQVGA, kBGRColorSpace, 30); /** Create an iplimage header to wrap into an opencv image.*/ IplImage* imgHeader = cvCreateImageHeader(cvSize(320, 240), 8, 3); /** Retrieve an image from the camera. * The image is returned in the form of a container object, with the * following fields: * 0 = width * 1 = height * 2 = number of layers * 3 = colors space index (see alvisiondefinitions.h) * 4 = time stamp (seconds) * 5 = time stamp (micro seconds) * 6 = image buffer (size of width * height * number of layers) */ ALValueimg = camProxy.getImageRemote(clientName); /** Access the image buffer (6th field) and assign it to the opencv image * container. */ imgHeader->imageData = (char*)img[6].GetBinary();

  40. NAO OS • NAO 3.X : http://www.aldebaran-robotics.com/documentation/software/naoflasher/rescue_nao_v3.html?highlight=flasher • NAO 4.0 : http://www.aldebaran-robotics.com/documentation/software/naoflasher/rescue_nao_v4.html

  41. Connect to NAO • Wired Connection(Windows only): Plug the Ethernet cable, then press the chest button, NAO will speak out his IP address. Connect to NAO using web browser. • Wireless Connection: http://www.aldebaran-robotics.com/documentation/nao/nao-connecting.html

  42. Software, Documentation and Forum http://users.aldebaran-robotics.com/ Account: nturobotpal Password: xxxxxxxxxx

More Related