1 / 29

See-and-Avoid System for UAV with Five Miniature Cameras

Pázmány Péter Catholic University Faculty of Information Technology and Bionics. See-and-Avoid System for UAV with Five Miniature Cameras. Tamás Zsedrovits † , Ákos Zarándy *† , Zoltán Nagy *† , Bálint Vanek * , András Kiss *† , Máté Németh *†

chun
Télécharger la présentation

See-and-Avoid System for UAV with Five Miniature Cameras

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PázmányPéterCatholic University Facultyof InformationTechnology and Bionics See-and-Avoid System for UAV with Five Miniature Cameras Tamás Zsedrovits†, ÁkosZarándy*†, Zoltán Nagy*†, Bálint Vanek*, András Kiss*†, Máté Németh*† †Pázmány Péter Catholic University, Facultyof InformationTechnology and Bionics, Budapest, Hungary, *Computer and Automation Research Institute of theHungarianAcademyofSciences (MTA SZTAKI) Budapest, Hungary EESTEC HIT 01/29/2014

  2. Outline • Goals • System requirements • Vision system architecture • Many core processor array implementation • Algorithms

  3. Goals • Unmanned Aerial Vehicle (UAV) technology is maturing • UAVs are ready for autonomous missions technically • Surveillance tasks • Autonomous missions are not authorized • Missing redundancy • Missing collision avoidance device • Automatic collision avoidance device is needed even for remotely piloted UAVs* Makingway forunmannedaircraft The FAA’s effort to bring unmanned planesinto the nation’s airspace faces toughobstacles, political as well as technical. Work is under way to resolve questionsabout privacy and to find affordabledetect-and-avoid technologies.

  4. Collision avoidance devices • Radar based • Applied on large airliners (Airbus) • Radar and vision • Applied on large remotely piloted UAVs (predator) • Transponder based • TCAS, ADS-B (all manned aircrafts and larger UAVs) • Vision only • Currently developed (small UAVs) • Basic requirements • Equivalent safety • Probabilities of collision < 10-11 per flight hour • Layered approach

  5. collision volume separation minima intruder collision avoidance threshold traffic avoidance threshold our track Requirements • Detect a 10m aircraft from 2km • 0.1 degree/pixel resolution • min. 220x60⁰ view angle • Flyable size/weight/power figures • On-board data storage

  6. Framework Plannedon-boardsensor and control loop

  7. Architecture • High resolution camera system in the visual range • Elongated, (220x60⁰) • Large view angle • min 2Mpixel • FPGA processing system • High computational power • Low power consumption • Solid state disk • Bandwidth • Capacity • Vibration Solid State Drive FPGA board Camera system 220⁰ to/from on-board control

  8. Camera selection • Single camera with wide angle optics • Easy from architectural, algorithmic, and processing side • Low distortion, ultra wide view angle optics are bulky • 3 pieces of C-mount cameras • Good image quality • Relatively large size, volume, and power (1kg, 10W) • High speed serial I/O (USB, Gige, fire-wire) difficult to connect to embedded systems • 5 pieces of miniature cameras (M12 lens) • Max 1.2 megapixel with global shutter • 50g, 200mW • Poorer image quality • Micro cameras • Very advantages size/weight/power figures • Rolling shutter only

  9. Solid State Drive FPGA board 752x480 cameras to/from navigation computer Sensor and computational system • 5 pieces of wVGA micro cameras • Aptina MBSV034 sensor • 5g • <150mW • 3.8mm megapixel objectives (M-12) • 70 degrees between two cameras • Total view angle: 220˚x78˚ • FPGA board with Spartan 6 FPGA • Solid State Drive (128Gbyte)

  10. Mechanics • Stable camera holder • Alignment • Avoids cross vibration of the cameras • 100g • Aluminum alloy • Electronics in the middle • Covered with aluminum plates

  11. Hardware system • Sensing and processing system • Field of view: 220°x78° • Resolution: ~2250x752 • Frame-rate: 56 FPS • Processor: Spartan6 L45 FPGA • Storage: 128Gbyte (23 min) • Size: 125x145x45mm (5”x6”x1,8”) • Weight: ~450g (1lb) • Power consumption: <8W

  12. Vision system mounted to the airplane

  13. View angle Panoramic view (stitched images) Perspective view

  14. Many-core processor arrays implemented in FPGA • FPGA chips have the largest computational capability nowadays • In affordable medium sized FPGAs: • Over 200 DSP cores • 200 memory blocks • 500 IO pins • Low power consumption • Special purpose processor arrays • How to utilize this performance? • Many-core architectures • Specially optimized data and control paths • Distributed control units

  15. Input: DVI/HDMI 1920x1080@50Hz Native camera interface Three image processing accelerators Parallel operation Optimized for the image processing algorithm Image processing system

  16. Algorithmic components • Aircraft detection against sky background • Preprocessing on the full frame • Identifying candidate points • Post processing • Discarding non-relevant candidate points • Tracking • Multi-level global and local adaptivity • Aircraft detection against terrain background • Visual-inertial data fusion • Motion based moving object detection

  17. Preprocessing (full frame) contrast calculation • Identifies the candidate aerial objects • Finds numerous false targets also • Local adaptation based on edge density • Global adaptation based on number of candidate points locally adaptive contrast thresholding threshold adjustment candidate points Too many or too few points? y n regions of interest (ROIs)

  18. Preprocessing (full frame) contrast calculation • Identifies the candidate aerial objects • Finds numerous false targets also • Local adaptation based on edge density • Global adaptation based on number of candidate points locally adaptive contrast thresholding threshold adjustment candidate points Too many or too few points? y n regions of interest (ROIs)

  19. Post processing (ROIs) cutting the perimeter of each object • Discard edges of clouds • Significantly reduces the number of candidate points • Resulting few targets can be tracked histogram calculation • Variance high? y n accept candidate point reject candidate point

  20. Post processing (ROIs) cutting the perimeter of each object • Discard edges of clouds • Significantly reduces the number of candidate points • Resulting few targets can be tracked histogram calculation • Variance high? y n accept candidate point reject candidate point

  21. Post processing (ROIs) cutting the perimeter of each object • Discard edges of clouds • Significantly reduces the number of candidate points • Resulting few targets can be tracked histogram calculation • Variance high? y n accept candidate point reject candidate point

  22. Post processing (ROIs) cutting the perimeter of each object • Discard edges of clouds • Significantly reduces the number of candidate points • Resulting few targets can be tracked histogram calculation • Variance high? y n accept candidate point reject candidate point

  23. Example 1: Ground camera in hand

  24. Pre- and post processing Red points: all candidate objects Green point: allowed by post processing

  25. Pre- and post processing + tracking Red points: all candidate objects Green points: allowed by post processing Blue points: tracked objects

  26. Example : Airborne camera

  27. Navigation aid • UAV was equipped with INS and cameras • Attitude was calculated from the INS data (blue) • Attitude was calculated from 5 point (red) and 8 point (magenta) algorithms • Visual-navigation data fusion development is going on

  28. Conclusion • Vision system is under development • Currently • We are replacing the FPGA board • Finalizing the implementation of the image processing subsystem • Fusion of visual and navigation algorithms • Replacing the wVGA cameras with 1.2 megapixel ones • Expecting the system completion by the end of the year

  29. Thank you for your attention! • The ONR Grant (Number: N62909-10-1-7081) is greatly acknowledged. • The support of the grants TÁMOP-4.2.1.B-11/2/KMR-2011-0002 and TÁMOP-4.2.2/B-10/1-2010-0014 is gratefully acknowledged. • The financial support provided jointly by the Hungarian State, the European Union and the European Social Fund through the grant TÁMOP 4.2.4.A/1-11-1-2012-0001 is gratefully acknowledged. • FPGA, many-coreprocessing and UAV group • Ákos ZarándyPéter Szolgay • András Radványi • Zoltán Nagy • András Kiss • PenczBorbála • Németh Máté

More Related