1 / 19

Persistent Tactical Seeability Through Integrated Sensor Guidance

Persistent Tactical Seeability Through Integrated Sensor Guidance. Tim McLain Randy Beard Bryan Morse. BYU Team Introduction. Randy Beard Professor, Electrical and Computer Engineering autonomous systems, unmanned aircraft systems, multiple vehicle coordination and control

zuri
Télécharger la présentation

Persistent Tactical Seeability Through Integrated Sensor Guidance

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Persistent Tactical Seeability Through Integrated Sensor Guidance Tim McLain Randy Beard Bryan Morse

  2. BYU Team Introduction • Randy Beard • Professor, Electrical and Computer Engineering • autonomous systems, unmanned aircraft systems, multiple vehicle coordination and control • Bryan Morse • Associate Professor, Computer Science • Computer vision, image reconstruction and registration, applications in WiSAR • Tim McLain • Professor and Dept. Chair, Mechanical Engineering • UAS autonomy and control, vision-based guidance, cooperative control of UA teams, WiSAR Mosaic Team Proprietary

  3. UAS Research at BYU • Cooperative timing problems • Cooperative persistent imaging • Cooperative fire monitoring • Consensus seeking • 3D Waypoint path planning • Wind compensation • Collision avoidance • Optic flow sensor • Laser ranger • EO cameras • Image stabilization • Geo-location • Vision aided tracking & engagement • Autopilot design for small UA • Attitude estimation • Adaptive control • Tailsitter guidance & control Cooperative Control Path PlanningPath Following Image Directed Control Autonomous Vehicles

  4. Related Work • UAS-assisted wilderness search and rescue (WiSAR) • Funded for six years by NSF • Morse, Goodrich, McLain • Key take-away: “Just because you flew over it and imaged it, doesn’t mean you saw it.” • Seeability metric conceived Mosaic Team Proprietary

  5. Related Work • UAV/UGV cooperative tracking (Army SBIR Ph. II with SET Corp.) • Enhancing probability of detection and persistent imaging of dismounts • SUAS wind energy extraction (Navy STTR Ph. I with Mosaic ATM) • Exploit energy available from Wx • Maintain mission effectiveness Mosaic Team Proprietary

  6. Related Work • Aerial recovery of UAS (Air Force STTR Ph. II with Procerus) • Goal: Recover air-launched SUAS • Concept: Mothership/drogue • Challenge: Mothership airspeed significantly higher than SUAS airspeed Mosaic Team Proprietary

  7. Mosaic Team Proprietary

  8. Three Product Families KestrelTM Flight Systems OnPointTM Vision Systems PerceptorTM Imaging Systems • Kestrel Autopilot v3.0 • Kestrel Autopilot v3.0 • Adaptive Control • Kestrel CockpitTM • Kestrel SIMTM – For RealFlight • OnPoint OnBoardTM • Vision Suite v3.0 • OnPoint Targeting v2.0 • OnPoint VPUTM • OnPoint GUI • Perceptor 88x • Perceptor DG • Perceptor DG2

  9. Kestrel Autopilot v3.0 • Fixed-wing / Heli / tri-rotor / quad-rotor • 500mhz DSP, 32Mb flash, 32Mb RAM • 11 servos supported onboard • Integrated 3-axis magnetometer • High quality pitot-static systems • Factory calibrated, temp. compensated IMU • 17 state EKF navigation solution • High accuracy gimbal pointing / geo-location Kestrel Autopilot v3.0 2.25” x 1.4” x .5” 21 grams

  10. Kestrel Cockpit v3.0 • Improved user interface • 3D streaming maps/terrain • 3D waypoints • Intuitive map elements • VTOL controls/features • Multi-agent support • Fixed wing and VTOL • Kestrel 2x and 3x autopilots • Intuitive multi-function display & health monitor • Agent & flight path rendered • Full-context 3D environment • Tight integration with OnPoint Targeting

  11. OnPoint Targeting v2.0 • Target tracking and geolocation • Video stabilization • Click ‘n Fly operation • New highly configurable GUI w/dockable windows • TIVO-like video pause/playback/record • Snap shots appear in Virtual Cockpit GCS as clickable icons • Improved Kalman filter target position/velocity estimation • Position uncertainty estimates shown on video overlay

  12. OnPoint OnBoard – Vision Suite v3.0 OnPoint VPUTM 1.9” x 1.35” 11 grams (.4oz) • Target tracking and geolocation • Video stabilization • Click ‘n Fly operation • OnPoint VPUTM (Vision Processing Unit) • Perceptor DG (ePTZ) imager (gyros, 5mp) • OMAP DSP processor (2Gb flash, 1Gb DDR) • SD card slot, USB2, video in/out, SPI/GPIO/ethernet, serial • Computer vision and inertial measurement for solid video ePTZ - 5mp Imager 1.3" x 0.9” Hardware 7/24/09

  13. Perceptor 88x • 2 axis gyro stabilized, slip ring, continuous 360° pan rotation, 90° tilt • Pointing resolution: 0.05⁰ • Stabilization: 0.05⁰ error w/ 15⁰ 2Hz disturbance • Small & lightweight: 3.5 in diameter turret, <0.85 lbs (400g) • EO (10x zoom) or IR • Factory calibrated IMU sensor suite • High bandwidth camera positioning for small platforms • 17 state GPS/INS solution + targeting & motion control provides for standalone operation • Graceful targeting degradation during GPS loss/denied environment • RS232, TTL serial, and CAN interfaces • Compliance with popular protocols for easy integration • Available soon in retractable version • OnPoint GUI: video record, playback, gimbal pointing • Control Modes Include: rate, angle, and latitude longitude elevation Continuous 360° Stabilized Gimbal 3.5” turret Sony FCB-IX, EO 10x optical zoom 4x digital zoom * Shown without IMU (KAP3) or cover.

  14. Procerus Perceptor 88x Mosaic Team Proprietary

  15. Mosaic Team Proprietary

  16. Simulation Architecture

  17. Current Status of Simulation • TCP-IP connection to Virtual Cockpit ground station • Includes n-step look-ahead algorithm • Faster observation of targets than orbits, lawnmower paths, other traditional paths • Only needs to know search area – no other parameters needed • Configurable dynamics block to accommodate various UAV platforms • Plotting capabilities • Import or customize terrain • Paint imaged terrain with measure of seeability • Recently implemented vector field path following

  18. Future of Simulation • Look-ahead algorithm is exponential, needs improvement • Chain-link model • Genetic algorithm • Gimbal model and pointing strategy enhancements needed to reduce ‘jumpiness’ of the camera • Need to establish metric to judge quality of real-time video imagery • Compare to simulated Tactical Seeability metric • Allow for user feedback • Consider amount of blur, contrast, wash-out, self-obstruction • Wavelet transforms or color histograms are possible options

  19. Mosaic Team Proprietary

More Related