1 / 31

3D Indoor Positioning System Final Presentation SD May 11-17

3D Indoor Positioning System Final Presentation SD May 11-17. Faculty Advisor: Dr. Daji Qiao. Members: Nicholas Allendorf - CprE Christopher Daly – CprE Daniel Guilliams – CprE Andrew Joseph – EE Adam Schuster – CprE. Client: Dr. Stephen Gilbert Virtual Reality Application Center.

kevork
Télécharger la présentation

3D Indoor Positioning System Final Presentation SD May 11-17

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 3D Indoor Positioning SystemFinal PresentationSD May 11-17 Faculty Advisor: Dr. DajiQiao Members: Nicholas Allendorf - CprE Christopher Daly – CprE Daniel Guilliams – CprE Andrew Joseph – EE Adam Schuster – CprE Client: Dr. Stephen Gilbert Virtual Reality Application Center

  2. Problem Statement • Currently, there is no inexpensive system that is able to accurately localize and track fingertips • Small scale (cm level) vs. larger scale (m level) accuracy • Such a system could be used as an input device or controller for human-computer interaction • It could be used for virtual reality systems, touch tables, or a “Minority Report”-style user interface Slide of 31

  3. Project Goal • Create a system capable of accurately tracking fingertips in three dimensions • Incorporate the ability to support many users simultaneously • Design the system so that it is easily reproducible Slide of 31

  4. Functional Requirements • Provide a 3D position of all tracked fingertips within a 2m x 2m x 2m indoor region with 1 centimeter accuracy • Update positions 15 times per second (15 Hz) • The system shall be capable of tracking as many as 60 object positions simultaneously • Positions shall be displayed in a graphical interface so the position may be viewed in real time Slide of 31

  5. Non-Functional Requirements • The device used by the potential user shall be small, lightweight and durable • The device shall be able to go three weeks without requiring any sort of recharging • The tracking infrastructure surrounding shall be easy to set up so the tracking system may be moved to different locations when required • The system shall be reproducible with consistent quality Slide of 31

  6. Constraints • Small device size limits choice of technology • Need for battery life forces much of the work to be done by the infrastructure • Want the system to be as non-intrusive as possible • Some part of the device must be uniquely identifiable to software Slide of 31

  7. Market Survey • There are several systems available that perform gesture/pose recognition, but not localization • Several systems that provide localization, but are not wireless, do not track finger movement, require holding a device, etc. • Playstation Move/Nintendo Wii • Our project in unique in that it will be wireless with accurate absolute fingertip localization and no hand held device Slide of 31

  8. Considered Technologies Slide of 31

  9. Range and Accuracy of Current Positioning Technologies Source: IPIN Website http://www.geometh.ethz.ch/ipin/index/IPIN_Opening_Session.pdf Slide of 31

  10. Choice of Technology • Optical/Infrared tracking • Most practical solution • Accuracy is a function of camera resolution • No need to develop custom hardware • Existing IR tracking systems are highly accurate, but very expensive ( $5,000 + ) • Our systemwill cost less than $1000 • Plan of Attack • Use stereo pairs of cameras to determine location of IR LEDs on fingertips. Slide of 31

  11. System Components • Glove • Contains IR LEDs and colored cloth on fingertips which are tracked by the cameras and software • Infrastructure • Provides mounting points for the cameras • Cameras • Mounted in stereo pairs around periphery of infrastructure • Detect IR LEDs and pass images to server for processing • Server/Computer • Performs image processing, calculates position, and runs the GUI Slide of 31

  12. System Block Diagram Slide of 31

  13. Hardware Platforms • Cameras : Logitech QuickCam Pro 9000 • Varying resolution, as high as 1280 x 720 @ 15 fps • 75 degree field-of-view • USB 2.0 • Computer : Dell XPS • Dual Core Intel Processor @ 2.93 GHz (4 Virt. Cores) • 4 GB RAM • Gloves/LEDs : 890 nm Low Profile Infrared LEDs • Infrastructure : 8020 Aluminum Framing Slide of 31

  14. Software Platforms • Computer Operating System : Windows 7 • Image Processing/Stereo Calibration : OpenCV • Graphical User Interface : OpenGL • Development Environment: Visual Studio 2008 Slide of 31

  15. Cost Breakdown Slide of 31

  16. System Detail: Gloves • 2 IR LEDs per fingertip – one front and one back • Uniquely colored fabric on each finger for ID • Battery pack and wiring to connect the LEDs to the battery and power switch Slide of 31

  17. System Detail: Infrastructure • Provide a stable platform to mount the cameras • Allows for flexibility in camera mounting position and orientation • Made from 80/20 aluminum framing Slide of 31

  18. System Detail: IR Cameras • Stereo pairs of cameras resolve 3D position of LEDs • One Camera of each pair has an IR filter to block visible light • Cameras mounted so that pairs’ field of view is intersecting as large of an area as possible • Attached to infrastructure with articulating mounts to allow for flexible positioning Filter Slide of 31

  19. IR Filter Response to Light IR LEDs: 890 nm Wavelength Slide of 31

  20. Graphical User Interface • Simple 3D position viewer • Used for testing to show how the system is working • Displays color and position of recognized fingertips • Incorporates camera feeds to show location of recognized points Slide of 31

  21. Camera Calibration • Camera lenses introduce distortion in the images, particularly around the edges of the images • In order to get accurate localization, each stereo pair of cameras must have parallel viewing rays • OpenCVhas a calibration routine which introduces error correction to compensate for this • Once calibrated, cameras do not need calibration unless they are moved relative to each other or the infrastructure Slide of 31

  22. Localization Process Slide of 31

  23. Testing Results • The system is functional, but not to the level set forth by the project requirements. • Several limitations have been encountered with the design: • USB 2.0 bandwidth is not large enough for 8 cameras, even with reduced resolution • Positioning accuracy is poor, positioning precision is good • Image processing is too intensive to maintain 15 fps, even with only two cameras Slide of 31

  24. Testing Results: Calibration • Accurate calibration is very difficult • Average calibration has an average error of 0.8 px • Our best: 0.415 px • 0.1 is a good target # of Frames used in Calibration Slide of 31

  25. Testing Results: Localization • Localization accuracy varies greatly Slide of 31

  26. Testing Results: USB Cameras • USB 2.0 bandwidth is 480 Mbps • One camera @ 30 fps uses ~48% of that! • Bandwidth limited to two cameras • With reduced resolution, accuracy suffers even more • PCI USB card grants one extra camera Slide of 31

  27. Testing Results: Durability • Gloves: • Battery life tested at 16 hrs continuous usage before the system has trouble picking up LEDs at distance • Initially, battery packs had problems slipping out and connections coming un-soldered • Problems have been resolved, and the glove functions very well Beginning of test After 16 hrs continuous use Slide of 31

  28. Final Results • The design of our system is feasible but still needs to be improved in several areas • Accuracy/Calibration need work • USB bandwidth issues to be resolved • IR LED location and color recognition algorithms must be sped up if 8 cameras are going to be used • Ultimately, we ran out of time • Should have looked in to system limitations earlier • Took too much time to learn how to manipulate OpenCV to our needs • Lack of foresight had us scrambling to find questionable solutions to major issues Slide of 31

  29. Team Member Duties • Daniel Guilliams - CprE • Team leader, client communication, 3D localization, GUI • Andrew Joseph – EE • Weekly status reports & other documentation, glove design and implementation, • Nicholas Allendorf – CprE • IR marker detection, color recognition, general image processing guru • Chris Daly - CprE • Camera control, IR filters, synchronous image capture • Adam Schuster - CprE • Infrastructure, camera calibration, camera pair design and assembly Slide of 31

  30. Questions? Slide of 31

  31. Thanks for your time! Slide of 31

More Related