Gaze Controlled Robotic Platform
190 likes | 324 Vues
This project explores the development of a gaze-controlled robotic platform that enables users to control a robot's movements by simply directing their gaze using a laser pointer. Key components include a robot that operates within a typical room size, a system for tracking user gaze, and a communication interface between the user and the robot. The objectives focus on creating an intuitive control system, ensuring precise movement within a close range, and enabling the robot to adapt to various indoor lighting conditions. The platform aims not only for functionality but also ease of use with minimal learning curves.
Gaze Controlled Robotic Platform
E N D
Presentation Transcript
Gaze Controlled Robotic Platform Nick Harezga Jeremy Thornton Matthew Wozniak
Agenda • Overview • Objectives • Requirements • Design • Implementation • User Interface • Interfaces & Communication • Arduino • Robot • Algorithms • Vision • Risks • Multidisciplinary Aspects • Testing • Costs
Overview The primary goal of the project is to allow a user to control a robot using their “gaze” (which in this case is specified using a laser pointer). To do this we need a few components • A robot which can move freely about a room • A system to determine the location of a user’s “gaze” • A method of locating the robot relative to the user’s gaze • A way for the user to control when the robot moves
Objectives • Creation of a system capable of detecting a robot’s location relative to a target destination • Also able to transform this information into discrete instructions • Construction of a robot capable of wirelessly receiving and executing movement instructions • Intuitive user controls with next to no learning curve • The robot should be able to operate in a variety of indoor environments within a field large enough to be considered useful
Requirements • Accuracy • The robot should be able to move to within a “tennis ball area” of the laser target location • Longevity • An hour of regular use • Up to the maximum longevity of the Asus EeePC battery on standby (approx 9 hours) • Area of Operation • Operate from up to 8 feet in front of the camera (large portion of a standard room size) • Reliability • System functions properly under various lighting conditions ranging from full daylight to artificial lighting at nighttime • Simplicity • 2 button user remote • Simple calibration and setup procedures • Portable
Design The project consists of 5 primary components: • Computer • Robot (mounted with a calibration point) • Scene Camera • Wireless Module • User Control Module
Implementation • Computer: Asus EeePC 1000HE • Vision algorithms: OpenCV in Python • Robot control: Arduino Pro Mini 3.3V 8MHz • Communication: Xbee 1mW Wireless Modules • User Input: FT245RL USB to FIFO Module
User Interface • Control • Laser targeting device (red laser pointer) • Wired 2-button remote • Connected via USB to the PC • Status LEDs to report current activity • Diagnostics • Robot status LEDs (error, command in progress, etc.)
Interfaces & Communication • PC Robot • Handled via Xbee transparent mode • Wireless serial link • Tested and confirmed functional between PC and Arduino • Human perceivable latency in loopback test program is negligible • PC serial communication via PySerial library (Python) • PC User Remote • FT245LR USB to FIFO
Arduino (Pro Mini 3.3V) • Serial communication (TX0 and RX1) • External Interrupts • Wheel encoders on pins 2 and 3 • PWM signals to Ardumoto motor shield • Pins 5 and 6 via analogWrite() function • Sets a duty cycle from 0 to 100% based on the value in [0,255] • Voltage supplied to motors varies directly with the duty cycle • Alternatively, digital writes can be used to simply turn motors on and off if speed control becomes unnecessary • Direction bit on pins 4 and 7 – logic HIGH or LOW • Status LEDs • Pins 8 and 9 – logic HIGH or LOW • 4 digital pins remain available
Robot • Constructed using Legos • Lego motor spec @9V • Motors will be driven by the Arduino through the Ardumoto motor shield • Reflective wheel encoders used for turning and distance tracking
Algorithms • Main loop: • Read inputs from user control • Capture locations of robot and laser target via scene camera • Un-distort image • Determine location of laser target • Determine location and orientation of robot • Calculate path of movement for the robot • Robot movement: • Update velocity vector • Recalculate position • Optionally recapture positions using scene camera for adjustments • Each stage has error correction and diagnostic output for debugging purposes
Algorithms - Vision • Camera is extremely important • We are using the Logitech Pro 900 • 1600x1200 max resolution • Un-distortion • OpenCV has built in functions for detecting and removing camera and perspective distortion • Robot orientation and location • Easily distinguishable and unique shape on the top of the robot (“calibration point”) • Laser target detection • Convert image to HSV and filter
Risks • Robot accuracy • Largely reliant on algorithms and wheel encoder resolution • Detection of robot orientation and position • Longevity requirements • 3.7V Li-Ion 2000mAh • Arduino – Idle: 6mA (measured), Active: 12mA • Xbee – 55mA (measured) • Ardumoto – 36mA (rated Icc max) • 7.2V NiMH 3000mAh • Motors – 0.12A each under load
Multidisciplinary Aspects • Computer Science • Interpreting results of computer vision algorithms into both diagnostic information and discrete instructions • Imaging Science • Image normalization, interpretation and point detection • Computer Engineering • Command encoding and communication as well as microcontroller software • Electrical Engineering • Motor control, button debouncing, battery concerns and circuitry • Mechanical Engineering • Robot weight and design
Testing • Subsystem/Component Testing • Scene camera • Detection algorithm • Communication • Wireless • Arduino to Xbee • PC to Xbee • Arduino • Command interpreting • Pin output • Do other pieces of hardware work with the output?
Testing (cont.) • Subsystem/Component Testing • Robot • Battery life • Movement • Straight line and turning • Gradual Integration • System Testing • Does everything work together?