150 likes | 311 Vues
Analysis of Remote Controlled vs. Autonomous Robots. Computation and Modeling. Daniel Kight and Ian Mercer CamTech High School, Hickory Ridge High School. 2010. The Idea. To have an autonomous robot as well as a remote controlled robot traverse through a maze
E N D
Analysis of Remote Controlled vs. Autonomous Robots Computation and Modeling Daniel Kight and Ian Mercer CamTech High School, Hickory Ridge High School 2010
The Idea • To have an autonomous robot as well as a remote controlled robot traverse through a maze • To test which one is more effective • To explore the functionality of two programming languages • NXT • Physical Etoys
The Obstacles The idea was simple on paper but we encountered many obstacles: • Neither of us has had much experience with the two languages so programming the bots ourselves was a challenge • The robots did not consistently turn 90° which is REALLY frustrating because the maze is nothing but right angles • We also had to custom build our robots to complete the maze
The Maze Start Finish
Autonomous Robot • This robot is controlled by the NXT programming software • The program is general enough to guide robots by telling how to turn as well as when and where • NXT “blocks” are used to define how the robot should act and react • The program is enveloped in “forever loops” which keep the program running at all times until the conditions are met to stop the program
The Algorithms • The autonomous robot has a section of programming that operates the third motor, on the side, to rotate in order to turn the head • The “head” is an ultrasonic sensor which detects the distances between the sensor and the object in front of the sensor • If the object is too close the robot will stop the A and B motors, and look for an alternative route around the object • When the robot finds a way to the other side of the maze then the light sensor will detect a white sheet of paper on the ground and stop all motors
Remote Control Robot • The remote controlled robot was programmed using PhysicalEtoys, a form of Squeak • In PhysicalEtoys , we used a steering wheel drawing to direct the robots heading • We also used “scripts”, lines of code, to control the robots speed, turning rate, and all other functions
Fires all scripts Remote Controlled Code Displays the distance to an object detected by the Ultrasonic sensor Controls the speed of the robot from 100 to -100 Turns either green, yellow, or red depending on the proximity of the sensor to the wall Controls the heading of the robot Turns the Ultrasonic sensor left or right to check for walls Makes the robot turn 90° left or right
Time Trials Winner FATALITY
What We Learned • Two programming languages • How to engineer and design our robots to be able to complete the maze • Problem solving skills acquired from troubleshooting the programs • Determination from repeated testing