1 / 12

Perception and Perspective in Robotics

active probing. segmentation. affordance exploitation (rolling). use constraint of familiar activity to discover unfamiliar entity used within it. reveal the structure of unfamiliar activities by tracking familiar entities into and through them. manipulator detection (robot, human).

laforge
Télécharger la présentation

Perception and Perspective in Robotics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. active probing segmentation affordance exploitation (rolling) use constraint of familiar activity to discover unfamiliar entity used within it reveal the structure of unfamiliar activities by tracking familiar entities into and through them manipulator detection (robot, human) edge catalog object detection, recognition 2 1 2 1 3 3 Perspective familiar activities (tasks, games, …) Perception familiar entities (objects, actors, properties, …) Perception and Perspective in Robotics Paul Fitzpatrick • MIT Computer Science and Artificial Intelligence Laboratory • Humanoid Robotics Group Overview ‘Toil’ Example – Active Segmentation ‘Theft’ Example – Search Activity • Goal • To build robots that can interact with novel objects and participate in novel activities • Challenge • Machine perception can be robust for a specific domain such as face detection, but unlike human perception it is not currently adaptable in the face of change (new objects, changed circumstances) • Approach • Integrate conventional machine perception and machine learning with strategies for opportunistic development – • Active perception (sensorimotor ‘toil’) • Interpersonal influences (‘theft’) • This work is implemented on a humanoid robot (Cog, see right). The robot uses the structure of familiar activities to learn about novel elements within those activities, and tracks known elements to learn about the unfamiliar activities in which they are used. Object boundaries are not always easy to detect visually, so robot Cog sweeps its arm through ambiguous areas This can cause object motion, which makes boundaries much easier to find Then robot can learn to recognize and segment object without further contact Robot observes a human searching for objects, and learns to make a connection between the named target of the search and the object successfully found. The robot has no predefined vocabulary or object set. Human Robot says “Find” “Toma” (shows cube) “No” (shows car) “No” (shows bottle) “Yes!” (shows cube) “Say” (shows bottle) “Say” says “Find” “Toma” (sees cube) “No” (sees car) “No” (sees bottle) “Yes” (sees cube) “Say” “Cube” (sees bottle) “Say” “Toma” This is a good basis for adaptable object perception: This work is funded by DARPA under contract number DABT 63-00-C-10102, and by the Nippon Telegraph and Telephone Corporation under the NTT/MIT collaboration agreement

  2. Goal • To learn how human-level perception is possible, by trying to build it • Challenge • Machine perception can be robust for a specific domain, but is not adaptable like human perception • Approach • Integrate conventional machine perception and machine learning with strategies for opportunistic development – • Active perception (sensorimotor ‘toil’) • Interpersonal influences (‘theft’) • Development • If a robot is engaged in a known activity there may be sufficient constraint to identify novel elements within that activity. Similarly, if known elements take part in some unfamiliar activity, tracking those can help characterize that activity. • Potentially, perceptual development is an open-ended loop of such discoveries. Learning a sorting activity Human shows robot where a collection of disparate objects should go, based on some common criterion (color). Robot demonstrates understanding through verbal descriptions, nods towards target locations. Kismet What is done on Kismet Novel Perspective leads to Novel Perception Learning a search activity Human shows robot examples of search activity by speaking. Robot demonstrates understanding by linking name and object. Learning through a search activity Blah blah Cog What is done on Cog

  3. active probing segmentation affordance exploitation (rolling) use constraint of familiar activity to discover unfamiliar entity used within it reveal the structure of unfamiliar activities by tracking familiar entities into and through them manipulator detection (robot, human) edge catalog object detection, recognition 3 2 1 3 2 1 familiar activities familiar entities (objects, actors, properties, …) Perception and Perspective in Robotics Paul Fitzpatrick • MIT Computer Science and Artificial Intelligence Laboratory • Humanoid Robotics Group Overview An Example – Active Segmentation Open-ended Development If the robot is engaged in a known activity there may be sufficient constraint to identify novel elements within that activity. Similarly, if known elements take part in some unfamiliar activity, tracking those can help characterize that activity. Potentially, perceptual development is an open-ended loop of such discoveries. Object boundaries are not always easy to detect visually, so robot Cog sweeps its arm through ambiguous areas This can cause object motion, which makes boundaries much easier to find Then robot can learn to recognize and segment object without further contact • Goal • To learn how human-level perception is possible, by trying to build it • Challenge • Machine perception can be robust for a specific domain, but is not adaptable like human perception • Approach • Integrate conventional machine perception and machine learning with strategies for opportunistic development – • Active perception (sensorimotor ‘toil’) • Interpersonal influences (‘theft’) • Experimental Platform • Expressive active vision head ‘Kismet’ and upper-torso humanoid robot ‘Cog’ Kismet What is done on Kismet Sorting activity Human shows robot where a collection of disparate objects should go, based on some common criterion (color). Robot demonstrates understanding through verbal descriptions, nods towards target locations. Gives opportunity for much development… Cog What is done on Cog Search activity Human shows robot examples of search activity by speaking. Robot demonstrates understanding through verbal descriptions, nods towards target locations.

  4. EgoMap short term memory of objects and their locations so “out of sight” is not “out of mind” use constraint of familiar activity to discover unfamiliar entity used within it reveal the structure of unfamiliar activities by tracking familiar entities into and through them familiar activities familiar entities (objects, actors, properties, …) Perception and Perspective in Robotics Paul Fitzpatrick • MIT Computer Science and Artificial Intelligence Laboratory • Humanoid Robotics Group Active Perception To foo foo foo Active Segmentation Solve classic problem • Object boundaries are not always easy to detect visually (e.g. yellow car on yellow table) • Solution: robot Cog sweeps through ambiguous area • Resulting object motion helps segmentation • Robot can learn to recognize and segment object without further contact • Opportunities abound and cascade • Robot can perform “find the toma” style tasks • Observes search activity • Then uses structure of search activity to learn new properties (object names) • Searching and sorting Active Perception Point 1, 2, 3 Motivation Training examples are currently a necessary condition for achieving robust machine perception. Acquiring those examples is properly the role of perception itself. But a human is typically needed to collect those examples. Sorting task Human shows robot where a collection of disparate objects should go, based on some common criterion (color). Robot demonstrates understanding through verbal descriptions, nods towards target locations. Search task Human shows robot examples of search activity by speaking… Robot demonstrates understanding through verbal descriptions, nods towards target locations. Active Perception To foo foo foo Active Segmentation Solve classic problem

  5. EgoMap short term memory of objects and their locations so “out of sight” is not “out of mind” use constraint of familiar activity to discover unfamiliar entity used within it reveal the structure of unfamiliar activities by tracking familiar entities into and through them familiar activities familiar entities (objects, actors, properties, …) Perception and Perspective in Robotics Paul Fitzpatrick • MIT Computer Science and Artificial Intelligence Laboratory • Humanoid Robotics Group • Object boundaries are not always easy to detect visually (e.g. yellow car on yellow table) • Solution: robot Cog sweeps through ambiguous area • Resulting object motion helps segmentation • Robot can learn to recognize and segment object without further contact Goal To understand perception by trying to build it Approach Extend machine perception to include opportuistic deve The grist: Active perception Interpersonal influences The mill: Opportunistic development Examples • Opportunities abound and cascade • Robot can perform “find the toma” style tasks • Observes search activity • Then uses structure of search activity to learn new properties (object names) • Searching and sorting

  6. 1

  7. B A C Opportunism Standard approach to machine perception is to develop algorithms which, when provided with sufficient training data, can learn to perform some classification or regression task. Can move one step back and develop algorithms which, given physical opportunities, acquire the training data. Need to design system behavior side-by-side with the perceptual code. Opportunistic Development Suppose there is a property P which can normally not be perceived. But there exists a situation S where it can be. Then the robot can try to get into situation S, and observe P, and relate it to other perceptual variables that are observable

  8. Perception and Perspective in Robotics Training Data Task Learning Mechanism Head (7 DOFs) poking Sequencing Model Instructor Right arm (6 DOFs) Left arm (6 DOFs) EgoMap short term memory of objects and their locations so “out of sight” is not “out of mind” Task Modeling object segmentation Torso (3 DOFs) affordance exploitation (rolling) Task Grounding State Grounding use constraint of familiar activity to discover unfamiliar entity used within it reveal the structure of unfamiliar activities by tracking familiar entities into and through them Stand (0 DOFs) manipulator detection (robot, human) edge catalog Perceptual System object detection (recognition, localization, contact-free segmentation) Demonstrated Task Perceptual Network Eyes (3 DOFs) Facial (15 DOFs) Neck (3 DOFs) familiar activities familiar entities (objects, actors, properties, …) Paul Fitzpatrick • MIT Computer Science and Artificial Intelligence Laboratory • Humanoid Robotics Group 1 2 3 (Speech)

  9. Training Data Task Learning Mechanism Sequencing Model Instructor Task Modeling Understanding perception by trying to build it Machine perception is very fallible. Robots (and humans) need not just particular perceptual competences, but the tools to forge those competences out of raw physical experiences. Three important tools for extending a robot’s perceptual abilities whose importance have been recognized individually are related and brought together. The first is active perception, where the robot employs motor action to reliably perceive properties of the world that it otherwise could not. The second is development, where experience is used to improve perception. The third is interpersonal influences, where the robot’s percepts are guided by those of an external agent. Examples are given for object segmentation, object recognition, and orientation sensitivity; initial work on action understanding is also described. Task Grounding State Grounding Perceptual System Demonstrated Task Perceptual Network

  10. Object boundaries are not always easy to detect visually • Solution: Cog sweeps through ambiguous area • Resulting object motion helps segmentation • Robot can learn to recognize and segment object without further contact camera image implicated edges found and grouped response for each object

More Related