Presentation is loading. Please wait.

Presentation is loading. Please wait.

Perception and Perspective in Robotics Paul Fitzpatrick MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group Goal To build.

Similar presentations


Presentation on theme: "Perception and Perspective in Robotics Paul Fitzpatrick MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group Goal To build."— Presentation transcript:

1 Perception and Perspective in Robotics Paul Fitzpatrick MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group Goal To build robots that can interact with novel objects and participate in novel activities Challenge Machine perception can be robust for a specific domain such as face detection, but unlike human perception it is not currently adaptable in the face of change (new objects, changed circumstances) Approach Integrate conventional machine perception and machine learning with strategies for opportunistic development –  Active perception (sensorimotor ‘toil’)  Interpersonal influences (‘theft’) This work is implemented on a humanoid robot (Cog, see right). The robot uses the structure of familiar activities to learn about novel elements within those activities, and tracks known elements to learn about the unfamiliar activities in which they are used. Perspective familiar activities (tasks, games, …) Perception familiar entities (objects, actors, properties, …) use constraint of familiar activity to discover unfamiliar entity used within it reveal the structure of unfamiliar activities by tracking familiar entities into and through them Object boundaries are not always easy to detect visually, so robot Cog sweeps its arm through ambiguous areas This can cause object motion, which makes boundaries much easier to find Then robot can learn to recognize and segment object without further contact active probing segmentation edge catalog object detection, recognition affordance exploitation (rolling) manipulator detection (robot, human) ‘Toil’ Example – Active Segmentation 12 3 1 2 3 This is a good basis for adaptable object perception: Overview‘Theft’ Example – Search Activity says “Find” “Toma” (shows cube) “No” (shows car) “No” (shows bottle) “Yes!” (shows cube) “Say” (shows bottle) “Say” says “Find” “Toma” (sees cube) “No” (sees car) “No” (sees bottle) “Yes” (sees cube) “Say” “Cube” (sees bottle) “Say” “Toma” Robot observes a human searching for objects, and learns to make a connection between the named target of the search and the object successfully found. The robot has no predefined vocabulary or object set. HumanRobot This work is funded by DARPA under contract number DABT 63-00-C-10102, and by the Nippon Telegraph and Telephone Corporation under the NTT/MIT collaboration agreement

2 Kismet What is done on Kismet Learning a sorting activity Human shows robot where a collection of disparate objects should go, based on some common criterion (color). Robot demonstrates understanding through verbal descriptions, nods towards target locations. Cog What is done on Cog Learning a search activity Human shows robot examples of search activity by speaking. Robot demonstrates understanding by linking name and object. Learning through a search activity Blah blah Novel Perspective leads to Novel Perception Goal To learn how human-level perception is possible, by trying to build it Challenge Machine perception can be robust for a specific domain, but is not adaptable like human perception Approach Integrate conventional machine perception and machine learning with strategies for opportunistic development – Active perception (sensorimotor ‘toil’) Interpersonal influences (‘theft’) Development If a robot is engaged in a known activity there may be sufficient constraint to identify novel elements within that activity. Similarly, if known elements take part in some unfamiliar activity, tracking those can help characterize that activity. Potentially, perceptual development is an open-ended loop of such discoveries.

3 Perception and Perspective in Robotics Paul Fitzpatrick MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group Goal To learn how human-level perception is possible, by trying to build it Challenge Machine perception can be robust for a specific domain, but is not adaptable like human perception Approach Integrate conventional machine perception and machine learning with strategies for opportunistic development – Active perception (sensorimotor ‘toil’) Interpersonal influences (‘theft’) Experimental Platform Expressive active vision head ‘Kismet’ and upper-torso humanoid robot ‘Cog’ familiar activities familiar entities (objects, actors, properties, …) use constraint of familiar activity to discover unfamiliar entity used within it reveal the structure of unfamiliar activities by tracking familiar entities into and through them Kismet What is done on Kismet Object boundaries are not always easy to detect visually, so robot Cog sweeps its arm through ambiguous areas This can cause object motion, which makes boundaries much easier to find Then robot can learn to recognize and segment object without further contact active probing segmentation edge catalog object detection, recognition affordance exploitation (rolling) manipulator detection (robot, human) Cog What is done on Cog An Example – Active Segmentation 12 3 1 2 3 Gives opportunity for much development… OverviewOpen-ended Development Sorting activity Human shows robot where a collection of disparate objects should go, based on some common criterion (color). Robot demonstrates understanding through verbal descriptions, nods towards target locations. Search activity Human shows robot examples of search activity by speaking. Robot demonstrates understanding through verbal descriptions, nods towards target locations. If the robot is engaged in a known activity there may be sufficient constraint to identify novel elements within that activity. Similarly, if known elements take part in some unfamiliar activity, tracking those can help characterize that activity. Potentially, perceptual development is an open-ended loop of such discoveries.

4 Perception and Perspective in Robotics Paul Fitzpatrick MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group Active Perception To foo foo foo Active Segmentation Solve classic problem Object boundaries are not always easy to detect visually (e.g. yellow car on yellow table) Solution: robot Cog sweeps through ambiguous area Resulting object motion helps segmentation Robot can learn to recognize and segment object without further contact familiar activities familiar entities (objects, actors, properties, …) use constraint of familiar activity to discover unfamiliar entity used within it reveal the structure of unfamiliar activities by tracking familiar entities into and through them Opportunities abound and cascade Robot can perform “find the toma” style tasks Observes search activity Then uses structure of search activity to learn new properties (object names) Searching and sorting EgoMap short term memory of objects and their locations so “out of sight” is not “out of mind” Active Perception Point 1, 2, 3 Motivation Training examples are currently a necessary condition for achieving robust machine perception. Acquiring those examples is properly the role of perception itself. But a human is typically needed to collect those examples. Sorting task Human shows robot where a collection of disparate objects should go, based on some common criterion (color). Robot demonstrates understanding through verbal descriptions, nods towards target locations. Search task Human shows robot examples of search activity by speaking… Robot demonstrates understanding through verbal descriptions, nods towards target locations. Active Perception To foo foo foo Active Segmentation Solve classic problem

5 Perception and Perspective in Robotics Paul Fitzpatrick MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group Goal To understand perception by trying to build it Approach Extend machine perception to include opportuistic deve The grist:Active perception Interpersonal influences The mill:Opportunistic development Examples Object boundaries are not always easy to detect visually (e.g. yellow car on yellow table) Solution: robot Cog sweeps through ambiguous area Resulting object motion helps segmentation Robot can learn to recognize and segment object without further contact familiar activities familiar entities (objects, actors, properties, …) use constraint of familiar activity to discover unfamiliar entity used within it reveal the structure of unfamiliar activities by tracking familiar entities into and through them Opportunities abound and cascade Robot can perform “find the toma” style tasks Observes search activity Then uses structure of search activity to learn new properties (object names) Searching and sorting EgoMap short term memory of objects and their locations so “out of sight” is not “out of mind”

6 1

7 Opportunism Standard approach to machine perception is to develop algorithms which, when provided with sufficient training data, can learn to perform some classification or regression task. Can move one step back and develop algorithms which, given physical opportunities, acquire the training data. Need to design system behavior side-by-side with the perceptual code. Opportunistic Development Suppose there is a property P which can normally not be perceived. But there exists a situation S where it can be. Then the robot can try to get into situation S, and observe P, and relate it to other perceptual variables that are observable A B C

8 Perception and Perspective in Robotics Paul Fitzpatrick MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group familiar activities familiar entities (objects, actors, properties, …) use constraint of familiar activity to discover unfamiliar entity used within it reveal the structure of unfamiliar activities by tracking familiar entities into and through them poking object segmentation edge catalog object detection (recognition, localization, contact-free segmentation) affordance exploitation (rolling) manipulator detection (robot, human) EgoMap short term memory of objects and their locations so “out of sight” is not “out of mind” Sequencing Model Instructor Task Modeling Task Grounding Perceptual System Perceptual Network Demonstrated Task Task Learning Mechanism State Grounding Training Data Head (7 DOFs) Torso (3 DOFs) Left arm (6 DOFs) Right arm (6 DOFs) Stand (0 DOFs) Facial (15 DOFs) Neck (3 DOFs) Eyes (3 DOFs) (Speech) 1 2 3

9 Understanding perception by trying to build it Machine perception is very fallible. Robots (and humans) need not just particular perceptual competences, but the tools to forge those competences out of raw physical experiences. Three important tools for extending a robot’s perceptual abilities whose importance have been recognized individually are related and brought together. The first is active perception, where the robot employs motor action to reliably perceive properties of the world that it otherwise could not. The second is development, where experience is used to improve perception. The third is interpersonal influences, where the robot’s percepts are guided by those of an external agent. Examples are given for object segmentation, object recognition, and orientation sensitivity; initial work on action understanding is also described. Sequencing Model Instructor Task Modeling Task Grounding Perceptual System Perceptual Network Demonstrated Task Task Learning Mechanism State Grounding Training Data

10 Object boundaries are not always easy to detect visually Solution: Cog sweeps through ambiguous area Resulting object motion helps segmentation Robot can learn to recognize and segment object without further contact camera image response for each object implicated edges found and grouped

11

12


Download ppt "Perception and Perspective in Robotics Paul Fitzpatrick MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group Goal To build."

Similar presentations


Ads by Google