Presentation is loading. Please wait.

Presentation is loading. Please wait.

L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE Intent Recognition as a Basis for Imitation.

Similar presentations


Presentation on theme: "L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE Intent Recognition as a Basis for Imitation."— Presentation transcript:

1 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE Intent Recognition as a Basis for Imitation Learning in Humanoid Robots Andrew Fagg, Rod Grupen, Mike Rosenstein, and John Sweeney UMass Amherst NEMS 2005

2 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE Programming a Humanoid is Hard Complex mechanisms: many DOF, many sensor streams. Programming by demonstration: –Demonstrator performs task, robot extracts salient knowledge to reproduce across many instances. Notables: [Pook & Ballard 93], [Kuniyoshi et al. 94], [Voyles et al. 99], and [Ijspeert et al. 02] Imitation: –Imitation learning augments stochastic exploration for acquiring control knowledge.

3 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE On Imitation Assume that demonstrator is performing a goal-directed behavior. Kinematic properties of demonstration are not important to us. –Can refine using robot specific objectives. Interested in the work conveyed by demonstration. –How the objects are manipulated, and what sequence, etc.

4 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE Why Intention? Infer the goal and recognize the scene, and the behavior can be successfully reproduced. –We have some domain specific knowledge. Intention is compatible across morphologies. Recognize more from less. –More abstract representations of actions.

5 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE Demonstration by Teleoperation Direct access to joint velocities and tactile information. No difficulty with correspondence. Difficulty of teleoperation: –Minimal feedback, communication delays. –Fatigue. –Discrepancy between human and robot observational frames.

6 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE A Robot Teleoperation Interface NASA/JSC Telepresence System

7 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE Video

8 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE How to infer intent?

9 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE Mirror Neurons [Rizzolatti et al. 01] What this suggests: Action generation and perception are initimately related: use controller as sensor!

10 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE Controller as Sensor Set of controllers defined by objects in the scene. Compare what demonstrator does to what each controller would do. –“Control Projection” Use domain knowledge to inform when meaningful events occur. –Pay attention to tactile events!

11 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE Set of Primitive Controllers Each controller represents one identified affordance in the scene. –Domain: objects on a table in front of the robot: cans, beanbags, and targets. AFFORDANCE A functional matching between object and actor; described by particular perceptual features. [Gibson 77]

12 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE The Robot’s View Recognized Affordances: –Type of grasp: top, side –DOF constraints: don’t care about rot. about Z –Every object has a previous place

13 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE From Scene to Controllers Object Models Affordances Controllers

14 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE Extracting a Sequence of Actions Set of controllers represent hypotheses of intention. Observable variables: controller errors, force magnitude at fingertips. Controller i explains sequence of observations if: –Each step in sequence reduces error –Error at end is small –Finishes with tactile event Use Bayesian inference to infer most likely.

15 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE Controller Primitives Define Cartesian controller i: Error Joint Command Reference

16 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE Determining Likelihood Error given by distance between joint commands: Compute likelihood at time t:

17 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE An Extracted Sequence

18 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE An Extracted Sequence

19 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE Playing Back the Sequence Most likely controller at tactile event is recorded. Extracted sequence makes reference to affordances in relation to specific objects. –Can rearrange scene: just find correspondence between objects. –Simple visual models used.

20 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE Further More elaborate model of activity –Look at controller error and change in error. More elaborate representations of task –Hierarchical –Example: using a tool, building a structure. Identify affordances from interaction. –Find visual features that predict affordances. –Categorization Relational models describe object interaction –How objects can interact depend on identity.

21 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE

22 Related Work Teleoperation activity recognition [Pook & Ballard 93] Block stacking imitation [Kuniyoshi et al. 94] Gesture-Based Programming [Voyles et al. 99] Movement Imitation [Ijspeert et al. 02]

23 L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE Mirror Neurons –Area of ventral premotor cortex in primates disocovered by Rizzolatti et al. [1996] that fire when monkey performs grasps and observes others perform grasp.


Download ppt "L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE Intent Recognition as a Basis for Imitation."

Similar presentations


Ads by Google