Download presentation
Presentation is loading. Please wait.
1
Manipulation in Human Environments
Aaron Edsinger & Charlie Kemp Humanoid Robotics Group MIT CSAIL
2
Domo 29 DOF 6 DOF Series Elastic Actuator (SEA) arms 4 DOF SEA hands
2 DOF SEA neck Active vision head Stereo cameras Gyroscope Sense joint angle + torque 15 node Linux cluster
3
Manipulation in Human Environments
Human environments are designed to match our cognitive and physical abilities Work with everyday objects Collaborate with people Perform useful tasks
4
Applications Aging in place Cooperative manufacturing Household chores
5
Three Themes Let the body do the thinking Collaborative manipulation
Task relevant features
7
Let the Body do the Thinking
Design Passive compliance Force control Human morphology
8
Let the Body do the Thinking
Compensatory behaviors Reduce uncertainty Modulate arm stiffness Aid perception (motion, visibility) Test assumptions (explore)
9
Let the Body Do the Thinking
10
Collaborative Manipulation
Complementary actions Person can simplify perception and action for the robot Robot can provide intuitive cues for the human Requires matching to our social interface
11
Collaborative Manipulation
Social amplification
12
Collaborative Manipulation
A third arm: Hold a flashlight Fixture a part Extend our physical abilities: Carry groceries Open a jar Expand our workspace: Place dishes in a cabinet Hand a tool Reach a shelf
13
Task Relevant Features
What is important? What is irrelevant? *Distinct from object detection/recognition.
14
Structure In Human Environments
Donald Norman The Design of Everyday Objects
15
Structure In Human Environments
Human environments are constrained to match our cognitive and physical abilities Sense from above Flat surfaces Objects for human hands Objects for use by humans
17
Why are tool tips common?
Single, localized interface to the world Physical isolation helps avoid irrelevant contact Helps perception Helps control
19
Tool Tip Detection Visual + motor detection method Kinematic Estimate
Visual Model
21
Mean Pixel Error for Automatic and Hand Labelled Tip Detection
22
Mean Pixel Error for Hand Labeled, Multi-Scale Detector, and Point Detector
23
Model-Free Insertion Active tip perception Arm stiffness modulation
Human interaction
24
Other Examples Circular openings Handles Contact Surfaces
Gravity Alignment
25
Future: Generalize What You've Learned
Across objects Perceptually map tasks across objects Key features map to key features Across manipulators Motor equivalence Manipulator details may be irrelevant
26
RSS 2006 Workshop Manipulation for Human Environments
Robotics: Science and Systems University of Pennsylvania , August 19th, 2006 manipulation.csail.mit.edu/rss06
27
Summary Importance of Task Relevant Features Example of the tool tip
Large set of hand tools Robust detection (visual + motor) Kinematic estimate Visual model
28
In Progress Perform a variety of tasks Insertion Pouring Brushing
29
Learning from Demonstration
30
The Detector Responds To
Fast Motion Convex
31
Video from Eye Camera Motion Weighted Edge Map Multi-scale Histogram (Medial-Axis, Hough Transform for Circles) Local Maxima
32
Defining Characteristics
Geometric Isolated Distal Localized Convex Cultural/Design Far from natural grasp location Long distance relative to hand size
33
Other Task Relevant Features?
34
Detecting the Tip
35
Include Scale and Convexity
36
Distinct Perceptual Problem
Not object recognition How should it be used Distinct methods and features
39
Use The Hand's Frame Combine weak evidence Rigidly grasped
41
Acquire a Visual Model
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.