Presentation is loading. Please wait.

Presentation is loading. Please wait.

First-person Teleoperation of Humanoid Robots

Similar presentations


Presentation on theme: "First-person Teleoperation of Humanoid Robots"— Presentation transcript:

1 First-person Teleoperation of Humanoid Robots
Lars Fritsche, Felix Unverzagt, Jan Peters, Roberto Calandra Presented by Antong Liu

2 Motivation for Teleoperation
Research Encoding joint movements for learning Military and Industry Unstructured tasks in inaccessible environment Medicine Long distance medicine Minimally invasive surgery Radioactive sites, disaster sites, deep oceans

3 Tools for Teleoperation
Direction manipulation Joystick manipulation of joints Joint tracking Motion tracking through Microsoft Kinect Full-body tracking suits

4 Limitation of Current Methods
Direct visual contact between operator and robot required

5 Proposed Method Oculus Rift virtual reality goggles
SensorGlove haptic feedback gloves

6 Experiment Setup Blue arrows show flow of information from operator to= robot, and red arrows show feedback from robot to operator

7 Components: Microsoft Kinect
Collects motion data of the body of the operator Uses camera with depth sensor to generate 3D skeleton Low price and does not require special cameras or markers Prone to noise and cannot handle occlusion Update frequency limited to 30 Hz

8 Components: Oculus Rift
Head-mounted virtual reality display High resolution displays split vertically for each eye Gyroscopic tracking of head movement at 1 kHz

9 Components: SensorGlove
Haptic feedback sensor gloves Track finger motion at 350 Hz

10 Components: iCub Robot
104cm tall, 24kg humanoid robot 53 total degrees of freedom 30 used in this experiment 3 in torso 5 in head 4 in each arm 7 in each hand Tactile sensors in fingertips 640x480 camera in each eye

11 Components: Controller
7 DoF arms calculated from positions of operator shoulder, elbow, and wrist 3 DoF torso controlled under assumption operator is upright (spine aligned with gravity vector) 5 DoF head controlled by Oculus Rift orientation Movement exceeding iCub boundaries is outsourced to eye DoF Torso from spine, hip, and shoulders

12 Components: Controller
7 DoF hands controlled by bending sensed from glove Thumb, index, and middle fingers are independent Ring and pinky are controlled by a single DoF SensorGlove returns haptic feedback proportional to largest pressure on iCub’s 12 tactile sensors Torso from spine, hip, and shoulders

13 Components: Controller
iCub head has ±35° line of sight By controlling the eyes as well, line of sight improves to ±65°

14 Control Signals Safety routines implemented to prevent damage to robot
Jerking at maximal joint values Control suspended when Kinect detects more than one person in the scene Maximum joint step size for abrupt changes in position

15 Control Signals Signals from all components needed filtering
Butterworth filter used: maximally flat in passband Maximum sample rate of 100 Hz (hardware limit) Kinect: 1.5 Hz Oculus Rift and SensorGlove: 5 Hz Tradeoff between delay and signal smoothness

16 Control Latency Latency is necessary to ensure safe operation
High delay can be disorienting to operator After filtering Kinect delay 600ms, iCub operation delay 200ms Oculus Rift and SensorGlove delay 100ms Delay proportional to how noisy the raw data is

17 Experiments: Mimic

18 Experiments: Pick and place

19 Conclusion Using an Oculus VR device removes the need for the operator to have visual contact with the robot Robot is safe enough to interact directly with humans First-person controls are intuitive for human operators Latency may be a hindering factor Kinect has limitation on trackable poses due to occlusion intolerance


Download ppt "First-person Teleoperation of Humanoid Robots"

Similar presentations


Ads by Google