Human-Robot Interaction

Slides:



Advertisements
Similar presentations
Hand Gesture for Taking Self Portrait Shaowei Chu and Jiro Tanaka University of Tsukuba Japan 12th July 15 minutes talk.
Advertisements

Carnegie Mellon University School of Computer Science Carnegie Mellon University School of Computer Science Cognitive Primitives for Mobile Robots Development.
ISTD 2003, Thoughts and Emotions Interactive Systems Technical Design Seminar work: Thoughts & Emotions Saija Gronroos Mika Rautanen Juha Sunnari.
 A robot is a machine or a computer program which seems to have a life of its own.  A social robot is a robot that interacts and communicates with humans.
CHAPTER 2 Input & Output Prepared by: Mrs.sara salih 1.
- Talkback with Dark screen Rapid key input and Speak PW - Font Size - Negative Colors - Magnification gestures - Notification reminder - Colour adjustment.
Humanoid Robots Debzani Deb.

2003Lenko Grigorov, CISC 839 eyePROXY Lenko Grigorov, CISC 839 Supervisor: Roel Vertegaal Additional support by Skaburskis A and Changuk S.
Tour Guide Robot Project Face Detection and Face Orientation on The Mobile Robot Robotino Gökhan Remzi Yavuz Ayşenur Bilgin.
Hello My Name Is… Introductory Presentation. Opening Activity Think about the following: 1.When you are feeling sad, how do you act? 2.How would someone.
HCI / CprE / ComS 575: Computational Perception Instructor: Alexander Stoytchev
Design Strategies for Effective Presentations PowerPoint Poisoning Have you experienced it? Have you experienced it? How can you avoid it? How can.
Model of the Human  Name Stan  Emotion Happy  Command Watch me  Face Location (x,y,z) = (122, 34, 205)  Hand Locations (x,y,z) = (85, -10, 175) (x,y,z)
 Motivated by desire for natural human-robot interaction  Encapsulates what the robot knows about the human  Identity  Location  Intentions Human.
Automatic Storytelling in Comics
HCI/ComS 575X: Computational Perception Instructor: Alexander Stoytchev
CAMEO: Face Recognition Year 1 Progress and Year 2 Goals Fernando de la Torre, Carlos Vallespi, Takeo Kanade.
Using IR For Maze Navigation Kyle W. Lawton and Liz Shrecengost.
The Social Robots Project

Emotion and Sociable Humanoid Robots (Cynthia Breazeal) Yumeng Liao.
Advanced State Machine Programming
1 Drama Unit Learning Targets I can analyze the development of a theme over the course of a text. I can analyze the development of a theme over the course.
9/30/ Cognitive Robotics1 Human-Robot Interaction Cognitive Robotics David S. Touretzky & Ethan Tira-Thompson Carnegie Mellon Spring 2009.
Introduction to Input Devices. Input Devices Units that gather information and transform that information into a series of electronic signals for the.
Perceptive Computing Democracy Communism Architecture The Steam Engine WheelFire Zero Domestication Iron Ships Electricity The Vacuum tube E=mc 2 The.
15-494: Cognitive Robotics Spring 2006 Professor:
Affordances Cognitive Robotics David S. Touretzky &
Making videos accessible – Mandatory guidelines
Crowds (and research in computer animation and games)
Lesson 4 Alternative Methods Of Input.
Delivery strategies.
Starter What is BBC micro:bit?
Hand Gestures Based Applications
Alternative Methods Of Input
Nicole looks at faces Development of a visual robot interface to interpret facial expressions NICOLE: Future robots will share their environment with humans.
Elements Of Drama/Theatre
Ullman's Visual Routines and Tekkotsu Sketches
Interactive Technology
Musical Instrument Virtual
Unit 3, Lesson 8 Making Presentations That Get Attention
Introductory Presentation
GESTURE RECOGNITION TECHNOLOGY
Affordances Cognitive Robotics David S. Touretzky &
Object Recognition Cognitive Robotics David S. Touretzky &
The Map Builder Cognitive Robotics David S. Touretzky &
Non-verbal communication techniques
Lesson 4 Alternative Methods Of Input.
Crowds (and research in computer animation and games)
Elements Of Drama/Theatre
What is blue eyes ? aims on creating computational machines that have perceptual and sensory ability like those of human beings. interactive computer.
Elements Of Drama.
Multimedia Fundamentals
Introduction to Multimedia
Human-centered Interfaces
SENSORS.
The Body of Delivery Chapter 19.
Lesson 3: No One Communicates Alone
Lesson 4 Alternative Methods Of Input.
Visual Literacy.
Thirteen Ways to Look at a Black and White Photograph by Ryan Jerving
Prompt Copy.
Elements Of Drama/Theatre
Elements Of Drama/Theatre

TO EDIT BUTTON 1 1 Locate the page where the button is located
CHAPTER 8 The Nonverbal Code.
Continuous control of avatar gesture (with dirty details)
Elements Of Drama/Theatre
Presentation transcript:

Human-Robot Interaction 15-494 Cognitive Robotics David S. Touretzky & Ethan Tira-Thompson Carnegie Mellon Spring 2010 3/12/2017 15-494 Cognitive Robotics

Human-Robot Interaction Topics Awareness of humans Person tracking Face detection; gaze tracking Face recognition Human's “perspective” considerations Gesture recognition pointing hand motions Social interaction Gaze as indicator of attention Facial expressions (e.g., Kismet) Sound effects (R2D2, AIBO) vs. speech Use of displays (Looking Glass project) 3/12/2017 15-494 Cognitive Robotics

Awareness 1: Person Tracking Be aware of human presence Follow a human (robot assistant) Avoid the humans Interact with humans (museum tour guide robots) Use skin color; look for legs (rangefinder); etc. 3/12/2017 15-494 Cognitive Robotics

Awareness 2: Face Detection Rowley, Baluja, and Kanade (1998) used a neural net: (movie) 3/12/2017 15-494 Cognitive Robotics

OpenCV Face Detector Ilya Matiach ported the OpenCV face detector to Tekkotsu in 2009 to run on the Chiara. For more information: http://opencv.willowgarage.com/wiki/FaceDetection 3/12/2017 15-494 Cognitive Robotics

Awareness 3: Gaze Tracking What is the human looking at? Gaze has high social significance among primates. For robots, hard to measure gaze at a distance. 3/12/2017 15-494 Cognitive Robotics

Awareness 4: Face Recognition Which human is this? Lots of work in this area now for security applications. Sony's AIBO, QRIO robots had face recognition modules. Digital cameras now do face recognition, smile detection. 3/12/2017 15-494 Cognitive Robotics

Awareness 5: The Human's Perspective What can the human see from his present location? Trafton et al.: “Give me the wrench.” Robot sees two wrenches, but knows that the human can only see one. 3/12/2017 15-494 Cognitive Robotics

Gesture Recognition Pointing Point at objects to designate them to the robot Point in a direction, or towards a goal location Hand gestures “Come here” / “Come closer” / “Back off” “Stop” “Put that there” 3/12/2017 15-494 Cognitive Robotics

Social Interaction Do robots need heads? What are heads used for? Indicate focus of attention by gaze direction Gestures such as nodding agreement Anthropomorphism makes robots more acceptable to humans Headless robots are creepy. 3/12/2017 15-494 Cognitive Robotics

Facial Expressions: Kismet Cynthia Breazeal, ca. 1999-2000 http://www.ai.mit.edu/projects/sociable/facial-expression.html 3/12/2017 15-494 Cognitive Robotics

Kismet Social Interactions (see movies) 3/12/2017 15-494 Cognitive Robotics

Communicating with Humans Should robots talk? R2D2 used sound effects to convey emotion AIBO and Kismet do likewise Use of canned messages: “Excuse me, you're blocking my path.” Roboceptionist: “How may I help you?” Will people expect to be able to talk back? Voice recognition gets harder when the robot is noisy. Use of lights to communicate status, mood. 3/12/2017 15-494 Cognitive Robotics

Speech in Tekotsu #include “Sound/SoundManager.h” sndman->speak(“Please charge my battery.”); SpeechNode($,”Take me to your leader!”) Tekkotsu uses the Mary text-to-speech system: http://mary.dfki.de Project idea: enhance the Mary interface to permit control of volume and tempo, use of audio filters for sound effects, etc. (These functions are already built in to Mary, we just need a way to access them.) 3/12/2017 15-494 Cognitive Robotics

Communication via a Detached Display AIBO's Magic Looking Glass (Kirtane & Libby, 2005) Question: how can you use a robot-controlled flat-panel display to mediate human-robot interactions? 3/12/2017 15-494 Cognitive Robotics

Looking Glass Applications Display instructions for a game. Keep score. Display a landmark the robot can use for navigation. Display robot's view of the world. Serve as a backdrop for a dramatic presentation: Display background scenery Display objects the robot is talking about Display another agent the robot can interact with 3/12/2017 15-494 Cognitive Robotics

Display as Input Device User points at display to indicate their choice. 3/12/2017 15-494 Cognitive Robotics

La Traviata 3/12/2017 15-494 Cognitive Robotics

Virtual Violetta (movie) 3/12/2017 15-494 Cognitive Robotics

A Visual Joke At the end of the performance, the user's picture is inserted into an audience shot. 3/12/2017 15-494 Cognitive Robotics

How Looking Glass Works upload image and HTML files Java client renderer 3/12/2017 15-494 Cognitive Robotics

Looking Glass Example #include "Behaviors/StateMachine.h" #nodeclass LGdemo : StateNode #shortnodeclass DisplayMessage : LGNode : DoStart displayHtmlText( "<html><body>Hello world!</body></html>"); #nodemethod setup #statemachine StateNode =B(GreenButOffset)=> DisplayMessage #endnodeclass REGISTER_BEHAVIOR(LGdemo); 3/12/2017 15-494 Cognitive Robotics

Looking Glass Functions uploadFile(string filename) displayHtmlFile(string remoteFilename) displayImageFile(string remoteFilename) displayHtmlText(string text) uploadCameraImage(string remoteFilename) uploadSketch(Sketch<uchar> sketch, string remoteFilename) 3/12/2017 15-494 Cognitive Robotics