2008 數位互動科技與產業應用研討會 互動表情呈現機器人之技術及趨勢 Approaches to Interactive Emotional Robots 謝銘原 南台科技大學 機器人研究中心 Robotics Research Center, Southern Taiwan University,

Slides:



Advertisements
Similar presentations
Cognitive Systems, ICANN panel, Q1 What is machine intelligence, as beyond pattern matching, classification and prediction. What is machine intelligence,
Advertisements

Breakout session B questions. Research directions/areas Multi-modal perception cognition and interaction Learning, adaptation and imitation Design and.
Instructors Edition. Psychology in Action, 9 th ed. By Dr. Karen Huffman Facial Characteristics Jim Matiya Psychology in Action 9 th Edition Karen Huffman.
Perception and Perspective in Robotics Paul Fitzpatrick MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group Goal To build.
University of Minho School of Engineering Centre ALGORITMI Uma Escola a Reinventar o Futuro – Semana da Escola de Engenharia - 24 a 27 de Outubro de 2011.
Psikologi Anak Pertemuan 3 Motor, Sensory, and Perceptual Development.
Nicola Fankhauser DIUF - Department of Informatics University of Fribourg, Switzerland March.
Ages and Stages (toddlers 1-3years) Jessica Stubblefield.
CSD 2230 HUMAN COMMUNICATION DISORDERS Topic 2 Normal Communication Development and Communication Across the Lifespan.
Introduction to Artificial Intelligence Ruth Bergman Fall 2004.
Animat Vision: Active Vision in Artificial Animals by Demetri Terzopoulos and Tamer F. Rabie.
ISTD 2003, Thoughts and Emotions Interactive Systems Technical Design Seminar work: Thoughts & Emotions Saija Gronroos Mika Rautanen Juha Sunnari.
Humanoid Robotics – A Social Interaction CS 575 ::: Spring 2007 Guided By Prof. Baparao By Devangi Patel reprogrammable multifunctionalmovable self - contained.
LEARNING THEORY OBSERVATIONAL LEARNING. Observational learning is learning through observation. Observational learning is learning through observation.
OUR COMMUNICATION : -Words -How we say these words (our tone, pitch, volume, etc.) -Non-verbal communication.
Robotics. Introduction Of Robotics  Robot and Robotics technologies represented a practical applications of physics, computer science, engineering and.
Recognizing Emotions in Facial Expressions
 A robot is a machine or a computer program which seems to have a life of its own.  A social robot is a robot that interacts and communicates with humans.
REAL ROBOTS. iCub It has a height of 100 cm, weighs 23 Kg, and is able to recognize and manipulate objects. Each hand has 9 DOF and can feel objects almost.
Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Cognitive Robots © 2014, SNU CSE Biointelligence Lab.,
Closing conference of SYSIASS – June 17 th 2014 Multimodal Bio-signal based Control of Intelligent Wheelchair Professor Huosheng Hu Leader of Activity.
By: Andrea Sloane COMP 4620 Fall Human-Like Robots and Androids are Designed to Resemble Humans in the Most Life-Like Fashion Possible Human-like.
Humanoid Robots Debzani Deb.
EXPRESSED EMOTIONS Monica Villatoro. Vocab to learn * Throughout the ppt the words will be bold and italicized*  Emotions  Facial Codes  Primary Affects.
Pharos University In Alexandria Faculty of Mass communication Communication Skills Dr. Enjy Mahmoud Dr. Enjy Mahmoud Week #:6 Lecture #:6 Fall
GUI: Specifying Complete User Interaction Soft computing Laboratory Yonsei University October 25, 2004.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Kinesics The study of body movements The study of body movements.
Chapter 7. BEAT: the Behavior Expression Animation Toolkit
Towards Cognitive Robotics Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Christian.
Building Humanoid Robots Our quest to create intelligent machines Aaron Edsinger MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics.
SEMINAR REPORT ON K.SWATHI. INTRODUCTION Any automatically operated machine that functions in human like manner Any automatically operated machine that.
제 6 주. 응용 -2: Graphics Artificial Life for Computer Graphics D. Terzopoulos, Communications of the ACM, vol. 42, no. 8, pp. 33~42, 1999 학습목표 Understanding.
Lecture 15 – Social ‘Robots’. Lecture outline This week Selecting interfaces for robots. Personal robotics Chatbots AIML.
1 CS 2710, ISSP 2610 Foundations of Artificial Intelligence introduction.
卓越發展延續計畫分項三 User-Centric Interactive Media ~ 主 持 人 : 傅立成 共同主持人 : 李琳山,歐陽明,洪一平, 陳祝嵩 水美溫泉會館研討會
I Robot.
Bellwork Day 1  What is emotional development?  What is social development?  How are they similar, and how are they different?
 Motivated by desire for natural human-robot interaction  Encapsulates what the robot knows about the human  Identity  Location  Intentions Human.
Natural Tasking of Robots Based on Human Interaction Cues Brian Scassellati, Bryan Adams, Aaron Edsinger, Matthew Marjanovic MIT Artificial Intelligence.
The Expression of Emotion: Nonverbal Communication.
Hirota lab. 1 Mentality Expression by the eyes of a Robot Presented by: Pujan Ziaie Supervisor: Prof. K. Hirota Dept. of Computational Intelligence and.
Chapter five.  Language is a communication tools whose development depends on the prior development of communication.  Language is a social tool.* 
In Your Face Body Language through facial expression.
The Next Generation of Robots?
Animated Speech Therapist for Individuals with Parkinson Disease Supported by the Coleman Institute for Cognitive Disabilities J. Yan, L. Ramig and R.
1 Galatea: Open-Source Software for Developing Anthropomorphic Spoken Dialog Agents S. Kawamoto, et al. October 27, 2004.
ROBOT. What is ROBOT ? DEFINITION “A reprogrammable, multifunctional manipulator designed to move material, parts, tools, or specialized devices through.
MIT Artificial Intelligence Laboratory — Research Directions The Next Generation of Robots? Rodney Brooks.
Chapter 7 Affective Computing. Structure IntroductionEmotions Emotions & Computers Applications.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Modelling Fish Behaviour Advisor : Dr. Hsu Presenter :
The Expression of Emotion: Nonverbal Communication.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Recognizing Partially Occluded, Expression Variant Faces.

Expressionbot: An Emotive Lifelike Robotic Face for Face-to- Face Communication Ali Mollahossenini, Gabriel Gairzer, Eric Borts, Stephen Conyers, Richard.
Recognition and Expression of Emotions by a Symbiotic Android Head Daniele Mazzei, Abolfazl Zaraki, Nicole Lazzeri and Danilo De Rossi Presentation by:
Humanoid-Human Interaction Presented by KMR ANIK 1.
Simulation of Characters in Entertainment Virtual Reality.
Emotion and Sociable Humanoid Robots (Cynthia Breazeal) Yumeng Liao.
Artificial Intelligence
Under Guidance of Mr. A. S. Jalal Associate Professor Dept. of Computer Engineering and Applications GLA University, Mathura Presented by Dev Drume Agrawal.
Perceptive Computing Democracy Communism Architecture The Steam Engine WheelFire Zero Domestication Iron Ships Electricity The Vacuum tube E=mc 2 The.
An Emotive Lifelike Robotics Face for Face-to-Face Communication
Derek McColl Alexander Hong Naoaki Hatakeyama Goldie Nejat
CHAPTER 1 Introduction BIC 3337 EXPERT SYSTEM.
Artificial Intelligence (CS 370D)
Humanoid Robotics – A Social Interaction
Lecture 26 Artificial Intelligence: Reasoning & Embodied Intelligence
Survey of Affective Computing for Digital Home
Automated Detection of Human Emotion
Presentation transcript:

2008 數位互動科技與產業應用研討會 互動表情呈現機器人之技術及趨勢 Approaches to Interactive Emotional Robots 謝銘原 南台科技大學 機器人研究中心 Robotics Research Center, Southern Taiwan University, Taiwan

IEEE IECON 2007 Approaches to Interactive Emotional Robots 2 Outline  Introduction to Interactive Emotional Robots  Famous Researches on Emotional Robots  KISMET  Hanson Robotics  Discussions on related technologies  Conclusions

IEEE IECON 2007 Approaches to Interactive Emotional Robots 3 Introduction  Interactive Emotional Robots  Anthropomorphic robots display a unique, artificial subconscious, partly due to their  cognitive understanding of language-based interactive speech,  conversational capabilities and genuine eye contact,  coupled with a full range of human facial expressions.  The key technologies consist of  Anthropomorphic artificial musculature and skin  A.I. software

IEEE IECON 2007 Approaches to Interactive Emotional Robots 4 Famous Researches on Emotional Robots 1/2  Kismet (MIT)  She can engage people in natural and expressive face-to-face interaction  Reddy (RoboMotio)  By combining facial expression to arms movement, it easily express a wide range of emotions like joy, anger, sadness, surprise or disgust

IEEE IECON 2007 Approaches to Interactive Emotional Robots 5 Famous Researches on Emotional Robots - 2/2  Hanson Robotics  The Albert-Hubo -- A collaboration of East and West  The Albert Hubo is the first ever walking robot with realistic, humanlike expressions  Eva -- The best of all worlds  Eva is a humanlike robot of universal beauty achieved by incorporating a mixture of ethnic characteristics  Zeno -- The smartest and coolest robot  Zeno is the first of his kind.  Zeno lives in the “Inventing Academy” in the year 2027 with a whole group of other robot kids, learning and fighting to save humanity.

IEEE IECON 2007 Approaches to Interactive Emotional Robots 6 Kismet – 1/5  The Hardware Design  The high-level perception system,  the motivation system,  the behavior system,  the motor skill system,  the face motor system

IEEE IECON 2007 Approaches to Interactive Emotional Robots 7 Kismet – 2/5  Vision System  Auditory System  Expressive Motor System  a 15 DoF face  Vocalization System

IEEE IECON 2007 Approaches to Interactive Emotional Robots 8 Kismet – 3/5

IEEE IECON 2007 Approaches to Interactive Emotional Robots 9 Kismet – 4/5  Social Amplification

IEEE IECON 2007 Approaches to Interactive Emotional Robots 10 Kismet – 5/5  Facial expressions  Looking at  Searching the objective  Affective responses

IEEE IECON 2007 Approaches to Interactive Emotional Robots 11 Hanson Robotics – 1/5  Each HumanKind robot is  individually hand-crafted  built to perform in a wide variety of applications, including  Entertainment  Research  Animation  Consumer households

IEEE IECON 2007 Approaches to Interactive Emotional Robots 12 Hanson Robotics – 2/5  All HumanKind robots have the following capabilities:  Emulate over 62 facial and neck muscles  providing anthropomorphic facial expression  Embedded micro-cameras  providing vision recognition  A.I. software technology for  face and speech recognition,  eye tracking, and  conversational operations

IEEE IECON 2007 Approaches to Interactive Emotional Robots 13 Hanson Robotics – 3/5  Capable for local/remote pupeteering  Portable and can run on up to 1/20 the power required for comparable products  Interface with standard computers  included with robot  Can function in a variety of research environments, including  Computer vision,  Computational interaction, and  Speech perception

IEEE IECON 2007 Approaches to Interactive Emotional Robots 14 Hanson Robotics – 4/5  Zeno robot – 17” tall, weigh 6 lbs. battery power  learns through artificial intelligence  A character robot that can see, hear, talk and remembers who you are  Wirelessly controlled by a PC  He can view a 3D mental image of his environment to determine and control physical action and reactions, much like we do as humans.  He then has the ability to navigate, make facial expressions and move his body based on what he sees around him.

IEEE IECON 2007 Approaches to Interactive Emotional Robots 15 Hanson Robotics – 5/5  A character engine with speech recognition and conversational AI for language reasoning so that  Zeno can recognize and remember both speech and faces and interact accordingly

IEEE IECON 2007 Approaches to Interactive Emotional Robots 16 Discussions on related technologies  The key technologies for emotional expression consist of  Anthropomorphic artificial musculature and skin  A.I. technologies (ex. Character Engine software in Hanson Robotics HumanKind Robots)

IEEE IECON 2007 Approaches to Interactive Emotional Robots 17 Human musculature of the face

IEEE IECON 2007 Approaches to Interactive Emotional Robots 18 Artificial musculature  The head of a typical HumanKind TM robot consists of 32 DOF to simulate human musculature in the face and the neck, there are: (1) 4 DOF in the neck (turn, tilt, nod-upper, nod-lower) (2) 3 DOF in the eyes (left eye turn, right eye turn, eyes up and down) (3) 1 DOF for the jaw (4) 3 DOF for the eyelids (2 upper eyelids, coupled lower eyelids) (5) 21 DOF servos in the face: smile left, smile right, frown left, frown right, “ee”left+right, lower lip center up+out, upper lip center, lower lip ¾ left+right, upper lip ¾ left+right, sneers, eye-scrunches left+right, outer brows left+right, inner brow left+right, and brow center [US Patent 7,113,848, Sep. 26, 2006]

IEEE IECON 2007 Approaches to Interactive Emotional Robots 19 Artificial skin  How to get an anthropomorphic skin? [US Patent 7,113,848, Sep. 26, 2006]

IEEE IECON 2007 Approaches to Interactive Emotional Robots 20 Artificial musculature and skin  To simulate musculature and skin [US Patent 7,113,848, Sep. 26, 2006]

IEEE IECON 2007 Approaches to Interactive Emotional Robots 21 Artificial mouth and eyes  The dynamic actions of the artificial lips, the artificial eyes, and the artificial eyelids [US Patent 7,113,848, Sep. 26, 2006]

IEEE IECON 2007 Approaches to Interactive Emotional Robots 22 Artificial Intelligence of Emotional system  Emotional expression  Happy, sad, afraid, disgusted, angry, surprise, contemplative, confused …  A.I. system integrates these techniques:  computer vision,  face detection and identification,  speech recognition,  natural language processing,  speech synthesis, and  motion control

IEEE IECON 2007 Approaches to Interactive Emotional Robots 23 The future of emotional robots  The challenges  More anthropopathic facial expression with real emotions  Emotion detection and recognition  Micro servo actuators and their controls  More DOFs  More anthropopathic skin  Bionic material  Sensory, reflective, sensitive to force, pressure and temperature  More intelligent  Integration of multi-perceptions and recognitions  Learning and emulating capabilities  Simple but powerful algorithms

IEEE IECON 2007 Approaches to Interactive Emotional Robots 24 Conclusions  Interactive emotional robots need to be developed with  Learning social behaviors during human-robot play  Kismet  To imitate infant and copy the intelligence under nature behavior  Flexible facial framework to display anthropopathic expressions  Hanson Robotics  To emulate human facial musculature and skin and set up sufficient preprogrammed basic facial expressions ZENO

IEEE IECON 2007 Approaches to Interactive Emotional Robots Thanks for your attendance 25