Feedback Elisabetta Bevacqua, Dirk Heylen,, Catherine Pelachaud, Isabella Poggi, Marc Schröder.

Slides:



Advertisements
Similar presentations
Towards Affective Interactive ECAs Catherine Pelachaud – University of Paris 8 Isabella Poggi – University of Rome 3 WP6 workshop, March 2005, Paris.
Advertisements

Emotion Chapter 11 Emotion 4/12/2017
Nonverbal Components of Delivery
Business Communication
ENTERFACE’08 Multimodal Communication with Robots and Virtual Agents.
Persuasive Listener in a Conversation Elisabetta Bevacqua, Chris Peters, Catherine Pelachaud (IUT de Montreuil - Paris 8)
Facial expression as an input annotation modality for affective speech-to-speech translation Éva Székely, Zeeshan Ahmed, Ingmar Steiner, Julie Carson-Berndsen.
PSYC 1000 Lecture 44. Emotion –Response of whole organism to pleasant and aversive events of different types Happiness, Sadness, Fear, Anger, … –Three.
Chapter 7, Nancy Langton and Stephen P. Robbins, Organizational Behaviour, Fourth Canadian Edition 7-1 Copyright © 2007 Pearson Education Canada Chapter.
23-May-151 Multiparty Communication with a Tour Guide ECA Aleksandra Čereković HOTLab group Department of telecommunications Faculty of electrical engineering.
Emotion. The heart has reasons that reason does not recognize -- Pascal Reason is and ought to be the slave of passion -- Hume Are Emotions Necessary.
1 IUT de Montreuil Université Paris 8 Emotion in Interaction: Embodied Conversational Agents Catherine Pelachaud.
Communication Ms. Morris.
Understanding Non- Verbal Communication MRS. DOBBINS.
PowerPoint® Presentation by Jim Foley Motivation and Emotion © 2013 Worth Publishers.
Emotional Intelligence and Agents – Survey and Possible Applications Mirjana Ivanovic, Milos Radovanovic, Zoran Budimac, Dejan Mitrovic, Vladimir Kurbalija,
Hone Your Communication Skills
Communication and Active Listening Essential Tools for the Community Ambassador.
Active Listening.
E FFECTIVE COMMUNICATION Sarajevo, May 2015 Valerica Dumitrescu Education Officer ETUI.
Obj.1.03 Practice interpersonal skills Ms. Jessica Edwards, M.A.Ed.
Chapter 6- Listening and Responding to others
Module 16 Emotions Kimberly, Diana, Kristen, JP, Chris, Michael, Chris.
Effective Communication. Elements of Communication Speaker: someone who wishes to communicate a message Listener: the receiver of the message (in most.
Effective Communication
Lecture 7: Conversation and Conflict Introduction to Communication.
Components of Emotion: Facial expressions Physiological factors (e.g., heart rate, hormone levels) Subjective experience/feelings Cognitions that may elicit.
Nonverbal Communication
Emotion Module 12. What are emotions? full body responses, involving: 1. physiological arousal (increased heart rate) 2. expressive behaviors (smiling,
EXPRESSED EMOTIONS Monica Villatoro. Vocab to learn * Throughout the ppt the words will be bold and italicized*  Emotions  Facial Codes  Primary Affects.
+ EQ: How are emotions communicated nonverbally and across cultures?
NON-VERBAL COMMUNICATION
Effective Public Speaking Chapter # 3 Setting the Scene for Community in a Diverse Culture.
APML, a Markup Language for Believable Behavior Generation Soft computing Laboratory Yonsei University October 25, 2004.
How Language and Communication Needs can impact on Social Emotional and Behavioural Difficulties.
Prepared by Thuy Tran, Sep 2012 Communication skill.
Full-body motion analysis for animating expressive, socially-attuned agents Elisabetta Bevacqua Paris8 Ginevra Castellano DIST Maurizio Mancini Paris8.
©1999 Prentice Hall Emotion Chapter 11. ©1999 Prentice Hall Emotion Defining Emotion. Elements of Emotion 1: The Body. Elements of Emotion 2: The Mind.
“Do NOW” “Do NOW” What is the Definition of Peer Pressure? What is the Definition of Peer Pressure? What is the difference between Direct and Indirect.
Social Aspects of Health Building Healthy Relationships.
Backchannels Through Gaze as Indicators of Persuasive Success E. Bevacqua, M. Mancini, C. Peters, C. Pelachaud University of Paris 8 Isabella Poggi Università.
ENTERFACE 08 Project 1 “MultiParty Communication with a Tour Guide ECA” Mid-term presentation August 19th, 2008.
The Role of HCI In IS Curriculum Ping Zhang School of Information Studies Syracuse University AMCIS’03 Panel.
Toward a Unified Scripting Language 1 Toward a Unified Scripting Language : Lessons Learned from Developing CML and AML Soft computing Laboratory Yonsei.
M O D U L E 1 2 E M O T I O N.
Nonverbal Communication
Module 16 Emotion.
Emotion. Emotion Defining Emotion Defining Emotion Elements of Emotion 1: The Body Elements of Emotion 1: The Body Elements of Emotion 2: The Mind Elements.
Communicating Effectively (1:46) Click here to launch video Click here to download print activity.
MIT Artificial Intelligence Laboratory — Research Directions The Next Generation of Robots? Rodney Brooks.
ENTERFACE 08 Project #1 “ MultiParty Communication with a Tour Guide ECA” Final presentation August 29th, 2008.
Theories of Emotions Module 41.
COMMUNICATION The process of sending and receiving messages between people.
Understanding Nonverbal Messages
Immersive Virtual Characters for Educating Medical Communication Skills J. Hernendez, A. Stevens, D. S. Lind Department of Surgery (College of Medicine)
Regulation of Emotion. Name the emotion Contempt Surprise Anger Happiness Disgust Sadness Fear.
ENTERFACE’08 Multimodal Communication with Robots and Virtual Agents mid-term presentation.
Emotion. Defining Emotion ► Emotion: not just facial expressions.
Robbins et al., Fundamentals of Management, 4th Canadian Edition ©2005 Pearson Education Canada, Inc. 1 Chapter 3 Communication and Interpersonal Skills.
Emotions. Emotion A state of arousal involving facial and body changes, brain activation, cognitive appraisals, subjective feelings, and tendencies toward.
Communication Process. Defining Communication On a sticky note, write down your own definition of communication. Be as detailed as possible. With a group,
WP6 Emotion in Interaction Embodied Conversational Agents WP6 core task: describe an interactive ECA system with capabilities beyond those of present day.
Objectives of session By the end of today’s session you should be able to: Define and explain pragmatics and prosody Draw links between teaching strategies.
Eye contact activity Eye contact activity Face to face instructions Back to back instructions 1 min full eye contact from both conversation 1 min no eye.
Recognition and Expression of Emotions by a Symbiotic Android Head Daniele Mazzei, Abolfazl Zaraki, Nicole Lazzeri and Danilo De Rossi Presentation by:
Eye contact activity 1. Communication and Employability Skills for IT Unit 1 2 1, 3 and 6 ONLY.
Grounding by nodding GESPIN 2009, Poznan, Poland
Think about the skill you picked for this semester
Expressing and Experiencing Emotion
Presentation transcript:

Feedback Elisabetta Bevacqua, Dirk Heylen,, Catherine Pelachaud, Isabella Poggi, Marc Schröder

Roddy + Ruth

Modeling

Listener system Listener’s modules: generator of listener’s behaviours Input: video and audio data from real world Player: 3D agent Greta Backchannel library: lexicon of backchannel signals Whiteboard Psyclone: communication protocol system

Dialogue Management Integrating the various components of a dialogue system capable of non-verbal expressivity. We connected –a visual renderer (Greta), –an audio renderer (MARY), and –a dummy dialogue system capable of generating non-verbal behaviour (Conversational Dialogue Engine / DFKI) Using OpenAIR.

Reactive Listener module The reactive module generates listener’s responses according to speaker’s head movement To detect head movements we integrated Watson (Gratch et al.) –At the moment Greta reacts with a head nod every time the speaker performs a nod or a shake –In the future Greta will be able to react with different backchannel signals and/or copy the speaker’s head movement

Analysis Head movements –tracking –classification

Television

Data Annotation (SAL)

Cognitive Listener module We use the SAL Wizard of Oz to trigger deliberative listener behaviours for Greta Pre-calculated FAP files are selected according to the wizard’s decision The Player displays the selected FAP files

Backchannel lexicon We aim at building a listener ECA able to display backchannel signals according to its style of behaviour: assertive/not assertive, believing/not believing, interested/not interested and so on We need to define a set of backchannel signals that users are able to interpret and understand To define such a library of recognizable signals we performed a perceptive test

Perception/Feedback Samples* for subjects to judge –questionnaires (semantic scales) –ask to label things Facial Expressions Affect Bursts

Perceptive Test Perceptive test: find a mapping between signals and meanings Questions: –it is possible to identify a signal (or a combination of signals) for each meaning, –a combination of signals can alter the meaning attached to each backchannel single signal.

Subjects and material Sixty French subjects (age mean 20.1) Tasks: select meanings for facial expressions and head signals displayed by Greta 21 different signals 12 meanings: agree, disagree, accept, refuse, interested, not interested, believe, disbelieve, understand, don't understand, like, dislike As the list of meanings was too long, subjects were divided into two groups: group1 and group2

Signals Signals can be simple (containing just a single action) or complex (containing several actions): nodtilt smiletilt and frown raise eyebrowstilt and sad eyebrows nod and smiletilt and raise eyebrows nod and raise eyebrowstilt and gaze right down shakegaze right down frowneyes roll up tension 1 raise left eyebrow shake and frownsad eyebrows frown and tension 1 eyes wide open shake, frown and tension 1 1 tension of the lips

Result

3

11

First question Q1 it is possible to identify a signal (or a combination of signals) for each meaning agree and accept: the signal nod proved to be very significant. All signals containing nods were interpreted as signals of agreement and acceptance like: the signal smile conveys this meaning understand: this meaning can be conveyed through the combination of smile and raise eyevrows

Second question Hyp2: a combination of signals can alter the meaning of backchannel single signals. Tension alone and frown alone do not mean dislike, but their combination does To convey the meaning disbelieve tilt and frown must be displayed together The signal frown means don’t understand but when a shake is added their combination loses this meaning Tilt alone and gaze right down alone do not mean not interested, but their combination does

Experiment: Affect bursts as listener feedback Research questions 1. Affect bursts used as listener feedback => same emotion? 2. How acceptable is such feedback? (3. Difference between German and Dutch listeners?)

Method Stimuli –select German affect bursts from Schröder (2003) –embed into neutral German / Dutch speaker utterance “Yeah, then I told myself, why don’t you try it and then I did it!” –10 emotions, 2 affect bursts each => 20 stimuli per language e.g. Dutch + admiration-wow e.g. Dutch + anger-growl … e.g. German + worry-ohoh e.g. German + startle-ah …

Results

maybe: social appropriate ness + admiration, elation, relief, worry - threat, startle, anger

Discussion “Acceptability” is very ambiguous –general appropriateness in the context (intended) –strange as reaction to speaker utterance –technical aspects mismatch between sound quality timing of feedback –social appropriateness: display rules social norms prescribed by one’s culture as to “who can show what emotion to whom, when” (Ekman, 1977)

Discussion (2) Tentative set of display rules for affect bursts –display gratifying emotions (admiration); –display empathy emotions (elation, worry, relief); –do not display negative evaluation (disgust, contempt) –do not display aggression (anger, threat) Can explain most observations –but not high acceptability of boredom

Summary and Questions For some emotions, highly recognisable affect bursts were judged to fit well with the context Perception of emotional feedback may depend on: –social acceptability (display rules); –semantic/pragmatic interaction between speaker utterance and affect burst; –timing of feedback; –relation between speaker/listener; –formality of the situation; –…

Future work In the future we aim at: –integrating perception of audio data –defining a set of rules to decide when a reactive backchannel signal must be triggered and which signal Greta should display –defining different styles to create variety of agents (assertive/not assertive, interested/not interested, believing/not believing, and so on) and evaluating their impression on users

CONTEXT first phase has ended succesfully move to the next… and put the findings in context