LIMSI-CNRS WP5 - Belfast September, 2004 Multimodal Annotation of Emotions in TV Interviews S. Abrilian, L. Devillers, J.C Martin, S. Buisine LIMSI – CNRS,

Slides:



Advertisements
Similar presentations
Chapter 6: Nonverbal Communication: Messages Without Words
Advertisements

HUMAINE - Workshop on Databases - Belfast
JC Martin - LIMSI/CNRS - WP5 WS1 Manual Annotation of Multimodal Behaviors in Emotionnal TV Interviews J.-C. Martin, S. Abrilian, L. Devillers LIMSI-CNRS,
Instructors Edition. Psychology in Action, 9 th ed. By Dr. Karen Huffman Facial Characteristics Jim Matiya Psychology in Action 9 th Edition Karen Huffman.
Essentials of Human Communication, 7th Edition
(a)(b)(c)(d)(a)(b)(c)(d) 1 J-C. Martin, L. Devillers, A. Zara – LIMSI-CNRS V. Maffiolo, G. Le Chenadec – France Télécom R&D France EmoTABOU Corpus.
Annotation and Detection of Blended Emotions in Real Human-Human Dialogs recorded in a Call Center L. Vidrascu and L. Devillers TLP-LIMSI/CNRS - France.
 INTRODUCTION  STEPS OF GESTURE RECOGNITION  TRACKING TECHNOLOGIES  SPEECH WITH GESTURE  APPLICATIONS.
On the parameterization of clapping Herwin van Welbergen Zsófia Ruttkay Human Media Interaction, University of Twente.
Meta-Cognition, Motivation, and Affect PSY504 Spring term, 2011 March 16, 2010.
Double level analysis of the Multimodal Expressions of Emotions in Human-Machine Interaction.
MUSCLE movie data base is a multimodal movie corpus collected to develop content- based multimedia processing like: - speaker clustering - speaker turn.
1 IUT de Montreuil Université Paris 8 Emotion in Interaction: Embodied Conversational Agents Catherine Pelachaud.
Non-Verbal Communication and Body language
Understanding Non- Verbal Communication MRS. DOBBINS.
Emotional Intelligence and Agents – Survey and Possible Applications Mirjana Ivanovic, Milos Radovanovic, Zoran Budimac, Dejan Mitrovic, Vladimir Kurbalija,
Non-Verbal Communication
Emotion, Stress, and Health chapter 13. Overview Nature of emotion Emotion and culture Nature of stress Stress and emotion How to cope chapter 13.
 7% is verbal (words)  38% is vocal (volume, pitch, rhythm, etc.)  55% is body movements (mostly facial expressions)
Nonverbal Communication
Types of Nonverbal Communication and Body Language
HUMAINE - WP5 Belfast04 1 Experience with emotion labelling in naturalistic data L. Devillers, S. Abrilian, JC. Martin, LIMSI-CNRS, Orsay E. Cowie, C.
Towards an integrated scheme for semantic annotation of multimodal dialogue data Volha Petukhova and Harry Bunt.
Laboratory of Computational Engineering Michael Frydrych, Making the head smile Smile? -> part of non-verbal communication Functions of non-verbal.
Nonverbal Communication
Chapter 4: Cultural Differences in Nonverbal communication
EXPRESSED EMOTIONS Monica Villatoro. Vocab to learn * Throughout the ppt the words will be bold and italicized*  Emotions  Facial Codes  Primary Affects.
GUI: Specifying Complete User Interaction Soft computing Laboratory Yonsei University October 25, 2004.
Actions Speak Louder Than Words. Types of Nonverbal Communication Appearance Gestures Posture Eye Contact Facial Expression Vocal Cues Spatial Relations.
Maria Neophytou Communication And Internet Studies ENG270 – English for Communication Studies III
Body Language The gestures, postures, and facial expressions by which a person manifests various physical, mental, or emotional states and communicates.
Expressive Emotional ECA ✔ Catherine Pelachaud ✔ Christopher Peters ✔ Maurizio Mancini.
NM – LREC 2008 /1 N. Moreau 1, D. Mostefa 1, R. Stiefelhagen 2, S. Burger 3, K. Choukri 1 1 ELDA, 2 UKA-ISL, 3 CMU s:
SPEECH CONTENT Spanish Expressive Voices: Corpus for Emotion Research in Spanish R. Barra-Chicote 1, J. M. Montero 1, J. Macias-Guarasa 2, S. Lufti 1,
Greta MPEG-4 compliant Script based behaviour generator system: Script based behaviour generator system: input - BML or APML input - BML or APML output.
Laurence DEVILLERS & Jean-Claude MARTIN LIMSI-CNRS FP6 IST HUMAINE Network of Excellence / Association (
Copyright © Allyn & Bacon 2007 Chapter 8 Emotion and Motivation.
* NONVERBAL COMMUNICATION. NONVERBAL COMMUNICATION Non-linguistic transmission of information between people.
ENTERFACE 08 Project 1 “MultiParty Communication with a Tour Guide ECA” Mid-term presentation August 19th, 2008.
Toward a Unified Scripting Language 1 Toward a Unified Scripting Language : Lessons Learned from Developing CML and AML Soft computing Laboratory Yonsei.
VoiceUNL : a proposal to represent speech control mechanisms within the Universal Networking Digital Language Mutsuko Tomokiyo (GETA-CLIPS-IMAG) & Gérard.
The Expression of Emotion: Nonverbal Communication.
1 Workshop « Multimodal Corpora » Jean-Claude MARTIN Patrizia PAGGIO Peter KÜEHNLEIN Rainer STIEFELHAGEN Fabio PIANESI.
Feedback Elisabetta Bevacqua, Dirk Heylen,, Catherine Pelachaud, Isabella Poggi, Marc Schröder.
Multimodality, universals, natural interaction… and some other stories… Kostas Karpouzis & Stefanos Kollias ICCS/NTUA HUMAINE WP4.
M. Brendel 1, R. Zaccarelli 1, L. Devillers 1,2 1 LIMSI-CNRS, 2 Paris-South University French National Research Agency - Affective Avatar project ( )
What We Know People Know About Gesture Barbara Kelly and Lauren Gawne University of Melbourne.
NONVERBAL COMMUNICATION What is non verbal communication? Nonverbal communication has been defined as communication without words.Nonverbal communication.
Nonverbal communication
English for communication studies III Semester 2: Spring 2010 Instructor: Stavroulla Hadjiconstantinou Angelidou Nectaria Papaneocleous.
Nonverbal Communication. Communication in general is process of sending and receiving messages that enables humans to share knowledge, attitudes, and.
MIT Artificial Intelligence Laboratory — Research Directions The Next Generation of Robots? Rodney Brooks.
Emotional Intelligence
Unit 4: Emotions.
Chris Hewitt, Wild Mouse Male, Age 42, Happy ARC31 2.
By : Y N Jagadeesh Trainer – Soft skills Blue HR Solutions.
The Expression of Emotion: Nonverbal Communication.
ENTERFACE’08 Multimodal Communication with Robots and Virtual Agents mid-term presentation.
WP6 Emotion in Interaction Embodied Conversational Agents WP6 core task: describe an interactive ECA system with capabilities beyond those of present day.
Presented By Meet Shah. Goal  Automatically predicting the respondent’s reactions (accept or reject) to offers during face to face negotiation by analyzing.
Facial Expression Analysis Theoretical Results –Low-level and mid-level segmentation –High-level feature extraction for expression analysis (FACS – MPEG4.
Non Verbal Communication. Program Objectives (1 of 2)  Hone your interpersonal advantages while interacting with others.  Recognize how the eyes, face,
Modeling Expressivity in ECAs
Emotions Emotions seem to rule our daily lives.
BA-II Functional English Paper B: T.V. Journalism
Verbal and Non-verbal Communication Skills
Voluntary (Motor Cortex)
Nonverbal Communication
Presentation transcript:

LIMSI-CNRS WP5 - Belfast September, 2004 Multimodal Annotation of Emotions in TV Interviews S. Abrilian, L. Devillers, J.C Martin, S. Buisine LIMSI – CNRS, France HUMAINE - WP5 Summer school - Belfast

LIMSI-CNRS WP5 - Belfast September, 2004 Content Emotion pervades human communication –feelings are conveyed in faces, voices, and gestures; and people judge others by the way they respond to those signals. On-going research on –modeling relations emotion / multimodal –blended & subtles real-life emotions –detection / synthesis (ECAs) EmoTV exploratory corpus –Annotations: segmentation, emotion, multimodal Difficult issues Hands-on session

LIMSI-CNRS WP5 - Belfast September, 2004 EmoTV-1 Corpus 51 video recorded from French TV channels The interviews covering a range of different topics : politics, sport, law, religion, etc # interviewees: 48 # topics: 24 # total duration: 12 mn # words total: ~2500 # emot. segmt: 281 Min: 4 s Max: 43 s # distinct : ~800

LIMSI-CNRS WP5 - Belfast September, 2004 Video selection criteria TV interviews Realistic situation Presence of emotion (fullblown/subtle/blended/…) Speaker face and upper body (close shot) Multimodal cues : speech, gesture, gaze… French language Focus on one person (unknown person) Audio quality

LIMSI-CNRS WP5 - Belfast September, 2004 Annotation Protocol with natural corpora Annotation scheme design difficult even more for blended / subtle / masked / sequential emotions Emotion and multimodal annotation iterations:  defining categories phase  annotating phase  validating phase with inter-annotator agreement, perceptual tests and statistical analysis

LIMSI-CNRS WP5 - Belfast September, 2004 Emotion labelling phase Emotion segmentation: different strategies Emotion labelling with: abstract dimensions: valence/activation (Cowie 2001) scale of 7 levels category labels pragmatic decision : - converging on a smaller set of basic categories, - combining those categories in order to define complex emotion « palette theory » (Cowie 2001), « Plutchik wheel » (Plutchik 1980) example: Disappointment = sadness + surprise Contempt = anger + disgust

LIMSI-CNRS WP5 - Belfast September, 2004 Emotion labelling: choice of labels Annotation protocol : 2 annotators, free choice, then elaboration of emotion category 176 labels -> preliminary list of 18 emotion categories anger, despair, disappointment, disgust, doubt embarrassment, exaltation, fear, irritation, joy, neutral, pleased, pride, sadness, serenity, shame, surprise, worry Example of emotion annotation scheme (ANVIL): Each segment is labelled with Primary label: sadness Secondary label (or not): disgust Valence: 2 (negative) Intensity: 6 (very high)

LIMSI-CNRS WP5 - Belfast September, 2004 Annotation of multimodal behavior Multimodal corpora and tools –LREC 2002 & 2004 workshops –I. Poggi coding scheme Annotation of multimodal behavior –McNeill 1982, Kipp 2004 Emotion and multimodal behavior –Emotional expression Collier 1985, –Facial expression: Ekman 2003, Pandzic 2002 –Expressivity of gesture (Pelachaud 2004)

LIMSI-CNRS WP5 - Belfast September, 2004 Coding scheme design Requirements –« Fast » annotation by single annotator for all modalities –Specific requirements for the mm coding scheme for TV interviews Coding scheme design –Behavior observed in the videos –Suggested by the literature (prototypical emotions)

LIMSI-CNRS WP5 - Belfast September, 2004

Audio tracks Required –Prosodic cues: rhythm (speech rate), melody (F0), energy, voice quality. –Non-verbal events: laughter, cry, throat clearing,… Tracks –Energy –Transliteration: French / English

LIMSI-CNRS WP5 - Belfast September, 2004 Posture group Pose track –Body orientation: up, down, left, right, front, back, packed, seat Shift track –Activity: whole body, upper body, legs –Speed: fast, moderate, slow –Action: walk, jump, duck, run, stand, sit, turn over, back, move back, come closer

LIMSI-CNRS WP5 - Belfast September, 2004 hand/arm movement non-communicative adaptoremblem communicative deicticillustrative iconicmetaphoric beat Communicative gesture classes Several typologies (Efron 41, Ekman & Friesen 69, McNeill 92, Kipp 04)

LIMSI-CNRS WP5 - Belfast September, 2004 Which gesture classes for emotional ecological corpora ? Criteria: corpus + ease of annotation –Adaptor –Beats –Gesticulation: free form, spontaneous –Deictics, emblems, iconics, metaphorics

LIMSI-CNRS WP5 - Belfast September, 2004 Phase gesture group (Kipp 1991) Phase –Type (Kendon and McNeill 1992): preparation, stroke, beats, hold, retract –Speed: fast, moderate, slow –Energy: high, normal, low –Handedness –Spatial region: up, head, chest, down, periphery –Hand shape: open, closed –Direction: horizontal, vertical

LIMSI-CNRS WP5 - Belfast September, 2004 Facial expressions MPEG-4 Facial Animation Parameters FACS Action units –Chin, Lids, Brows, Cheeks, Head, Lips, Nose, Mouth, Eyes

LIMSI-CNRS WP5 - Belfast September, 2004 Problematic issues Time consuming Subjectivity in the segmentation and annotation of emotion labels and some modalities Separate emotionally significant events from non emotional Require expertise in annotating all modalities (gesture type, FAPS) Limitations of TV samples: image resolution, mostly upper body, external events/objects out of camera scope (which ellicit gaze) Annotation is corpus-dependent (ex: gesture speed) … But enable the exploration of complex natural emotions

LIMSI-CNRS WP5 - Belfast September, 2004 Next steps Corpus annotations/Analysis of results –inter-annotation agreement –Signal/transcription alignment –Statistics: relation between mm behavior and emotion Improve multimodal coding scheme Update of coders documentation Results will be presented at WP5 WS

LIMSI-CNRS WP5 - Belfast September, 2004 Future direction Typology of natural mm complex emotion Unsupervised classification of mm annotations Correlation of mm annotations and emotions

LIMSI-CNRS WP5 - Belfast September, 2004 Summer school protocol Annotate emotion and multimodal behavior on videos (individual and collective steps) Provided –Anvil short documentation (Kipp 1991) –Coding scheme –Emotional segments –Speech transcription: French, English