APML, a Markup Language for Believable Behavior Generation Soft computing Laboratory Yonsei University October 25, 2004.

Slides:



Advertisements
Similar presentations
APPROACHES TO T&L Language
Advertisements

Towards Affective Interactive ECAs Catherine Pelachaud – University of Paris 8 Isabella Poggi – University of Rome 3 WP6 workshop, March 2005, Paris.
HUMAINE Summer School - September Basic Emotions from Body Movements HUMAINE Summer School 2006 Casa Paganini Genova, Italy Ahmad S. Shaarani The.
ENTERFACE’08 Multimodal Communication with Robots and Virtual Agents.
RRL: A Rich Representation Language for the Description of Agent Behaviour in NECA Paul Piwek, ITRI, Brighton Brigitte Krenn, OFAI, Vienna Marc Schröder,
Collection and Analysis of Multimodal Interaction in Direction Giving Dialogues Seikei University Takeo TsukamotoYumi Muroya Masashi Okamoto Yukiko Nakano.
1http://img.cs.man.ac.uk/stevens Interaction Models of Humans and Computers CS2352: Lecture 7 Robert Stevens
XISL language XISL= eXtensible Interaction Sheet Language or XISL=eXtensible Interaction Scenario Language.
CCM oi IST project COMIC Vision and Approach Results first 1.5 years
Communication The exchange of ideas, information, etc. between two or more persons. In an act of communication there is usually at least a speaker or sender,
23-May-151 Multiparty Communication with a Tour Guide ECA Aleksandra Čereković HOTLab group Department of telecommunications Faculty of electrical engineering.
Language, Culture and Communication: Introduction
Chapter 3 Nonverbal Communication. What is nonverbal communication? “Everything that communicates a message but does not use words” Facial expressions,
Spoken Dialogue Technology How can Jerry Springer contribute to Computer Science Research Projects?
Communicating with Avatar Bodies Francesca Barrientos Computer Science UC Berkeley 8 July 1999 HCC Research Retreat.
1 IUT de Montreuil Université Paris 8 Emotion in Interaction: Embodied Conversational Agents Catherine Pelachaud.
Emotional Intelligence and Agents – Survey and Possible Applications Mirjana Ivanovic, Milos Radovanovic, Zoran Budimac, Dejan Mitrovic, Vladimir Kurbalija,
Sunee Holland University of South Australia School of Computer and Information Science Supervisor: Dr G Stewart Von Itzstein.
Building the Design Studio of the Future Aaron Adler Jacob Eisenstein Michael Oltmans Lisa Guttentag Randall Davis October 23, 2004.
What does your body say?.  all messages that are not expressed as words.
TuniSigner: An avatar-based system to interpret SignWriting notations Yosra Bouzid & Mohamed Jemni Research Laboratory LaTICE, University of Tunis, Tunisia.
Enabling enactive interaction in virtualized experiences Stefano Tubaro and Augusto Sarti DEI – Politecnico di Milano, Italy.
GUI: Specifying Complete User Interaction Soft computing Laboratory Yonsei University October 25, 2004.
Lecture 12: 22/6/1435 Natural language processing Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
An Architecture for Empathic Agents. Abstract Architecture Planning + Coping Deliberated Actions Agent in the World Body Speech Facial expressions Effectors.
Recognition of meeting actions using information obtained from different modalities Natasa Jovanovic TKI University of Twente.
Expressive Emotional ECA ✔ Catherine Pelachaud ✔ Christopher Peters ✔ Maurizio Mancini.
Markup of Multimodal Emotion-Sensitive Corpora Berardina Nadja de Carolis, Univ. Bari Marc Schröder, DFKI.
4/12/2007dhartman, CS A Survey of Socially Interactive Robots Terrance Fong, Illah Nourbakhsh, Kerstin Dautenhahn Presentation by Dan Hartmann.
Chapter 7. BEAT: the Behavior Expression Animation Toolkit
The persuasive import of gesture and gaze. The importance of bodily behaviour in persuasive discourse has been acknowledged as early as in ancient Rome.
Greta MPEG-4 compliant Script based behaviour generator system: Script based behaviour generator system: input - BML or APML input - BML or APML output.
Backchannels Through Gaze as Indicators of Persuasive Success E. Bevacqua, M. Mancini, C. Peters, C. Pelachaud University of Paris 8 Isabella Poggi Università.
1 MPML and SCREAM: Scripting the Bodies and Minds of Life-Like Characters Soft computing Laboratory Yonsei University October 27, 2004.
Non Verbal Communication How necessary is it to use and interpret it? Demosthenous Christiana.
Can We Talk?: Building Social Communication Skills Lydia H. Soifer, Ph.D. SPED*NET Wilton Norwalk SPED Partners.
Non-verbal Communication. How necessary is it to use and interpret it?
ENTERFACE 08 Project 1 “MultiParty Communication with a Tour Guide ECA” Mid-term presentation August 19th, 2008.
A Common Ground for Virtual Humans: Using an Ontology in a Natural Language Oriented Virtual Human Architecture Arno Hartholt (ICT), Thomas Russ (ISI),
Introducing the Cast for Social Computing: Life-Like Characters 2004 년 2 학기 로봇공학 특강 송윤석.
Communication Additional Notes. Communication Achievements 7% of all communication is accomplished Verbally. 55% of all communication is achieved through.
Toward a Unified Scripting Language 1 Toward a Unified Scripting Language : Lessons Learned from Developing CML and AML Soft computing Laboratory Yonsei.
Intelligent Robot Architecture (1-3)  Background of research  Research objectives  By recognizing and analyzing user’s utterances and actions, an intelligent.
1 1. Representing and Parameterizing Agent Behaviors Jan Allbeck and Norm Badler 연세대학교 컴퓨터과학과 로봇 공학 특강 학기 유 지 오.
Nonverbal Communication
Österreichisches Forschnungsinstitut für Artificial Intelligence Representational Lego for ECAs Brigitte Krenn.
Feedback Elisabetta Bevacqua, Dirk Heylen,, Catherine Pelachaud, Isabella Poggi, Marc Schröder.
4 November 2000Bridging the Gap Workshop 1 Control of avatar gestures Francesca Barrientos Computer Science Division UC Berkeley.
Soft Skills Unit. What Is Communication? Communication Transfer and understanding of meaning. Transfer means the message was received in a form that can.
Learning to Share Meaning in a Multi-Agent System (Part I) Ganesh Padmanabhan.
Communication Process Making appropriate choices so that you can be heard!
Human Figure Animation. Interacting Modules The ones identified –Speech, face, emotion Plus others: –Perception –Physiological states.
Animated Speech Therapist for Individuals with Parkinson Disease Supported by the Coleman Institute for Cognitive Disabilities J. Yan, L. Ramig and R.
1 Animated Pedagogical Agents: An Opportunity to be Grasped? 報 告 人:張純瑋 Clarebout, G., Elen, J., Johnson, W. & Shaw, E. (2002). Animated pedagogical agents:
Communication Jargon. jargon jargon: A special language of a particular activity or group.
1 Galatea: Open-Source Software for Developing Anthropomorphic Spoken Dialog Agents S. Kawamoto, et al. October 27, 2004.
Electronic visualization laboratory, university of illinois at chicago Towards Lifelike Interfaces That Learn Jason Leigh, Andrew Johnson, Luc Renambot,
Copyright © 2013, 2010, 2007 Pearson Education, Inc. All Rights Reserved.
Language in Cognitive Science. Research Areas for Language Computational models of speech production and perception Signal processing for speech analysis,
O The Feedback Culture. o Theories of Communication. o Barriers. REVIEW.
By : Y N Jagadeesh Trainer – Soft skills Blue HR Solutions.
Understanding Nonverbal Messages
ENTERFACE’08 Multimodal Communication with Robots and Virtual Agents mid-term presentation.
Software Architecture for Multimodal Interactive Systems : Voice-enabled Graphical Notebook.
WP6 Emotion in Interaction Embodied Conversational Agents WP6 core task: describe an interactive ECA system with capabilities beyond those of present day.
Communication Part I Dr.Ali Al-Juboori. Communication is the process by which information is exchanged between the sender and receiver. The six aspects.
Interpreting Ambiguous Emotional Expressions Speech Analysis and Interpretation Laboratory ACII 2009.
Communication and Interpersonal Skills By Adel Ali 18/09/14371Communication Skills, Adel Ali.
Survey of Affective Computing for Digital Home
Pilar Orero, Spain Yoshikazu SEKI, Japan 2018
Presentation transcript:

APML, a Markup Language for Believable Behavior Generation Soft computing Laboratory Yonsei University October 25, 2004

1 Contents Introduction Expressing believable behaviors MagiCster architecture A markup language for behavior specification: APML –Overview of existing markup languages for expressing human-like behavior –Defining APML tags Facial description language An example Conclusion

2 Introduction Humans communicate using verbal and non-verbal signals –Body posture, gestures, facial expressions, gaze, intonation and prosody, and words and sentences Embodied conversational agents –Virtual body that interacts with another agent –Human-like manner –Believable way Express emotion Exhibit a given personality Two approaches on ‘Body’ and ‘Mind’ –Strictly and necessarily interdependent –Mainly independent from each other

3 EU project MagiCster ‘Mind’ and ‘Body’ are interfaced by a language based on XML During the conversation –Mind decides what to communicate considering different factors that trigger the goal of communicating and influence the contents to communicate –Body reads what the Mind decides to communicate and interprets and renders it at the surface level according to the available communicative channels Define a set of language for specifying the format of dialogue moves at different abstraction levels –APML (Affective Presentation Markup Language) –Express the content of the dialogue move at the meaning level

4 Chapter Overview Describe the main features of the underlying architecture Present the APML language and how it has been used in the context of the MagiCster project Show hot it has been interfaced with a 3D realistic face call Greta and a synthetic voice Use an example in the medical domain

5 Expressing Believable Behaviors Communication: A means to influence others Beliefs forming the content of a communicative act –Information about the world Deictics, adjectival –Information about the speaker’s identity –Information about the speaker’s mind Speaker’s beliefs: Degree of certainty, metacognitive information Goals: Performativity of the sentence, topic-comment or theme-rheme distinction, rhetorical relations, turn-taking and backchannel Emotions: Affective words, gestures, intonation, facial expression, gaze, and posture

6 MagiCster Architecture (1) An example of advice-giving dialogue in the medical domain –The agent (Gi): A doctor –The Interlocutor (Uj): A patient Coordination of the speech with various expressions –In move G1, she manifests her empathy with the User –In move G2, Greta indicates her chest while saying “a spasm of chest” –In move G3, she looks at the User while saying “your problem”

7 MagiCster Architecture (2) The MagiCster system –‘Mind’ component: A content planner, a dialogue manager, and an affective agent modeling module –‘Body’ component : A 3D face/avatar with a speech synthesizer –A plan enricher: An interface between the ‘Mind’ and ‘Body’ components

8 MagiCster Architecture (3) The affective agent modeling module: Decide –Whether a particular affective state should be activated –Whether the felt emotion should be displayed in a given context The content planner –Generate the discourse plan appropriate to the context –DPML DTD: Achieve the goal in that piece of conversation

9 MagiCster Architecture (4) The dialogue manager –Top of the TRINDI architecture –Compute dialogue moves and a space in which information relevant to the move selection The plan enricher –Translate the symbolic representation of a dialogue move into an Agent’s behavior specification at the meaning level –Translate the DPML-based tree structure into APML The face and body animation –Interpret the APML-tagged dialogue move –Decide how to convey every meaning

10 A Markup Language for Behavior Specification: APML High-level primitives for specifying behavior acts Express agent behavior at different levels of abstraction Control easily the behavior of ECAs independently of the body APML (the Affective Presentation Markup Language) –Specification of the agent behavior at the meaning level –Affective aspect of the communication between the agent and the user

11 Overview of Existing Markup Languages for Expressing Human-Like Behavior APML Human markup language (HML) –Provide a very abstract level language Difficult for controlling specific agent bodies Require developing complex interpreters –Enhance the fidelity of human communications –Allow the representation of physical, cultural, social, kinetic, psychological, and intentional features VHML: Provide several languages for acting on different modalities MPML (Multimodal Presentation Markup Language): Enable authors of web pages to add agents for improving human-computer interaction (MS-Agent) BEAT (Behavior Expression Animation Toolkit): Generate embodied agent’s animation from textual input Avatar markup language (AML): Represent a new high-level language to describe avatar animation

12 Defining APML Tags APML APML DTD: Communication function (A meaning-signal pair)

13 Facial Description Language (1) Describe facial expressions as (meaning, signal) pairs Define expressions to capture a slight variation –At a high level: A facial expression is a combination of other facial expressions already defined –At a low level: A facial expression is a combination of facial parameters Combine facial expressions due to distinct co-occurring communicative acts using a Bayesian network –Facial basis (FB): a basic facial movement –Facial display (FD): a set of FBs surprise = raised_eyebrow + raised_lid + open_mouth

14 Facial Description Language (2)

15 Facial Description Language (3)

16 An Example: Medical domain application (1) DPML APML

17 An Example: Medical domain application (2)

18 Conclusion Describe the architecture of the behavior generator of a believable conversational agent Focus on the importance of Mind-Body separation Define two XML-like markup languages to represent the Mints’ output and the Body’s input APML: Not available in public