Presentation is loading. Please wait.

Presentation is loading. Please wait.

Emotional Machines Presented by Chittha Ranjani Kalluri.

Similar presentations


Presentation on theme: "Emotional Machines Presented by Chittha Ranjani Kalluri."— Presentation transcript:

1 Emotional Machines Presented by Chittha Ranjani Kalluri

2 Why Can’t… We have a thinking computer? A machine that performs about a million floating-point operations per second understand the meaning of shapes? We build a machine that learns from experience rather than simply repeat everything that has been programmed into it? A computer be similar to a person? The above are some of the questions facing computer designers and others who are constantly striving to build more and more ‘intelligent’ machines.

3 So, what’s intelligence? According to en.wikipedia.org: “Intelligence is a general mental capability that involves the ability to reason, plan, solve problems, think abstractly, comprehend ideas and language, and learn.”

4 What does this mean for current machines? Definitely not that they’re not intelligent! Some amount of intelligence has to be built in How can that be done? Designers looked closely at how humans Behave Express themselves Process information Solve problems

5 Expressing ourselves Body language Facial expressions Tone of voice Words we choose All of them vary based on situation What we implicitly convey - emotion

6 What is emotion? In psychology and common use, emotion is the language of a person's internal state of being, normally based in or tied to their internal (physical) and external (social) sensory feeling. Love, hate, courage, fear, joy, and sadness can all be described in both psychological and physiological terms.

7 Do machines need emotion? Machines of today don’t need emotion Machines of the future would need it to Survive Interact with other machines and humans Learn Adapt to circumstances Emotions are a basis for humans to do all the above

8 What is an emotional machine? An intelligent machine that can recognize emotions and respond using emotions Concept proposed by Marvin Minsky about a year ago in his book ‘The Emotion Machine’ Example: the WE-4RII (Waseda Eye No. 4 Refined II), being developed at the Waseda University, Japan

9 The WE-4RII Simulates six basic emotions Happiness Fear Surprise Sadness Anger Disgust Recognizes certain smells Detects certain types of touch Uses 3 personal computers for communication Still not as close to an emotional machine as we would want

10 The WE-4RII Happiness Fear

11 The WE-4RII Surprise Sadness

12 The WE-4RII Anger Disgust

13 Do we want…

14 Maybe… We’re not there…yet! So how do we get from to

15 Characteristics of multi-modal ELIZA Based on message passing on blackboard Input – user’s text string Output – sentences and facial displays Processing module consists of NLP layer Emotional recognition layer Constructs facial displays

16 NLP Layer String converted to list of words by parser Spelling checked Abbreviations replaced Slang words and codes replaced with correct ones Some words replaced with synonyms by thesaurus Input matched with predefined patterns by syntactic-semantic analyzer Longest matching string used to generate reply

17 NLP Layer Repetition recognition ensures dialog does not enter loop Rules written in AIML (Artificial Intelligence Markup Language) Pragmatic analysis module checks reply against user preferences collected during conversation, and against goals and states of system

18 Emotion recognition layer Emotive Lexicon Look-up Parser used to extract emotion eliciting factors Bases it on a lexicon of words having emotional content 247 words, each with a natural number intensity Overall emotional content of a string got from seven ‘thermometers’ which get updated when an emotionally rich word is found

19 Emotion recognition layer Emotive Labeled Memory Structure Extraction labels each pattern and corresponding rules Two additional AIML tags used – ‘affect’ and ‘concern’: positive, negative, joking, normal Goal-Based Emotion Reasoning stores user’s personal data Two knowledge bases to determine affective state Stimulus response to user’s input Result of cognitive process of conversation to convey reply

20 Preference rules - examples IF (user is happy) AND (user asks question) AND (systems reply is sad) AND (situation type of user is not negative) AND (highest thermo is happy) THEN reaction is joy. IF (user is sad) AND (systems reply is sad) AND (situation type of user is joking) AND (situation type of the system is negative) AND (maximum affective thermo is sad) THEN reply is resentment.

21 Facial display selection Intensity of an emotion must exceed a threshold level before it can be expressed externally If an emotion is active, system calculates values of all thermometers Thermometer having highest value chosen as emotion Intensity of emotion determines facial display

22 Other work in this area Emotionally Oriented Programming (EOP) Allows programmers to explicitly represent and reason about emotions Can build Emotional Machines (EMs) – intelligent software agents with explicit programming constructs for concepts like mood, feelings, temperament Inspiration: thoughts and feelings are intertwined Analysis of thought inspires feelings Feelings inspire creation of thoughts

23 Other work in this area Emotionally Oriented Programming (EOP)

24 Other work in this area Emotional Model for Intelligent Response (EMIR) Developed by Mindsystems, an Australian company Includes simulations for feelings such as boredom! Methodology: Looks at factors influencing a character  Success at achieving goals  Levels of a character’s control over situation Compares this “state of mind” to a database of human responses mapped over time Was in demo stage in 2002

25 Other work in this area Emotionally Rich Man-machine Intelligent System (ERMIS) Aims to develop a prototype system for human- computer interaction that can interpret its user’s attitude or emotional state, e.g., activation/ interest, boredom, and anger, in terms of their speech and/or their facial gestures and expressions Adopted techniques include linguistic speech analysis, robust speech recognition, and facial expression analysis

26 Other work in this area Net Environment for Embodied, Emotional Conversational Agents (NECA) Promotes concept of multi-modal communication with animated synthetic personalities Key challenge - the fruitful combination of different research strands including situation-based generation of natural language and speech and the modeling of emotions and personality.

27 Conclusion The question is not whether intelligent machines can have emotions, but whether machines can be intelligent without any emotions. Marvin Minsky, The Society of Mind

28 Bibliography Emotional machines – http://www.emotionalmachines.comhttp://www.emotionalmachines.com Emotional machines – Do we want them? - http://www.zdnet.com.au/news/communications/0,2000061791,202661 34,00.htm http://www.zdnet.com.au/news/communications/0,2000061791,202661 34,00.htm Marvin Minsky Home Page - http://web.media.mit.edu/~minsky/http://web.media.mit.edu/~minsky/ Multi-Modal ELIZA - http://mmi.tudelft.nl/pub/siska/_TSD%20my_eliza.pdf http://mmi.tudelft.nl/pub/siska/_TSD%20my_eliza.pdf The WE4-RII - http://www.takanishi.mech.waseda.ac.jp/research/eyes/http://www.takanishi.mech.waseda.ac.jp/research/eyes/ Small Wonder - http://www.smallwonder.tv/http://www.smallwonder.tv/ The HUMAINE Portal - http://emotion-research.nethttp://emotion-research.net ERMIS - http://manolito.image.ece.ntua.gr/ermishttp://manolito.image.ece.ntua.gr/ermis NECA - http://www.oefai.at/NECAhttp://www.oefai.at/NECA


Download ppt "Emotional Machines Presented by Chittha Ranjani Kalluri."

Similar presentations


Ads by Google