Presentation is loading. Please wait.

Presentation is loading. Please wait.

PROGRESS ON EMOTION RECOGNITION J G Taylor & N Fragopanagos King’s College London.

Similar presentations


Presentation on theme: "PROGRESS ON EMOTION RECOGNITION J G Taylor & N Fragopanagos King’s College London."— Presentation transcript:

1 PROGRESS ON EMOTION RECOGNITION J G Taylor & N Fragopanagos King’s College London

2 KCL WORK IN ERMIS Analysis of emotion v cognition in human brain ( →simulations of emotion/attention paradigms) → emotion recognition architecture ANNA ANNA hidden layer = emotion state, + feedback control for attention (= IMC) Learning laws for ANNA developed ANNA fuses all modalities or only one HUMAINE: WP3 +WP4

3 BASIC BRAIN EMOTION CIRCUIT Valence in amygdala & OBFC Attention in parietal & PFC Interaction in ACG SC Parietal A Thal ACG SFG NBM

4 SIMPLIFIED ARCHITECTURE OF EMOTIONAL/COGNITIVE PROCESSING IN THE BRAIN:

5 DETAILED ARCHITECTURE FOR FACES CLASSIFICATION gender

6 BASIC ERMIS EMOTION RECOGNITION ARCHITECTURE: Attention control system: Feature vector Inputs: Emotion state as hidden layer Output as recognised emotional state ↑ →

7 ANNA:Assume linear output: Hidden layer response: IMC node response: Then solve self- consistent equations for (y, z) for each training input by relaxation

8 NATURE OF ANNA Handles both unimodal and multi- modal data (input vector x of arbitrary dimension, not too large) Needs consistent input and output data {x(t), OUT(t)}, with t specified for both x & OUT=(activat, evaluat) Uses SALAS date-base (450 tunes) from QUB (Roddie/Ellie/Cate)

9 UNIMODAL RESULTS Can use numerous representations of emotion: extreme, continuous in n dimensions, … ANNA → FEELTRACE output (continuous 2-D) Trained on unimodal for prosody First look at word content

10 Text Post-Processing Module Prof. Whissell compiled ‘Dictionary of Affect in Language (DAL)’ Mapping of ~9000 words → (activation- evaluation), based on students’ assessment Take words from meaningful segments obtained by pause detection → (activation-evaluation) space But humans use context to assign emotional content to words

11 Text Post-Processing Module (SALAS data) Table 1. Quadrant match for normal text (full DAL). ParticipantP1P2P3P4P9P12All Quadrant match (%)21.412.521.430.425.019.616.1 Table 2. Quadrant match for scrambled text (full DAL). ParticipantP5P6P7P8P10P11All Quadrant match (%)07.123.225.032.123.221.421.4 Table 3. Standard deviation of participants’ assessments for normal and scrambled text (average over all passages assessed). NormalScrambled Evaluation1.241.45 Activation1.551.73 Table 4. Quadrant match averaged over participants’ groups for normal text and scrambled text when threshold for DAL range* is varied. Threshold0.00.250.50.75 Normal text16.116.012.516.4 Scrambled text21.421.419.621.8 *The higher the threshold the higher emotionally rated words are spotted only. Conclude: need further context/semantics

12 Correlational analysis of ASSESS features Correlational analysis between ~450 ASSESS features and FeelTrace => –ASSESS features correlate more highly with activation –Similar top ranking features for 3 out of 4 FeelTracers (but still differences) –Different top ranking features for different SALAS subjects ->Is there a male/female trend? Difficult to say - insufficient data

13 ANNA on top correlated ASSESS features Quadrant match using top 10 activation features + top 10 evaluation features and activation – evaluation output space: Feeltracerjdccdrem Avg Quad Match 0.420.390.370.45 Std Dev0.030.02 0.04

14 ANNA on top correlated ASSESS features Half-plane match using top 10 activation features and activation only output space: Feeltracerjdccdrem Avg Quad Match 0.750.660.640.74 Std Dev0.02 0.03

15 PRESENT SITUATION OF ANNA: MULTIMODAL Time-stamped data now becoming available for lexical (ILSP) & face streams (NTUA) Expect to have results in about 1 month for recognition for fused modalities (faces/prosody/words)

16 CONCLUSIONS UNIMODAL: ANNA on prosody OK (especially activation) MULTIMODAL: Soon to be done On semi-realistic data (SALAS QUB) Future work: 1) analysis of detailed results 2) insert temporality in ANNA

17 QUESTIONS How to handle variations across experiencers and across FEELTRACERS? How to incorporate expert knowledge? How combine recognition across models? Coding of emotions: as dimensional reps or as dissociated states (sad AMYG v angry OBFC)? Nature of emotions as goal/reward assessment (frustration → anger; impossible →sadness, etc: brain-based)?


Download ppt "PROGRESS ON EMOTION RECOGNITION J G Taylor & N Fragopanagos King’s College London."

Similar presentations


Ads by Google