Presentation is loading. Please wait.

Presentation is loading. Please wait.

Human Social Interaction Research proposal Dr. Roger Newport Room B47 Drop-in times: Tuesdays 12-2 Understanding.

Similar presentations

Presentation on theme: "Human Social Interaction Research proposal Dr. Roger Newport Room B47 Drop-in times: Tuesdays 12-2 Understanding."— Presentation transcript:

1 Human Social Interaction Research proposal Dr. Roger Newport Room B47 Drop-in times: Tuesdays Understanding Emotion: visual recognition 1

2 Introduction to facial emotions The neuroscience of Fear and Disgust (the simple story) Other emotions (the complicated story) Current research 2

3 What are facial expressions of emotion and what are they for? Are there specific centres in the brain dedicated to emotion perception? Are different emotions processed in different ways? 3 Understanding Emotions Lecture Overview

4 Why are we interested in emotion perception? Evolutionary survivalSocial survival 4

5 MotivationalSelf-conscious/social Thirst Hunger Pain Mood Shame Embarrassment Pride Guilt Regulate social behaviour Basic Happiness Fear Anger Surprise Disgust Sadness Feature prominently in social communication. Do not feature prominently in social communication. 5 facial expressions of emotion - what are the emotions?

6 Faces are special Face perception may be the most developed visual perceptual skill in humans. Infants prefer to look at faces from shortly after birth (Morton and Johnson 1991). Most people spend more time looking at faces than at any other type of object. We seem to have the capacity to perceive the unique identity of a virtually unlimited number of different faces 6

7 We laugh more if in a group/ show distress more if in a group Babies (10 months) almost only smile in presence of caregiver Babies look to caregiver and behave according to caregiver response when encountering novel object. E.g. a barking dog or a snake This is known as social referencing and is also seen in chimpanzee societies A similar process, observational fear, is seen in other monkeys. Infant monkeys show fearful unconditioned response to mothers expression of fear when the mother could see a snake, but the infants could not. That is, infants showed a fear response to the mothers fear response. Facial expressions as a communicative tool 7 Understanding Emotion: from facial expressions

8 Percentage of facial responses to unpleasant odour classified as unpleasant, neutral, or pleasant in a spontaneous condition, a posed to real person condition, and a posed to imaginary audience condition In Erickson and Schulkin, facial expressions as communication

9 Facial expressions allow for rapid communication They are produced when there is an emotional stimulus and an audience present Our interpretation of anothers emotion modulates our behaviour and vice versa The ability to recognise emotion expressions appears very early first few days (neonates) can distinguish between expressions of happiness, sadness, and surprise Four- to six-monthshow preferences for facial expressions of happiness over neutral and angry expressions seven monthscan distinguish among expressions of fear, anger, surprise, happiness, and sadness 9 facial expressions as communication

10 Angry faces are detected much more rapidly than faces depicting non- threatening expressions Ohman et al., 2001 Attention is driven by fear 10 Recognition as an automatic processes - fear and threat

11 Evidence from animal, neuropsychological and imaging studies suggest that the amygdala is of primary importance in the recognition of fear. 11 Automatic processes = dedicated network Fear and the amygdala

12 Bilateral amygdala removal: reduces levels of aggression and fear in rats and monkeys facial expressions and vocalisations become less expressive impairs fear conditioning 12 Fear and the amygdala - evidence from animal studies

13 Bilateral amygdala damage reduces recognition of fear-inducing stimuli reduces recognition of fear in others reduces ability to express fear Does NOT affect ability to recognise faces or to know what fear is See patients SM, DR and SE (Adolphs et al. and Calder et al.) Alzheimers disease impairs fear conditioning 13 Fear and the amygdala - evidence from human neuropsychology

14 non-conscious processing of fear expressions Fear and the amygdala - evidence from human imaging Increased amydala activity for facial expressions of fear vs. happiness, disgust, anger, neutral Fear face recog Fear cond. Results from several studies 14 Neuromodulatory role of left amygdala: less fear = less activity Subliminal activation of amygdala to fear

15 Amygdala Response to Facial Expressions in Children and Adults Thomas et al., 2001 Blocks of fixation of fear / neutral faces No task, just watch Left amygdala activation for fear vs fixation in male children and adults Overall, adults showed greater amygdala activation for fear v neutral whereas children did not (neutral faces may be ambiguous) LeftRight 15 A typical study

16 Methods and Materials Subjects Six male adults (mean 24 years, SD 6.6 years) and 12 children (mean 11 years, SD 2.4 years) recruited in the Pittsburgh area were scanned in a 1.5-T scanner during passive viewing of fearful and neutral faces. The children, sixfemale and six male, ranged in pubertal development from Tanner stages1 I/I to V/IV. Male and female subjects did not differ in mean age or Tanner stage. Data from an additional three adults (three female) and four children (two female) were not included due to excessive motion artifact (0.5 voxels; n 5) or claustrophobia (n 1) or because the subject fell asleep during the task (n 1). Subjects were screened for any personal or family history of psychiatric or medical illness, and for any contraindications for an MRI. Written child assent and parental consent were acquired before the study.

17 Behavioral Paradigm The task consisted of the rapid and successive presentation of faces in blocks of neutral and emotional expressions. The face stimuli consisted of digitized fearful and neutral faces taken from the Ekman and Friesen (1976) study (Figure 1). A total of eight different actors (four male and four female) demonstrating both fearful and neutral expressions were used. The hair was stripped from the images to remove any nonfacial features, and both fear and exaggerated fear poses were used for each actor (Calder et al 1997), resulting in a total of 16 fear stimuli and eight neutral stimuli. Stimuli were presented for 200 msec with an interstimulus interval of 800 msec (flashing fixation point). Each block of trials consisted of the presentation of a flashing fixation point for 45 sec followed by alternating 42-sec blocks of either neutral or fearful expressions and a final 45-sec epoch of fixation (Figure 1). This procedure was repeated in three runs of trials with the presentation order counterbalanced across runs and across subjects (i.e., F-N-F-N-F or N-F-N-F- N). Following Breiter and colleagues (Breiter et al 1996) design, no overt response was required. Instead, subjects were instructed to fixate centrally to try to get an overall sense of the faces.2

18 Image Acquisition, Processing, and Analysis Scans were acquired on a 1.5-T GE Signa scanner (General Electric Systems, Milwaukee) modified for echo planar imaging (Advanced NMR, Wilmington, MA) using a quadrature head coil. A T1-weighted sagittal localizer image was used to prescribe the functional slice locations. T1-weighted structural images were acquired in 4-mm contiguous coronal slices through the whole brain (echo time [TE] min, repetition time [TR] 500, matrix , field of view [FOV] 20) for purposes of localizing the functional activity and aligning images in Talairach space (Talairach and Tournoux 1988). Functional images (T2*) were acquired at 12 of these slice locations spanning the entire amygdala (A20 to P24 in Talairach coordinates) using an EPI BOLD sequence (TE 40, TR 3000, flip angle 90°, matrix , FOV 20, 4-mm skip 0, voxel size mm). There were three runs of 100 images totaling 300 images per slice. Images were motion corrected and normalized. All 18 subjects had less than 0.5 voxels of in-plane motion. All images were registered to a representative reference brain using Automated Image Registration software (Woods et al 1992), and voxelwise analyses of variance (ANOVAs) were conducted on these pooled data using normalized signal intensity as the dependent variable (Braver et al 1997; Casey et al 2000). Separate analyses were conducted comparing male adults and male children and comparing male and female children to examine interactions of stimulus type (fearful faces, neutral faces, fixation) with age or gender, respectively. Significant activations were defined by at least three contiguous voxels and alpha =.05 (Forman et al 1995). Amygdala activation was defined on the reference brain using Talairach coordinates and consensus among three raters (BJC, KMT, PJW). Significant regions that extended outside of the brain or had large SDs were excluded.

19 Results Adults and Children A 2 2 (Group Condition)3 ANOVA comparing male adults (n 6) and male children (n 6) revealed significant activity in the left amygdala and substantia innominata for fearful faces relative to fixation (Figure 2) and a decrease in signal with repeated presentations of the fearful faces4 (Table 1). Neutral faces showed a similar pattern of activation relative to fixation trials (F 23.71, p.001). A significant interaction was observed in the left amygdala between stimulus type and age for the comparison of fearful and neutral expressions (Table 1) (Group Condition, Fear vs. Neutral). Post hoc t tests indicate that adults demonstrated significantly greater activity for fearful faces relative to neutral faces (p.001). However, the children demonstrated greater amygdala activity for neutral faces than for fearful expressions (p.0001) (Figure 3). Neither age nor Tanner stage predicted the magnitude of the percent change in signal in this sample.


21 A variety of brain regions are involved in the processing of facial expressions of emotion They are active at different times and some structures are active at more than one time The amygdala is particularly implicated in the processing of fear stimuli receiving early (<120 ms) subcortical as well as late (~170 ms) cortical input from the temporal lobes Warning about other brain regions 16

22 The Amygdala Response to Emotional Stimuli: A Comparison of Faces and Scenes Hariri et al., 2002 Blocked design Matching task Preferential right amygdala response to faces (faces > IAPS) Amygdala response to fear - special for faces? Getting rid of unwanted activations 17

23 Adolphs and Tranel, 2003Hadjihhani and de Gelder, Emotions - not just an ugly face Cs better when faces present Bilat AMs worse when faces present often better at negative stimuli without faces

24 19 Emotions - the importance of the eyes

25 20 break

26 21

27 What emotion do these eyes depict? 22 The contribution of the eyes to facial expressions of fear

28 Whalen et al., 2004 Amygdala activation above fixation baseline for non- inverted (eye white) fearful eyes 23 The amygdala is responsive to large eye whites in fear (and surprise) expressions Emotions - the importance of the eyes % signal change from fix.

29 The amygdala, fear and the eyes Adolphs et al., SM bubble analysis 24 Emotions - the importance of the eyes

30 25 Emotions - the importance of the eyes SMs eye fixation (or lack of it)

31 When told to look at the eyes specifically SM improves, but only while given this instruction 26 Emotions - the importance of the eyes

32 Hybrid Faces from Vuilleumier, We need more than just the eyes to determine emotional and social relevance 27 Some easy to tell from eyes Others from mouths The amydalae are not just eye detectors - may direct attention to relevant stimuli - a biological relevance detector Emotions - amygdalae are not simply eye detectors

33 Animal studies insula = gustatory cortex impaired taste aversion in rats Human Neuropsychology patient NK Huntingdons Disease Tourettes and OCD Electrical stimulation = nausea Repeated exposure leads to habituation Human imaging Philips et al. Wicker et al. 29 Yuck! Disgust and the Insula (and basal ganglia)

34 Both of Us Disgusted in My Insula: The Common Neural Basis of Seeing and Feeling Disgust Wicker et al., Observed actors smelling and reacting to bad, nice and neutral odours 2. Smelt bad and nice odours (+ rest) Separate visual and olfactory runs Overlay analysis Disgust - evidence from imaging 30

35 Fear Amygdala Activated by fear- inducing stimuli Habituates to fear Removal or damage disproportionately impairs fear recognition and feelings of fear Disgust Insula and Basal Ganglia Activated by facial expressions of disgust Insula Habituates to disgust Removal, damage or degeneration of either structure disproportionately impairs disgust recognition and feelings of disgust. A double dissociation: conclude that the neural mechanisms for fear and disgust are anatomically and functionally distinct 31 Disgust vs. Fear summary

36 Green = neutral Red = anger Purple = fear Yellow - happy Blue = sad 32 Duvernoy 1991, but see Kessler/West et al., 2001 Other basic emotions - implicated brain regions

37 50 patients with neurodegenerative dementia: Negative emotions (red, rlITG/rMTG) and in particular sadness (green, rSTG) correlated with tissue loss in right lateral inferior temporal and right middle temporal regions. Reflects this areas role in visual processing of negative emotions Tissue loss associated with specific emotion recognition impairment Rosen et al., Other basic emotions - implicated brain regions

38 How does knowledge about brain activation help social psychologists? 34

39 Recent ResearchJune 2006 Age is related to decreasing cognitive function - esp in frontal functions Is emotion processing affected by advancing age? An event-related brain potential study Emotional intensity is a frontal function Are old folk impaired at emotion intensity recognition? Investigated using ERP (EEG) and analysed by ANOVA Rationale Methods

40 young old Delay in early discrimination processing, but no difference in emotion discrimination Results

41 Perceiving fear in dynamic body expressions Emotional body language is important when the face cannot be seen Important for survival so should be fast, automatic and employ dedicated brain network We know which parts of the brain are active for static emotion We know that other parts of the brain are active for body motion How do these interact for emotive whole body dynamic stimuli? Recent ResearchJanuary 2007 Rationale

42 Used event-related fMRI and video clips Fear and neutral body movements with scrambled static and dynamic video images as controls Task to press button when inverted image seen (therefore incidental processing being measured) Methods

43 1. Main effects of bodies vs. scrambled stimuli (Fs+Fd+Ns+Nd) 2(Sd+Ss). 2. Main effects of fear vs. neutral bodies [(Fs+Fd)(Ns+Nd)]. 3. Main effects of dynamic vs. static bodies [(Fd+Nd)(Fs+Ns)]. Analyses Results Amygdala active for social stimuli Not bothered whether static or dynamic More bothered when it is fearful

44 Other brain regions are bothered that it is dynamic (and fearful) These regions will be covered in later lectures Results

45 Attention to the person or the emotion: Underlying activations in MEG Facial emotion processing is fast (100ms) and automatic and occurs regardless of whether you attend to the face or not. Facial identity is also fast (but slower) and occurs in parallel according to most models But there is some evidence from schizophrenia suggesting that the parallel (and therefore separate) brain regions interact What happens to this interaction when you attend to either emotion or identity? Recent ResearchJune 2007 Rational

46 Used MEG and happy/fear/neutral faces Identity task - press button when 2 identities the same Emotion task - press button when 2 emotions the same 90ms orbito-frontal response to emotion regardless of attention 170ms right insula response when attending to emotion Also 220ms activation increase for areas associated with identity processing Methods and Results Conclusions So there you go

47 Impaired facial emotion recognition and reduced amygdalar volume in schizophrenia Amygdala volume known to be reduced in Schizophrenics Emotion recognition known to be impaired in Schizophrenia Direct link between the two not studied (properly) before Used 20 Sz + 20 Cs. 3T MRI And facial emotion intensity recognition task Recent ResearchOct 2007 Rationale Methods

48 (1) The schizophrenia patients had smaller amygdalar volumes than the healthy controls; (2) the patients showed impairment in recognizing facial emotions, specifically anger, surprise, disgust, and sadness; (3) the left amygdala volume reduction in these patients was associated with impaired recognition of sadness in facial expressions. Results

49 Summary distinct neural pathways underlie the processing of signals of fear (amygdala) and disgust (insula/basal ganglia) in humans. this dissociation can be related to the adaptive significance of these emotions as responses to critical forms of threat that are associated with external (fear) and internal (disgust) defence systems. According to LeDoux, social neuroscience has been able to make progress in the field of emotion by focusing on a psychologically well-defined aspect of emotion using an experimental approach to emotion that simplifies the problem in such a way as to make it tractable circumventing vague and poorly defined aspects of emotion removing subjective experience as a roadblock to experimentation. 37

50 Next week Emotion recognition from auditory cues and theories of emotion

Download ppt "Human Social Interaction Research proposal Dr. Roger Newport Room B47 Drop-in times: Tuesdays 12-2 Understanding."

Similar presentations

Ads by Google