Presentation on theme: "Human Social Interaction Drop-in times: Tuesdays 12-2"— Presentation transcript:
1Human Social Interaction Drop-in times: Tuesdays 12-2 Research proposalDr. Roger NewportRoom B47Drop-in times: Tuesdays 12-2Understanding Emotion: visual recognition
22Introduction to facial emotionsThe neuroscience of Fear and Disgust(the simple story)Other emotions (the complicated story)Current research
3Lecture Overview3What are facial expressions of emotion and what are they for?Are there specific centres in the brain dedicated to emotion perception?Are different emotions processed in different ways?Understanding Emotions
4Why are we interested in emotion perception? 4Evolutionary survivalSocial survival
5in social communication. facial expressions of emotion - what are the emotions?5MotivationalBasicSelf-conscious/socialHappinessFearAngerSurpriseDisgustSadnessShameEmbarrassmentPrideGuiltThirstHungerPainMoodDo not feature prominentlyin social communication.Feature prominentlyin social communication.Regulate social behaviour
66Faces are specialFace perception may be the most developed visual perceptual skill in humans.Infants prefer to look at faces from shortly after birth (Morton and Johnson 1991).Most people spend more time looking at faces than at any other type of object.We seem to have the capacity to perceive the unique identity of a virtually unlimited number of different faces
7Understanding Emotion: from facial expressions 7Facial expressions as a communicative toolWe laugh more if in a group/ show distress more if in a groupBabies (10 months) almost only smile in presence of caregiverBabies look to caregiver and behave according to caregiver response when encountering novel object. E.g. a barking dog or a snakeThis is known as social referencing and is also seen in chimpanzee societiesA similar process, observational fear, is seen in other monkeys. Infant monkeys show fearful unconditioned response to mother’s expression of fear when the mother could see a snake, but the infants could not. That is, infants showed a fear response to the mother’s fear response.Facial expressions as a communicative toolWe laugh more if in a group/ show distress more if in a groupBabies (10 months) almost only smile in presence of caregiverBabies look to caregiver and behave according to caregiver response when encountering novel object. E.g. barking dogThis is known as social referencing and is also seen in chimpanzee societiesA similar process, observational fear, is seen in other monkeys. Infant monkeys show fearful unconditioned response to mother’s expression of fear when the mother could see a snake, but the infants could not. That is, infants showed a fear response to the mother’s fear response.
8facial expressions as communication 8InErickson and Schulkin, 2003Percentage of facial responses to unpleasant odour classified as unpleasant, neutral, or pleasant in a spontaneous condition, a posed to real person condition, and a posed to imaginary audience condition
9facial expressions as communication 9Facial expressions allow for rapid communicationThey are produced when there is an emotional stimulus and an audience presentOur interpretation of another’s emotion modulates our behaviour and vice versaThe ability to recognise emotion expressions appears very earlyfirst few days (neonates)can distinguish between expressions of happiness, sadness, and surpriseFour- to six-monthshow preferences for facial expressions of happiness over neutral and angry expressionsseven monthscan distinguish among expressions of fear, anger, surprise, happiness, and sadness
10Recognition as an automatic processes - fear and threat 10Angry faces are detected much more rapidly than faces depicting non-threatening expressionsAttention is driven by fearOhman et al., 2001
11Automatic processes = dedicated network 11Fear and the amygdalaEvidence from animal, neuropsychological and imaging studies suggest that the amygdala is of primary importance in the recognition of fear.
12Fear and the amygdala - evidence from animal studies 12Bilateral amygdala removal:reduces levels of aggression and fear in rats and monkeysfacial expressions and vocalisations become less expressiveimpairs fear conditioning
13Fear and the amygdala - evidence from human neuropsychology Bilateral amygdala damagereduces recognition of fear-inducing stimulireduces recognition of fear in othersreduces ability to express fearDoes NOT affect ability to recognise faces or to know what fear isSee patients SM, DR and SE (Adolphs et al. and Calder et al.)Alzheimer’s disease impairs fear conditioning13
1414Fear and the amygdala - evidence from human imagingResults from several studiesFear face recogFear cond.Increased amydala activity for facial expressions of fear vs. happiness, disgust, anger, neutralnon-conscious processing of fear expressionsSubliminal activation of amygdala to fearNeuromodulatory role of left amygdala: less fear = less activity
15Amygdala Response to Facial Expressions in Children and Adults A typical study15Amygdala Response to Facial Expressions in Children and AdultsThomas et al., 2001Blocks of fixation of fear / neutral facesNo task, just watchRightLeftLeft amygdala activation for fear vs fixation in male children and adultsOverall, adults showed greater amygdala activation for fear v neutral whereas children did not(neutral faces may be ambiguous)
16Methods and MaterialsSubjectsSix male adults (mean 24 years, SD 6.6 years) and 12 children (mean 11 years, SD 2.4 years) recruited in the Pittsburgh area were scanned in a 1.5-T scanner during passive viewing of fearful and neutral faces. The children, sixfemale and six male, ranged in pubertal development from Tanner stages1 I/I to V/IV. Male and female subjects did not differ in mean age or Tanner stage. Data from an additional three adults (three female) and four children (two female) were not included due to excessive motion artifact (0.5 voxels; n 5) or claustrophobia (n 1) or because the subject fell asleep during the task (n 1). Subjects were screened for any personal or family history of psychiatric or medical illness, and for any contraindications for an MRI. Written child assent and parental consent were acquired before the study.
17Behavioral ParadigmThe task consisted of the rapid and successive presentation of faces in blocks of neutral and emotional expressions. The face stimuli consisted of digitized fearful and neutral faces taken from the Ekman and Friesen (1976) study (Figure 1). A total of eight different actors (four male and four female) demonstrating both fearful and neutral expressions were used. The hair was stripped from the images to remove any nonfacial features, and both fear and exaggerated fear poses were used for each actor (Calder et al 1997), resulting in a total of 16 fear stimuli and eight neutral stimuli. Stimuli were presented for 200 msec with an interstimulus interval of 800 msec (flashing fixation point). Each block of trials consisted of the presentation of a flashing fixation point for 45 sec followed by alternating 42-sec blocks of either neutral or fearful expressions and a final 45-sec epoch of fixation (Figure 1). This procedure was repeated in three runs of trials with the presentation order counterbalanced across runs and across subjects (i.e., F-N-F-N-F or N-F-N-F-N). Following Breiter and colleagues’ (Breiter et al 1996) design, no overt response was required. Instead, subjects were instructed to fixate centrally to try to get an overall sense of the faces.2
18Image Acquisition, Processing, and Analysis Scans were acquired on a 1.5-T GE Signa scanner (General Electric Systems, Milwaukee) modified for echo planar imaging (Advanced NMR, Wilmington, MA) using a quadrature head coil. A T1-weighted sagittal localizer image was used to prescribe the functional slice locations. T1-weighted structural images were acquired in 4-mm contiguous coronal slices through the whole brain (echo time [TE] min, repetition time [TR] 500, matrix 256 256, field of view [FOV] 20) for purposes of localizing the functional activity and aligning images in Talairach space (Talairach and Tournoux 1988). Functional images (T2*) were acquired at 12 of these slice locations spanning the entire amygdala (A20 to P24 in Talairach coordinates) using an EPI BOLD sequence (TE 40, TR 3000, flip angle 90°, matrix 128 64, FOV 20, 4-mm skip 0, voxel size 4.0 mm). There were three runs of 100 images totaling 300 images per slice. Images were motion corrected and normalized. All 18 subjects had less than 0.5 voxels of in-plane motion. All images were registered to a representative reference brain using Automated Image Registration software (Woods et al 1992), and voxelwise analyses of variance (ANOVAs) were conducted on these pooled data using normalized signal intensity as the dependent variable (Braver et al 1997; Casey et al 2000). Separate analyses were conducted comparing male adults and male children and comparing male and female children to examine interactions of stimulus type (fearful faces, neutral faces, fixation) with age or gender, respectively. Significant activations were defined by at least three contiguous voxels and alpha = .05 (Forman et al 1995). Amygdala activation was defined on the reference brain using Talairach coordinates and consensus among three raters (BJC, KMT, PJW). Significant regions that extended outside of the brain or had large SDs were excluded.
19ResultsAdults and ChildrenA 2 2 (Group Condition)3 ANOVA comparing male adults (n 6) and male children (n 6) revealed significant activity in the left amygdala and substantia innominata for fearful faces relative to fixation (Figure 2) and a decrease in signal with repeated presentations of the fearful faces4 (Table 1). Neutral faces showed a similar pattern of activation relative to fixation trials (F 23.71, p .001). A significant interaction was observed in the left amygdala between stimulus type and age for the comparison of fearful and neutral expressions (Table 1) (Group Condition, Fear vs. Neutral). Post hoc t tests indicate that adults demonstrated significantly greater activity for fearful faces relative to neutral faces (p .001). However, the children demonstrated greater amygdala activity for neutral faces than for fearful expressions (p .0001) (Figure 3). Neither age nor Tanner stage predicted the magnitude of the percent change in signal in this sample.
21Warning about other brain regions A variety of brain regions are involved in the processing of facial expressions of emotionThey are active at different times and some structures are active at more than one timeThe amygdala is particularly implicated in the processing of fear stimuli receiving early (<120 ms) subcortical as well as late (~170 ms) cortical input from the temporal lobes16
22Amygdala response to fear - special for faces? 17 The Amygdala Response to Emotional Stimuli:A Comparison of Faces and ScenesHariri et al., 2002Blocked designMatching taskGetting rid of unwanted activationsPreferential right amygdala response to faces (faces > IAPS)
23Emotions - not just an ugly face 18Hadjihhani and de Gelder, 2003Adolphs and Tranel, 2003Cs better when faces presentBilat AMs worse when faces presentoften better at negative stimuli without faces
27The contribution of the eyes to facial expressions of fear 22What emotion do these eyes depict?
28Emotions - the importance of the eyes 23Whalen et al., 2004% signal change from fix.But Amygdala is not just an eye detectorPerception of emotion relies on more than just the eyes andIt responds most to whole facesAmygdala may direct our attention to biologically (and through evolution, socially) relevant stimuli that require further analysis. A relevance detector rather than fear detector per seThe amygdala is responsive to large eye whites in fear (and surprise) expressionsAmygdala activation above fixation baseline for non-inverted (eye white) fearful eyes
29Emotions - the importance of the eyes 24The amygdala, fear and the eyesAdolphs et al., 2005.SM bubble analysis
30Emotions - the importance of the eyes SM’s eye fixation (or lack of it)25
31Emotions - the importance of the eyes 26When told to look at the eyes specifically SM improves, but only while given this instruction
32Emotions - amygdalae are not simply eye detectors 27The amydalae are not just eye detectors - may direct attention to relevant stimuli - a biological relevance detectorSome easy to tell from eyesOthers from mouthsWe need more than just the eyes to determine emotional and social relevanceHybrid Faces from Vuilleumier, 2005.
33Yuck! Disgust and the Insula (and basal ganglia) 29Animal studiesinsula = gustatory corteximpaired taste aversion in ratsHuman Neuropsychologypatient NKHuntingdon’s DiseaseTourette’s and OCDElectrical stimulation = nauseaRepeated exposure leads to habituationHuman imagingPhilips et al.Wicker et al.
34Disgust - evidence from imaging 30Both of Us Disgusted in My Insula: The Common Neural Basis of Seeing and Feeling DisgustWicker et al., 20031. Observed actors smelling and reacting to bad, nice and neutral odoursSeparate visual and olfactory runs2. Smelt bad and nice odours (+ rest)Overlay analysis
35Disgust vs. Fear summary 31FearAmygdalaActivated by fear-inducing stimuliHabituates to fearRemoval or damage disproportionately impairs fear recognition and feelings of fearDisgustInsula and Basal GangliaActivated by facial expressions of disgustInsula Habituates to disgustRemoval, damage or degeneration of either structure disproportionately impairs disgust recognition and feelings of disgust.A double dissociation: conclude that the neural mechanisms for fear and disgust are anatomically and functionally distinct
36Other basic emotions - implicated brain regions 32Green = neutralRed = angerPurple = fearYellow - happyBlue = sadDuvernoy 1991, but see Kessler/West et al., 2001
37Other basic emotions - implicated brain regions 33 Tissue loss associated with specific emotion recognition impairmentRosen et al., 2006B50 patients with neurodegenerative dementia:Negative emotions (red, rlITG/rMTG) and in particular sadness (green, rSTG) correlated with tissue loss in right lateral inferior temporal and right middle temporal regions.Reflects this area’s role in visual processing of negative emotions
3834How does knowledge about brain activation help social psychologists?
39Recent ResearchJune 2006Is emotion processing affected by advancing age?An event-related brain potential studyRationaleAge is related to decreasing cognitive function - esp in frontal functionsEmotional intensity is a frontal functionAre old folk impaired at emotion intensity recognition?MethodsInvestigated using ERP (EEG) and analysed by ANOVA
40ResultsDelay in early discrimination processing, but no difference in emotion discriminationyoungold
41Recent ResearchJanuary 2007Perceiving fear in dynamic body expressionsRationaleEmotional body language is important when the face cannot be seenImportant for survival so should be fast, automatic and employ dedicated brain networkWe know which parts of the brain are active for static emotionWe know that other parts of the brain are active for body motionHow do these interact for emotive whole body dynamic stimuli?
42MethodsUsed event-related fMRI and video clipsFear and neutral body movements with scrambled static and dynamic video images as controlsTask to press button when inverted image seen (therefore incidental processing being measured)
43Analyses1. Main effects of bodies vs. scrambled stimuli (Fs+Fd+Ns+Nd) − 2(Sd+Ss).2. Main effects of fear vs. neutral bodies [(Fs+Fd)−(Ns+Nd)].3. Main effects of dynamic vs. static bodies [(Fd+Nd)−(Fs+Ns)].ResultsAmygdala active for social stimuliNot bothered whether static or dynamicMore bothered when it is fearful
44ResultsOther brain regions are bothered that it is dynamic (and fearful)These regions will be covered in later lectures
45Recent ResearchJune 2007Attention to the person or the emotion: Underlying activations in MEGRationalFacial emotion processing is fast (100ms) and automatic and occurs regardless of whether you attend to the face or not.Facial identity is also fast (but slower) and occurs in parallel according to most modelsBut there is some evidence from schizophrenia suggesting that the parallel (and therefore separate) brain regions interactWhat happens to this interaction when you attend to either emotion or identity?
46Methods and ResultsUsed MEG and happy/fear/neutral facesIdentity task - press button when 2 identities the sameEmotion task - press button when 2 emotions the same90ms orbito-frontal response to emotion regardless of attention170ms right insula response when attending to emotionConclusionsSo there you goAlso 220ms activation increase for areas associated with identity processing
47Recent ResearchOct 2007Impaired facial emotion recognition and reduced amygdalar volume in schizophreniaRationaleAmygdala volume known to be reduced in SchizophrenicsEmotion recognition known to be impaired in SchizophreniaDirect link between the two not studied (properly) beforeMethodsUsed 20 Sz + 20 Cs. 3T MRIAnd facial emotion intensity recognition task
48Results(1) The schizophrenia patients had smaller amygdalar volumes than the healthy controls;(2) the patients showed impairment in recognizing facial emotions, specifically anger, surprise, disgust, and sadness;(3) the left amygdala volume reduction in these patients was associated with impaired recognition of sadness in facial expressions.
49Summary37distinct neural pathways underlie the processing of signals of fear (amygdala) and disgust (insula/basal ganglia) in humans.this dissociation can be related to the adaptive significance of these emotions as responses to critical forms of threat that are associated with external (fear) and internal (disgust) defence systems.According to LeDoux, social neuroscience has been able to make progress in the field of emotion byfocusing on a psychologically well-defined aspect of emotionusing an experimental approach to emotion that simplifies the problem in such a way as to make it tractablecircumventing vague and poorly defined aspects of emotionremoving subjective experience as a roadblock to experimentation.
50Next weekEmotion recognition from auditory cues and theories of emotion