The Time Course of Processing Emotional Prosody: Behavioral and Electrophysiological Investigations Lauren Cornew, 1 Leslie J. Carver, 1 and Tracy Love.

Slides:



Advertisements
Similar presentations
Helen Gaeta, David Friedman, & Gregory Hunt Cognitive Electrophysiology Laboratory New York State Psychiatric Institute Differential Effects of Stimulus.
Advertisements

REFRESHING MEMORY FOR DETAILS OF A MOCK CRIME DOES NOT ENHANCE ACCURACY OF A P300 GUILTY KNOWLEDGE LABORATORY TEST Shinji HIRA (Fukuyama University) Isato.
Tone perception and production by Cantonese-speaking and English- speaking L2 learners of Mandarin Chinese Yen-Chen Hao Indiana University.
Visual speech speeds up the neural processing of auditory speech van Wassenhove, V., Grant, K. W., & Poeppel, D. (2005) Proceedings of the National Academy.
Electroencephalogram (EEG) and Event Related Potentials (ERP) Lucy J. Troup 28 th January 2008 CSU Symposium on Imaging.
Effects of Attentional Focus on Oral-Motor Control and Learning Skott E. Freedman 1, Edwin Maas 1, Michael P. Caligiuri 2, Gabriele Wulf 3, & Donald A.
P3 target cue target long CTI cue target cue short CTI children old PZ cue target Cue-related ERPs.
Electrodermal Measures of Face Recognition Iowa State University of Science and Technology Alison L. MorrisDanielle R. Mitchell Nichole Stubbe Anne M.
ERPs to Semantic and Physical Anomalies in Cartoon Videos Jennifer Michelson 1, Courtney Brown 1, Laura Davis 1, Tatiana Sitnikova 2 & Phillip J. Holcomb.
Using prosody to avoid ambiguity: Effects of speaker awareness and referential context Snedeker and Trueswell (2003) Psych 526 Eun-Kyung Lee.
Vocal Emotion Recognition with Cochlear Implants Xin Luo, Qian-Jie Fu, John J. Galvin III Presentation By Archie Archibong.
SPECIFICITY OF ERP TO CHANGE OF EMOTIONAL FACIAL EXPRESSION. Michael Wright Centre for Cognition and Neuroimaging, Brunel University, Uxbridge, UB8 3PH,
Participants seem to utilize predictive information to actively maintain both task sets when doing so aids performance. Decreased mixing and switch costs.
Phonetic Similarity Effects in Masked Priming Marja-Liisa Mailend 1, Edwin Maas 1, & Kenneth I. Forster 2 1 Department of Speech, Language, and Hearing.
Audiovisual Emotional Speech of Game Playing Children: Effects of Age and Culture By Shahid, Krahmer, & Swerts Presented by Alex Park
Links Between Action Perception and Action Production in 10-Week-Old Infants Vincent M. Reid 1 and Katharina Kaduk 1 Department of Psychology, Durham University,
INTRODUCTION Emotional stimuli that are irrelevant to task performance may redirect attentional resources, particularly among individuals evidencing high.
There’s more to emotion than meets the eye: Processing of emotional prosody in the auditory domain Lauren Cornew, 1 Tracy Love, 1,2 Georgina Batten, 1.
Effects of Bilingualism on Hemispheric Interaction Suzanne E. Welcome & Christine Chiarello University of California, Riverside Maintaining and coordinating.
Cross-Language Neighborhood Effects in Bilinguals: An Electrophysiological Investigation Krysta Chauncey 1, Katherine J. Midgley 1,2, Jonathan Grainger.
An Electrophysiological study of translation priming in French/English bilinguals Katherine J. Midgley 1,2, Jonathan Grainger 2 & Phillip J. Holcomb 1.
Measuring the brain’s response to temporally modulated sound stimuli Chloe Rose Institute of Digital Healthcare, WMG, University of Warwick, INTRODUCTION.
Background Infants and toddlers have detailed representations for their known vocabulary items Consonants (e.g., Swingley & Aslin, 2000; Fennel & Werker,
Categorizing Emotion in Spoken Language Janine K. Fitzpatrick and John Logan METHOD RESULTS We understand emotion through spoken language via two types.
Change blindness and time to consciousness Professor: Liu Student: Ruby.
Introduction How do people recognize objects presented in pictorial form? The ERP technique has been shown to be extremely useful in studies where the.
Method Participants Participants were 68 preschoolers, between the ages of 29 and 59 months of age. The sample was comprised of 32 male participants and.
Pavlovian, Observational and Instructed Fear Learning: Emotional Responses to Unmasked and Masked Stimuli Andreas Olsson, Kristen Stedenfeld & Elizabeth.
INTRODUCTION AIMS AND PREDICTIONS METHODS Participants: 18 children (9-10; M = 10). 38 young adults (20-30; M = 24) 26 older adults (65-85; M = 72) EEG.
Transforming transcendence into trait - An electrophysiological approach: The Model of Mindfulness Meditation Aviva Berkovich Ohana 1, Dr Avi Goldstein.
J. Peter Rosenfeld, Elena Labkovsky, Michael Winograd, Alex Haynes Northwestern University Psychology Department, Institute of Neuroscience.
Sh s Children with CIs produce ‘s’ with a lower spectral peak than their peers with NH, but both groups of children produce ‘sh’ similarly [1]. This effect.
Sounds in a reverberant room can interfere with the direct sound source. The normal hearing (NH) auditory system has a mechanism by which the echoes, or.
Training Phase Results The RT difference between gain and loss was numerically larger for the second half of the trials than the first half, as predicted,
As expected, a large N400 effect was observed for all 3 word types in both experiments, |ts|≥7.69, ps
Electrophysiological Processing of Single Words in Toddlers and School-Age Children with Autism Spectrum Disorder Sharon Coffey-Corina 1, Denise Padden.
The maturation of familiarity and recollection in episodic memory: An ERP developmental approach Marianne de Chastelaine 1, David Friedman 1, Yael M. Cycowicz.
The New Normal: Goodness Judgments of Non-Invariant Speech Julia Drouin, Speech, Language and Hearing Sciences & Psychology, Dr.
1 Cross-language evidence for three factors in speech perception Sandra Anacleto uOttawa.
A) Flat Reflection B) Slant Reflection C) Flat Random D) Slant Random E Reflection – Random scalp maps (300 to 1000 ms) F) Discriminate Color G) Discriminate.
Neurophysiologic correlates of cross-language phonetic perception LING 7912 Professor Nina Kazanina.
fMRI Task Design Robert M. Roth, Ph.D.
Too happy to careAlcohol, Affect and ERN amplitude Too happy to care: Alcohol, Affect and ERN amplitude Conclusions: Consistent with Ridderinkhof et al.
Introduction Can you read the following paragraph? Can we derive meaning from words even if they are distorted by intermixing words with numbers? Perea,
The Source of Enhanced Cognitive Control in Bilinguals: Evidence From Bimodal-Bilinguals Gigi Luk 1, Jennie Pyers 2, Karen Emmorey 3 & Ellen Bialystok.
Absolute Threshold PSY 3215 – Perception – Appalachian State University.
1 Behavioral and electrophysiological evidence for the impact of regional variation on phoneme perception A. Brunellière, S. Dufour, N. Nguyen, U. H. Frauenfelder.
Video Games and Working Memory Derek M. Ellis Chris Blais Gene A. Brewer Department of Psychology Arizona State University The Entertainment Software Rating.
Tonal Violations Interact with Lexical Processing: Evidence from Cross-modal Priming Meagan E. Curtis 1 and Jamshed J. Bharucha 2 1 Dept. of Psych. & Brain.
Intersensory Redundancy Facilitates Infants’ Perception of Meaning in Speech Passages Irina Castellanos, Melissa Shuman, and Lorraine E. Bahrick Florida.
ANT Z=52 R ACUE - PASSIVE VCUE - PASSIVE 1300 msVoltageCSD.31uV.03uV/cm 2 AIM We investigate the mechanisms of this hypothesized switch-ERP.
A direct comparison of Geodesic Sensor Net (128-channel) and conventional (30-channel) ERPs in tonal and phonetic oddball tasks Jürgen Kayser, Craig E.
Early Time Course Hemisphere Differences in Phonological & Orthographic Processes Laura K. Halderman 1, Christine Chiarello 1 & Natalie Kacinik 2 1 University.
Subjective evaluation of an emotional speech database for Basque Aholab Signal Processing Laboratory – University of the Basque Country Authors: I. Sainz,
Processing Faces with Emotional Expressions: Negative Faces Cause Greater Stroop Interference for Young and Older Adults Gabrielle Osborne 1, Deborah Burke.
RESEARCH QUESTIONS Might having to lie still without moving, or having to lie down rather than sit up, change the pattern of neural activity in very young.
Without Words for Emotions: Is the emotional processing deficit in alexithymia caused by dissociation or suppression? Christian Sinnott & Dr. Mei-Ching.
An ERP investigation of response inhibition in adults with DCD Elisabeth Hill Duncan Brown José van Velzen.
Topographic mapping on memory test of verbal/spatial information assessed by event related potentials (ERPs) Petrini L¹², De Pascalis V², Arendt-Nielsen.
Danielle Werle Undergraduate Thesis Intelligibility and the Carrier Phrase Effect in Sinewave Speech.
The Role of Figurativeness and Modality on Semantic Processing: An N400 Study Stephen Agauas and Elizabeth Miller Faculty Advisor: Dr. Gwenda SchmidtBackground.
Fig. 3 Attentional bias score Note: Attentional bias score, calculated as the difference between each category RT and neutral RT, reflects the effect of.
Relationship between Pitch and Rhythm Perception with Tonal Sequences
S. Kramer1, K. Tucker1, A.L. Moro1, E. Service1, J.F. Connolly1
Verifiability and Action verb Processing: An ERP Investigation
Neurofeedback of beta frequencies:
Social context influence emotional language comprehension
PSY Perception – Appalachian State University
Minami Ito, Gerald Westheimer, Charles D Gilbert  Neuron 
Intro to EEG studies BCS204 Week 2 1/23/2019.
Presentation transcript:

The Time Course of Processing Emotional Prosody: Behavioral and Electrophysiological Investigations Lauren Cornew, 1 Leslie J. Carver, 1 and Tracy Love 1,2 1 University of California, San Diego, 2 San Diego State University References Background Exp. 2 Preliminary Results Questions Acknowledgements This research was supported in part by an NSF Graduate Research Fellowship to the first author and NIH grants (DC00494 and DC03885) to the last author. Special thanks to Sarah Callahan, Hong Duong, Maxwell Moholy, Negin Khalifian, Hoa Nguyen, Uyen Pham, Sara Rust, Danny Sanchez, and Matt Walenski! Experiment 1 Discussion Exp. 1a Results 43 UCSD undergraduates (monolingual native English speakers, 27 female, mean age = 21) Stimuli: 48 Jabberwocky sentences (mean length = 2.7s), recorded by a Native English female speaker at a regular rate of speech (approx 5 syll/sec), were spoken in happy, angry, and neutral prosody Gating paradigm: 8 Sentences edited into successive clips, with duration increasing in 250ms increments; 5s of silence in between (Figure 1 below) 13 UCSD undergrads (right-handed monolingual native English speakers, mean age = 21; note that this experiment is part of a larger study that is still in progress.) Stimuli: Jabberwocky sentences from Experiment 1 (presented in their entirety; i.e., not spliced), alternating with tone sequences of ascending, descending, and constant pitch Target detection task: 6 blocks of trials; each emotion (happy, angry, neutral) served as a target in 3 blocks; order counterbalanced Participants instructed to press one button (right/left counterbalanced across subjects) following a Jabberwocky sentence conveying the target emotion, and the other button following any other type of audio clip Experiment 1a Methods Experiment 2 Methods Every clip judged to be happy, angry, or neutral Variables of interest: Percent correct Isolation point (length of the clip at which participants chose the correct emotion and did not subsequently change their decision) Mixed factorial design: all participants received all emotions, but no one participant heard any sentence more than once Participants randomly assigned to 1 of 3 groups: Is there enhanced processing of negative prosody? Or, is there an advantage for emotional (positive or negative) compared to non-emotional prosody? Two experiments, Behavioral (Exp. 1a & 1b) and Electrophysiological (Exp. 2), aim to explore these questions. Participants in Experiments 1a and 1b demonstrated a processing advantage for neutral prosody, which was identified more accurately and more rapidly than happy or angry prosody. This ‘neutral bias’ was not simply due to a greater ease in discriminating emotional from non-emotional prosody. Does the “neutral bias” observed in Experiments 1a and 1b reflect perception, attention, decision/response, or a language processing or acoustic parameter? Emotion interacts with cognition at many levels of processing, from basic perceptual 1 and attentional 2 stages to higher cognitive functions such as decision-making 3 and categorization. 4 Findings from studies of auditory emotion recognition are mixed and include reports of enhanced processing of negative 5, positive, 6 and neutral 7 prosody. 1 Phelps et al (2006). Psych. Science, 17, Carretié et al (2003). Psychophys., 40, Bechara, Damasio, & Damasio (2003). Ann. NY Academy Sciences, 985, Ito et al. (1998). J Personality and Social Psych., 75, Wambacq, Shea-Miller, & Abubakr (2004). NeuroReport, 15, Alter et al (2003). Speech Communication, 40, Schirmer & Kotz (2003). JCN, 15, Grosjean, F. (1980). Perception & Psychophys., 28, Greater accuracy for neutral prosody, F(2, 34) = 7.87, p =.001: Faster correct identification of neutral prosody, F(2, 34) = 24.67, p =.000: “The hessups ate pea chup after the sholt.” 250ms 500ms 750ms Entire Sentence... Figure 1. Schematic of a spliced sentence Figure 2 Figure 3 Experiment 1b Methods 24 UCSD undergrads (monolingual native English speakers, mean age = 22) Stimuli: same as in Experiment 1a Similar design as above: participants randomly assigned to 1 of 4 groups: Exp. 1b Results Greater accuracy for neutral prosody than happy prosody, t(11) = 3.52, p =.003, or angry prosody, t(10) = 4.48, p =.001. Faster correct identification of neutral prosody than happy t(11) = 2.23, p =.024, or angry prosody, t(10) = 3.06, p = – 0 window: significant effect of emotion maximal at Pz, P3/4, Oz, and O1/2 Amplitudes in neutral condition significantly more negative than in happy or angry conditions (all ps ≤.001), which did not differ from one another: Results (cont’d) 600 – 2000 window: significant effect of emotion maximal at FT7/8 Sustained positivity in neutral condition significantly greater than in happy or angry conditions (all ps ≤.001), which did not differ from one another: Figure 5 Figure 6 Experiment 2 Discussion General Discussion Behavioral and ERP data support a bias for processing neutral prosody over happy and angry prosody. Future directions include comparing task-relevant vs. task-irrelevant processing of emotional prosody. ERP data indicate a negative deflection in the 400ms preceding the isolation point, which likely reflects emotion recognition processes. Neutral prosody elicited a greater amplitude positivity between ms than emotional prosody. Figure 5 EEG amplified 100x with a SynAmps amplifier; electrodes referenced to linked mastoids Incorrect trials excluded, as well as time windows containing artifacts Participants whose data yielded >70% trial inclusion were retained for data analysis (n = 11). ERPs to Jabberwocky emotional prosody stimuli time-locked to the average isolation point for each emotion, as determined in Experiment 1a, in order to examine the recognition process (reference Fig. 3) Data analysis: Repeated measures ANOVAs; Greenhouse-Geisser adjustment applied to correct for sphericity violations Time windows of interest: -400 – 0ms, 600 – 2000ms EEG recording and ERP reduction: Quik-Cap (CompuMedics, Inc.) with 32 sintered Ag/AgCl electrodes arranged according to modified international system placement (Figure 5) Figure 4