Attentive Tracking of Sound Sources

Slides:



Advertisements
Similar presentations
Journal of Vision. 2008;8(11):18. doi: / Figure Legend:
Advertisements

Visual Influences on Echo Suppression
Brain Responses in 4-Month-Old Infants Are Already Language Specific
Volume 87, Issue 4, Pages (August 2015)
Backward Masking and Unmasking Across Saccadic Eye Movements
Volume 27, Issue 7, Pages (April 2017)
A Source for Feature-Based Attention in the Prefrontal Cortex
Volume 77, Issue 5, Pages (March 2013)
The cocktail party problem
Volume 55, Issue 3, Pages (August 2007)
Araceli Ramirez-Cardenas, Maria Moskaleva, Andreas Nieder 
Elise A. Piazza, Marius Cătălin Iordan, Casey Lew-Williams 
Decision Making during the Psychological Refractory Period
Norm-Based Coding of Voice Identity in Human Auditory Cortex
Volume 25, Issue 15, Pages (August 2015)
Ryota Kanai, Naotsugu Tsuchiya, Frans A.J. Verstraten  Current Biology 
Huan Luo, Xing Tian, Kun Song, Ke Zhou, David Poeppel  Current Biology 
Nori Jacoby, Josh H. McDermott  Current Biology 
Selective Attention in an Insect Visual Neuron
Volume 17, Issue 13, Pages (July 2007)
Jason Samaha, Bradley R. Postle  Current Biology 
Differential Impact of Behavioral Relevance on Quantity Coding in Primate Frontal and Parietal Neurons  Pooja Viswanathan, Andreas Nieder  Current Biology 
Brain Responses in 4-Month-Old Infants Are Already Language Specific
Volume 26, Issue 3, Pages (February 2016)
Volume 27, Issue 19, Pages e2 (October 2017)
Visual Sensitivity Underlying Changes in Visual Consciousness
Consequences of the Oculomotor Cycle for the Dynamics of Perception
Single-Unit Responses Selective for Whole Faces in the Human Amygdala
Integration Trumps Selection in Object Recognition
Eye Movement Preparation Modulates Neuronal Responses in Area V4 When Dissociated from Attentional Demands  Nicholas A. Steinmetz, Tirin Moore  Neuron 
A Scalable Population Code for Time in the Striatum
Ethan S. Bromberg-Martin, Masayuki Matsumoto, Okihide Hikosaka  Neuron 
Mosquitoes Use Vision to Associate Odor Plumes with Thermal Targets
Volume 95, Issue 5, Pages e5 (August 2017)
Volume 27, Issue 21, Pages e3 (November 2017)
Neuronal Response Gain Enhancement prior to Microsaccades
Consequences of the Oculomotor Cycle for the Dynamics of Perception
Visual Sensitivity Can Scale with Illusory Size Changes
Direct Two-Dimensional Access to the Spatial Location of Covert Attention in Macaque Prefrontal Cortex  Elaine Astrand, Claire Wardak, Pierre Baraduc,
Volume 23, Issue 21, Pages (November 2013)
Stephen V. David, Benjamin Y. Hayden, James A. Mazer, Jack L. Gallant 
Caudate Microstimulation Increases Value of Specific Choices
The Normalization Model of Attention
Volume 72, Issue 6, Pages (December 2011)
Dissociable Effects of Salience on Attention and Goal-Directed Action
Volume 64, Issue 3, Pages (November 2009)
Visual Scene Perception in Navigating Wood Ants
Facial-Expression and Gaze-Selective Responses in the Monkey Amygdala
Jude F. Mitchell, Kristy A. Sundberg, John H. Reynolds  Neuron 
John T. Serences, Geoffrey M. Boynton  Neuron 
Attention Samples Stimuli Rhythmically
Tuning to Natural Stimulus Dynamics in Primary Auditory Cortex
Category Selectivity in the Ventral Visual Pathway Confers Robustness to Clutter and Diverted Attention  Leila Reddy, Nancy Kanwisher  Current Biology 
Dynamic Shape Synthesis in Posterior Inferotemporal Cortex
Ian C. Fiebelkorn, Yuri B. Saalmann, Sabine Kastner  Current Biology 
Volume 17, Issue 15, Pages (August 2007)
Kristy A. Sundberg, Jude F. Mitchell, John H. Reynolds  Neuron 
Daniela Vallentin, Andreas Nieder  Current Biology 
Sound Facilitates Visual Learning
Cross-Modal Associative Mnemonic Signals in Crow Endbrain Neurons
Christoph Kayser, Nikos K. Logothetis, Stefano Panzeri  Current Biology 
Color Constancy for an Unseen Surface
Gaby Maimon, Andrew D. Straw, Michael H. Dickinson  Current Biology 
Attention-Dependent Representation of a Size Illusion in Human V1
Li Zhaoping, Nathalie Guyader  Current Biology 
Nori Jacoby, Josh H. McDermott  Current Biology 
Visual Crowding at a Distance during Predictive Remapping
Volume 27, Issue 6, Pages (March 2017)
Visual Crowding Is Correlated with Awareness
Responsibility Assignment in Redundant Systems
Presentation transcript:

Attentive Tracking of Sound Sources Kevin J.P. Woods, Josh H. McDermott  Current Biology  Volume 25, Issue 17, Pages 2238-2246 (August 2015) DOI: 10.1016/j.cub.2015.07.043 Copyright © 2015 Elsevier Ltd Terms and Conditions

Figure 1 Features in Natural Speech Vary over Time (A) Spectrogram of concurrent utterances by two female speakers. (B) Example spectral structure of a single speaker. Top: power spectrum of a 100 ms segment of voiced speech excerpted from one of the utterances in (A). Resonances in the vocal tract produce formants—broad spectral peaks that determine vowel quality. Bottom: spectrogram of one of the utterances from (A). Dashed lines depict segment from which power spectrum in top panel was measured. (C) Pitch and formant contours from the two utterances from (A), measured with PRAAT. The yellow line plots the trajectory for the utterance in (B). Open and closed circles denote the beginning and end of the trajectories, respectively. (D–F) Marginal distributions of F0, F1, and F2 for all TIMIT utterances for these particular speakers. Red bars mark μ ± 2σ of the means of such distributions for all 53 female speakers in TIMIT. Differences between the average features of speakers are small relative to the variability produced by a single speaker. Current Biology 2015 25, 2238-2246DOI: (10.1016/j.cub.2015.07.043) Copyright © 2015 Elsevier Ltd Terms and Conditions

Figure 2 Streaming Stimuli and Task (A) Representative stimulus trajectories from experiment 1 (stream-segregation task). Stimulus trajectories in all experiments crossed at least once in each feature dimension, such that the cued voice could not be selected on the basis of its average pitch or formant values. Here and elsewhere, open and closed circles denote the beginning and end of the trajectories, respectively. (B) Listeners first heard a cue taken from the beginning portion of one voice, then a mixture of two voices, and finally a probe that could be taken from the end portion of either voice. Listeners had to decide whether the probe came from the cued voice. The graph depicts the stimulus variation along a single dimension for ease of visualization. (C) Results of experiment 1 (stream-segregation task). Each marker plots the performance of an individual subject. See also Figure S1 for block-by-block performance. Current Biology 2015 25, 2238-2246DOI: (10.1016/j.cub.2015.07.043) Copyright © 2015 Elsevier Ltd Terms and Conditions

Figure 3 Experiment 2: Vibrato Detection as a Measure of Attention during Streaming (A) Example stimulus trajectories. Either voice could contain vibrato (a brief pitch modulation, added in this example to the green trajectory). Listeners performed the stream-segregation task from experiment 1 but were additionally asked to detect vibrato in either stream. The trajectory shown is 2 s in duration (from experiment 2A); trajectories in experiment 2B were 3 s. (B) Stream-segregation performance for the 12 participants in experiment 2A. (C) Sensitivity to vibrato in the cued and uncued voices for subjects grouped by streaming performance (into two equal-sized groups; left) and pooled across groups (right). Includes only trials in which the stream-segregation task was performed correctly. Error bars here and elsewhere denote within-subject SEMs and thus do not reflect the variability in overall vibrato detection across subjects. (D) Stream-segregation performance for the six best streamers in experiment 2B (3 s mixtures, 250 ms cue and probe, different group of listeners). (E) Sensitivity to vibrato versus temporal position of vibrato onset (equal-sized bins of uniformly distributed onset times) in the cued and uncued voices for the six best streamers in experiment 2B. Only trials in which the stream-segregation task was performed correctly are included. The gray bar below depicts the time course of the mixture; regions matching the cue and probe are in dark gray. Current Biology 2015 25, 2238-2246DOI: (10.1016/j.cub.2015.07.043) Copyright © 2015 Elsevier Ltd Terms and Conditions

Figure 4 Experiment 3: Speech-like Discontinuities (A) Histograms of the durations of discontinuities (red) and voiced segments (blue) in the stimuli. (B) Example stimulus trajectories from experiment 3, containing speech-like discontinuities. (C) Stream-segregation performance for discontinuous and continuous sources. Current Biology 2015 25, 2238-2246DOI: (10.1016/j.cub.2015.07.043) Copyright © 2015 Elsevier Ltd Terms and Conditions

Figure 5 Experiment 4: Source Proximity (A) Example stimulus trajectories; dashed line indicates the sources’ closest pass in feature space. (B) Stream-segregation performance as a function of this minimum distance between sources. Current Biology 2015 25, 2238-2246DOI: (10.1016/j.cub.2015.07.043) Copyright © 2015 Elsevier Ltd Terms and Conditions

Figure 6 Experiment 5: Sources Varying in Just One Feature (A) Example feature trajectories in the two conditions of experiment 5, in which sources could vary over time in either three dimensions (F0, F1, and F2) or one (F0). (B) Stream-segregation performance for sources changing in F0, F1, and F2 and sources changing only in F0. Current Biology 2015 25, 2238-2246DOI: (10.1016/j.cub.2015.07.043) Copyright © 2015 Elsevier Ltd Terms and Conditions