The barn owl (Tyto alba)

Slides:



Advertisements
Similar presentations
Figure 10.13, page 344 Earoverall view Copyright © 2002 Wadsworth Group. Wadsworth is an imprint of the Wadsworth Group, a division of Thomson Learning.
Advertisements

Auditory scene analysis 2
Binaural Hearing Or now hear this! Upcoming Talk: Isabelle Peretz Musical & Non-musical Brains Nov. 12 noon + Lunch Rm 2068B South Building.
Periodicity and Pitch Importance of fine structure representation in hearing.
HEARING Sound How the Ears Work How the Cochlea Works Auditory Pathway
A.Diederich– International University Bremen – USC – MMM – Spring Auditory scene.
Hearing Detection Loudness Localization Scene Analysis Music Speech.
INTRODUCTION TO HEARING. WHAT IS SOUND? amplitude Intensity measured in decibels.
Sensation and Perception - audition.ppt © 2001 Laura Snodgrass, Ph.D.1 Audition Anatomy –outer ear –middle ear –inner ear Ascending auditory pathway –tonotopic.
Music Perception. Why music perception? 1. Found in all cultures - listening to music is a universal activity. 2. Interesting from a developmental point.
Localizing Sounds. When we perceive a sound, we often simultaneously perceive the location of that sound. Even new born infants orient their eyes toward.
Auditory Scene Analysis (ASA). Auditory Demonstrations Albert S. Bregman / Pierre A. Ahad “Demonstration of Auditory Scene Analysis, The perceptual Organisation.
A.Diederich– International University Bremen – Sensation and Perception – Fall Frequency Analysis in the Cochlea and Auditory Nerve cont'd The Perception.
All you have is a pair of instruments (basilar membranes) that measure air pressure fluctuations over time Localization.
Hearing & Deafness (3) Auditory Localisation
AUDITORY PERCEPTION Pitch Perception Localization Auditory Scene Analysis.
Structure and function
Spectral centroid 6 harmonics: f0 = 100Hz E.g. 1: Amplitudes: 6; 5.75; 4; 3.2; 2; 1 [(100*6)+(200*5.75)+(300*4)+(400*3.2)+(500*2 )+(600*1)] / = 265.6Hz.
A.Diederich– International University Bremen – USC – MMM – Spring Onset and offset Sounds that stop and start at different times tend to be produced.
Sound source segregation (determination)
Plasticity in sensory systems Jan Schnupp on the monocycle.
What are harmonics? Superposition of two (or more) frequencies yields a complex wave with a fundamental frequency.
1 Psy280: Perception Prof. Anderson Department of Psychology Audition 1 & 2.
Frequency Coding And Auditory Space Perception. Three primary dimensions of sensations associated with sounds with periodic waveforms Pitch, loudness.
Hearing Part 2. Tuning Curve Sensitivity of a single sensory neuron to a particular frequency of sound Two mechanisms for fine tuning of sensory neurons,
COMBINATION TONES The Science of Sound Chapter 8 MUSICAL ACOUSTICS.
The Auditory System Dr. Kline FSU. What is the physical stimulus for audition? Sound- vibrations of the molecules in a medium like air. The hearing spectrum.
SOUND IN THE WORLD AROUND US. OVERVIEW OF QUESTIONS What makes it possible to tell where a sound is coming from in space? When we are listening to a number.
The Auditory System. Gross anatomy of the auditory and vestibular systems.
Hearing: auditory coding mechanisms. Harmonics/ Fundamentals ● Recall: most tones are complex tones, consisting of multiple pure tones ● The lowest frequency.
Chapter 12: Auditory Localization and Organization
Chapter 5: Normal Hearing. Objectives (1) Define threshold and minimum auditory sensitivity The normal hearing range for humans Define minimum audible.
Sounds in a reverberant room can interfere with the direct sound source. The normal hearing (NH) auditory system has a mechanism by which the echoes, or.
 Space… the sonic frontier. Perception of Direction  Spatial/Binaural Localization  Capability of the two ears to localize a sound source within an.
Chapter 12: Sound Localization and the Auditory Scene.
Chapter 12: Sound Localization and the Auditory Scene.
Auditory Neuroscience 1 Spatial Hearing Systems Biology Doctoral Training Program Physiology course Prof. Jan Schnupp HowYourBrainWorks.net.
Fundamentals of Sensation and Perception THE AUDITORY BRAIN AND PERCEIVING AUDITORY SCENE ERIK CHEVRIER OCTOBER 13 TH, 2015.
Ch 121 Sensation & Perception Ch. 12: Sound Localization © Takashi Yamauchi (Dept. of Psychology, Texas A&M University) Main topics Auditory localization.
Introduction to psycho-acoustics: Some basic auditory attributes For audio demonstrations, click on any loudspeaker icons you see....
Development of Sound Localization 2 How do the neural mechanisms subserving sound localization develop?
Fletcher’s band-widening experiment (1940)
SPATIAL HEARING Ability to locate the direction of a sound. Ability to locate the direction of a sound. Localization: In free field Localization: In free.
1 Chapter 15 Objectives: 1) Explain a sound wave in terms of wave type and classification. 2) Describe a medium’s characteristics that affect the speed.
Fundamentals of Sensation and Perception
COMBINATION TONES The Science of Sound Chapter 8 MUSICAL ACOUSTICS.
Chapter 18 Waves and Sound
General Principles: The senses as physical instruments
Oregon Health & Science University
Auditory Localization in Rooms: Acoustic Analysis and Behavior
Auditorium Acoustics 1. Sound propagation (Free field)
The Cochlea Frequency analysis Transduction into neural impulses.
PSYCHOACOUSTICS A branch of psychophysics
Vision.
Precedence-based speech segregation in a virtual auditory environment
Fundamentals of Sensation and Perception
Getting an Earful Winter 2017 Peter Woodruff
Central auditory processing
Central Auditory Nervous System
C-15 Sound Physics 1.
Psychoacoustics: Sound Perception
The Special Senses: Part D
How you perceive your surroundings
Perception.
Localizing Sounds.
Evolution of sound localisation in land vertebrates
Sensation & Perception
3 primary cues for auditory localization: Interaural time difference (ITD) Interaural intensity difference Directional transfer function.
Outline Announcements Human Visual Information Processing
Perception.
Presentation transcript:

The barn owl (Tyto alba) amazing performance on sound localization tasks

azimuth 3D Space of sound localization Distance Elevation (vertical plan) Distance azimuth (horizontal plan)

Cues about Azimuth: Interaural Time Difference (ITD) eagle at 10 degrees azimuth Binaural Cues: when both ears are needed to process information x y x - y = .02m speed of sound = 331 m/s x - y = 60 microseconds i.e. ITD = 60 microseconds 20 cm

Physiology of Interaural Time Difference Superior Olivary Complex: (medial superior olives): First place where input converges from two ears ITD detectors form connections from inputs coming from two ears during first few months of life Newborns can do it (poorly) 2-month olds progressively better Indicative of lower brain (non-cortical) structures

Processing information on the azimuth: Interaural Level Difference (ILD) Binaural Cues: calculations are done comparing ears on pressure changes (i.e., differential activity at each level of the ear)

Interaural Level Difference (ILD) is also influenced by the size of the “shadow” cast from the head Shadow differentially influences high frequency waves (knocks them out) compared to large/low frequency waves Acoustic Shadow

How the size of your head influences Interaural Level Differences? Low frequency waves are longer/wider & “less frequent” Whereas, high frequency waves occupy a smaller space Takes a smaller “obstacle” to block out smaller waves

azimuth 3D Space of sound localization Distance Elevation (vertical plan) Distance azimuth (horizontal plan)

Information about the vertical plane: Elevation Spectral Cues Sound reflects off the head and differing locations of the pinna These reflections differ as a function of whether the sound is coming from higher or lower locations

Differing “reflections” cause variations in amplitude (loudness) at differing frequencies (changing cues in the spectrum) that reveal important information about the elevation of sound sources Directional transfer function (DTF)

Information about Distance Differing effect of short distance (i.e., within arms length) v. longer distances Short distances are dramatically influenced by interaural level differences (ILD)

Information for distance as we get farther away Sound level – changes in distance change amplitude (loudness/SPL/dB) Mostly useful for familiar stimuli Frequency changes – loss of high frequencies through the atmosphere over longer distances Movement parallax – exactly the same as in vision – nearer objects seem to move faster than farther objects in “sound space” Reflection – a source of multiple sound inputs – the greater the distance the greater opportunity for reflected information

Sound localization in a complex environment: The influence of reflected sound (echoes) Echoes (reflected sound or “reverberation”) occur to some extent in all natural situations Echoes depend on characteristics of the room or other space Echoes contribute to sound quality Echoes provide cues about the space in which they occur Echoes could potentially complicate sound localization

Precedence Effect: direct vs. indirect sound Simultaneous sounds that are symmetrically located to either side: Source of sound perceived as “centered” (called Fusion) Sounds arriving <1-msec apart (left v. right) don’t quite sound centered (locates more in direction of 1st sound) But, we perceive this VERY short Interaural Time Difference

Precedence Effect: direct vs. indirect sound Two Sounds (left v. right ears) arriving >1-to-5-msec apart : location perceived as direct from 1st sound called Precedence Effect Two sounds arriving to the listener more than 5-msecs apart: hear two different sounds called Echo Threshold

Precedence Effect Auditory system “deadens” sounds that are arriving under 5-msec apart Prevents hearing echoes in most day-to-day settings The effect of deadened echoes, being able to localize sound Too many echoes and you wouldn’t be able to localize

Physiology of sound localization One synapse from cochlea to cochlear nucleus (CN) One synapse from CN to Olivary Complex (lateral superior olive) Location information processing is done very fast!!

Physiological basis for localization Jenkins & Merzenich (1984): lesions of very specific frequency channels in the auditory cortex (cats) Inability to localize sound Stroke patients with damage to frequency channels in the auditory cortex Why does frequency matter for sound location?

Sound localization is influenced by multiple factors: Location cues for each of the 3 planes (Horizontal, vertical, distance) Interaural level differences (ILD) is particularly sensitive to Frequency

Physiological basis for localization in the monkey auditory cortex Cells respond differentially to specific interaural time delays Interaural time difference “detectors:” cells that respond “best” to specific time delays between the two ears Cells have been identified in the right (not left) hemisphere that respond “best” when there is movement between either the source or the perceiver “Panoramic” Neurons Fires to stimuli in all locations surrounding the perceiver, but… …neural firing rate varies (increase v. decrease) as a function of location in space

Neurons in the Inferior Colliculus are tuned to multiple sound parameters: Frequency Intensity Duration Direction and rate of change of frequency modulation (FM) Rate of change in amplitude modulation (AM) The interval between two sounds Other more complex sound patterns (not all are tuned to every parameter)

Integration in the Inferior Colliculus The inferior Colliculus received convergent information from multiple lower brainstem pathways. This convergence performs a number of functions: Integration of information about binaural intensity and time differences Integration of information contained in different spectral (frequency) ranges Integration of information occurring at different times

Auditory Scene Analysis What happens in natural situations? Acoustic environment can be a busy place Multiple sound sources How does auditory system sort out these sources? Source segregation and segmentation, or auditory scene analysis

Considering sound quality: timbre What have we considered in terms of sound? Fundamental frequency, pitch (hi/low) Amplitude, intensity, loudness (hi/low) Duration (long/short) Location (horizontal, vertical, distance) What else? Sound quality (complexity)  timbre

Timbre: Psychological sensation by which a listener can judge that two sounds that have the same loudness and pitch, but are dissimilar; Conveyed by harmonics and other high frequencies Perception of timbre depends on context in which sound is heard Provides information about auditory scene

Auditory Scene characteristics: Attack and Decay of sound The parts of a sound during which amplitude: (i) increases (onset or “attack”), or (ii) decreases (offset or “decay”)

guitar- many harmonics Timbre: differences in number & relative strength of harmonics guitar- many harmonics 400 800 1200 1600 2000 2400 Frequency (Hz) flute - few harmonics (only one actually) 400 800 1200 1600 2000 2400 Attack (onset) and decay (offset) also affect timbre -low harmonics build up faster, high harmonics decay slower

Gestalt Psychology In response to Wilhelm Wundt (1879) who proposed that “perception” was a function of “sensation” Gestalt psychologists were struck by the many ways in which our perceptions transcend the simple sensations from which they are built… and the importance of the “organization of perception” "The whole is different/greater than the sum of the parts"

How do we perceive objects in our world? Summary of Gestalt rules Good fit, closure, simplicity Similarity Good continuity Proximity Common fate Familiarity (meaningfulness) Common region Connectedness Synchrony Classical Gestalt “laws” Modern Gestalt “laws”

Read the relevant pages on the “Gestalt Rules” of visual perception from Chapter 4 (see following slide)

Auditory Scene Analysis Gestalt Psychology & auditory grouping Similarity: Location Timbre Pitch Temporal Proximity: Onset Offset Synchrony Good Continuation (melody) Familiarity (experience) pitch time

Factors that contribute to auditory stream segregation: Binaural cues Spatial separation: Frequencies coming from different points in Space produce different Interaural Level and Timing Differences Frequency components with the same ILD/ITD values will be grouped together. This would be an example of the Gestalt law of (spatial) proximity.

Timing cues for Gestalt “laws” Stream segregation: Timing cues for Gestalt “laws” Temporal separation: Frequency components that occur close together are grouped (law of proximity (near in time)). Temporal onsets and offsets: Frequencies that have the same onset and offset time belong together. (law of synchrony) Temporal modulations: Frequency components that change together belong together. (i.e., law of good continuation and the law of common fate)

Frequency components that change together are grouped together Frequency components that change together are grouped together. Those that do not change are grouped separately (law of synchrony, proximity &/or common fate)

Stream segregation: spectral cues & Gestalt “laws” Spectral separation: Frequencies that are “similar” (i.e., octaves, chords) are grouped together (law of similarity, familiarity) Harmonicity: Frequencies that are harmonically related may be grouped together (law of synchrony, law of familiarity) Spectral profile: Frequency components whose relative amplitudes (e.g., soft or loud) remain constant across the sound may be grouped together. (law of good continuation)

The tendency to perceive “good continuation” results In perceptual “filling in”, or closure