Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Motor Theory of Speech Perception April 1, 2013.

Similar presentations


Presentation on theme: "The Motor Theory of Speech Perception April 1, 2013."— Presentation transcript:

1

2 The Motor Theory of Speech Perception April 1, 2013

3 Palatography Preparations! First off: turn in your course project reports! Secondly: we will do the palatography demo on Wednesday. We’ve already gotten volunteers for speakers We also need someone to volunteer for: Photography Transcription I’ll bring the goodies! Now: let’s watch some dogs playing the piano.

4 The Next Level Interestingly, categorical perception is not found for non-speech stimuli. Miyawaki et al: tested perception of an F3 continuum between /r/ and /l/.

5 The Next Level They also tested perception of the F3 transitions in isolation. Listeners did not perceive these transitions categorically.

6 The Implications Interpretation: we do not perceive speech in the same way we perceive other sounds. “Speech is special”… and the perception of speech is modular. A module is a special processor in our minds/brains devoted to interpreting a particular kind of environmental stimuli.

7 Module Characteristics You can think of a module as a “mental reflex”. A module of the mind is defined as having the following characteristics: 1.Domain-specific 2.Automatic 3.Fast 4.Hard-wired in brain 5.Limited top-down access (you can’t “unperceive”) Example: the sense of vision operates modularly.

8 A Modular Mind Model central processes judgment, imagination, memory, attention modules visionhearingtouchspeech transducers eyesearsskinetc. external, physical reality

9 Remember this stuff? Speech is a “special” kind of sound because it exhibits spectral change over time.  it’s processed by the speech module, not by the auditory module.

10 SWS Findings The uninitiated either hear sinewave speech as speech or as “whistles”, “chirps”, etc. Claim: once you hear it as speech, you can’t go back. The speech module takes precedence (Limited top-down access) Analogy: it’s impossible to not perceive real speech as speech. We can’t hear the individual formants as whistles, chirps, etc. Motor theory says: we don’t perceive the “sounds”, we perceive the gestures which shape the spectrum.

11 More Evidence for Modularity It has also been observed that speech is perceived multi-modally. i.e.: we can perceive it through vision, as well as hearing (or some combination of the two).  We’re perceiving “gestures” …and the gestures are abstract. Interesting evidence: McGurk Effect

12 McGurk Effect, revealed AudioVisualPerceived ba+ga  da ga+ba  bga Some interesting facts: The McGurk Effect is exceedingly robust. Adults show the McGurk Effect more than children. Americans show the McGurk Effect more than Japanese.

13 Original McGurk Data AuditoryVisual Stimulus:ba-baga-ga Response types: Auditory: ba-baFused:da-da Visual: ga-gaCombo:gabga, bagba Age AuditoryVisualFusedCombo 3-519%36810 7-8360640 18-4020980

14 Original McGurk Data AuditoryVisual Stimulus:ga-gaba-ba Response types: Auditory: ba-baFused:da-da Visual: ga-gaCombo:gabga, bagba Age AuditoryVisualFusedCombo 3-557%10019 7-836211132 18-401131054

15 Audio-Visual Sidebar Visual cues affect the perception of speech in non- mismatched conditions, as well. Scientific studies of lipreading date back to the early twentieth century The original goal: improve the speech perception skills of the hearing-impaired Note: visual speech cues often complement audio speech cues In particular: place of articulation However, training people to become better lipreaders has proven difficult… Some people got it; some people don’t.

16 Sumby & Pollack (1954) First investigated the influence of visual information on the perception of speech by normal-hearing listeners. Method: Presented individual word tokens to listeners in noise, with simultaneous visual cues. Task: identify spoken word Clear: +10 dB SNR: + 5 dB SNR: 0 dB SNR:

17 Sumby & Pollack data Auditory-OnlyAudio-Visual Visual cues provide an intelligibility boost equivalent to a 12 dB increase in signal-to-noise ratio.

18 Tadoma Method Some deaf-blind people learn to perceive speech through the tactile modality, by using the Tadoma method.

19 Audio-Tactile Perception Fowler & Dekle: tested ability of (naive) college students to perceive speech through the Tadoma method. Presented synthetic stops auditorily Combined with mismatched tactile information: Ex: audio /ga/ + tactile /ba/ Also combined with mismatched orthographic information: Ex: audio /ga/ + orthographic /ba/ Task: listeners reported what they “heard” Tactile condition biased listeners more towards “ba” responses

20 Fowler & Dekle data orthographic mismatch condition tactile mismatch condition read “ba” felt “ba”

21 Another Piece of the Puzzle Another interesting finding which has been used to argue for the “speech is special” theory is duplex perception. Take an isolated F3 transition: and present it to one ear…

22 Do the Edges First! While presenting this spectral frame to the other ear:

23 Two Birds with One Spectrogram The resulting combo is perceived in duplex fashion: One ear hears the F3 “chirp”; The other ear hears the combined stimulus as “da”.

24 Duplex Interpretation Check out the spectrograms in Praat. Mann and Liberman (1983) found: Discrimination of the F3 chirps is gradient when they’re in isolation… but categorical when combined with the spectral frame. (Compare with the F3 discrimination experiment with Japanese and American listeners) Interpretation: the “special” speech processor puts the two pieces of the spectrogram together.

25 fMRI data Benson et al. (2001) Non-Speech stimuli = notes, chords, and chord progressions on a piano

26 fMRI data Benson et al. (2001) Difference in activation for natural speech stimuli versus activation for sinewave speech stimuli

27 Mirror Neurons In the 1990s, researchers in Italy discovered what they called mirror neurons in the brains of macaques. Macaques had been trained to make grasping motions with their hands. Researchers recorded the activity of single neurons while the monkeys were making these motions. Serendipity: the same neurons fired when the monkeys saw the researchers making grasping motions.  a neurological link between perception and action. Motor theory claim: same links exist in the human brain, for the perception of speech gestures

28 Motor Theory, in a nutshell The big idea: We perceive speech as abstract “gestures”, not sounds. Evidence: 1.The perceptual interpretation of speech differs radically from the acoustic organization of speech sounds 2.Speech perception is multi-modal 3.Direct (visual, tactile) information about gestures can influence/override indirect (acoustic) speech cues 4.Limited top-down access to the primary, acoustic elements of speech


Download ppt "The Motor Theory of Speech Perception April 1, 2013."

Similar presentations


Ads by Google