Language Comprehension Speech Perception Semantic Processing & Naming Deficits.

Slides:



Advertisements
Similar presentations
Perception and Dyslexia Mr Patrick Mulcahy, Chair ASASA
Advertisements

Human Speech Recognition Julia Hirschberg CS4706 (thanks to John-Paul Hosum for some slides)
Created By: Lauren Snyder, Juliana Gerard, Dom Williams, and Ryan Holsopple.
Language and Cognition Colombo, June 2011 Day 8 Aphasia: disorders of comprehension.
Language and Psychology. The comprehension of sounds The comprehension of words The comprehension of sentences.
Chapter 12 Speech Perception. Animals use sound to communicate in many ways Bird calls Bird calls Whale calls Whale calls Baboons shrieks Baboons shrieks.
The Perception of Speech. Speech is for rapid communication Speech is composed of units of sound called phonemes –examples of phonemes: /ba/ in bat, /pa/
Cognition, 8e by Margaret W. MatlinChapter 2 Cognition, 8e Chapter 2 Perceptual Processes I: Visual and Auditory Recognition.
Writing Workshop Here are some typical writing style issues which people have trouble with.
Language Perception and Comprehension
The Perception of Speech. Speech is for rapid communication Speech is composed of units of sound called phonemes –examples of phonemes: /ba/ in bat, /pa/
Auditory Word Recognition
Speech Perception Overview of Questions Can computers perceive speech as well as humans? Does each word that we hear have a unique pattern associated.
Models of Language Language and Cognition Colombo 2011.
Exam 1 Monday, Tuesday, Wednesday next week WebCT testing centre Covers everything up to and including hearing (i.e. this lecture)
Language. Using Language What is language for? Using Language What is language for? – Rapid, efficient communication To accomplish this goal, what needs.
I. Face Perception II. Visual Imagery. Is Face Recognition Special? Arguments have been made for both functional and neuroanatomical specialization for.
What is Cognitive Science? … is the interdisciplinary study of mind and intelligence, embracing philosophy, psychology, artificial intelligence, neuroscience,
Language. Processing Stages Language makes it possible for us to inform each other about the world. A sentence (string of words) is perceived. That string.
PSY 369: Psycholinguistics
SPEECH PERCEPTION The Speech Stimulus Perceiving Phonemes Top-Down Processing Is Speech Special?
PSY 369: Psycholinguistics Language Comprehension: Introduction & Perception of language.
Visual Cognition II Object Perception. Theories of Object Recognition Template matching models Feature matching Models Recognition-by-components Configural.
What is Phonetics? Short answer: The study of speech sounds in all their aspects. Phonetics is about describing speech. (Note: phonetics ¹ phonics) Phonetic.
What is Cognitive Science? … is the interdisciplinary study of mind and intelligence, embracing philosophy, psychology, artificial intelligence, neuroscience,
Sound and Speech. The vocal tract Figures from Graddol et al.
Reading. Reading Research Processes involved in reading –Orthography (the spelling of words) –Phonology (the sound of words) –Word meaning –Syntax –Higher-level.
Reading & Speech Perception Connectionist Approach E.g., Seidenberg and McClelland (1989) and Plaut (1996). Central to these models is the absence of.
What is Cognitive Science? … is the interdisciplinary study of mind and intelligence, embracing philosophy, psychology, artificial intelligence, neuroscience,
Language Comprehension Speech Perception Naming Deficits.
An aside: peripheral drift illusion illusion of motion is strongest when reading text (such as this) while viewing the image in your periphery. Blinking.
The Perception of Speech
Psycholinguistics 05 Internal Lexicon.
Language Comprehension Speech Perception Meaning Representation.
Cognitive Processes PSY 334 Chapter 2 – Perception.
Language Comprehension reading
Cognitive Psychology Chapter 3. Visual Consciousness Transduction of the visible spectrum (400 nm to 700 nm) of electromagnetic radiation. Crossing.
Amira Al Harbi.  Psycholinguistics is concerned with language and the brain.  To be a perfect psycholinguistist, you would need to have a comprehensive.
SPOKEN LANGUAGE COMPREHENSION Anne Cutler Addendum: How to study issues in spoken language comprehension.
Audio Scene Analysis and Music Cognitive Elements of Music Listening
Speech Perception 4/6/00 Acoustic-Perceptual Invariance in Speech Perceptual Constancy or Perceptual Invariance: –Perpetual constancy is necessary, however,
1 Speech Perception 3/30/00. 2 Speech Perception How do we perceive speech? –Multifaceted process –Not fully understood –Models & theories attempt to.
PSY 369: Psycholinguistics Language Comprehension Word recognition & speech recognition.
What is Language? Education 388 Lecture 3 January 23, 2008 Kenji Hakuta, Professor.
Age of acquisition and frequency of occurrence: Implications for experience based models of word processing and sentence parsing Marc Brysbaert.
Speech Perception 4/4/00.
Cognitive Processes PSY 334
The Influence of Feature Type, Feature Structure and Psycholinguistic Parameters on the Naming Performance of Semantic Dementia and Alzheimer’s Patients.
PSY 369: Psycholinguistics Language Comprehension Speech recognition.
Sensation & Perception
PSY 369: Psycholinguistics Language Comprehension Word recognition & speech recognition.
Solve this maze at your leisure.
Perceptual and Neural Modeling Automatic Speech Attribute Transcription (ASAT) Project Sorin Dusan Center for Advanced Information Processing Rutgers University.
Language Perception.
Articulatory Net I.2 Oct 14, 2015 – DAY 21
WebCT You will find a link to WebCT under the “Current Students” heading on It is your responsibility to know how to work WebCT!
Chapter 11 Language. Some Questions to Consider How do we understand individual words, and how are words combined to create sentences? How can we understand.
Solve this maze at your leisure. Start at phil’s house. At first, you can only make right turns through the maze. Each time you cross the red zigzag sign.
VISUAL WORD RECOGNITION. What is Word Recognition? Features, letters & word interactions Interactive Activation Model Lexical and Sublexical Approach.
Type of Work: Language Materials Development Duration: 06-Jul-2015 to 14-Aug-2015 Compensation: Paid Attractive compensation Internship Location: Princeton,
Chapter 9 Teaching Listening Warming up questions  What are our problems in listening in English?  Do you think listening is very difficult for English.
Cognitive Processes PSY 334
Cognitive Processes in SLL and Bilinguals:
Copyright © American Speech-Language-Hearing Association
What is Phonetics? Short answer: The study of speech sounds in all their aspects. Phonetics is about describing speech. (Note: phonetics ¹ phonics) Phonetic.
Perceptual Processes I (Ch 2)
Language Comprehension
Cognitive Processes PSY 334
عمادة التعلم الإلكتروني والتعليم عن بعد
Topic: Language perception
Presentation transcript:

Language Comprehension Speech Perception Semantic Processing & Naming Deficits

Thought / High-level understanding Semantics meaning Orthography text Phonology speech Connectionist framework for lexical processing, adapted from Seidenberg and McClelland (1989) and Plaut et al (1996). Triangle Model Reading Speech perception

Speech Perception The first step in comprehending spoken language is to identify the words being spoken, performed in multiple stages: 1. Phonemes are detected (/b/, /e/, /t/, /e/, /r/, ) 2. Phonemes are combined into syllables (/be/ /ter/) 3. Syllables are combined into words (“better”) 4. Word meaning retrieved from memory

Spectrogram: I owe you a yo-yo

How many words were spoken (in Finnish)? 1) 2) 3)

Speech perception: two problems Words are not neatly segmented (e.g., by pauses) Difficult to identify phonemes –Coarticulation = consecutive speech sounds blend into each other due to mechanical constraints on articulators –Speaker differences; pitch affected by age and sex; different dialects, talking speeds etc.

How Do Listeners Resolve Ambiguity in Acoustic Input? Use of context –Cross-modal context e.g., use of visual cues: McGurk effect –Semantic context E.g. phonemic restoration effect

Effect of Semantic Context Pollack & Pickett (1964) –Recorded several conversations. –Subjects in their experiment had to identify the words in the conversation. –When words were spliced out of the conversation and then presented auditorily, subjects identified the correct word only 47% of the time. –When context was provided, words were identified with higher accuracy –clarity of speech is an illusion; we hear what we want to hear

Phonemic restoration Auditory presentation Perception Legislature legislature Legi*lature legislature Legi_laturelegi lature It was found that the *eel was on the axle. It was found that the *eel was on the shoe. It was found that the *eel was on the orange. It was found that the *eel was on the table. Warren, R. M. (1970). Perceptual restorations of missing speech sounds. Science, 167, wheel heel peel meal

McGurk Effect Perception of auditory event affected by visual processing Harry McGurk and John MacDonald in "Hearing lips and seeing voices", Nature 264, (1976). Demo 1 AVI: MOV: YOUTUBE: Demo 2 MOV:

McGurk Effect McGurk effect in video: –lip movements = “ga” –speech sound = “ba” –speech perception = “da” (for 98% of adults) Demonstrates parallel & interactive processing: speech perception is based on multiple sources of information, e.g. lip movements, auditory information. Brain makes reasonable assumption that both sources are informative and “fuses” the information.

Models of Spoken Word Identification The Cohort Model –Marslen-Wilson & Welsh, 1978 –Revised, Marslen-Wilson, 1989 The TRACE Model –Similar to the Interactive Activation model –McClelland & Elman, 1986

Online word recognition: the cohort model

Recognizing Spoken Words: The Cohort Model All candidates considered in parallel Candidates eliminated as more evidence becomes available in the speech input Uniqueness point occurs when only one candidate remains

Analyzing speech perception with eye tracking ‘Point to the beaker’ Allopenna, Magnuson & Tanenhaus (1998) Eye tracking device to measure where subjects are looking

Human Eye Tracking Data Plot shows the probability of fixating on an object as a function of time Allopenna, Magnuson & Tanenhaus (1998)

TRACE: a neural network model Similar to interactive activation model but applied to speech recognition Connections between levels are bi-directional and excitatory  top-down effects Connections within levels are inhibitory producing competition between alternatives (McClelland & Elman, 1986)

TRACE Model Predictions

Semantic Representations and Naming Deficits

Thought / High-level understanding Semantics meaning Orthography text Phonology speech Connectionist framework for lexical processing, adapted from Seidenberg and McClelland (1989) and Plaut et al (1996). Triangle Model

Representing Meaning No mental dictionary Instead, we appear to represent meaning with an interconnected (distributed) representation involving many different kinds of “features”

Representing Meaning

Category Specific Semantic Deficits Patients who have trouble naming living or non-living things Evidence comes from patients with category- specific impairments Warrington and Shallice (1984)

Definitions giving by patient JBR and SBY Farah and McClelland (1991)

Explanation One possibility is that there are two separate systems for living and non-living things More likely explanation: –Living things are primarily distinguished by their perceptual attributes. Non-living things are distinguished by their functional attributes.  Sensory functional approach

Sensory-Functional Approach Support for sensory-functional approach comes from a number of studies Example: –Dictionary studies measured the ratio of visual to functional features: 7.7:1 for living things 1.4:1 for nonliving things Farah & McClelland (1991)

A neural network model of category-specific impairments Farah and McClelland (1991) A single system with functional and visual features. Model was trained to associate visual picture with the name of object using a distributed internal semantic representation

A neural network model of category-specific impairments Farah and McClelland (1991) Simulate the effect of brain lesions lesions

Simulating the Effects of Brain Damage by “lesioning” the model Farah and McClelland (1991) Visual Lesions: selective impairment of living things Functional Lesions: selective impairment of non-living things