Development and Disintegration of Conceptual Knowledge: A Parallel-Distributed Processing Approach James L. McClelland Department of Psychology and Center.

Slides:



Advertisements
Similar presentations
Does the Brain Use Symbols or Distributed Representations? James L. McClelland Department of Psychology and Center for Mind, Brain, and Computation Stanford.
Advertisements

Emergence in Cognitive Science: Semantic Cognition Jay McClelland Stanford University.
Emergence of Semantic Structure from Experience Jay McClelland Stanford University.
Concepts and Categories. Functions of Concepts By dividing the world into classes of things to decrease the amount of information we need to learn, perceive,
Concepts and Categories. Functions of Concepts By dividing the world into classes of things to decrease the amount of information we need to learn, perceive,
Un Supervised Learning & Self Organizing Maps Learning From Examples
Chapter Seven The Network Approach: Mind as a Web.
Knowledge information that is gained and retained what someone has acquired and learned organized in some way into our memory.
Cognitive Psychology, 2 nd Ed. Chapter 8 Semantic Memory.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Roles of Knowledge in Cognition 1 Knowledge is often thought of as constituting particular bodies of facts, techniques, and procedures that cultures develop,
Cooperation of Complementary Learning Systems in Memory Review and Update on the Complementary Learning Systems Framework James L. McClelland Psychology.
Development and Disintegration of Conceptual Knowledge: A Parallel-Distributed Processing Approach Jay McClelland Department of Psychology and Center for.
Using Backprop to Understand Apects of Cognitive Development PDP Class Feb 8, 2010.
Representation, Development and Disintegration of Conceptual Knowledge: A Parallel-Distributed Processing Approach James L. McClelland Department of Psychology.
Emergence of Semantic Structure from Experience Jay McClelland Stanford University.
Integrating New Findings into the Complementary Learning Systems Theory of Memory Jay McClelland, Stanford University.
The PDP Approach to Understanding the Mind and Brain Jay McClelland Stanford University January 21, 2014.
Disintegration of Conceptual Knowledge In Semantic Dementia James L. McClelland Department of Psychology and Center for Mind, Brain, and Computation Stanford.
Contrasting Approaches To Semantic Knowledge Representation and Inference Psychology 209 February 15, 2013.
Emergence of Semantic Knowledge from Experience Jay McClelland Stanford University.
The Influence of Feature Type, Feature Structure and Psycholinguistic Parameters on the Naming Performance of Semantic Dementia and Alzheimer’s Patients.
Development, Disintegration, and Neural Basis of Semantic Cognition: A Parallel-Distributed Processing Approach James L. McClelland Department of Psychology.
Emergence of Semantic Structure from Experience Jay McClelland Stanford University.
Similarity and Attribution Contrasting Approaches To Semantic Knowledge Representation and Inference Jay McClelland Stanford University.
Rapid integration of new schema- consistent information in the Complementary Learning Systems Theory Jay McClelland, Stanford University.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Semantic Cognition: A Parallel Distributed Processing Approach James L. McClelland Center for the Neural Basis of Cognition and Departments of Psychology.
1 How is knowledge stored? Human knowledge comes in 2 varieties: Concepts Concepts Relations among concepts Relations among concepts So any theory of how.
The Emergent Structure of Semantic Knowledge
Memory: Its Nature and Organization in the Brain James L. McClelland Stanford University.
Emergent Semantics: Meaning and Metaphor Jay McClelland Department of Psychology and Center for Mind, Brain, and Computation Stanford University.
Semantic Knowledge: Its Nature, its Development, and its Neural Basis James L. McClelland Department of Psychology and Center for Mind, Brain, and Computation.
Organization and Emergence of Semantic Knowledge: A Parallel-Distributed Processing Approach James L. McClelland Department of Psychology and Center for.
Development and Disintegration of Conceptual Knowledge: A Parallel-Distributed Processing Approach James L. McClelland Department of Psychology and Center.
Chapter 9 Knowledge. Some Questions to Consider Why is it difficult to decide if a particular object belongs to a particular category, such as “chair,”
Big data classification using neural network
Bayesian inference in neural networks
A presentation to El Paso del Norte Software Association
Complementary Learning Systems
Psychology 209 – Winter 2017 January 31, 2017
What is cognitive psychology?
Learning linguistic structure with simple and more complex recurrent neural networks Psychology February 2, 2017.
Representational Similarity Analysis
Simple recurrent networks.
Psychology 209 – Winter 2017 Feb 28, 2017
Representational Similarity Analysis
Does the Brain Use Symbols or Distributed Representations?
Backpropagation in fully recurrent and continuous networks
Intelligent Information System Lab
Knowledge Pt 2 Chapter 10 Knowledge Pt 2.
Cooperation of Complementary Learning Systems in Memory
Bayesian inference in neural networks
Memory and Learning: Their Nature and Organization in the Brain
Emergence of Semantic Structure from Experience
Emergence of Semantics from Experience
Neuropsychology of Vision Anthony Cate April 19, 2001
[Human Memory] 10.Knowledge
Knowledge Pt 2 Chapter 10 Knowledge Pt 2.
Dynamical Models of Decision Making Optimality, human performance, and principles of neural information processing Jay McClelland Department of Psychology.
Artificial Intelligence Chapter 3 Neural Networks
Computer Vision Chapter 4
Knowledge Pt 2 Chapter 10 Knowledge Pt 2.
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Perceptual learning Nisheeth 15th February 2019.
CLS, Rapid Schema Consistent Learning, and Similarity-weighted Interleaved learning Psychology 209 Feb 26, 2019.
The Network Approach: Mind as a Web
Volume 27, Issue 2, Pages (August 2000)
Presentation transcript:

Development and Disintegration of Conceptual Knowledge: A Parallel-Distributed Processing Approach James L. McClelland Department of Psychology and Center for Mind, Brain, and Computation Stanford University

Parallel Distributed Processing Approach to Semantic Cognition Representation is a pattern of activation distributed over neurons within and across brain areas. Bidirectional propagation of activation mediated by a learned internal representation underlies the ability to bring these representations to mind from given inputs. The knowledge underlying propagation of activation is in the connections, and is acquired through a gradual learning process. language

A Principle of Learning and Representation Learning and representation are sensitive to coherent covariation of properties across experiences.

What is Coherent Covariation? The tendency of properties of objects to co-occur in clusters. e.g. Has wings Can fly Is light Or Has roots Has rigid cell walls Can grow tall

Development and Degeneration Sensitivity to coherent covariation in an appropriately structured Parallel Distributed Processing system creates the taxonomy of categories that populate our minds and underlies the development of conceptual knowledge. Gradual degradation of the representations constructed through this developmental process underlies the pattern of semantic disintegration seen in semantic dementia.

Some Phenomena in Development Progressive differentiation of concepts Overextension of frequent names Overgeneralization of typical properties

The Rumelhart Model

The Training Data: All propositions true of items at the bottom level of the tree, e.g.: Robin can {grow, move, fly}

Target output for ‘robin can’ input

Forward Propagation of Activation aj ai wij neti=Sajwij wki

Back Propagation of Error (d) aj wij ai di ~ Sdkwki wki dk ~ (tk-ak) Error-correcting learning: At the output layer: Dwki = edkai At the prior layer: Dwij = edjaj …

Early Later Later Still E x p e r i e n c e

What Drives Progressive Differentiation? Waves of differentiation reflect coherent covariation of properties across items. Patterns of coherent covariation are reflected in the principal components of the property covariance matrix. Figure shows attribute loadings on the first three principal components: 1. Plants vs. animals 2. Birds vs. fish 3. Trees vs. flowers Same color = features covary in component Diff color = anti-covarying features

Coherence Training Patterns Items Properties Coherent Incoherent is can has is can has … 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 Items No labels are provided Each item and each property occurs with equal frequency Coherently co-varying inputs are not presented at the same time!

Effect of Coherence on Representation

Overextension of A Frequent Name to Similar Objects Oak Goat “tree” “goat” “dog”

Overgeneralization of typical properties Rochel Gelman found that children think that all animals have feet. Even animals that look like small furry balls and don’t seem to have any feet at all.

A typical property that a particular object lacks e.g., pine has leaves An infrequent, atypical property

Development and Degeneration Sensitivity to coherent covariation in an appropriately structured Parallel Distributed Processing system underlies the development of conceptual knowledge. Gradual degradation of the representations constructed through this developmental process underlies the pattern of disintegration seen in semantic dementia.

Disintegration of Conceptual Knowledge in Semantic Dementia Progressive loss of specific knowledge of concepts, including their names, with preservation of general information Overextension of frequent names Overgeneralization of typical properties

Picture naming and drawing in Sem. Demantia

Grounding the Model in What we Know About The Organization of Semantic Knowledge in The Brain Specialized areas for each of many different kinds of semantic information. Semantic dementia results from progressive bilateral disintegration of the anterior temporal cortex. Destruction of the medial temporal lobes results in loss of memory for recent events and loss of the ability to form new memories quickly, but leaves existing semantic knowledge unaffected. language

Proposed Architecture for the Organization of Semantic Memory action name Medial Temporal Lobe motion Temporal pole color valance form

Rogers et al (2005) model of semantic dementia Trained with 48 items from six categories (from a clinical test). Names are individual units, other patterns are feature vectors. Features come from a norming study. From any input, produce all other patterns as output. Representations undergo progressive differentiation as learning progresses. Test of ‘picture naming’: Present vision input. Most active name unit above a threshold is chosen as response. name assoc function temporal pole vision

Errors in Naming for As a Function of Severity Simulation Results Patient Data omissions within categ. superord. Severity of Dementia Fraction of Connections Destroyed

Simulation of Delayed Copying Visual input is presented, then removed. After three time steps, the vision layer pattern is compared to the pattern that was presented. Omissions and intrusions are scored for typicality. name assoc function temporal pole vision

Omission Errors IF’s ‘camel’

Intrusion Errors DC’s ‘swan’

Development and Degeneration Sensitivity to coherent covariation in an appropriately structured Parallel Distributed Processing system underlies the development of conceptual knowledge. Gradual degradation of the representations constructed through this developmental process underlies the pattern of semantic disintegration seen in semantic dementia.

A Hierarchical Bayesian Characterization Initially, assume there’s only one ‘kind of thing’ in the world. Assign probabilities to occurrence of properties according to overall occurrence rates. Coherent covariation licenses the successive splitting of categories Probabilities of individual features become much more predictable, and conditional relations between features are available for inference. Overgeneralization and overextension are consequences of implicitly applying the ‘kind’ feature probabilities to the ‘item’ and depends on the current level of splitting into kinds. This process occurs By an on-line, incremental learning process. In a gradual and graded way. Without explicit enumeration of possibilities.

A Hierarchical Bayesian Characterization (Cont’d) Effect of damage causes the network to revert to a simpler model. Perhaps the network can be seen as maximizing its accuracy in ‘explaining’ properties of objects given Limited training data during acquisition Limited resources during degredation

Thanks for your attention!

Sensitivity to Coherence Requires Convergence A A A