Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Emergent Structure of Semantic Knowledge

Similar presentations


Presentation on theme: "The Emergent Structure of Semantic Knowledge"— Presentation transcript:

1 The Emergent Structure of Semantic Knowledge
Jay McClelland Department of Psychology and Center for Mind, Brain, and Computation Stanford University

2 The Parallel Distributed Processing Approach to Semantic Cognition
Representation is a pattern of activation distributed over neurons within and across brain areas. Bidirectional propagation of activation underlies the ability to bring these representations to mind from given inputs. The knowledge underlying propagation of activation is in the connections. Experience affects our knowledge representations through a gradual connection adjustment process language

3 Distributed Representations: and Overlapping Patterns for Related Concepts
dog goat hammer dog goat hammer

4 Kiani et al, J Neurophysiol 97: 4296–4309, 2007.

5 Emergence of Meaning and Metaphor
Learned distributed representations that capture important aspects of meaning emerge through a gradual learning process in simple connectionist networks Metaphor arises naturally as a byproduct of learning information in homologous domains in models of this type

6 Emergence of Meaning: Differentiation, Reorganization, and Context-Sensitivity

7

8 The Rumelhart Model

9 The Training Data: All propositions true of items at the bottom level of the tree, e.g.: Robin can {grow, move, fly}

10 Target output for ‘robin can’ input

11 Forward Propagation of Activation
aj ai wij neti=Sajwij wki

12 Back Propagation of Error (d)
aj wij ai di ~ Sdkwki wki dk ~ (tk-ak) Error-correcting learning: At the output layer: Dwki = edkai At the prior layer: Dwij = edjaj

13

14

15 Early Later Later Still E x p e r i e n c e

16

17 What Drives Progressive Differentiation?
Waves of differentiation reflect coherent covariation of properties across items. Patterns of coherent covariation are reflected in the principal components of the property covariance matrix. Figure shows attribute loadings on the first three principal components: 1. Plants vs. animals 2. Birds vs. fish 3. Trees vs. flowers Same color = features covary in component Diff color = anti-covarying features

18 Sensitivity to Coherence Requires Convergence
A A A

19 Conceptual Reorganization (Carey, 1985)
Carey demonstrated that young children ‘discover’ the unity of plants and animals as living things with many shared properties only around the age of 10. She suggested that the coalescence of the concept of living thing depends on learning about diverse aspects of plants and animals including Nature of life sustaining processes What it means to be dead vs. alive Reproductive properties Can reorganization occur in a connectionist net?

20 Conceptual Reorganization in the Model
Suppose superficial appearance information, which is not coherent with much else, is always available… And there is a pattern of coherent covariation across information that is contingently available in different contexts. The model forms initial representations based on superficial appearances. Later, it discovers the shared structure that cuts across the different contexts, reorganizing its representations.

21

22 Organization of Conceptual Knowledge Early and Late in Development

23

24 Overall Structure Extracted by a Structured Statistical Model

25

26 Sensitivity to Context
Context-general representation Context-sensitive representation

27 Relation-specific representations
IS Representations (top) reflect idiosyncratic appearance properties. HAS representations are similar to the context-general representations (middle). Can representations collapse differences between plants, since there is little that plants can do. The fish are all the same, because there’s no difference in what they can do.

28 Ongoing Work Can the representations learned in the distributed connectionist model capture different patterns of generalization of different kinds of properties? Simulations already show context-specific patterns of property generalization. We are currently collecting detailed data from a new data set to explore the sufficiency of the model to explain experimental data on context specific patterns of generalization.

29 Generalization of different property types
At different points in training, the network is taught one of: Maple can queem Maple is queem Maple has queem Only weights from hidden to output are allowed to change. Network is then tested to see how strongly ‘queem’ is activated then same relation is paired with other items. queem

30 Generalization to other concepts after training with can, has, or is queem

31 Ongoing Work Can the representations learned in the distributed connectionist model capture different patterns of generalization of different kinds of properties? Our simulations already show context-specific patterns of property generalization. We are currently conducting new experiments to gather experimental data on context specific patterns of generalization that we will use to test an extended version of the model trained with a much larger training set.

32 Metaphor in Connectionist Models of Semantics
By metaphor I mean: the application of a relation learned in one domain to a novel situation in another

33 Hinton’s Family Tree Network
Person 1 Relation Person 2 Training data: Colin’s Father is James … Alfonso’s Father is Marco

34 English Tree Recovered
Italian Tree Recovered

35 Understanding Via Metaphor in the Family Trees Network
Marco’s father is Pierro. Who is James’s father?

36 Future Work: Metaphors We Live By
In Hinton’s model, neither domain is the base – each influences the other equally But research suggests that some domains serve as a base that influences other domains Lakoff – physical structure as a base for the structure of an intellectual argument Boroditsky – space as a base for time In connectionist networks, primacy and frequency both influence performance This allows the models to simulate how early and pervasive experience may allow one domain to serve as the base for others experienced later or less frequently Influences can still run in both directions, but to different extents

37 Emergence of Meaning and Metaphor
Learned distributed representations that capture important aspects of meaning emerge through a gradual learning process in simple connectionist networks Metaphor arises naturally as a byproduct of learning information in homologous domains in models of this type


Download ppt "The Emergent Structure of Semantic Knowledge"

Similar presentations


Ads by Google