Presentation is loading. Please wait.

Presentation is loading. Please wait.

NTL – Converging Constraints Basic concepts and words derive their meaning from embodied experience. Abstract and theoretical concepts derive their meaning.

Similar presentations


Presentation on theme: "NTL – Converging Constraints Basic concepts and words derive their meaning from embodied experience. Abstract and theoretical concepts derive their meaning."— Presentation transcript:

1

2 NTL – Converging Constraints Basic concepts and words derive their meaning from embodied experience. Abstract and theoretical concepts derive their meaning from metaphorical maps to more basic embodied concepts. Structured Connectionist Models can capture both of these processes nicely. Grammar extends this by Constructions: pairings of form with embodied meaning.

3 Simulation-based language understanding “Harry walked to the cafe.” SchemaTrajectorGoal walkHarrycafe Analysis Process Simulation Specification Utterance Simulation Cafe Constructions General Knowledge Belief State

4

5 Background: Primate Motor Control Relevant requirements (Stromberg, Latash, Kandel, Arbib, Jeannerod, Rizzolatti) –Should model coordinated, distributed, parameterized control programs required for motor action and perception. –Should be an active structure. –Should be able to model concurrent actions and interrupts. Model –The NTL project has developed a computational model based on that satisfies these requirements (x- schemas). –Details, papers, etc. can be obtained on the web at http://www.icsi.berkeley.edu/NTL

6 Active representations Many inferences about actions derive from what we know about executing them Representation based on stochastic Petri nets captures dynamic, parameterized nature of actions Walking: bound to a specific walker with a direction or goal consumes resources (e.g., energy) may have termination condition (e.g., walker at goal ) ongoing, iterative action walker =Harry goal =home energy walker at goal

7 Somatotopy of Action Observation Foot Action Hand Action Mouth Action Buccino et al. Eur J Neurosci 2001

8 Active Motion Model Evolving Responses of Competing Models over Time. Nigel Goddard 1989

9 Language Development in Children 0-3 mo: prefers sounds in native language 3-6 mo: imitation of vowel sounds only 6-8 mo: babbling in consonant-vowel segments 8-10 mo: word comprehension, starts to lose sensitivity to consonants outside native language 12-13 mo: word production (naming) 16-20 mo: word combinations, relational words (verbs, adj.) 24-36 mo: grammaticization, inflectional morphology 3 years – adulthood: vocab. growth, sentence-level grammar for discourse purposes

10 food toys misc. people sound emotion action prep. demon. social Words learned by most 2-year olds in a play school (Bloom 1993)

11 Learning Spatial Relation Words Terry Regier A model of children learning spatial relations. Assumes child hears one word label of scene. Program learns well enough to label novel scenes correctly. Extended to simple motion scenarios, like INTO. System works across languages. Mechanisms are neurally plausible.

12 Learning System We’ll look at the details next lecture dynamic relations (e.g. into) structured connectionist network (based on visual system)

13 Limitations Scale Uniqueness/Plausibility Grammar Abstract Concepts Inference Representation Biological Realism

14

15 physicslowest energy state chemistrymolecular minima biology fitness, MEU neuroeconomics vision threats, friends language errors, NTL Constrained Best Fit in Nature inanimate animate

16 Learning Verb Meanings David Bailey A model of children learning their first verbs. Assumes parent labels child’s actions. Child knows parameters of action, associates with word Program learns well enough to: 1) Label novel actions correctly 2) Obey commands using new words (simulation) System works across languages Mechanisms are neurally plausible.

17

18

19 Motor Control (X-schema) for SLIDE

20

21 Parameters for the SLIDE X-schema

22

23 Feature Structures for PUSH

24 System Overview

25 Learning Two Senses of PUSH Model merging based on Bayesian MDL

26 Training Results David Bailey English 165 Training Examples, 18 verbs Learns optimal number of word senses (21) 32 Test examples : 78% recognition, 81% action All mistakes were close lift ~ yank, etc. Learned some particle CXN,e.g., pull up Farsi With identical settings, learned senses not in English

27

28

29 Learning Two Senses of PUSH Model merging based on Bayesian MDL

30 physicslowest energy state chemistrymolecular minima biology fitness, MEU neuroeconomics vision threats, friends language errors, NTL Constrained Best Fit in Nature inanimate animate

31 Model Merging and Recruitment Word Learning requires “fast mapping”. Recruitment Learning is a Connectionist Level model of this. Model Merging is a practical Computational Level method for fast mapping. Bailey’s thesis outlines the reduction and some versions have been built. The full story requires Bayesian MDL, later.

32 The Idea of Recruitment Learning Suppose we want to link up node X to node Y The idea is to pick the two nodes in the middle to link them up Can we be sure that we can find a path to get from X to Y? the point is, with a fan-out of 1000, if we allow 2 intermediate layers, we can almost always find a path X Y BNK F = B/N

33 Recruiting triangle nodes Let’s say we are trying to remember a green circle currently weak connections between concepts (dotted lines) has-color bluegreenroundoval has-shape

34 Strengthen these connections and you end up with this picture has-color bluegreenroundoval has-shape Green circle

35


Download ppt "NTL – Converging Constraints Basic concepts and words derive their meaning from embodied experience. Abstract and theoretical concepts derive their meaning."

Similar presentations


Ads by Google