NTL – Converging Constraints Basic concepts and words derive their meaning from embodied experience. Abstract and theoretical concepts derive their meaning.

Slides:



Advertisements
Similar presentations
Neural Networks and Kernel Methods
Advertisements

Linking Sounds and Letters. 1: Joins in with rhyming and rhythmic activities Scale points 1 – 3 are based on the childrens achievement in their preferred.
Bayesian Belief Propagation
CSCTR Session 11 Dana Retová.  Start bottom-up  Create cognition based on sensori-motor interaction ◦ Cohen et al. (1996) – Building a baby ◦ Cohen.
Neural networks Introduction Fitting neural networks
Phantom Limb Phenomena. Hand movement observation by individuals born without hands: phantom limb experience constrains visual limb perception. Funk M,
SAL: A Hybrid Cognitive Architecture Y. Vinokurov 1, C. Lebiere 1, D. Wyatte 2, S. Herd 2, R. O’Reilly 2 1. Carnegie Mellon University, 2. University of.
Infant Development Review Object Constancy Object Identity Object Permanence Depth Perception Objects Remain the Same Even if they are different Objects.
Yiannis Demiris and Anthony Dearden By James Gilbert.
Broca’s area Pars opercularis Motor cortexSomatosensory cortex Sensory associative cortex Primary Auditory cortex Wernicke’s area Visual associative cortex.
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
CSD 2230 HUMAN COMMUNICATION DISORDERS Topic 2 Normal Communication Development and Communication Across the Lifespan.
SA-1 Body Scheme Learning Through Self-Perception Jürgen Sturm, Christian Plagemann, Wolfram Burgard.
Unified Cognitive Science Neurobiology Psychology Computer Science Linguistics Philosophy Social Sciences Experience Take all the Findings and Constraints.
Language and Symbolic Development. Symbols Systems for representing and conveying information 1 thing is used to stand for something else e.g. numbers,
CS 182 Sections slides created by Eva Mok modified by JGM March 22, 2006.
Unified Cognitive Science Neurobiology Psychology Computer Science Linguistics Philosophy Social Sciences Experience Take all the Findings and Constraints.
Psych 56L/ Ling 51: Acquisition of Language Lecture 8 Phonological Development III.
COGNITIVE NEUROSCIENCE
Connectionist Models: Lecture 3 Srini Narayanan CS182/CogSci110/Ling109 Spring 2006.
Part 2 of 3: Bayesian Network and Dynamic Bayesian Network.
Lecture Overview Regier System: Limitations Image Schemas: Recap Force Dynamic Schemas: Recap Sensory-Motor Schemas –Evidence in Primates –Evidence in.
The ICSI/Berkeley Neural Theory of Language Project Learning early constructions (Chang, Mok) ECG.
CS 182 Sections Eva Mok Feb 11, 2004 ( bad puns alert!
1 Ivan Lanese Computer Science Department University of Bologna Italy Concurrent and located synchronizations in π-calculus.
Motor cortical areas: the homunculus The motor system.
CS 182 Sections Some languages use an absolute orientation system, e.g. 1.Guugu Yimithirr (Cape York, Queensland, Australia) 2.Hai//om (Khoisan,
Concept Mapping. What is Concept Mapping ? Concept mapping is a technique for representing knowledge in graphs. This technique was developed by Professor.
Function Approximation for Imitation Learning in Humanoid Robots Rajesh P. N. Rao Dept of Computer Science and Engineering University of Washington,
Newcomers You have the whole world in your hands!.
NTL – Converging Constraints Basic concepts and words derive their meaning from embodied experience. Abstract and theoretical concepts derive their meaning.
Psych 56L/ Ling 51: Acquisition of Language Lecture 8 Phonological Development III.
Richard Socher Cliff Chiung-Yu Lin Andrew Y. Ng Christopher D. Manning
Author: James Allen, Nathanael Chambers, etc. By: Rex, Linger, Xiaoyi Nov. 23, 2009.
A Test for the Consecutive Ones Property 1/39. Outline Consecutive ones property PQ-trees Template operations Complexity Analysis –The most time consuming.
Genetic Algorithms Michael J. Watts
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Intellectual Development of Toddlers (1-3)
Bayesian goal inference in action observation by co-operating agents EU-IST-FP6 Proj. nr Raymond H. Cuijpers Project: Joint-Action Science and.
Learning to perceive how hand-written digits were drawn Geoffrey Hinton Canadian Institute for Advanced Research and University of Toronto.
Computing & Information Sciences Kansas State University Wednesday, 22 Oct 2008CIS 530 / 730: Artificial Intelligence Lecture 22 of 42 Wednesday, 22 October.
Introduction to Embodied Construction Grammar March 4, 2003 Ben Bergen
CS 182 Sections slides created by Eva Mok modified by JGM March
Chapter 10. The Explorer System in Cognitive Systems, Christensen et al. Course: Robots Learning from Humans On, Kyoung-Woon Biointelligence Laboratory.
CIAR Summer School Tutorial Lecture 1b Sigmoid Belief Nets Geoffrey Hinton.
Announcements a3 is out, due 2/15 11:59pm Please please please start early quiz will be graded in about a week. a1 will be graded shortly—use glookup to.
The Emergent Structure of Semantic Knowledge
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 6: Applying backpropagation to shape recognition Geoffrey Hinton.
Intellectual Development from One to Three Chapter 12.
Intellectual Development of the Infant
CSC321: Introduction to Neural Networks and Machine Learning Lecture 23: Linear Support Vector Machines Geoffrey Hinton.
Generalized Point Based Value Iteration for Interactive POMDPs Prashant Doshi Dept. of Computer Science and AI Institute University of Georgia
Supporting Your Child with writing Parents Meeting 6 th March 9am Welcome.
1 Prepared by: Laila al-Hasan. 2 language Acquisition This lecture concentrates on the following topics: Language and cognition Language acquisition Phases.
The Neural Basis of Thought and Language Final Review Session.
slides derived from those of Eva Mok and Joe Makin March 21, 2007
The Neural Basis of Thought and Language Week 12 Metaphor and Automatic Reasoning, plus Grammars.
Processing visual information for Computer Vision
Learning linguistic structure with simple and more complex recurrent neural networks Psychology February 2, 2017.
slides derived from those of Eva Mok and Joe Makin March 21, 2007
The ICSI/Berkeley Neural Theory of Language Project
with acknowledgments to Eva Mok and Joe Makin
Machine Learning Today: Reading: Maria Florina Balcan
Learning linguistic structure with simple and more complex recurrent neural networks Psychology February 8, 2018.
Statistical environment representation to support navigation of mobile robots in unstructured environments Stefan Rolfes Maria Joao Rendas
Boltzmann Machine (BM) (§6.4)
Learning linguistic structure with simple recurrent neural networks
A task of induction to find patterns
Active, dynamic, interactive, system
A task of induction to find patterns
Presentation transcript:

NTL – Converging Constraints Basic concepts and words derive their meaning from embodied experience. Abstract and theoretical concepts derive their meaning from metaphorical maps to more basic embodied concepts. Structured Connectionist Models can capture both of these processes nicely. Grammar extends this by Constructions: pairings of form with embodied meaning.

Simulation-based language understanding “Harry walked to the cafe.” SchemaTrajectorGoal walkHarrycafe Analysis Process Simulation Specification Utterance Simulation Cafe Constructions General Knowledge Belief State

Background: Primate Motor Control Relevant requirements (Stromberg, Latash, Kandel, Arbib, Jeannerod, Rizzolatti) –Should model coordinated, distributed, parameterized control programs required for motor action and perception. –Should be an active structure. –Should be able to model concurrent actions and interrupts. Model –The NTL project has developed a computational model based on that satisfies these requirements (x- schemas). –Details, papers, etc. can be obtained on the web at

Active representations Many inferences about actions derive from what we know about executing them Representation based on stochastic Petri nets captures dynamic, parameterized nature of actions Walking: bound to a specific walker with a direction or goal consumes resources (e.g., energy) may have termination condition (e.g., walker at goal ) ongoing, iterative action walker =Harry goal =home energy walker at goal

Somatotopy of Action Observation Foot Action Hand Action Mouth Action Buccino et al. Eur J Neurosci 2001

Active Motion Model Evolving Responses of Competing Models over Time. Nigel Goddard 1989

Language Development in Children 0-3 mo: prefers sounds in native language 3-6 mo: imitation of vowel sounds only 6-8 mo: babbling in consonant-vowel segments 8-10 mo: word comprehension, starts to lose sensitivity to consonants outside native language mo: word production (naming) mo: word combinations, relational words (verbs, adj.) mo: grammaticization, inflectional morphology 3 years – adulthood: vocab. growth, sentence-level grammar for discourse purposes

food toys misc. people sound emotion action prep. demon. social Words learned by most 2-year olds in a play school (Bloom 1993)

Learning Spatial Relation Words Terry Regier A model of children learning spatial relations. Assumes child hears one word label of scene. Program learns well enough to label novel scenes correctly. Extended to simple motion scenarios, like INTO. System works across languages. Mechanisms are neurally plausible.

Learning System We’ll look at the details next lecture dynamic relations (e.g. into) structured connectionist network (based on visual system)

Limitations Scale Uniqueness/Plausibility Grammar Abstract Concepts Inference Representation Biological Realism

physicslowest energy state chemistrymolecular minima biology fitness, MEU neuroeconomics vision threats, friends language errors, NTL Constrained Best Fit in Nature inanimate animate

Learning Verb Meanings David Bailey A model of children learning their first verbs. Assumes parent labels child’s actions. Child knows parameters of action, associates with word Program learns well enough to: 1) Label novel actions correctly 2) Obey commands using new words (simulation) System works across languages Mechanisms are neurally plausible.

Motor Control (X-schema) for SLIDE

Parameters for the SLIDE X-schema

Feature Structures for PUSH

System Overview

Learning Two Senses of PUSH Model merging based on Bayesian MDL

Training Results David Bailey English 165 Training Examples, 18 verbs Learns optimal number of word senses (21) 32 Test examples : 78% recognition, 81% action All mistakes were close lift ~ yank, etc. Learned some particle CXN,e.g., pull up Farsi With identical settings, learned senses not in English

Learning Two Senses of PUSH Model merging based on Bayesian MDL

physicslowest energy state chemistrymolecular minima biology fitness, MEU neuroeconomics vision threats, friends language errors, NTL Constrained Best Fit in Nature inanimate animate

Model Merging and Recruitment Word Learning requires “fast mapping”. Recruitment Learning is a Connectionist Level model of this. Model Merging is a practical Computational Level method for fast mapping. Bailey’s thesis outlines the reduction and some versions have been built. The full story requires Bayesian MDL, later.

The Idea of Recruitment Learning Suppose we want to link up node X to node Y The idea is to pick the two nodes in the middle to link them up Can we be sure that we can find a path to get from X to Y? the point is, with a fan-out of 1000, if we allow 2 intermediate layers, we can almost always find a path X Y BNK F = B/N

Recruiting triangle nodes Let’s say we are trying to remember a green circle currently weak connections between concepts (dotted lines) has-color bluegreenroundoval has-shape

Strengthen these connections and you end up with this picture has-color bluegreenroundoval has-shape Green circle