James L. McClelland Stanford University

Slides:



Advertisements
Similar presentations
Intermediate 1 ESOL Grammar: The Correct Tense
Advertisements

Simple Past, Past continuous and Past perfect tense
PRESENT PERFECT.
Introduction to Computational Natural Language Learning Linguistics (Under: Topics in Natural Language Processing ) Computer Science (Under:
Simple past tense Richard Ortega.
Learning linguistic structure with simple recurrent networks February 20, 2013.
Present Perfect Foundation English II.
PDP: Motivation, basic approach. Cognitive psychology or “How the Mind Works”
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
9.012 Brain and Cognitive Sciences II Part VIII: Intro to Language & Psycholinguistics - Dr. Ted Gibson.
Language is very difficult to put into words. -- Voltaire What do we mean by “language”? A system used to convey meaning made up of arbitrary elements.
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
Sentence Processing using a Simple Recurrent Network EE 645 Final Project Spring 2003 Dong-Wan Kang 5/14/2003.
Basics of the English grammar
November 21, 2012Introduction to Artificial Intelligence Lecture 16: Neural Network Paradigms III 1 Learning in the BPN Gradients of two-dimensional functions:
Regular and Irregular Verbs
Form (structure) Meaning & Use Pronunciation
Rules or Connections in Past Tense Inflections Psychology 209 February 4, 2013.
Constituents  Sentence has internal structure  The structures are represented in our mind  Words in a sentence are grouped into units, and these units.
The simple past tense Musaab Salkhi Technical English
Speech & Language Development 1 Normal Development of Speech & Language Language...“Standardized set of symbols and the knowledge about how to combine.
Modelling Language Evolution Lecture 2: Learning Syntax Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Introduction Pinker and colleagues (Pinker & Ullman, 2002) have argued that morphologically irregular verbs must be stored as full forms in the mental.
The Linguistics of Second Language Acquisition
 The most intelligent device - “Human Brain”.  The machine that revolutionized the whole world – “computer”.  Inefficiencies of the computer has lead.
Past Tenses – Units 5 & 6/15 & 16 Past Simple Past Continuous Past Perfect Past Perfect Continuous.
Company Success Language School presents:
Korea Maritime and Ocean University NLP Jung Tae LEE
PRESENT PERFECT. PRESENT PERFECT FORM The present perfect of any verb is composed of two elements : the appropriate form of the auxiliary verb to have.
The Grammar Business © 2001 Glenrothes College The Grammar Business Part One 5. Common Verb Errors.
THE PAST SIMPLE.
Modelling Language Evolution Lecture 1: Introduction to Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.
The Simple Past Tense.
Presented by Karen Inchy
The Past Tense Model Psych /719 Feb 13, 2001.
Past Simple Zuzana Hrdličková. Welcome to my English lesson !
Introduction to Neural Networks and Example Applications in HCI Nick Gentile.
Rapid integration of new schema- consistent information in the Complementary Learning Systems Theory Jay McClelland, Stanford University.
Spelling, punctuation and grammar. KEEP CALM.. New curriculum expectations. Year 1 Grammar and Punctuation: Regular Plural Noun Suffixes. Suffixes and.
The PDP Approach to Understanding the Mind and Brain Jay McClelland Stanford University January 21, 2014.
SYNTAX.
3 Phonology: Speech Sounds as a System No language has all the speech sounds possible in human languages; each language contains a selection of the possible.
Connectionist Modelling Summer School Lecture Three.
The Emergent Structure of Semantic Knowledge
the Past Perfect tense What is this tense and when do we use it in English?
Distributed Representations Psych /719 Feb 8, 2001.
Emergent Semantics: Meaning and Metaphor Jay McClelland Department of Psychology and Center for Mind, Brain, and Computation Stanford University.
Connectionist Modelling Summer School Lecture Two.
The Emergentist Approach To Language As Embodied in Connectionist Networks James L. McClelland Stanford University.
Development and Disintegration of Conceptual Knowledge: A Parallel-Distributed Processing Approach James L. McClelland Department of Psychology and Center.
SPAG.
Unit: 8 The simple past tense
Simple Past, Past continuous and Past perfect
Review of Past Tense.
Learning linguistic structure with simple and more complex recurrent neural networks Psychology February 2, 2017.
Assessing Grammar Module 5 Activity 5.
The simple past tense PAST SIMPLE IN MOVIES
Morphology and syntax.
James L. McClelland SS 100, May 31, 2011
Assessing Grammar Module 5 Activity 5.
ALL ABOUT VERBS GRAMMAR SUMMARY.
The simple past tense PAST SIMPLE IN MOVIES
Simple learning in connectionist networks
Emergence of Semantics from Experience
Learning linguistic structure with simple and more complex recurrent neural networks Psychology February 8, 2018.
Learning linguistic structure with simple recurrent neural networks
Representation of Language Knowledge: Is it All in your Connections?
Simple learning in connectionist networks
Psychology Chapter 8 Section 5: Language.
Presentation transcript:

James L. McClelland Stanford University The Emergentist Approach To Language As Embodied in Connectionist Networks James L. McClelland Stanford University

Some Simple Sentences The man liked the book. The boy loved the sun. The woman hated the rain.

The Standard Approach: Units and Rules Sentences Clauses and phrases Words Morphemes Phonemes S -> NP VP VP-> Tense V (NP) Tense V -> V+{past} V -> like, love, hate… N -> man, boy, sun… man -> /m/ /ae/ /n/ {past} -> ed ed -> /t/ or /d/ or /^d/‡ ‡depends on context

What happens with exceptions?

Standard Approach to the Past Tense We form the past tense by using a (simple) rule. If an item is an exception, the rule is blocked. So we say ‘took’ instead of ‘taked’ If you’ve never seen an item before, you use the rule If an item is an exception, but you forget the exceptional past tense, you apply the rule Predictions: Regular inflection of ‘nonce forms’ This man is blinging. Yesterday he … This girl is tupping. Yesterday she … Over-regularization errors: Goed, taked, bringed

The Emergentist Approach Language (like perception, etc) arises from the interactions of neurons, each of which operates according to a common set of simple principles of processing, representation and learning. Units and rules are useful to approximately describe what emerges from these interactions but have no mechanistic or explanatory role in language processing, language change, or language learning.

An Emergentist Theory: Natural Selection No grand design Organisms produce offspring with random differences (mating helps with this) Forces of nature favor those best suited to survive Survivors leave more offspring, so their traits are passed on The full range of the animal kingdom including all the capabilities of the human mind emerge from these very basic principles

An Emergentist/Connectionist Approach to the Past Tense Knowledge is in connections Experience causes connections to change Sensitivity to regularities emerges

The RM Model Learns from verb [root, past tense] pairs [Like, liked]; [love, loved]; [carry, carried]; [take, took] Present and past are represented as patterns of activation over units that stand for phonological features.

Examples of ‘wickelfeatures’ in the verb ‘baked’ Starts with a nasal followed by a vowel Has a long vowel preceded by a nasal and followed by a stop Ends with a dental stop preceded by a velar stop Ends with an unvoiced sound preceded by another unvoiced sound

A Pattern Associator Network Pattern representing sound of the verb’s past tense Matrix of connections Summed input p(a=1) Pattern representing the sound of the verb root

Learning rule for the Pattern Associator network For each output unit: Determine activity of the unit based on its input. If the unit is active when target is not: Reduce each weight coming into the unit from each active input unit. If the unit is inactive when the target is active: Increase the weight coming into the unit from each active input unit. Each connection weight adjustment is very small Learning is gradual and cumulative

Some Learning Experiments Learn a single item Test for generalization Learn from a set of regular items

Over-regularization errors in the RM network Most frequent past tenses in English: Felt Had Made Got Gave Took Came Went Looked Needed Here’s where 400 more words were introduced Trained with top ten words only.

Over-regularization simulation in the 78 net First learn one exception Then continue training the exception with all other forms What happens to the exception?

Questions?

Some features of the model Regulars co-exist with exceptions. The model produces the regular past for most unfamiliar test items. The model captures the different subtypes among the regulars: like-liked love-loved hate-hated The model is sensitive to the no-change pattern among irregulars: hit-hit cut-cut hide-hid

Additional characteristics The model exploits gangs of related exceptions. dig-dug cling-clung swing-swung The ‘regular pattern’ infuses exceptions as well as regulars: say-said, do-did have-had keep-kept, sleep-slept Burn-burnt Teach-taught

Key Features of the Past Tense model No lexical entries and no rules No problem of rule induction or grammar selection

Strengths and Weaknesses of the Models

Elman’s Simple Recurrent Network Task is to predict the next element of a sequence on the output, given the current element on the input units. Each element is represented by a pattern of activation. Each box represents a set of units. Each dotted arrow represents all-to-all connections. The solid arrow indicates that the previous pattern on the hidden units is copied back to provide context for the next prediction. Learning occurs through connection weight adjustment using an extended version of the error correcting learning rule.

Results for Elman net trained with letter sequences

Hidden Unit Patterns for Elman Net Trained on Word Sequences

Key Features of the Both Models No lexical entries and no rules No problem of rule induction or grammar selection