The Syntagmatic Paradigmatic Model of Sentence Processing Simon Dennis Key Centre for Human Factors and Applied Cognitive Psychology University of Queensland.

Slides:



Advertisements
Similar presentations
The filler-gap hypothesis and the acquisition of German relative clauses Holger Diessel University of Jena
Advertisements

Language and Literacy Levels Optional Module 1.2 A: Sentences.
I´m too fat ________? End.
Prepositions of movement
Developmental Sequences in Second Language Learning Presenters: Jacqueline dos Anjos, Hanna Heseker, Dana Meyer.
Modelling the Virtual Machine in Simple Rating and Categorization Tasks Simon Dennis School of Psychology University of Adelaide.
1 The cognitive psychology of language – 2 Now that we know how words are recognized –How are they produced in the first place? Word production is the.
Intervention by gaps in online sentence processing Michael Frazier, Peter Baumann, Lauren Ackerman, David Potter, Masaya Yoshida Northwestern University.
Language and Cognition Colombo, June 2011 Day 2 Introduction to Linguistic Theory, Part 4.
Linguistic Theory Lecture 7 About Nothing. Nothing in grammar Language often contains irregular paradigms where one or more expected forms are absent.
Learning linguistic structure with simple recurrent networks February 20, 2013.
Probabilistic inference in human semantic memory Mark Steyvers, Tomas L. Griffiths, and Simon Dennis 소프트컴퓨팅연구실오근현 TRENDS in Cognitive Sciences vol. 10,
The Behaviour of Key Words (KWs) Mike Scott University of Liverpool.
Language (and Decomposition). Linguistics provides… a highly articulated “computational” (generative) theory of the mental representations of language.
Connectionist Simulation of the Empirical Acquisition of Grammatical Relations – William C. Morris, Jeffrey Elman Connectionist Simulation of the Empirical.
1 Pertemuan 23 Syntatic Processing Matakuliah: T0264/Intelijensia Semu Tahun: 2005 Versi: 1/0.
Psycholinguistic methodology Psycholinguistics: Questions and methods.
Natural Language Query Interface Mostafa Karkache & Bryce Wenninger.
Syntax LING October 11, 2006 Joshua Tauberer.
1 Word meaning and equivalence M.A. Literary Translation- Lesson 1 prof. Hugo Bowles January
Goals of paper Create a neural network which simulates story comprehension Determine what parts of the network are damaged to produce schizophrenic behavior.
Intro to Psycholinguistics What its experiments are teaching us about language processing and production.
Syntax: The analysis of sentence structure
Additional aspects of interactive alignment Simon Garrod University of Glasgow.
From Sequential Structure to Semantic Interpretation: More Connectionist Research on Language Processing PDP Class Lecture February 14, 2011.
past progressive vs past simple
Learning Information Extraction Patterns Using WordNet Mark Stevenson and Mark A. Greenwood Natural Language Processing Group University of Sheffield,
Modelling Language Evolution Lecture 2: Learning Syntax Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.
Interference in Short-Term Memory The Magical Number Two (or Three) in Sentence Processing ` (Sat.) / Chan-hoon Park Hypernetwork Models of Learning.
Syntax Lecture 8: Verb Types 1. Introduction We have seen: – The subject starts off close to the verb, but moves to specifier of IP – The verb starts.
The Psychological Reality of Atomic Propositions Simon Dennis Annemarie Monck School of Psychology University of Adelaide.
Academic English I. Reading Review quiz Vocabulary from Unit 8 Reading Grammar Simple Past vs. Past Continuous Grammar in use Homework.
James L. McClelland Stanford University
Speech Comprehension: Decoding meaning from speech.
which [as a subject or object] What or which? We can use what or which before a noun: 1. What sport do you play? 2. Which way do we go there? 1 We use.
Brought to you by: Tyresha Ortiz, Riyadh Williams & Charly Banks
Interactive Probabilistic Search for GikiCLEF Ray R Larson School of Information University of California, Berkeley Ray R Larson School of Information.
A very, very brief introduction to linguistics Computational Linguistics, NLL Riga 2008, by Pawel Sirotkin 1.
1 A Syntagmatic Paradigmatic Model of Serial Order Memory Simon Dennis University of Adelaide.
Lexicalized and Probabilistic Parsing Read J & M Chapter 12.
PSY270 Michaela Porubanova. Language  a system of communication using sounds or symbols that enables us to express our feelings, thoughts, ideas, and.
Rules, Movement, Ambiguity
1/21 Automatic Discovery of Intentions in Text and its Application to Question Answering (ACL 2005 Student Research Workshop )
CSA2050 Introduction to Computational Linguistics Parsing I.
COSC 460 – Neural Networks Gregory Caza 17 August 2007.
The Main objectives of the Unit  Conditionals  Base and Strong adjectives.
Supertagging CMSC Natural Language Processing January 31, 2006.
SYNTAX.
R&J Act I quiz, Dependent clauses(Adjective) Day 70-Foundations.
48 Item Sets (Only the results for the relative clause versions are reported here.) The professor (who was) confronted by the student was not ready for.
SATs 2016 All you need to know about the new style SATs.
Top-down processing of language -necessary due to the noisy and variable nature of the stimulus -e.g.: coarticulation -luckily, we tend to engage in categorical.
48 Item Sets (Only the results for the relative clause versions are reported here.) The professor (who was) confronted by the student was not ready for.
MENTAL GRAMMAR Language and mind. First half of 20 th cent. – What the main goal of linguistics should be? Behaviorism – Bloomfield: goal of linguistics.
CONDITIONALS.
Learning linguistic structure with simple and more complex recurrent neural networks Psychology February 2, 2017.
End-To-End Memory Networks
Cognitive Language Processing for Rosie
Simple recurrent networks.
SYNTAX.
James L. McClelland SS 100, May 31, 2011
Chapter Eight Syntax.
Chapter Eight Syntax.
Natural Language Processing
Learning linguistic structure with simple and more complex recurrent neural networks Psychology February 8, 2018.
Learning linguistic structure with simple recurrent neural networks
Prediction Networks Prediction A simple example (section 3.7.3)
CS249: Neural Language Model
How precise are verbal working memory representations
Presentation transcript:

The Syntagmatic Paradigmatic Model of Sentence Processing Simon Dennis Key Centre for Human Factors and Applied Cognitive Psychology University of Queensland Provisional patent: 9222AU1

Motivation Problems with memory models: 1. Content 2. Control Problems with connectionist language models: 1. Systematicity 2. Scaling

Inspiration LEX (Kwantes & Mewhort) model of single word reading LEX (Kwantes & Mewhort) model of single word reading Based on Minerva II Based on Minerva II Uses vectors with content ~100,000 words Uses vectors with content ~100,000 words Lesson: Complicated control can come from simple processes working on a lot of data Lesson: Complicated control can come from simple processes working on a lot of data

Minerva II Trace 1 Trace 2 Trace 3 Trace 4 Memory Probe Hintzmann (1984, 1986)

The Syntagmatic Paradigmatic Shift Syntagmatic associate Syntagmatic associate  between slots  E.g. THANK YOU Paradigmatic associate Paradigmatic associate  within slots  E.g. The water was DEEP. The water was SHALLOW. Shift from syntagmatic to paradigmatic with development (Ervin 1961) and training (McNeill 1963, 1966) Shift from syntagmatic to paradigmatic with development (Ervin 1961) and training (McNeill 1963, 1966)

What is language acquisition? Conjecture: The learning of syntagmatic and paradigmatic associations. Conjecture: The learning of syntagmatic and paradigmatic associations. Syntactic Traces:- the syntagmatic associations in a sentence Syntactic Traces:- the syntagmatic associations in a sentence Relational Traces:- the paradigmatic associations in a sentence Relational Traces:- the paradigmatic associations in a sentence Lexical Traces:- the paradigmatic associations across sentences Lexical Traces:- the paradigmatic associations across sentences

Syntactic Traces Absolute position is not important Absolute position is not important Embedded structure is the norm Embedded structure is the norm Mary is loved by John This trace should be well matched by the probes: This trace should be well matched by the probes: The girl is loved by John Mary who was sick is loved by John

Mary is loved by John bygirlisJohnlovedMary sickthewaswho by girl is John loved Mary sick the was who

The girl is loved by John bygirlisJohnlovedMary sickthewaswho by girl is John loved Mary sick the was who

Mary who was sick is loved by John bygirlisJohnlovedMary sickthewaswho by girl is John loved Mary Sick the was who

Syntactic Retrieval Example Memory: Ellen is loved by Bert(0.25) Jody is loved by George(0.25) Alison is loved by Steve(0.25) Sonia is loved by Brad(0.25) Probe: Mary is loved by John

Syntactic Buffer

Resolving Constraints in Working Memory Retrieved syntagmatic matrix forms the constraint for working memory resolution Retrieved syntagmatic matrix forms the constraint for working memory resolution Error function: Error function: Buffer update equation: Buffer update equation:

Working Memory Buffer

Long Term Dependencies Memory:The man was furious. The men were drunk. The man was grateful. The men were organized. The man who stole the briefcase was worried. The men who shot the sheriff were scared. The man who ran the race was tired. The men who swam the river were fast. Probes:The man who knelt before the altar and gave thanks ___ The men who knelt before the altar and gave thanks ___

Sensitivity to Clause Structure Memory: The man was furious. The men were drunk. The man was grateful. The men were organized. The man who knew the men was worried. The men who heard the man were scared. The man who saw the men was tired. The men who chastised the man were fast. Probes:The man who berated the men ____ Probes:The man who berated the men ____ The men who berated the man ____ The men who berated the man ____

Garden Path Sentences The horse raced past the barn fell The horse raced past the barn fell The horse that was raced past the barn fell The horse that was raced past the barn fell The waiter served calzone complained The waiter served calzone complained  Waiter is initially assumed to be doing the serving  Assumption revised when “complained” is read

Garden Path Example The actress served duck complained.The customer served meat complained. The waiter served pizza.The waiter served wine. The waitress served oysters.The waitress served desert. The actress saw a bud.The customer saw a product. The actress saw a dove.The customer saw a hill. The waiter knew Lara.The waiter knew Bill. The waitress knew Joseph.The waitress knew Alison. The actress felt a breeze.The customer felt a hit. The actress felt sorry.The customer felt responsible. Garden Path: waiter (20), served (14), calzone (162), complained (181) Control:actress (29), served (62), calzone (179), complained (51)

Role Revision Before the word “complained” appears the waiter slot contains “waiter” and “waitress” Before the word “complained” appears the waiter slot contains “waiter” and “waitress” After the word “complained” appears “actress” and “customer” become more active as the model realizes that in this context the waiter is playing the role of a customer (i.e. someone who does the complaining) After the word “complained” appears “actress” and “customer” become more active as the model realizes that in this context the waiter is playing the role of a customer (i.e. someone who does the complaining) Unlike models such as the Simple Recurrent Network (SRN) and the Visitation Set Grammar network (VSG) it is possible for the model to modify the roles of items that have appeared earlier in the sentence. Unlike models such as the Simple Recurrent Network (SRN) and the Visitation Set Grammar network (VSG) it is possible for the model to modify the roles of items that have appeared earlier in the sentence.

Relational Traces Syntactically similar: Mary is loved by John Ellen is loved by Bert Relationally similar: Mary is loved by John John loves Mary Who does John love? Mary

Relational Bindings Ellen is loved by Bert Bert loves EllenWho does Bert love? Ellen Jody is loved by GeorgeGeorge loves JodyWho does George love? Jody Alison is loved by Steve Steve loves AlisonWho does Steve love? Alison Sonia is loved by BradBred loves SoniaWho does Brad love? Sonia Mary is loved by John => {{Ellen, Jody, Alison, Sonia} -> Mary, {Bert, George, Steve, Brad} -> John} {Bert, George, Steve, Brad} -> John} John loves Mary => {{Ellen, Jody, Alison, Sonia} -> Mary, {Bert, George, Steve, Brad} -> John} {Bert, George, Steve, Brad} -> John} Who does John love? Mary => {{Ellen, Jody, Alison, Sonia} -> Mary, {Bert, George, Steve, Brad} -> John} {Bert, George, Steve, Brad} -> John}

Forming Relational Traces Relational trace: Relational trace: Update function: Update function: Error function: Error function:

Systematicity Store relational trace for Mary is loved by John Store relational trace for Mary is loved by John {{Ann, Josie, Ellen} -> Mary,{Bert, Steve, Dave} -> John} Syntactic retrieval on Who does John love? Syntactic retrieval on Who does John love? {Bert, Steve, Dave} -> John Relational retrieval Relational retrieval {Ann, Josie, Ellen} => Mary Working memory resolution Working memory resolution Who does John love? Mary Able to bind an arbitrary token to the distributed pattern for the lovee role => Strong systematicity Able to bind an arbitrary token to the distributed pattern for the lovee role => Strong systematicity

Scaling Tennis News Article (from ABC site) Unseeded Ukranian Andrei Medvedev has won through to the semi-finals of the French Open with a three set victory over Gustavo Kuerten. Earlier Andre Agassi produced some of the best tennis of his career to dispatch Uruguayan Marcelo Filippini, Also last night, Dominic Hrbaty ended the tournament hopes of Marcello Rios Questions Who did Andrei Medvedev beat? Who dispatched Marcelo Filippini? What was the score in the match between Dominic Hrbaty and Marcello Rios? Responses Gustavo Kuerton, Andre Agassi and , respectively. No gradient descent training.

Scaling to larger corpora Sydney Morning Herald Corpus (1994) Sydney Morning Herald Corpus (1994) Took all sentences of 15 tokens or less Took all sentences of 15 tokens or less ~ sentences ~ sentences 1.2 million words 1.2 million words Vocabulary ~55000 words Vocabulary ~55000 words Syntactic retrieval currently takes ~8-10 seconds on Alpha. Syntactic retrieval currently takes ~8-10 seconds on Alpha.

Conclusions and Further Work Language learning may be the acquisition of syntagmatic and paradigmatic associations Language learning may be the acquisition of syntagmatic and paradigmatic associations Memory-based models over large corpora have the potential to show how complex control can arise from simple processes Memory-based models over large corpora have the potential to show how complex control can arise from simple processes Model predicts syntactic and relational priming (we are looking at filler gap, reduced relative and attachment structures) Model predicts syntactic and relational priming (we are looking at filler gap, reduced relative and attachment structures) Lexical traces may be an alternative to LSA and HAL that is sensitive to sentence structure Lexical traces may be an alternative to LSA and HAL that is sensitive to sentence structure Relational traces may provide basis for key word information retrieval, question answer systems and essay assessment Relational traces may provide basis for key word information retrieval, question answer systems and essay assessment