Presentation on theme: "Nancy Chang UC Berkeley / International Computer Science Institute"— Presentation transcript:
1Nancy Chang UC Berkeley / International Computer Science Institute Embodied Models of Language Learning and Use Embodied language learningNancy ChangUC Berkeley / International Computer Science Institute
2Turing’s take on the problem “Of all the above fields the learning of languages would be the most impressive, since it is the most human of these activities.This field seems however to depend rather too much on sense organs and locomotion to be feasible.”Alan M. TuringIntelligent Machinery (1948)
3…it may be more feasible than Turing thought! Five decades later…Sense organs and locomotionPerceptual systems (especially vision)Motor and premotor cortexMirror neurons: possible representational substrateMethodologies: fMRI, EEG, MEGLanguageChomskyan revolution…and counter-revolution(s)Progress on cognitively and developmentally plausible theories of languageSuggestive evidence of embodied basis of language…it may be more feasible than Turing thought!(Maybe language depends enough on sense organs and locomotion to be feasible!)
4From single words to complex utterances FATHER: Nomi are you climbing up the books?NAOMI: up.NAOMI: climbing.NAOMI: books.1;11.3FATHER: what’s the boy doing to the dog?NAOMI: squeezing his neck.NAOMI: and the dog climbed up the tree.NAOMI: now they’re both safe.NAOMI: but he can climb trees.4;9.3MOTHER: what are you doing?NAOMI: I climbing up.MOTHER: you’re climbing up?2;0.18Sachs corpus (CHILDES)
5How do they make the leap? 0-9 monthsSmilesResponds differently to intonationResponds to name and “no”9-18 monthsFirst wordsRecognizes intentionsResponds, requests, calls, greets, protests18-24 monthsagent-objectDaddy cookieGirl ballagent-actionDaddy eatMommy throwaction-objectEat cookieThrow hatentity-attributeentity-locativeDoggie bed
6Theory of Language Structure Theory of Language AcquisitionTheory of Language Use
7The logical problem of language acquisition Gold’s Theorem: Identification in the limitNo superfinite class of language is identifiable from positive data onlyThe logical problem of language acquisitionNatural languages are not finite sets.Children receive (mostly) positive data.But children acquire language abilities quickly and reliably.One (not so) logical conclusion:THEREFORE: there must be strong innate biases restricting the search spaceUniversal Grammar + parameter settingBut kids aren’t born as blank slates!And they do not learn language in a vacuum!
8Model is generalizing! Just like real babies! Note: class of probabilistic context-free languages is learnable in the limit!!I.e., from hearing a finite number of sentences, Baby can correctly converge on a grammar that predicts an infinite number of sentences.Model is generalizing! Just like real babies!
9Theory of Language Structure = autonomous syntax Theory of Language AcquisitionTheory of Language Use
10What is knowledge of language? Basic sound patterns (Phonology)How to make words (Morphology)How to put words together (Syntax)What words (etc.) mean (Semantics)How to do things with words (Pragmatics)Rules of conversation (Pragmatics)
11Many mysteries What is the nature of linguistic representation? Learning biases?Input dataPrior knowledgeHow does language acquisition interact with other linguistic and cognitive processes?
12meaningful language use in context. Grammar learning is driven bymeaningful language use in context.All aspects of the problem should reflect this assumption:Target of learning: a construction (form-meaning pair)Prior knowledge: rich conceptual structure, pragmatic inferenceTraining data: pairs of utterances / situational contextPerformance measure: success in communication (comprehension)
13Theory of Language Structure = constructions (form-meaning pairs) = autonomous syntaxTheory of Language Structure= constructions (form-meaning pairs)Theory of Language AcquisitionTheory of Language Use
14Theory of Language Structure = constructions (form-meaning pairs) Theory of Language AcquisitionTheory of Language Use
15Theory of Language Structure Theory of Language AcquisitionTheory of Language Use
17Incremental development throwthrow 1;8.0throw off 1;8.0I throwded 1;10.28I throw it. 1;11.3throwing in. 1;11.3throw it. 1;11.3throw frisbee. 1;11.3can I throw it? 2;0.2I throwed Georgie. 2;0.2you throw that? 2;0.5gonna throw that? 2;0.18throw it in the garbage. 2;1.17throw in there. 2;1.17throw it in that. 2;5.0throwed it in the diaper pail. 2;11.12fallfell down. 1;6.16fall down. 1;8.0I fall down. 1;10.17fell out. 1;10.18I fell it. 1;10.28fell in basket. 1;10.28fall down boom. 1;11.11almost fall down. 1;11.11toast fall down. 1;11.20did Daddy fall down? 1;11.20Kangaroo fall down 1;11.21Georgie fell off 2;0.4you fall down. 2;0.5Georgie fall under there? 2;0.5He fall down 2;0.18Nomi fell down? 2;0.18I falled down. 2;3.0
18Children in one-word stage know a lot! imagespeopleembodied knowledgestatistical correlations… i.e., experience.actionsOr, Opulence of the substrate.objectslocations
19Correlating forms and meanings FORM (sound)MEANING (stuff)lexical constructions“you”youHumanObjectthrow“throw”Throwthrowerthrowee“ball”ball“block”block
20Phonology: Non-native contrasts Werker and Tees (1984)Thompson: velar vs. uvular, /`ki/-/`qi/.Hindi: retroflex vs. dental, /t.a/-/ta/The Thompson language is an Interior Salish (Native Indian) language spoken in south central British Columbia.
21Finding words: Statistical learning Saffran, Aslin and Newport (1996)/bidaku/, /padoti/, /golabu//bidakupadotigolabubidaku/2 minutes of this continuous speech streamBy 8 months infants detect the words (vs non-words and part-words)pretty baby
22Language Acquisition Opulence of the substrate Prelinguistic children already have rich sensorimotor representations and sophisticated social knowledgeintention inference, reference resolutionlanguage-specific event conceptualizations(Bloom 2000, Tomasello 1995, Bowerman & Choi, Slobin, et al.)Children are sensitive to statistical informationPhonological transitional probabilitiesMost frequent items in adult input learned earliest(Saffran et al. 1998, Tomasello 2000)
23Words learned by most 2-year olds in a play school (Bloom 1993) food toys misc people sound emotion action prep demon. socialWords learned by most 2-year olds in a play school (Bloom 1993)
25Word order: agent and patient Hirsch-Pasek and Golinkoff (1996)1;4-1;7mostly still in the one-word stageWhere is CM tickling BB?
26Language Acquisition Basic Scenes Verb Island Hypothesis Simple clause constructions are associated directly with scenes basic to human experience(Goldberg 1995, Slobin 1985)Verb Island HypothesisChildren learn their earliest constructions (arguments, syntactic marking) on a verb-specific basis(Tomasello 1992)Young children’s early verbs and relational terms are individual islands of organization in an otherwise unorganized grammatical system. Later get: SELF-MOTION CXN, more general kind of progressive construction, etc.throw frisbeeget ballthrow ballget bottle……throw OBJECTget OBJECT
27Children generalize from experience …push3force=highpush12force=lowpush34force=?Specific cases are learned before generalthrow frisbee…throw ballthrow OBJECTdrop ball…drop bottledrop OBJECTEarliest constructions are lexically specific (item-based).(Verb Island Hypothesis, Tomasello 1992)throw OBJECT…push OBJECTACTION OBJECT
28Development Of Throw Contextually grounded throw off1;10.28 I throwded it. (= I fell)I throwded. (= I fell)1;11.3 I throw it.I throw it ice. (= I throw the ice)throwing in.throwing.1;2.9 don’t throw the bear.1;10.11 don’t throw them on the ground.1;11.3 Nomi don’t throw the books down.what do you throw it into?what did you throw it into?1;11.9 they’re throwing this in here.throwing the thing.Contextually groundedParental utterances more complexThese are (nearly) all Naomi’s uses of throw during the period shown.Note N’s phrases grow in complexity, infer a lot from context…but, parent’s are about the same over time. [Much more to input than just this.](Independent development of different verb usages)*** looking ahead, hypothesis is that children are learning CONSTRUCTIONS that help them understand/produce language
29Development Of Throw (cont’d) 2;0.3 don’t throw it Nomi.Nomi stop throwing.well you really shouldn’t throw things Nomi you know. remember how we told you you shouldn’t throw things.can I throw it?I throwed Georgie.could I throw that?2;0.5 throw it?you throw that?2;0.18 gonna throw that?2;1.17 throw it in the garbage.throw in there.2;5.0 throw it in that.2;11.12 I throwed it in the diaper pail.
30Session 4 outline Language acquisition: the problem Child language acquisitionUsage-based construction learning modelRecapitulation: Embodied cognitive models
31How do children make the transition from single words to complex combinations? Multi-unit expressions with relational structureConcrete word combinationsfall down, eat cookie, Mommy sockItem-specific constructions (limited-scope formulae)X throw Y, the X, X’s YArgument structure constructions (syntax)Grammatical markersTense-aspect, agreement, caseIn early word combinations and other “relational” constructions = structurally complex constructions.
32Language learning is structure learning “You’re throwing the ball!”Intonation, stressPhonemes, syllablesMorphological structureWord segmentation, orderSyntactic structureSensorimotor structureEvent structurePragmatic structure: attention, intention, perspectiveStat. regularitiesLanguage is rife with structure.Even simple utterances involve structure at a variety of interdependent levels,from intonational and phonological and syllabic structure, etc.Of course, there’s a lot more structure in the environment than that!In fact, much of the first year of life is devoted to mastery of all patterns of experience -- linguistic and otherwise.Event: Causal, temporal, force-dynamic structure.(Some crosslinguistic variation in timing, and perhaps packaging.)
33Making sense: structure begets structure! Structure is cumulativeObject recognition scene understandingWord segmentation word learningLearners exploit existing structure to make sense of their environmentAchieve goalsInfer intentionsLanguage learners exploit existing structure to make sense of their environmentAchieve communicative goalsInfer communicative intentionsTrue of all structure -- sound, meaning, social, etc.; march toward complexity.Children are active learners, employing function / understanding.
34Exploiting existing structure “You’re throwing the ball!”This claim is increasingly uncontroversial for word learning.Word learning is also, famously, a mapping problem (word-to-world).- it is a mapping across domains -- those of form and meaning.widespread consensus that children are sensitive to a lot of rich structure they can use to identify potential mappings(bloom etc).
36What we say to kids…what do you throw it into?they’re throwing this in here.do you throw the frisbee?they’re throwing a ball.don’t throw it Nomi.well you really shouldn’t throw things Nomi you know.remember how we told you you shouldn’t throw things.What they hear…blah blah YOU THROW blah?blah THROW blah blah HERE.blah YOU THROW blah blah?blah THROW blah blah BALL.DON’T THROW blah NOMI.blah YOU blah blah THROW blah NOMI blah blah.blah blah blah blah YOU shouldn’t THROW blah.But children also have rich situational context/cues they can use to fill in the gaps.
37Understanding drives learning Utterance+SituationConceptual knowledgeLinguistic knowledgeUnderstandingLearningBasic idea: exploit all known structure -- linguistic and otherwise -- to:- communicate- make sense of new data- build complex structures from known ones(Partial)Interpretation
38Potential inputs to learning Genetic language-specific biasesDomain-general structures and processesEmbodied representations…grounded in action, perception, conceptualization, and other aspects of physical, mental and social experienceTalmy 1988, 2000; Glenberg and Robertson 1999; MacWhinney 2005;Barsalou 1999; Choi and Bowerman 1991; Slobin 1985, 1997Social routinesIntention inference, reference resolutionStatistical informationtransition probabilities, frequency effectsUsage-based approaches to language learning(Tomasello 2003, Clark 2003, Bybee 1985, Slobin 1985, Goldberg 2005)…the opulence of the substrate!
39Methodology: computational modeling Grammar learning is driven bymeaningful language use in context.Meaningful, structured representationsTarget representation: construction-based grammarInput data: utterance+context pairs, conceptual/linguistic knowledgeConstruction analyzer (comprehension)Usage-based learning frameworkOptimization toward “simplest” grammar given the dataGoal: improved comprehensionTake meaning seriously!Structural constraints of word combinations -- meaningful* all meaningful I nput available* functional constraints of communicationLearning model: usage-driven optimization (MDL)Learning operations (structural mapping)Evaluation criteria (simplicity)
40Models of language learning Several previous models of word learning are grounded (form + meaning)Regier 1996: <bitmaps, word> ® spatial relationsRoy and Pentland 1998: <image, sound> ® object shapes/attributesBailey 1997: <feature structure, word> ® actionsSiskind 2000: <video, sound> ® actionsOates et al. 1999: <sensors, word class> ® actionsNot so for grammar learning!Stolcke 1994: probabilistic attribute grammars from sentencesSiskind 1996: verb argument structure from predicatesThompson 1998: syntax-semantics mapping from database queriesWord learning is like category learning, but with cross-domain character-addressed by many methods (model merging, clustering, any generalization)-NOTICE this is UNARYRoles handled implicitly, if at all.[Oates et al. 1999: Using Syntax to Learn Semantics -- learn senses using bigram clustering]
41Representation: constructions The basic linguistic unit is a <form, meaning> pair(Kay and Fillmore 1999, Lakoff 1987, Langacker 1987, Goldberg 1995, Croft 2001, Goldberg and Jackendoff 2004)balltowardBig Birdthrow-it
42Relational constructions throw ballconstruction THROW-BALLconstituentst : THROWo : BALLformtf before ofmeaningtm.throwee « omEmbodied Construction Grammar(Bergen & Chang, 2005)
48Construction learning model: search Model allows incorporation of all kinds of available info.Model defines a CLASS of learning problems.I.e. allows all or none of the sources of constraint above, though we will focus on latter set.
49Proposing new constructions Relational Mappingcontext-dependentReorganizationMerging (generalization)Splitting (decomposition)Joining (compositon)context-independent
50Initial Single-Word Stage FORM (sound)lexical constructionsMEANING (stuff)schema Addresseesubcase of Human“you”youschema Throwroles:throwerthroweethrow“throw”ball“ball”schema Ballsubcase of Objectschema Blocksubcase of Objectblock“block”
51New Data: “You Throw The Ball” FORMMEANINGSITUATIONthrow-ballSelfyouAddresseeschema Addresseesubcase of Human“you”AddresseeThrowthrowerthroweeschema Throwroles:throwerthroweeThrowthrowerthrowee“throw”throwbeforerole-filler“the”ballschema Ballsubcase of ObjectBallBall“ball”schema Blocksubcase of Object“block”block
52New Construction Hypothesized construction THROW-BALLconstructionalconstituentst : THROWb : BALLformtf before bfmeaningtm.throwee ↔ bm
54Context-driven relational mapping: form and meaning correlation
55Meaning Relations: pseudo-isomorphism strictly isomorphic:Bm fills a role of Amshared role-filler:Am and Bm have a role filled by Xsibling role-fillers:Am and Bm fill roles of Y
56Relational mapping strategies strictly isomorphic:Bm is a role-filler of Am (or vice versa)Am.r1 BmAfAmAform- relationrole- fillerBfBmBthrow ball throw.throwee ball
57Relational mapping strategies shared role-filler:Am and Bm each have a role filled by the same entityAm.r1 Bm.r2role- fillerAfAmAform- relationXrole- fillerBfBmBput ball down put.mover ball down.tr ball
58Relational mapping strategies sibling role-fillers:Am and Bm fill roles of the same schemaY.r1 Am, Y.r2 Bmrole- fillerAfAmAform- relationYrole- fillerBfBmBNomi ball possession.possessor Nomi possession.possessed ball
59Overview of learning processes Relational mappingthrow the ballMergingthrow the blockthrowing the ballJoiningball offyou throw the ball offTHROW < BALLTHROW < OBJECTTHROW < BALL < OFF
60Merging similar constructions FORMMEANINGthrow the blockconstruction THROW-BLOCKconstituentst : THROWo : BLOCKformtf before ofmeaningtm.throwee « omBlockThrowthrowerthroweethrow before ObjectfTHROW.throwee = ObjectmTHROW-OBJECT constructionthrow the ballconstruction THROW-BALLconstituentst : THROWo : BALLformtf before ofmeaningtm.throwee « omBallThrowthrowerthroweeMerge constructions involving correlated relational mappings over one or more pairs of similar constituents“MERGE” b/c default algorithm is model merging.-been used for several frameworks, most relevant = verbs. (Also used it for one-word domain, featurized.)construction THROW-OBJECTconstituentst : THROWo : OBJECTformtf before ofmeaningtm.throwee « omconstruction THROW-BLOCKsubcase of THROW-OBJECTo : BLOCKconstruction THROW-BALLo : BALL
61More complex generalization operations could drive the formation of new *constructional* categories. top: common parent Toy found for Ball and Blockbottom: new “ToyX” category formed with Ball-Cn and Block-Cn as only subcases
62Overview of learning processes Relational mappingthrow the ballMergingthrow the blockthrowing the ballJoiningball offyou throw the ball offTHROW < BALLTHROW < OBJECTTHROW < BALL < OFF
63Joining co-occurring constructions FORMMEANINGthrow the ballconstruction THROW-BALLconstituentst : THROWo : BALLformtf before ofmeaningtm.throwee « omBallThrowthrowerthroweeTHROW.throwee=BallMotion mm.mover = Ballm.path = Offthrow before ballball before offThrowBallOff constructionCompose frequently co-occurring constructions with compatible constraints (e.g., common arguments)construction BALL-OFFconstituentsb : BALLo : OFFformbf before ofmeaningevokes Motion as mmm.mover « bmmm.path « omBallMotionmoverball offpathOff
64Joined construction construction THROW-BALL-OFF constructional constituentst : THROWb : BALLo : OFFformtf before bfbf before ofmeaningevokes MOTION as mtm.throwee bmm.mover bmm.path om
65Construction learning model: evaluation Learning operations are guided by an MDL heuristicHeuristic: minimum description length (MDL: Rissanen 1978)asdf
66Learning:usage-based optimization Grammar learning = search for (sets of) constructionsIncremental improvement toward best grammar given the dataSearch strategy: usage-driven learning operationsEvaluation criteria: simplicity-based, information-theoreticMinimum description length: most compact encoding of the grammar and dataTrade-off between storage and processingDomain-general learning principles applied to linguistic structures/processes
67Minimum description length (Rissanen 1978, Goldsmith 2001, Stolcke 1994, Wolff 1982)Seek most compact encoding of data in terms ofCompact representation of model (i.e., the grammar)Compact representation of data (i.e., the utterances)Approximates Bayesian learning (Bailey 1997, Stolcke 1994)Exploit tradeoff between preferences for:smaller grammarssimpler analyses of dataFewer constructionsFewer constituents/constraintsShorter slot chains (more local concepts)Pressure to compress/generalizeMore likely constructionsShallower analysesPressure to retain specific constructions
68MDL: details Choose grammar G to minimize length(G|D): length(G|D) = m • length(G) + n • length(D|G)Bayesian approximation: length(G|D) ≈ posterior probability P(G|D)Length of grammar = length(G) ≈ prior P(G)favor fewer/smaller constructions/rolesfavor shorter slot chains (more familiar concepts)Length of data given grammar = length(D|G) ≈ likelihood P(D|G)favor simpler analyses using more frequent constructions
69Flashback to verb learning: Learning 2 senses of PUSH Model merging based on Bayesian MDL
70Experiment: learning verb islands Question:Can the proposed construction learning model acquire English item-based motion constructions? (Tomasello 1992)Form:Participants : Mother, Naomi, BallScene :Discourse :text : throw the ballintonation : fallingThrowthrower : Naomithrowee : Ballspeaker :Motheraddressee Naomispeech act : imperativeactivity : playjoint attention : BallGiven: initial lexicon and ontologyData: child-directed language annotated with contextual information
71Experiment: learning verb islands Subset of the CHILDES database of parent-child interactions (MacWhinney 1991; Slobin )coded by developmental psychologists forform: particles, deictics, pronouns, locative phrases, etc.meaning: temporality, person, pragmatic function, type of motion (self-movement vs. caused movement; animate being vs. inanimate object, etc.)crosslinguistic (English, French, Italian, Spanish)English motion utterances: 829 parent, 690 child utterancesEnglish all utterances: 3160 adult, 5408 childage span is 1;2 to 2;6Psychologists have chosen meaningful variablesthis may be sufficient, but in addition we may translate into simulation primitivesself-movement of animate being (walk)force.energy-source = spg.trajector
72Annotated Childes Data 765 Annotated Parent UtterancesAnnotated for the following scenes:CausedMotion : “Put Goldie through the chimney”SelfMotion : “did you go to the doctor today?”JointMotion : “bring the other pieces Nomi”Transfer :“give me the toy”SerialAction: “come see the doggie”Originally annotated by psychologists
73An Annotation (Bindings) Utterance: Put Goldie through the chimneySceneType: CausedMotionCauser: addresseeAction: putDirection: throughMover: Goldie (toy)Landmark: chimney
74Learning throw-constructions INPUT UTTERANCE SEQUENCELEARNED CXNS1. Don’t throw the bear.throw-bear2. you throw ityou-throw3. throw-ing the thing.throw-thing4. Don’t throw them on the ground.throw-them5. throwing the frisbee.throw-frisbeeMERGEthrow-OBJ6. Do you throw the frisbee? COMPOSEyou-throw-frisbee7. She’s throwing the frisbee. COMPOSEshe-throw-frisbee
76Early talk about throwing Transcript data, Naomi 1;11.9Par: they’re throwing this in here.Par: throwing the thing.Child: throwing in.Child: throwing.Par: throwing the frisbee. …Par: do you throw the frisbee? do you throw it?Child: throw it.Child: I throw it. …Child: throw frisbee.Par: she’s throwing the frisbee.Child: throwing ball.Sample input prior to 1;11.9:don’t throw the bear.don’t throw them on the ground.Nomi don’t throw the books down.what do you throw it into?Sample tokens prior to 1;11.9:throwthrow offI throw it.I throw it ice. (= I throw the ice)Over whole corpus, N’s phrases grow in complexity…but, parent’s are about the same over time. [Much more to input than just this.]Still infer a lot from context(Independent development of different verb usages)Sachs corpus (CHILDES)
77A quantitative measure: coverage Goal: incrementally improving comprehensionAt each stage in testing, use current grammar to analyze test setCoverage = % role bindings analyzedExample:Grammar: throw-ball, throw-block, you-throwTest sentence: throw the ball.Bindings: scene=Throw, thrower=Nomi, throwee=ballParsed bindings: scene=Throw, throwee=ballScore test grammar on sentence: 2/3 = 66.7%
79Principles of interaction Early in learning: no conflictConceptual knowledge dominatesMore lexically specific constructions (no cost)throw wantthrow off want cookiethrowing in want cerealyou throw it I want itLater in learning: pressure to categorizeMore constructions = more potential for confusion during analysisMixture of lexically specific and more general constructionsthrow OBJ want OBJthrow DIR I want OBJthrow it DIR ACTOR want OBJACTOR throw OBJ
80Verb island constructions learned Basic processes produce constructions similar to those in child production data.System can generalize beyond encountered data with pressure to merge constructions.Differences in verb learning lend support to verb island hypothesis.Future directionsfull English corpus: non-motion scenes, argument structure constructionsCrosslinguistic data: Russian (case marking), Mandarin Chinese (omission, directional particles, aspect markers)Morphological constructionsContextual constructions; multi-utterance discourse (Mok)
81SummaryModel satisfies convergent constraints from diverse disciplinesCrosslinguistic developmental evidenceCognitive and constructional approaches to grammarPrecise grammatical representations and data-driven learning framework for understanding and acquisitionModel addresses special challenges of language learningExploits structural parallels in form/meaning to learn relational mappingsLearning is usage-based/error-driven (based on partial comprehension)Minimal specifically linguistic biases assumedLearning exploits child’s rich experiential advantageEarliest, item-based constructions learnable from utterance-context pairs
82Key model components Embodied representations Construction formalism Experientially motivated rep’ns incorporating meaning/contextConstruction formalismMultiword constructions = relational form-meaning correspondencesUsage 1: Learning tightly integrated with comprehensionNew constructions bridge gap between linguistically analyzed meaning and contextually available meaningUsage 2: Statistical learning frameworkIncremental, specific-to-general learningMinimum description length heuristic for choosing best grammar
83Embodied Construction Grammar Theory of Language StructureTheory of Language AcquisitionTheory of Language UseUsage-based optimizationSimulation Semantics
84Usage-based learning: comprehension and production constructiconworld knowledgediscourse & situational contextsimulationanalysisutteranceanalyze & resolveutteranceresponsecomm. intentgeneratereinforcement(usage)hypothesize constructions& reorganizereinforcement (correction)reinforcement(usage)reinformcent(correction)
86A Best-Fit Approach for Productive Analysis of Omitted Arguments Eva Mok & John BryantUniversity of California, BerkeleyInternational Computer Science Institute
87Simplifying grammar by exploiting the language understanding process Omission of arguments in Mandarin ChineseConstruction grammar frameworkModel of language understandingOur best-fit approach
88Productive Argument Omission (in Mandarin) ma1+magei3ni3zhei4+gemothergive2PSthis+CLS1Mother (I) give you this (a toy).You give auntie [the peach].2ni3gei3yi22PSgiveauntieaoni3gei3yaEMP2PSgive3Oh (go on)! You give [auntie] [that].gei3give4[I] give [you] [some peach].CHILDES Beijing Corpus (Tardiff, 1993; Tardiff, 1996)
89Arguments are omitted with different probabilities All arguments omitted: 30.6%No arguments omitted: 6.1%
90Construction grammar approach Kay & Fillmore 1999; Goldberg 1995Grammaticality: form and functionBasic unit of analysis: construction, i.e. a pairing of form and meaning constraintsNot purely lexically compositionalImplies early use of semantics in processingEmbodied Construction Grammar (ECG) (Bergen & Chang, 2005)
91Proliferation of constructions SubjVerbObj1Obj2↓GiverTransferRecipientThemeVerbObj1Obj2↓TransferRecipientThemeSubjVerbObj2↓GiverTransferThemeSubjVerbObj1↓GiverTransferRecipient…
92If the analysis process is smart, then... SubjVerbObj1Obj2↓GiverTransferRecipientThemeThe grammar needs only state one constructionOmission of constituents is flexibly allowedThe analysis process figures out what was omitted
94Competition-based analyzer finds the best analysis An analysis is made up of:A constructional treeA set of resolutionsA semantic specificationThe best fit has the highest combined score
95Combined score that determines best-fit Syntactic Fit:Constituency relationsCombine with preferences on non-local elementsConditioned on syntactic contextAntecedent Fit:Ability to find referents in the contextConditioned on syntactic information, feature agreementSemantic Fit:Semantic bindings for frame rolesFrame roles’ fillers are scored
96Analyzing ni3 gei3 yi2 (You give auntie) Two of the competing analyses:ni3gei3yi2omitted↓GiverTransferRecipientThemeni3gei3omittedyi2↓GiverTransferRecipientThemeSyntactic Fit:P(Theme omitted | ditransitive cxn) = 0.65P(Recipient omitted | ditransitive cxn) = 0.42(1-0.78)*(1-0.42)*0.65 = 0.08(1-0.78)*(1-0.65)*0.42 = 0.03
97Frame and lexical information restrict type of reference Transfer FrameGiverRecipientThemeMannerMeansPlacePurposeReasonTimeLexical Unit gei3Giver (DNI)Recipient (DNI)Theme (DNI)
98Can the omitted argument be recovered from context? Antecedent Fit:ni3gei3yi2omitted↓GiverTransferRecipientThemeni3gei3omittedyi2↓GiverTransferRecipientThemeDiscourse & Situational Contextchild motherpeach auntietable?
99How good of a theme is a peach? How about an aunt? Semantic Fit:ni3gei3yi2omitted↓GiverTransferRecipientThemeni3gei3yi2omitted↓GiverTransferRecipientThemeni3gei3omittedyi2↓GiverTransferRecipientThemeThe Transfer FrameGiver (usually animate)Recipient (usually animate)Theme (usually inanimate)
100Each construction is annotated with probabilities of omission The argument omission patterns shown earlier can be covered with just ONE constructionSubjVerbObj1Obj2↓GiverTransferRecipientTheme0.780.650.42P(omitted|cxn):Each construction is annotated with probabilities of omissionLanguage-specific default probability can be set
101Leverage process to simplify representation The processing model is complementary to the theory of grammarBy using a competition-based analysis process, we can:Find the best-fit analysis with respect to constituency structure, context, and semanticsEliminate the need to enumerate allowable patterns of argument omission in grammarThis is currently being applied in models of language understanding and grammar learning.
102Simulation hypothesis We understand by mentally simulatingSimulation exploits some of the same neural structures activated during performance, perception, imagining, memory…Linguistic structure parameterizes the simulation.Language gives us enough information to simulate
103Language understanding as simulative inference “Harry walked to the cafe.”UtteranceLinguistic knowledgeAnalysis ProcessGeneral KnowledgeBelief StateSchema Trajector Goalwalk Harry cafeSimulation SpecificationSimulationCafe
104Usage-based learning: comprehension and production constructiconworld knowledgediscourse & situational contextsimulationanalysisutteranceanalyze & resolveutteranceresponsecomm. intentgeneratereinforcement(usage)hypothesize constructions& reorganizereinforcement (correction)reinforcement(usage)reinformcent(correction)
106Theory of Language Structure Theory of Language AcquisitionTheory of Language Use
107Motivating assumptions Structure and process are linkedEmbodied language use constrains structure!Language and rest of cognition are linkedAll evidence is fair gameNeed computational formalisms that capture embodimentEmbodied meaning representationsEmbodied grammatical theory
108Embodiment and Simulation: Basic NTL Hypotheses Embodiment HypothesisBasic concepts and words derive their meaning from embodied experience.Abstract and theoretical concepts derive their meaning from metaphorical maps to more basic embodied concepts.Structured connectionist models provide a suitable formalism for capturing these processes.Simulation HypothesisLanguage exploits many of the same structures used for action, perception, imagination, memory and other neurally grounded processes.Linguistic structures set parameters for simulations that draw on these embodied structures.
109The ICSI/Berkeley Neural Theory of Language Project The general tactic is to view complex cognitive phenomena as having more than one level of analysis, here 5 levels.(Cog/ling level that cogsci people study; computational level with relatively standard CS repns; neural networks with structure, etc.)Importantly, this reduction or abstraction is constrained in that structures or representations used at one level should have equivalent translations or implementations at the more concrete or biologically inspired levels (e.g. SHRUTI binding via temporal synchrony)The ICSI/Berkeley Neural Theory of Language Project
111Complex phenomena within reach Radial categories / prototype effects (Rosch 1973, 1978; Lakoff 1985)mother: birth / adoptive / surrogate / genetic, …Profiling (Langacker 1989, 1991; cf. Fillmore XX)hypotenuse, buy/sell (Commercial Event frame)Metaphor and metonymy (Lakoff & Johnson 1980)ANGER IS HEAT, MORE IS UPThe ham sandwich wants his check. / All hands on deck.Mental spaces (Fauconnier 1994)The girl with blue eyes in the painting really has green eyes.Conceptual blending (Fauconnier & Turner 2002, inter alia)workaholic, information highway, fake guns
112Embodiment in language Perceptual and motor systems play a central role in language production and comprehensionTheoretical proposalsLinguistics: Lakoff, Langacker, TalmyNeuroscience: Damasio, EdelmanCognitive psychology: Barsalou, Gibbs, Glenberg, MacWhinneyComputer science: Steels, Brooks, Siskind, Feldman, Roy