The Neural Basis of Thought and Language Final Review Session.

Slides:



Advertisements
Similar presentations
CSCTR Session 8 Dana Retová. group at UC Berkeley & Uni of Hawaii Nancy Chang Benjamin Bergen Jerome Feldman, … General assumption Semantic relations.
Advertisements

 Christel Kemke 2007/08 COMP 4060 Natural Language Processing Feature Structures and Unification.
Embodied Construction Grammar in language (acquisition and) use Jerome Feldman Computer Science Division, University of California,
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Approaches to Parsing.
Embodied Compositional Semantics Ellen Dodge
Understanding Natural Language
NLP and Speech Course Review. Morphological Analyzer Lexicon Part-of-Speech (POS) Tagging Grammar Rules Parser thethe – determiner Det NP → Det.
CS 182 Sections slides created by Eva Mok modified by JGM March 22, 2006.
The Neural Basis of Thought and Language Final Review Session.
Unified Cognitive Science Neurobiology Psychology Computer Science Linguistics Philosophy Social Sciences Experience Take all the Findings and Constraints.
Artificial Intelligence 2005/06 From Syntax to Semantics.
NTL – Converging Constraints Basic concepts and words derive their meaning from embodied experience. Abstract and theoretical concepts derive their meaning.
CS 182 Sections slides created by: Eva Mok modified by jgm April 26, 2006.
CS 182 Sections Eva Mok April 21, 2004.
The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.
PSY 369: Psycholinguistics Some basic linguistic theory part3.
Artificial Intelligence 2004 Natural Language Processing - Syntax and Parsing - Language Syntax Parsing.
11 CS 388: Natural Language Processing: Syntactic Parsing Raymond J. Mooney University of Texas at Austin.
1 Features and Unification Chapter 15 October 2012 Lecture #10.
NTL – Converging Constraints Basic concepts and words derive their meaning from embodied experience. Abstract and theoretical concepts derive their meaning.
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
Probabilistic Context Free Grammars for Representing Action Song Mao November 14, 2000.
SEMANTIC ANALYSIS WAES3303
Introduction to Embodied Construction Grammar March 4, 2003 Ben Bergen
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Linguistic Essentials
CS 182 Sections slides created by Eva Mok modified by JGM March
Rules, Movement, Ambiguity
1 Context Free Grammars October Syntactic Grammaticality Doesn’t depend on Having heard the sentence before The sentence being true –I saw a unicorn.
CPSC 422, Lecture 27Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 27 Nov, 16, 2015.
International Conference on Fuzzy Systems and Knowledge Discovery, p.p ,July 2011.
11 Project, Part 3. Outline Basics of supervised learning using Naïve Bayes (using a simpler example) Features for the project 2.
English Syntax Read J & M Chapter 9.. Two Kinds of Issues Linguistic – what are the facts about language? The rules of syntax (grammar) Algorithmic –
Chapter 3 Language Acquisition: A Linguistic Treatment Jang, HaYoung Biointelligence Laborotary Seoul National University.
10/31/00 1 Introduction to Cognitive Science Linguistics Component Topic: Formal Grammars: Generating and Parsing Lecturer: Dr Bodomo.
Roadmap Probabilistic CFGs –Handling ambiguity – more likely analyses –Adding probabilities Grammar Parsing: probabilistic CYK Learning probabilities:
The Neural Basis of Thought and Language Final Review Session.
The Neural Basis of Thought and Language Week 11 Metaphor and Automatic Reasoning.
CS 182 Sections Leon Barrett with acknowledgements to Eva Mok and Joe Makin April 4, 2007.
slides derived from those of Eva Mok and Joe Makin March 21, 2007
Week 8 Computational Level
The Neural Basis of Thought and Language Week 14.
CS 182 Sections Leon Barrett with slides inspired by Eva Mok and Joe Makin April 18, 2007.
The Neural Basis of Thought and Language Week 14.
A Compositional Constructional Analysis of ‘Hitting’ Verb Argument Realization Patterns and Their Meanings Ellen K. Dodge International Computer Science.
CS182 Leon Barrett with slides inspired by Eva Mok and Joe Makin April 25, 2008.
Announcements a3 is out, due 2/13 and 2/22 11:59pm Computational: part 1 isn't too hard; part 2 is harder quiz will be graded in about a week.
The Neural Basis of Thought and Language Week 12 Metaphor and Automatic Reasoning, plus Grammars.
CS 182 Sections slide credit to Eva Mok and Joe Makin Updated by Leon Barrett April 25, 2007.
Natural Language Processing Vasile Rus
Brief Intro to Machine Learning CS539
What is cognitive psychology?
“Perceptual Symbol Systems”
Lecture – VIII Monojit Choudhury RS, CSE, IIT Kharagpur
Statistical NLP: Lecture 3
Basic Parsing with Context Free Grammars Chapter 13
Natural Language Processing
Done Done Course Overview What is AI? What are the Major Challenges?
SYNTAX.
slides derived from those of Eva Mok and Joe Makin March 21, 2007
Natural Language Understanding
with acknowledgments to Eva Mok and Joe Makin
Week 8 Computational Level
Natural Language Processing
CKY Parser 0Book 1 the 2 flight 3 through 4 Houston5 11/16/2018
CS 388: Natural Language Processing: Syntactic Parsing
Natural Language - General
Toward a Great Class Project: Discussion of Stoianov & Zorzi’s Numerosity Model Psych 209 – 2019 Feb 14, 2019.
David Kauchak CS159 – Spring 2019
Artificial Intelligence 2004 Speech & Natural Language Processing
Presentation transcript:

The Neural Basis of Thought and Language Final Review Session

Administrivia Final in class Thursday, May 8 th –Be there on time! –Format: closed books, closed notes short answers, no blue books A9 due tonight Final paper due on Monday, May 12

Resources Textbook! Class slides Section slides Joe Makin's class notes from 2006 –on notes page

The Second Half Cognition and Language Computation Structured Connectionism Computational Neurobiology Biology Midterm Final abstraction Motor Control Metaphor Grammar Bailey Model KARMA ECG Reinforcement Learning ECG Learning Bayes Nets

Overview Bailey Model –feature structures –Bayesian model merging Bayes Nets KARMA –X-schema, frames –aspect –event-structure metaphor –inference FrameNet –frames –image schemas Reinforcement Learning ECG –SemSpecs –parsing –constructions –learning algorithm

Bailey’s VerbLearn Model 3 Levels of representation 1.cognitive: words, concepts 2.computational: f-structs, x-schemas 3.connectionist: structured models, learning rules Input: labeled hand motions (f-structs) learning: 1.the correct number of senses for each verb 2.the relevant features in each sense, and 3.the probability distributions on each included feature execution: perform a hand motion based on a label

Bayes Nets Probability Bayes' Rule / Product Rule –P(x,y) = P(x) P(y|x) = P(y) P(x|y) –P(x|y) = P(x) P(y|x) / P(y) Write factored distribution P(x,y,z,...) = P(x) P(y|x)... Infer distributions over variables given evidence –variable elimination (by summation: P(x) = sum_y P(x,y)) Temporal Bayes' Nets –pretend variables in different time slices

Event Structure Metaphor States are Locations Changes are Movements Causes are Forces Causation is Forced Movement Actions are Self-propelled Movements Purposes are Destinations Means are Paths Difficulties are Impediments to Motion External Events are Large, Moving Objects Long-term, Purposeful Activities are Journeys

Ego Moving versus Time Moving

Results 30.8%69.2% Object Moving 73.3%26.7% Ego Moving Meeting is FridayMeeting is MondayPRIME

KARMA DBN to represent target domain knowledge Metaphor maps link target and source domain X-schema to represent source domain knowledge

X-Schemas Active representation Has hierarchical actions –defined by network structure Actions have structure (e.g. ready, iterating, ongoing, failed, complete) –defined by network structure Properly-designed nets will be goal-directed –take best actions to reach goal, given current context –related to “reinforcement learning”

Reinforcement Learning unsupervised learning learn behaviors reward –discounts use estimated future value

Reinforcement Learning Learning methods –Value iteration –Q-learning Biology –dopamine = reward difference only for reward, not punishment –non-exponential discounting preference switching

Language Grammar –Syntax Semantics Metaphor Simulation Unification

Grammar A grammar is a set of rules defining a formal language a common example is Context-Free Grammar   β  : single non-terminal β : any combination of terminals and non-terminals S  NP VP NP  Det Noun | ProperNoun VP  Verb NP | Verb PP PP  Preposition NP Noun  kiwi | orange | store ProperNoun  Pat | I Det  a | an | the Verb  ate | went | shop Preposition  to | at

Sentence generation: Pat ate the kiwi start from S and apply any applicable rules forward expansion S  NP VP NP  Det Noun | ProperNoun VP  Verb NP | Verb PP PP  Preposition NP Noun  kiwi | orange | store ProperNoun  Pat | I Det  a | an | the Verb  ate | went | shop Preposition  to | at S NP VP Det Noun VPProperNoun VP a Noun VPan Noun VPthe Noun VP a kiwi VPa orange VPa store VP …

Unification Grammar Basic idea: capture these agreement features for each non-terminal in feature structures Enforce constraints on these features using unification rules Pat number : SG person : 3rd agreement I number : SG person : 1st agreement Went agreement Shop agreement number : person : 1st VP  Verb NP VP.agreement ↔ Verb.agreement S  NP VP NP.agreement ↔ VP.agreement

Poverty and Opulence Poverty of the stimulus –Coined to suggest how little information children have to learn from Opulence of the substrate –Opulence = “richness” –Coined in response to suggest how much background information children have

Analysis Process Utteranc e Simulatio n Belief State General Knowledge Constructions Semantic Specification “Harry walked into the café.”

The INTO construction construction INTO subcase of Spatial-Relation form self f.orth ← “into” meaning: Trajector-Landmark evokes Container as cont evokes Source-Path-Goal as spg trajector ↔ spg.trajector landmark ↔ cont cont.interior ↔ spg.goal cont.exterior ↔ spg.source

The Spatial-Phrase construction construction S PATIAL- P HRASE constructional constituents sr : Spatial-Relation lm : Ref-Expr form sr f before lm f meaning sr m.landmark ↔ lm m

The Directed-Motion construction construction DIRECTED-MOTION constructional constituents a : Ref-Exp m: Motion-Verb p : Spatial-Phrase form a f before m f m f before p f meaning evokes Directed-Motion as dm self m.scene ↔ dm dm.agent ↔ a m dm.motion ↔ m m dm.path ↔ p m schema Directed-Motion roles agent : Entity motion : Motion path : SPG

Do not forget the SemSpec!

What exactly is simulation? Belief update and/or X-schema execution hung ry meet ing cafe time of day ready start ongoing finish done iterate WAL K at goal

Constructions (Utterance, Situation) 1.Learner passes input (Utterance + Situation) and current grammar to Analyzer. Analyze Semantic Specification, Constructional Analysis 2.Analyzer produces SemSpec and Constructional Analysis. 3. Learner updates grammar: Hypothesize a.Hypothesize new map. Reorganize b.Reorganize grammar (merge or compose). c.Reinforce (based on usage). Learning-Analysis Cycle (Chang, 2004)

Three ways to get new constructions Relational mapping –throw the ball Merging –throw the block –throwing the ball Composing –throw the ball –ball off –you throw the ball off THROW < BALL < OFFTHROW < OBJECT THROW < BALL

Grammar merging How can we measure description length? –complicated rules are bad –lots of rules are bad measure “derivation length” –alpha * size(rules) + derivationCost(rules, sentences)

How do you learn… the meanings of spatial relations, the meanings of verbs, the metaphors, and the constructions?

How do you learn… the meanings of spatial relations, the meanings of verbs, the metaphors, and the constructions? That’s the Regier model.

How do you learn… the meanings of spatial relations, the meanings of verbs, the metaphors, and the constructions? That's Bailey's model

How do you learn… the meanings of spatial relations, the meanings of verbs, the metaphors, and the constructions? conflation hypothesis (primary metaphors)

How do you learn… the meanings of spatial relations, the meanings of verbs, the metaphors, and the constructions? construction learning

further questions?