Chapter 3 Language Acquisition: A Linguistic Treatment Jang, HaYoung Biointelligence Laborotary Seoul National University.

Slides:



Advertisements
Similar presentations
How Children Acquire Language
Advertisements

C O N T E X T - F R E E LANGUAGES ( use a grammar to describe a language) 1.
Chapter 4 Syntax.
Psych 156A/ Ling 150: Acquisition of Language II Lecture 12 Poverty of the Stimulus I.
Movement Markonah : Honey buns, there’s something I wanted to ask you
Main points of Interlanguage, Krashen, and Universal Grammar
Introduction: The Chomskian Perspective on Language Study.
Week 3a. UG and L2A: Background, principles, parameters CAS LX 400 Second Language Acquisition.
Learning linguistic structure with simple recurrent networks February 20, 2013.
MORPHOLOGY - morphemes are the building blocks that make up words.
Chapter 6 Formal Approaches to SLA Joanna – N98C0026 楊鎧綺 Gass, S. M., & Selinker, L. (2008). Second language acquisition: An introductory course (3rd.
Introduction to Computational Natural Language Learning Linguistics (Under: Topics in Natural Language Processing ) Computer Science (Under:
1 Introduction to Computational Natural Language Learning Linguistics (Under: Topics in Natural Language Processing ) Computer Science (Under:
Psycholinguistics 12 Language Acquisition. Three variables of language acquisition Environmental Cognitive Innate.
Models of Generative Grammar Smriti Singh. Generative Grammar  A Generative Grammar is a set of formal rules that can generate an infinite set of sentences.
Lecture 1 Introduction: Linguistic Theory and Theories
1. Introduction Which rules to describe Form and Function Type versus Token 2 Discourse Grammar Appreciation.
Generative Grammar(Part ii)
Syntax Nuha AlWadaani.
Linguistic Theory Lecture 3 Movement. A brief history of movement Movements as ‘special rules’ proposed to capture facts that phrase structure rules cannot.
Emergence of Syntax. Introduction  One of the most important concerns of theoretical linguistics today represents the study of the acquisition of language.
Copyright © Cengage Learning. All rights reserved.
IV. SYNTAX. 1.1 What is syntax? Syntax is the study of how sentences are structured, or in other words, it tries to state what words can be combined with.
THE BIG PICTURE Basic Assumptions Linguistics is the empirical science that studies language (or linguistic behavior) Linguistics proposes theories (models)
Some Probability Theory and Computational models A short overview.
Psycholinguistic Theory
Unsupervised learning of Natural languages Eitan Volsky Yasmine Meroz.
1 Chapter 4 Syntax The sentence patterns of language Part I.
© 2008 The McGraw-Hill Companies, Inc. Chapter 8: Cognition and Language.
Copyright © Curt Hill Languages and Grammars This is not English Class. But there is a resemblance.
Universal Grammar Noam Chomsky.
Introduction to Language Acquisition Theory Janet Dean Fodor St. Petersburg July 2013 Class 1. Language acquisition theory: Origins and issues.
First Language Acquisition
1 Acqusisition of Syntax Guasti Chapter 6; Thornton (1995)
GROUP 5 ANNIS LUTHFIANA AULYA PURNAWIDHA D FITA ARIYANA
Culture , Language and Communication
CSA2050 Introduction to Computational Linguistics Lecture 1 What is Computational Linguistics?
Rules, Movement, Ambiguity
Artificial Intelligence: Natural Language
1 LIN 1310B Introduction to Linguistics Prof: Nikolay Slavkov TA: Qinghua Tang CLASS 16, March 6, 2007.
Introduction Chapter 1 Foundations of statistical natural language processing.
1Computer Sciences Department. Book: INTRODUCTION TO THE THEORY OF COMPUTATION, SECOND EDITION, by: MICHAEL SIPSER Reference 3Computer Sciences Department.
Chapter 1. Introduction From “The Computational Nature of Language Learning and Evolution” Summarized by Seok Ho-Sik.
1 Syntax 1. 2 In your free time Look at the diagram again, and try to understand it. Phonetics Phonology Sounds of language Linguistics Grammar MorphologySyntax.
SYNTAX.
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
Chap2. Language Acquisition: The Problem of Inductive Inference (2.1 ~ 2.2) Min Su Lee The Computational Nature of Language Learning and Evolution.
Language and Cognition Colombo, June 2011 Day 2 Introduction to Linguistic Theory, Part 3.
1 LIN 1310B Introduction to Linguistics Prof: Nikolay Slavkov TA: Qinghua Tang CLASS 11, Feb 9, 2007.
3.3 A More Detailed Look At Transformations Inversion (revised): Move Infl to C. Do Insertion: Insert interrogative do into an empty.
Chapter 7 Linguistics aspect of interlanguage
Universal Grammar Chomsky and his followers no longer use the term LAD, but refer to the child’s innate endowment as Universal Grammar (UG). UG is a theory.
The Computational Nature of Language Learning and Evolution 10. Variations and Case Studies Summarized by In-Hee Lee
Language learning Approaches & key theorists. Historical language approaches 1 Grammar/translation Formalised end 19 th C. Mind consisting of separate.
Chapter 10 Language acquisition Language acquisition----refers to the child’s acquisition of his mother tongue, i.e. how the child comes to understand.
Ch 4. Language Acquisition: Memoryless Learning 4.1 ~ 4.3 The Computational Nature of Language Learning and Evolution Partha Niyogi 2004 Summarized by.
Ch 2. The Probably Approximately Correct Model and the VC Theorem 2.3 The Computational Nature of Language Learning and Evolution, Partha Niyogi, 2004.
Ch 5. Language Change: A Preliminary Model 5.1 ~ 5.2 The Computational Nature of Language Learning and Evolution P. Niyogi 2006 Summarized by Kwonill,
Chapter 9. A Model of Cultural Evolution and Its Application to Language From “The Computational Nature of Language Learning and Evolution” Summarized.
Biointelligence Laboratory, Seoul National University
Psych156A/Ling150: Psychology of Language Learning
What is Language Acquisition?
Chapter 2 : Data Flow Diagram
Chapter Eight Syntax.
Ch 6. Language Change: Multiple Languages 6.1 Multiple Languages
Traditional Grammar VS. Generative Grammar
PRESENTATION: GROUP # 5 Roll No: 14,17,25,36,37 TOPIC: STATISTICAL PARSING AND HIDDEN MARKOV MODEL.
Linguistic aspects of interlanguage
Rubric (Four 5-point scale=20 points)
Biointelligence Laboratory, Seoul National University
Presentation transcript:

Chapter 3 Language Acquisition: A Linguistic Treatment Jang, HaYoung Biointelligence Laborotary Seoul National University

Arguments in the Previous Chapters Languages have a formal structure and in this sense may be viewed as sets of well-formed expressions. On exposure to finite amounts of data, children are able to learn a language. Language acquisition does not depend upon the order of presentations of sentences and it largely proceeds on the basis of positive examples alone. All naturally occurring languages are learnable by children.

Arguments in the Previous Chapters In Gold theorem, Chomsky hierarchy

Objectives of This Chapter In the complete absence of prior information, successful generalization to novel expressions is impossible. Explanation for how successful language acquisition may come about.  The range of grammatical hypotheses  Learning algorithm that children may plausibly use

Contents 3.1 Language Learning and the Poverty of Stimulus 3.2 Constrained Grammars – Principles and Parameters 3.3 Learning in the Principles and Parameters Framework 3.4 Formal Analysis of the triggering Learning Algorithm 3.5 Conclusions

3.1 Language Learning and the Poverty of Stimulus Question formation in English  John is bald.  Is John bald? Two rules that the learner may logically infer  Move the second word in the sentence to the front  Move the first ‘is’ to the front

3.1 Language Learning and the Poverty of Stimulus Novel statement where there are multiple instances of ‘is’  The man who is running is bald. What is the appropriate interrogative form?  Is the man who is running bald?  Is the man who running is bald? Grammatical sequences have an internal structure  is

Poverty of Stimulus Two main conclusions from the examples  Given a finite amount of data, there are always many grammatical rules consistent with the data.  Sentences have an internal structure and these constituents play an important role. Poverty of Stimulus  There are patterns in all natural languages that cannot be learned by children using positive evidence alone. Children are only ever presented with positive evidence for these particular patterns.  Children do learn the correct grammars for their native languages.  Conclusion: Therefore, human beings must have some form of innate linguistic capacity which provides additional knowledge to language learners.

3.2 Constrained Grammars – Principles and Parameters Principles and Parameters (Chomsky 1981)  A finite set of fundamental principles that are common to all languages.  A finite set of parameters that determine syntactic variability amongst languages.

Example: A Three Parameter System from Syntax X-bar theory (Chomsky 1970)  Parameterized phrase structure grammar  It claims that among their phrasal categories, all languages share certain structural similarities. Two X-bar parameters  Spec: short for specifier  Roughly like the old in the noun phrase, the old book  Comp: short for complement, roughly a phrase’s argument  Like an ice-cream in the verb phrase ate an ice-cream  Or with envy in the adjective phrase green with envy

Example: A Three Parameter System from Syntax Parameterized production rules  Parameter settings of English are Spec-first and Comp- final (p 1 =0, p 2 =1) Example X-bar

Example: A Three Parameter System from Syntax Different derivation tree between English (Spec-first, Comp-final) and Bengali (Spec-first, Comp-first)

Example: A Three Parameter System from Syntax One transformational parameter (V2)  Finite verbs must move so as to appear in exactly the second position in root declarative clauses (p 3 =1)  German: p 3 =1  English: p 3 =0

Example: Parameterized Metrical Stress in Phnology

3.3 Learning in the Principles and Parameters Framework Learnability and the sample complexity of the finite hypothesis classes suggested by the Principles and Parameters theory.  Parameterization of the language space  Distribution of the input data  Noise in examples  Type of learning algorithm  Use of memory

3.4 Formal Analysis of the Triggering Learning Algorithm Triggering learning algorithm

Markov Formulation Parameterized grammar family with 3 parameters  Target language  Absorbing State  Loop to itself  No exit arcs  Closed set of states  No arc from any states in set  Absorbing set is a closed set with one state

Markov Chain Criteria for Learnability Memoryless learning system  Triple (A, G, g f )  A: memoryless learning algorithm  G: family of grammars  g f : target grammar Gold learnability

Markov Chain Criteria for Learnability Proof  Gold learnable ↔ every closed set includes the target state  →: by contradiction  ←: first, show that every non-target state must be transient. Then, show that the learner must converge to the target grammar in the limit with probability 1

The Markov Chain for the Three- parameter Example

Derivation of the Transition Probabilities for the Markov TLA Structure Target language L t consist of the strings TLA will move from state s to k only if the following conditions are met  Next sentence ω with probability P(ω) is analyzable by the parameter settings corresponding to k and not by the parameter settings corresponding to s  TLA happens to pick and change the one parameter (out of n) that would move it to state k

Derivation of the Transition Probabilities for the Markov TLA Structure Total probability of transition from s to k after one step Probability of remaining s after one step

Derivation of the Transition Probabilities for the Markov TLA Structure Procedeur for constructing Markov chain

Conclusions In order to learn language, certain kinds of prior information is required. How can we formulate such information? Under what condition is grammar learnable? There are many variants in algorithm, the possibility of noise, memory and so on.