Word order and phrases in a network

Slides:



Advertisements
Similar presentations
Word Grammar in Theory Dick Hudson Cardiff, May 2013.
Advertisements

Sociolinguistics 2 Everyday knowledge and language.
1 Dependency structure and cognition Richard Hudson Depling2013, Prague.
1 A cognitive analysis of the word ’S Dick Hudson Manchester April 2009.
Dr. Abdullah S. Al-Dobaian1 Ch. 2: Phrase Structure Syntactic Structure (basic concepts) Syntactic Structure (basic concepts)  A tree diagram marks constituents.
Language and Cognition Colombo, June 2011 Day 2 Introduction to Linguistic Theory, Part 4.
Statistical NLP: Lecture 3
1 Introduction to Computational Linguistics Eleni Miltsakaki AUTH Fall 2005-Lecture 2.
Lecture 1 Introduction: Linguistic Theory and Theories
Generative Grammar(Part ii)
Linguistic Theory Lecture 2 Phrase Structure. What was there before structure? Classical studies: Classical studies: –Languages such as Latin Rich morphology.
1 Word order without phrases (Introduction to Word Grammar) Richard Hudson Budapest, March 2012.
Context Free Grammars Reading: Chap 12-13, Jurafsky & Martin This slide set was adapted from J. Martin, U. Colorado Instructor: Paul Tarau, based on Rada.
Syntax Lecture 8: Verb Types 1. Introduction We have seen: – The subject starts off close to the verb, but moves to specifier of IP – The verb starts.
1 Words and rules Linguistics lecture #2 October 31, 2006.
IV. SYNTAX. 1.1 What is syntax? Syntax is the study of how sentences are structured, or in other words, it tries to state what words can be combined with.
THE BIG PICTURE Basic Assumptions Linguistics is the empirical science that studies language (or linguistic behavior) Linguistics proposes theories (models)
UNIT 7 DEIXIS AND DEFINITENESS
LOGIC AND ONTOLOGY Both logic and ontology are important areas of philosophy covering large, diverse, and active research projects. These two areas overlap.
A network model of processing in morphology Dick Hudson UCL
ENGLISH SYNTAX Introduction to Transformational Grammar.
Syntax and cognition Dick Hudson Freie Universität Berlin, October
Levels of Linguistic Analysis
◦ Process of describing the structure of phrases and sentences Chapter 8 - Phrases and sentences: grammar1.
1 Introduction to WG syntax Richard Hudson Joensuu November 2010 Word-word relations are concepts.
1 Chapter 4 Syntax Part III. 2 The infinity of language pp The number of sentences in a language is infinite. 2. The length of sentences is.
1 French clitics and cognition Dick Hudson Oxford, November 2012.
1 The logic of default inheritance in Word Grammar Richard Hudson Lexington, May 2012.
1 (Introduction to Word Grammar) Richard Hudson Joensuu November 2010 Words are concepts.
PROFª. FLÁVIA CUNHA SYNTAX OF THE ENGLISH LANGUAGE – 2013/2.
10/31/00 1 Introduction to Cognitive Science Linguistics Component Topic: Formal Grammars: Generating and Parsing Lecturer: Dr Bodomo.
Context Free Grammars. Slide 1 Syntax Syntax = rules describing how words can connect to each other * that and after year last I saw you yesterday colorless.
Natural Language Processing Vasile Rus
Introduction to Linguistics
Describing Syntax and Semantics
Grammar Module 1: Grammar: what and why? (GM1)
Structure, Constituency & Movement
An Introduction to the Government and Binding Theory
Syntax Lecture 9: Verb Types 1.
Statistical NLP: Lecture 3
What is linguistics?.
SYNTAX.
4.3 The Generative Approach
Part I: Basics and Constituency
Overview Tenses in colour is a set of high-quality colour-coded charts for graphically and systematically presenting the English tense system. It aims.
CSC 594 Topics in AI – Applied Natural Language Processing
Lecture 8: Verb Positions
Instructor: Nick Cercone CSEB -
1. Welcome: infusing language instruction in content 2. Mathematics
BBI 3212 ENGLISH SYNTAX AND MORPHOLOGY
Introduction to Computational Linguistics
Dependency structure and cognition
Between dependency structure and phrase structure
Língua Inglesa - Aspectos Morfossintáticos
X-bar Schema Linguistics lecture series
Levels of Linguistic Analysis
September 13th Grammars.
Extraction: dependency and word order
Word order and phrases in a network
Linguistic Essentials
Competence and performance
Ling 566 Oct 14, 2008 How the Grammar Works.
A User study on Conversational Software
Principles and Parameters (I)
Defaults without faults
Structure of a Lexicon Debasri Chakrabarti 13-May-19.
SYNTAX OF THE ENGLISH LANGUAGE – 2014/2
ALI139 – Arabic Grammar I Week 2.
Presentation transcript:

Word order and phrases in a network Dick Hudson Humboldt University Berlin, January 2019

Summary My aim is to model language as general cognition. Language is a network. Taxonomies allow inheritance, but partonomies don’t. Link types can be invented as needed, e.g. word-word dependencies. Link types are classified so we distinguish grammatical functions. Types, tokens and sub-tokens form a taxonomy. So what? (1) word order, including raising, lowering and long-distance extraction More on the free creation of ad hoc links, including phrase boundaries. Coordination So what? (2)

1. Language as general cognition Is language unique? I.e. is language a special module of the mind? Or is it just like the rest of cognition? How to find out? Assume it’s like the rest of cognition, and describe it in those terms. Failure  It’s a special module. Success  It’s not. Word Grammar (my theory) makes this assumption. The answer so far? Success.

So … Linguistic descriptions use only formal apparatus found outside language. No special apparatus such as PS rules, transformations, optimality grids Linguistic descriptions have access to all the formal apparatus found outside language. Every general-purpose device comes for free. We don’t need to limit the devices used There’s no pressure for elegance or parsimony E.g. we can create new relations freely.

2. The network notion of cognition arm General cognition is a network of concepts which are mental and classified but which are implemented in neurons which carry activation which spills over onto network neighbours which causes priming: e.g. doctor primes nurse = makes the word and its meaning more active. part upper arm fore- arm hand joiner joiner elbow wrist doctor colleague nurse

2.1 ‘Isa’ and inheritance General cognition allows generalisations Has two legs bird General cognition allows generalisations e.g. if it’s a bird, it has two legs In other words, examples are linked to general types (by ‘isa’) examples inherit properties across isa links. So the network is an ‘inheritance network’ containing ‘isa’ links other links ‘isa’ example Has two legs

Taxonomies and partonomies Not in Oxford English Dictionaryxx Taxonomies and partonomies Isa relations define a taxonomy, or classification. Part-whole relations define a partonomy (= meronomy). This difference is important Taxonomies are based on similarity. Partonomies are based on size and containment. The difference is fundamental in syntax! Has four fingers Has four fingers a taxonomy a partonomy

Taxonomies and partonomies in syntax Beware of unclassified links in tree diagrams! Part-whole relations are logically different from isa relations. NP Partonomy part part Det Noun Taxonomy the man

Default inheritance and exceptions token The logic is default inheritance. A (typical) bird flies. A sparrow isa bird, so it flies. A penguin isa bird. But a penguin doesn’t fly. Some people object that default inheritance is non- monotonic. An earlier inference may be overridden by a later one. But in Word Grammar, default inheritance is monotonic because it’s part of node-creation which creates and enriches token nodes. 1 flying flying sparrow penguin flying bird 1

2.2 Free linkage concept In general cognition, anything can link to anything else. Cognition reflects experience. and relation-types can be created as needed e.g. ‘joiner’, ‘part’, ‘colleague’. Relations are a kind of concept and form a taxonomy. relation entity part colleague centre boss

Free linkage in syntax Presumably free linkage also applies to syntax subject to functional constraints Then words can link directly to each other. This is dependency analysis. But PS denies this, and only recognises part-whole relations. Why deny word-word links? We are in Berlin.

Leonard Bloomfield and Wilhelm Wundt When in Germany, Bloomfield followed Wundt 1832-1920 In Leipzig he created the first laboratory of experimental psychology 184 PhD students He also wrote about syntax proposition = subject + predicate propositions all the way down Ein redlich denkender Mensch verschmäht die Täuschung G = Gesamtvorstellung A = subject B= predicate edlich denkender Mensch verschmäht die Täuschung A1 = ein Mensch B1 = denkt redlich A3 = der Gedanke B3 = ist redlich vers dlich denkender Mensch verschmäht die Täuschung

and not Franz Kern 1830-1894 Gymnasialdirektor and philologist Grundriss der Deutschen Satzlehre 1884 The first pure dependency grammar after Brassai 1873 and Dmitrievskij 1877 His stemma is not a partonomy.

2.3 Classified links and grammatical functions dependent We can, and do, classify relations in general cognition. So why not in syntax? Why not recognise grammatical functions? Dependency types are concepts so they form a taxonomy. valent adjunct subject comple-ment predic-ative We are in Berlin.

Some big boys came in. All the boys were ... 2.4. Types and tokens Types are permanent, in long-term memory. Tokens are temporary and in working memory. Types and tokens are connected by isa. So tokens are also part of the cognitive network of concepts. BOY BOY: plural Some big boys came in. All the boys were ... boys1 boys2

Tokens and sub-tokens outside language bird plane flying saucer Token = example/exemplar. One token may be represented by a series of sub-concepts, reflecting mental changes and linked by ‘isa’. General principle: different properties  different concepts. concept 1 concept 1 concept 1

Sub-tokens in unary branching Det _ _ V man the man Classification shows similarities. NPs can be subjects (etc) can contain Det Ns can follow Det can be inside NP But bare Ns are both so use unary branching but this requires separate tokens of one word. [N]NP [Det + N] NP _ V Det _ men men [Det + N] NP [N]NP men1 NP men2 N

Sub-tokens in processing past tense past tense time1e t2 t3e t4e t5e t6e t7e We create new sub-examples (sub-tokens) when we correct an error. E.g. in garden-path sentences. But maybe we do the same each time we modify a word. But the sub-examples form a taxonomy, not a partonomy. The horse raced past the barn fell t7e raced1 t1 metwe t4 passive metyesterday t3 metthere t1 t2 t3 t4 We met there yesterday

3. So what? (1) Dependencies between words are psychologically plausible Contrary to Chomsky’s PSG. They are relations, so they form a taxonomy of grammatical functions Contrary to most versions of HPSG. But we do recognise a sub-token of the head for every dependent: same nodes as in Bare PSG but in a taxonomy, not a partonomy! British linguists hate Brexit. hatelinguists subject linguistsBritish hateBrexit adjunct object British linguists hate Brexit.

4. Word order As in general cognition, a network has no left-right dimension, so position is defined in terms of landmarks. expected position follows from more abstract relations. but it may follow a chain of relations. cellar land-mark land-mark ‘owner’ British linguists hate Brexit. lm lm lm

Positions More precisely, X’s position is defined by: X’s position P X’s landmark L a relation between P and L This allows positions to be treated as properties and overridden by default inheritance. British linguists hate Brexit. pos lm lm lm < < >

Raising In It is raining, it is the subject of both is and raining. Itraining s lm p < In It is raining, it is the subject of both is and raining. So why is raining irrelevant to its position? Raining it is. Is it raining? Because each subject link defines a different sub-token of it: itis and itraining each with its positional properties and itis isa itraining so default inheritance overrides the position of itraining. This is the default for structure sharing. s s pr Itis It is raining. lm p < s s pr raiser

Eine Concordegelandet gelandet ist hier nie. Lowering But lowering is also found. E.g. in German Partial VP fronting: Eine Concorde gelandet ist hier nie. Lowering is exceptional but allowed by default inheritance. Eine Concordeist Eine Concordegelandet gelandet ist hier nie.

Long-distance dependencies in extraction We need a new dependency: ‘extractee’ (‘x’) plus structure sharing as in raising. Whatwant x+o Whatthat x Whatsay x Whatshould x Whatthink x x Whatdo do you think we should say that we want _?

5. Free creation of links Even if we have dependencies, we can have other kinds of link in syntax as well. E.g. in pied-piping we have a ‘pipee’. house in which we live pipee

Phrase boundaries Similarly, phrase boundaries are important in some languages e.g. Welsh, where llawn > lawn by soft mutation at the start of a phrase: Dw i [lawn mor grac â chi]. be.PRES.1S I full as angry as you ‘I’m just as angry as you.’ We can freely invent a relation ‘start’. especially since this relation already exists in ordinary cognition. start Dw i lawn mor grac â chi s a a c c c

6. Coordination Here the crucial notion is ‘set’. This is freely available in general cognition. A relation can apply to a set. But: What is the status of and? And: What about incomplete conjuncts? I read books about [linguistics on weekdays] and [astronomy at weekends]. Bill and Ann bought and ate apples and oranges.

So what? (2) Dependency structure is available for free, so why not use it? It allows analyses of complex structures without going beyond the apparatus of general cognition. But it’s not enough in itself. We also need: sub-tokens creating multiple nodes per head word. positions and landmarks. (in some languages) phrase boundaries (but not phrase nodes). sets for coordination. occasional extra relations such as ‘pipee’. More generally, ...

Between PS and DS? Word Grammar is between PS and DS. Like DS, no units longer than words. Like PS, a separate head node for each dependency. But dominance is logical isa, not part-whole. loveI I love grammar s lovegrammar love grammar o I love grammar I love grammar I love grammar

Between competence and performance? metwe t4 performance metyesterday Performance is the temporary network created for an utterance. Competence is the permanent network of lexico-grammar. But they’re part of the same network. And arguably competence is just performance that’s been made permanent. t3 metthere t2 t3 t4 We met there yesterday /met/ past meet competence verb t1

Between language and thought? Not allowed when I was a boy .... Language uses the apparatus of ordinary thinking and nothing else. So language is part of thinking. And language really is a ‘window into the human mind’. Words don’t just express concepts. They are concepts. So they can combine with non-language. word The train went !!!

Thank you for listening These slides can be downloaded from dickhudson.com/talks I’m very happy to discuss them at r.hudson@ucl.ac.uk.