Dependency structure and cognition

Slides:



Advertisements
Similar presentations
1 Cognitive sociolinguistics Richard Hudson Budapest March 2012.
Advertisements

Sociolinguistics 2 Everyday knowledge and language.
1 Word Grammar and other cognitive theories Richard Hudson Budapest March 2012.
Lecture #9 Syntax, Pragmatics, and Semantics © 2014 MARY RIGGS 1.
1 Dependency structure and cognition Richard Hudson Depling2013, Prague.
Syntax. Definition: a set of rules that govern how words are combined to form longer strings of meaning meaning like sentences.
1 A cognitive analysis of the word ’S Dick Hudson Manchester April 2009.
Dr. Abdullah S. Al-Dobaian1 Ch. 2: Phrase Structure Syntactic Structure (basic concepts) Syntactic Structure (basic concepts)  A tree diagram marks constituents.
Language and Cognition Colombo, June 2011 Day 2 Introduction to Linguistic Theory, Part 4.
Ambiguity ambiguity is the property of being ambiguous, where a word, notation, phrase, clause, sentence is called ambiguous if it can be interpreted in.
1 Language and kids Linguistics lecture #8 November 21, 2006.
MORPHOLOGY - morphemes are the building blocks that make up words.
Thinking about valency and sentence structure Part of Dick Hudson's web tutorial on Word Grammarweb tutorial.
PSY 369: Psycholinguistics Some basic linguistic theory part3.
Lecture 1 Introduction: Linguistic Theory and Theories
Generative Grammar(Part ii)
UML Class Diagrams: Basic Concepts. Objects –The purpose of class modeling is to describe objects. –An object is a concept, abstraction or thing that.
Grammar and education Dick Hudson University of Middlesex March 2006.
1 Word order without phrases (Introduction to Word Grammar) Richard Hudson Budapest, March 2012.
Thinking about inflections How to find verb inflections (Part of Dick Hudson's web tutorial on Word Grammar)web tutorial.
The Language Instinct Talking Heads.
Subject Verb Agreement Pronoun-Antecedent Agreement
Semantic Memory Psychology Introduction This is our memory for facts about the world This is our memory for facts about the world How do we know.
1 Words and rules Linguistics lecture #2 October 31, 2006.
Thinking about agreement. Part of Dick Hudson's web tutorial on Word Grammarweb tutorial.
NLP. Introduction to NLP Is language more than just a “bag of words”? Grammatical rules apply to categories and groups of words, not individual words.
A network model of processing in morphology Dick Hudson UCL
Subordination & Content clauses
Computational linguistics A brief overview. Computational Linguistics might be considered as a synonym of automatic processing of natural language, since.
Rules, Movement, Ambiguity
Linguistic Theory Lecture 5 Filters. The Structure of the Grammar 1960s (Standard Theory) LexiconPhrase Structure Rules Deep Structure Transformations.
Introduction Chapter 1 Foundations of statistical natural language processing.
1 How is knowledge stored? Human knowledge comes in 2 varieties: Concepts Concepts Relations among concepts Relations among concepts So any theory of how.
Syntax and cognition Dick Hudson Freie Universität Berlin, October
Verb phrases Main reference: Randolph Quirk and Sidney Greenbaum, A University Grammar of English, Longman: London, (3.23 – 3.55)
1 Introduction to WG syntax Richard Hudson Joensuu November 2010 Word-word relations are concepts.
1 French clitics and cognition Dick Hudson Oxford, November 2012.
Language and Cognition Colombo, June 2011 Day 2 Introduction to Linguistic Theory, Part 3.
Composition I Spring   Subjects are always nouns or pronouns.  Nouns are people, places, things, or ideas.  Pronouns take the place of nouns:
SYNTAX.
10/31/00 1 Introduction to Cognitive Science Linguistics Component Topic: Formal Grammars: Generating and Parsing Lecturer: Dr Bodomo.
Module 5 Other Knowledge Representation Formalisms
Lecture 2: Categories and Subcategorisation
Language, Mind, and Brain by Ewa Dabrowska
Students’ typical confusions and some teaching implications
Grammar Module 1: Grammar: what and why? (GM1)
PSYC 206 Lifespan Development Bilge Yagmurlu.
Flying High Lessons 7-12.
Phrases & Clause Unit The infinitive phrase.
Diagramming with Linking Verbs and
Lecture 3: Functional Phrases
SEL1007: The Nature of Language
SYNTAX.
4.3 The Generative Approach
Learning to read through phonics
Part I: Basics and Constituency
UML Class Diagrams: Basic Concepts
… and language typology
Word order and phrases in a network
Between dependency structure and phrase structure
Clauses Ed McCorduck English 402--Grammar SUNY Cortland
Extraction: dependency and word order
PREPOSITIONAL PHRASES
Word order and phrases in a network
Traditional Grammar VS. Generative Grammar
Linguistic aspects of interlanguage
How is knowledge stored?
Defaults without faults
Lecture 7: Definite Clause Grammars
Presentation transcript:

Dependency structure and cognition Richard Hudson Depling2013, Prague

The question What is syntactic structure like? Does it include dependencies between words (dependency structure)? Or does it only contain part-whole links (phrase structure)? She looked after him after him She looked after him

Relevant evidence: familiarity University courses teach only one approach. School grammar sometimes offers one. Usually dependency structure even in the USA Reed-Kellogg sentence-diagramming especially in Europe and especially in the Czech Republic!

What Czech children do at school blossomed out kingcups by stream yellow near Jirka Hana & Barbora Hladká 2012

or even …

Relevant evidence: convenience Dependency structure is popular in computational linguistics. Maybe because of its simplicity: few nodes little but orthographic words Good for lexical cooccurrence relations

Relevant evidence: cognition Language competence is memory Language processing is thinking Memory and thinking are part of cognition So what do we know about cognition? A. Very generally, cognition is not simple so maybe syntactic structures aren't in fact simple?

B. Knowledge is a network Gretta John Colin me Gaynor Lucy Peter

C. Links are classified relations person relative is-a woman man parent child mother father

D. Nodes are richly related Gretta John m m s f f s s s Colin b me Gaynor w b h d gf Lucy s Peter

E. Is-a allows default inheritance Is-a forms taxonomies. e.g. 'linguist is-a person', 'Dick is-a linguist' Properties 'inherit' down a taxonomy. But only 'by default' – exceptions are ok. e.g. birds (normally) fly but penguins don't.

Penguins bird 'flies' robin penguin 'doesn't fly' robin* 'flies'

Cognitivism 'Cognitivism' 'Language is an example of ordinary cognition' So all our general cognitive abilities are available for language and we have no special language abilities. Cognitivism matters for linguistic theory.

Some consequences of cognitivism Word-word dependencies are real. 'Deep' and 'surface' properties combine. Mutual dependency is ok. Dependents create new word tokens. Extra word tokens allow raising. But lowering may be ok too.

1. Word-word dependencies are real Do word-word dependencies exist (in our minds)? Why not? Compare social relations between individuals. What about phrases? But maybe only their boundaries are relevant? They're not classified, so no unary branching.

Punctuation marks boundaries At the end of the road, turn right. Not: At the end of the, road turn right. At the end, of the road turn right. At the end of the road turn right, How do we learn to punctuate if we can't recognise boundaries?

No unary branching If S NP + VP, then: But if a verb's subject is a noun: NP VP N V N V moo. Cows moo. Cows

2. 'Deep' and 'surface' properties combine. Dependencies are relational concepts. Concepts record bundles of properties that tend to coincide e.g. 'bird': beak, flying, feathers, two legs, eggs 'mother': bearer, carer So one dependency has many properties: semantic, syntactic, morphosyntactic e.g. 'subject' ….

'subject' The typical subject is defined by meaning typically 'actor' or … word order and/or case typically before verb and/or nominative agreement typically the verb agrees with it status obligatory or optional, according to finiteness

So … Cognition suggests that 'deep' and 'surface' properties should be combined not separated They are in harmony by default but exceptionally they may be out of harmony this is allowed by default inheritance

3. Mutual dependency is ok. Mutual dependency is formally impossible in standard notation And is formally impossible in phrase structure theory So if it exists, we need to resist PS theory change the standard notation

Mutual dependency exists I wonder who came? Who is subject of came, so who depends on came. But who depends on wonder and came can be omitted: e.g. Someone came – I wonder who. So came depends on who.

Standard notation A 'dominates' B so A is above B A B so B cannot 'dominate' A B A

4. Dependents create new word tokens. General cognition: every exemplar needs a mental node. no node carries contradictory properties. so some exemplars need two nodes. E.g. when we re-classify things. NB we can remember both classifications

What kind of bird? bird blackbird B ? mate B*

NB like* is a token of a token And in language … word LIKE-verb I like ? subject like* NB like* is a token of a token

The effect of a dependent When we recognise a dependent for W, we change W into a new token W*. The classification of W* may change. W* also has a new meaning normally a hyponym of W but may be idiomatic If we add dependents singly, this gives a kind of phrase structure!

typical French house HOUSE meaning house house meaning house French

Notation house** house* typical French house typical French house

5. Extra word tokens allow raising. it subject rains it subject subject predicative it* raining keeps

Raising in the grammar A* is-a A, so A* wins. higher parent A* B shared lower parent A C

6. But lowering may be ok too. Raising is helpful for processing the higher parent is nearer to the sentence root. But sometimes lowering is helpful too e.g. if it allows a new meaning-unit. Eine Concorde gelandet ist hier nie. a Concorde landed has here never. A-Concorde-landing has never happened here.

German Partial VP fronting Eine Concorde higher parent Eine Concorde* gelandet ist hier nie lower parent lowered

Conclusions Language is just part of cognition. So syntactic dependencies are: psychologically real rich (combining 'deep' and 'surface' properties) complex (e.g. mutual, multiple). And dependency combines with default inheritance multiple tokens