Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Anaphora, Discourse and Information Structure Oana Postolache EGK Colloquium April 29, 2004.

Similar presentations


Presentation on theme: "1 Anaphora, Discourse and Information Structure Oana Postolache EGK Colloquium April 29, 2004."— Presentation transcript:

1

2 1 Anaphora, Discourse and Information Structure Oana Postolache oana@coli.uni-sb.de EGK Colloquium April 29, 2004

3 2 Overview  Anaphora Resolution  Discourse (parsing)  Balkanet  Information Structure Joint work with Prof. Dan Cristea & Prof. Dan Tufis; Univ. of Iasi

4 3 Anaphora Resolution “If an incendiary bomb drops next to you, don’t loose your head. Put it in a bucket and cover it with sand”. Ruslan Mitkov (p.c.)

5 4 Anaphora Resolution “Anaphora represents the relation between a term (named anaphor) and another (named antecedent), when the interpretation of the anaphor is somehow determined by the interpretation of the antecedent”. Barbara Lust, Introduction to Studies of Anaphora Acquisition, D. Reidel, 1986

6 5 Anaphora Resolution Types Coreference resolution The anaphor and the antecedent refer to the same entity in the real world. Three blind mice, three blind mice. See how they run! Functional anaphora resolution The anaphor and the antecedent refer to two distinct entities that are in a certain relation. When the car stopped, the driver got scared. Haliday & Hassan 1976

7 6 Types of Coreference Pronominal coreference The butterflies were dancing in the air. They offered an amazing couloured show. Common nouns with different lemmas Amenophis the IV th 's wife was looking through the window. The beautiful queen was sad. Common nouns with different lemmas and number A patrol was marching in the street. The soldiers were very well trained. Proper names The President of U.S. gave a very touching speech. Bush talked about the antiterorist war. Appositions Mrs. Parson, the wife of a neighbour on the same floor, was looking for help. Nominal predicates Maria is the best student of the whole class. Function-value coreference The visitors agreed on the ticket price. They concluded that 100$ was not that much.

8 7 RARE – Robust Anaphora Resolution Engine RARE text AR-model 3 AR-model 2 AR-model 1 Coreference chains

9 8 RARE: Two main principles 1.Coreferential relations are semantic, not textual. Coreferential anaphoric relation text layer……………………………………………….. semantic layer…………………………………………… a a proposes center a center a b evokes center a b

10 9 RARE: Two main principles 2. Processing is incremental text layer ………………………………………… projection layer ……………………………………………………….. semantic layer …………………………………. RE b projects PS b PS b center a PS a proposes center a RE a projects PS a PS a ……………………… b a PS b evokes center a

11 10 Terminology text layer ……………………….………………………………………… semantic layer ……………………………………… DE m REa projection layer ……………………………………………… DE j PS x REbREcREdREx reference expressions DE 1 projected structures discourse entities

12 11 What is an AR-model? text layer ……………………….………………………………………… semantic layer ……………………………………… DE m REa projection layer ……………………………………………… DE j PS x REbREcREdREx DE 1 knowledge sources primary attributes heuristics/rules domain of referential accessibility

13 12 Primary attributes 1.Morphological (number, lexical gender, person) 2.Syntactic (REs as constituents of a syntactic tree, quality of being adjunct, embedded or complement of a preposition, inclusion or not in an existential construction, syntactic patterns in which the RE is involved) 3.Semantic and lexical (RE’s head position in a conceptual hierarchy, animacy, sex/natural gender, concreteness, inclusion in a synonymy class, semantic roles) 4.Positional (RE’s offset in the text, inclusion in a discourse unit) 5.Surface realisation (zero/clitic/full/reflexive/possessive/ demonstrative/reciprocal pronoun, expletive “it”, bare noun, indefinite NP, definite NP, proper noun) 6.Other (domain concept, frequency of the term in the text, occurrence of the term in a heading)

14 13 Knowledge sources A knowledge source: a (virtual) processor able to fetch values to attributes on the projections layer Minimum set: POS-tagger + shallow parser

15 14 Matching Rules Certifying Rules (applied first): certify without ambiguity a possible candidate. Demolishing Rules (applied afterwards): rule out a possible candidate. Scored Rules: increase/decrease a resolution score associated with a pair.

16 15 Domain of referential accesibility Filter and order the candidate discourse entities: a. Linearly Dorepaal, Mitkov,... b. Hierarchically Grosz & Sidner; Cristea, Ide & Romary...

17 16 The engine for_each RE in RESequence: projection(RE) proposing/evoking(PS) completion(DE,PS) re-evaluation

18 17 The engine: Projection for_each RE in RESequence: projection(RE) proposing/evoking(PS) completion(DE,PS) re-evaluation text layer ……………………….………………………………………… semantic layer ……………………………………… DE m RE a projection layer ……………………………………………… RE b RE c RE d DE n PS d RE x ps x primary attributes knowledge sources PS x

19 18 The engine: Proposing for_each RE in RESequence: projection(RE) proposing/evoking(PS) completion(DE,PS) re-evaluation text layer ……………………….………………………………………… semantic layer ……………………………………… DE m RE a projection layer ……………………………………………… PS x RE b RE c RE d RE x domain of referential accessibility DE n PS d heuristics/rules DE n

20 19 The engine: Proposing (2) for_each RE in RESequence: projection(RE) proposing/evoking(PS) apply certifying rules apply demolishing rules apply scored rules sort candidates in desc. order of scores use thresholds to: –propose a new DE –link the current PS to an existing DE –postpone decision completion(DE,PS) re-evaluation

21 20 The engine: Completion for_each RE in RESequence: projection(RE) proposing/evoking(PS) completion(DE,PS) re-evaluation text layer ……………………….………………………………………… semantic layer ……………………………………… DE m RE a projection layer ……………………………………………… PS x RE b RE c RE d RE x DE n PS d DE n

22 21 The engine: Completion (2) for_each RE in RESequence: projection(RE) proposing/evoking(PS) completion(DE,PS) re-evaluation text layer ……………………….………………………………………… semantic layer ……………………………………… DE m RE a projection layer ……………………………………………… RE b RE c RE d RE x PS d DE n

23 22 The engine: Re-evaluation for_each RE in RESequence: projection(RE) proposing/evoking(PS) completion(DE,PS) re-evaluation text layer ……………………….………………………………………… semantic layer ……………………………………… DE m RE a projection layer ……………………………………………… RE b RE c RE d RE x PS d DE n PS d DE n

24 23 The engine: Re-eval (2) for_each RE in RESequence: projection(RE) proposing/evoking(PS) completion(DE,PS) re-evaluation text layer ……………………….………………………………………… semantic layer ……………………………………… DE m RE a projection layer ……………………………………………… RE b RE c RE d RE x DE n

25 24 The Coref Corpus 4 chapters from George Orwell’s novel “1984” summing up aprox. 19,500 words. Preprocessed using a POS-tagger & a FDG-parser. The NPs automatically extracted from FDG structure (some manual corrections were necessary, also adding other types of referential expressions). Manual annotation of the coreferential links (each text was assigned to two annotators). Interannotator agreement – as low as 60%. Our annotation is conformant with MUC & ACE

26 25 The Coref Corpus Text 1Text 2Text 3Text 4Total No. of sentences311175169328983 No. of words 693533173260600819520 No. of REs 194291491617025472 Average no. of REs per sentence 6.25.25.45.15.4 Pronouns 6452813626141902 No. of DEs 921520464863

27 26 Evaluation Success Rate = #correctly solved anaphors / all anaphors For the four texts we obtained values between 60% and 70%. (Mitkov 2000)

28 27 Road Map  Anaphora Resolution  Discourse (parsing)  Balkanet  Information Structure

29 28 Discourse Parsing Input: plain text Goal: - Automatically obtain a discourse structure of the text (resembling RST trees). - Apply the Veins Theory to produce focussed summaries. Cristea, Ide & Romary 1998

30 29 Veins Theory: Quick Intro Cristea, Ide & Romary 1998 1 2 3 4 5 H=1 3 5 H=1 3 H=1 H=3 H=1 H=2 H=3 H=4 H=5 V=1 3 5 V=1 2 3 5 V=1 3 5 V=1 3 4 5 Head expression: the sequence of the most important units within the corresponding span of text Vein expression: the sequence of units that are required to understand the span of text covered by the node, in the context of the whole discourse

31 30 Focused Summaries We call focused summary on an entity X, a coherent excerpt presenting how X is involved in the story that constitutes the content of the text. - It is given by the vein expression of the unit to which X belongs.

32 31 The method Plain text FDG parser segments detector NP Detector sentence tree extractor AR-engine tagged corefs s-trees Discouse Parser Discourse structure Veins Theory focused summary

33 32 The method Plain text FDG parser segments detector NP Detector sentence tree extractor AR-engine tagged corefs s-trees Discouse Parser Discourse structure Veins Theory focused summary Conexor FDG parser http://www.connexor.com/m_syntax.html

34 33 The method Plain text FDG parser segments detector NP Detector sentence tree extractor AR-engine tagged corefs s-trees Discouse Parser Discourse structure Veins Theory focused summary Extracts NPs from the FDG structure

35 34 The method Plain text FDG parser segments detector NP Detector sentence tree extractor AR-engine tagged corefs s-trees Discouse Parser Discourse structure Veins Theory focused summary RARE...

36 35 The method Plain text FDG parser segments detector NP Detector sentence tree extractor AR-engine tagged corefs s-trees Discouse Parser Discourse structure Veins Theory focused summary Detects the boundaries of clauses, based on learning methods. Georgiana Puscasu (2004): A Multilingual Method for Clause Splitting.

37 36 The method Plain text FDG parser segments detector NP Detector sentence tree extractor AR-engine tagged corefs s-trees Discouse Parser Discourse structure Veins Theory focused summary — Proposes one or more tree structure(s) at the sentence level. — The leaves are the clauses previously detected. — Uses the FDG structure and the cue-phrases.

38 37 The method Plain text FDG parser segments detector NP Detector sentence tree extractor AR-engine tagged corefs s-trees Discouse Parser Discourse structure Veins Theory focused summary

39 38 The Discourse Parser – We have trees for each sentence; – The goal is to incrementally integrate these trees into a single structure corresponding to the entire text The current tree is inserted at each node on the right frontier; each resulting structure is scored considering: – The coreference links – Centering Theory – Veins Theory foot node * Cristea, Postolache, Pistol (2004): Summarization through Discourse structure (submitted to Coling)

40 39 The Discourse Parser – At the end of the process - set of trees corresponding to the input text, each with a score T* = argmax score(T i ) – Veins(T*) – Extract the summary TiTi

41 40 Discusion & Evaluation - We do obtain automatically coherent summaries! - How to evauate? - We have 90 summaries made by humans... 1)Construct a golden summary out of the 90 summaries and compare it with the system output? 2)Compare the sytem output with all 90 summaries and take the best result?

42 41 Road Map  Anaphora Resolution  Discourse (parsing)  Balkanet  Information Structure

43 42 Information Structure  Many approaches for IS:  Prague School Approach;  Formal account of English intonation;  Integrating different means of IS realization within one grammar framework;  Formal semantics of focus;  Formal semantics of topic;  Integrating IS within a theory of discourse interpretation;  IS-sensitive discourse context updating; Sgall et al; Steedman; Kruijff; Krifka, Rooth; Hendriks; Vallduvi, Kruijff-Korbayova

44 43 Information Structure  Goals:  Improve/Create/Enlarge a corpus annotated at IS (and not only); Investigate means of continuing the annotation (at least partially) automatically Investigate how the (major) NLP tasks can benefit from IS. Find correlation between different features. System that detects IS

45 44 Summary  Anaphora Resolution: RARE  Discourse Parsing: Veins theory  Balkanet: Multilingual WordNet  Information Structure

46 45 References Postolache, Oana. 2004. ‘‘A Coreference Resolution Model on Excerpts from a novel’’. ESSLLI’04, to appear. Postolache, Oana. 2004. ‘‘RARE: Robust Anaphora Resolution Engine’’. M.Sci. thesis. Univ. of Iasi.


Download ppt "1 Anaphora, Discourse and Information Structure Oana Postolache EGK Colloquium April 29, 2004."

Similar presentations


Ads by Google