Presentation is loading. Please wait.

Presentation is loading. Please wait.


Similar presentations

Presentation on theme: "A SHORT GUIDE TO THE MEANING-TEXT LINGUISTIC THEORY JASMINA MILIĆEVIĆ DALHOUSIE UNIVERSITY - HALIFAX (CANADA) 2006, Journal of Koralex, vol. 8: 187-233."— Presentation transcript:


2 Contents 0. Introduction (1-2) 1.Postulates and methodological principle (2-4) 2.Meaning-Text models (4-6) 3.Illustration of the linguistic synthesis in the Meaning-Text framework (6-27) 4.Summary of MTT’s main features (27-30) 5.Basic Meaning-Text bibliography (30-36)

3 0. Introduction MTT = theoretical framework for the construction of models of languages Launched in Moscow (Žolkovskij & Mel’čuk 1967) Developed in Russia, Canada, Europe Formal character  computer applications Relatively marginal

4 1. Postulate 1 “Natural language is (considered as) a many- to-many correspondence between an infinite denumerable set of meanings and an infinite denumerable set of texts.” (2) {SemRi} {PhonRj} │0 < i, j ∞

5 Postulate 2 “The Meaning-Text correspondence is described by a formal device which simulates the linguistic activity of the native speaker—a Meaning-Text Model.”(3)

6 Postulate 3 “Given the complexity of the Meaning-Text correspondence, intermediate levels of (utterance) representation have to be distinguished: more specifically, a Syntactic and a Morphological level.”(3)

7 Methodological principle “The Meaning-Text correspondence should be described in the direction of synthesis, i.e., from Meaning to Text (rather than in that of analysis, i.e., from Text to Meaning).” (3)

8 WHY? 1.Producing speech is an activity that is more linguistic than understanding speech; 2.Some linguistic phenomena can be discovered only from the viewpoint of synthesis (ex: lexical co-occurrence = collocations). Corollary: study of paraphrases (and lexicon) occupies a central place in the M-T framework.

9 Paraphrase Synonymy = fundamental semantic relation in natural language  “to model a language means to describe its synonymic means and the ways it puts them in use”. Meaning = invariant of paraphrases Text = “virtual paraphrasing” Lexical paraphrase  semantic decomposition of lexical meanings

10 Semantic decomposition of ‘criticize’ (definiendum): ‘X criticizes Y for Z’ ≈ (definiens): – ‘Y having done 2 1 Z which X considers2 bad2 for Y or other people1, – and X believing3 that X has good 1 1 reasons 1 2 for considering2 Z bad2, || – X expresses 3 1 X’s negative 1 1 opinion1 of Y because of Z(Y), – specifying what X considers2 bad2 about Z, – with the intention2 to cause 2 that people1 (including Y) do not do 2 1 Z.’

11 2. Meaning-Text Models: Characteristics Equative = transductive  generative (Postulate 1) Completely formalized (Postulate 2) Stratificational model (Postulate 3)

12 MTM Architecture (Neuvel)

13 Representations, (adapted from Mel'chuk 1988: 49)

14 2. MTM: peripheral structures Reflect different characerizations of the central entity = provide additional information relevant at each level. Peripheral: they do not exist independently of the central structure. Purpose: to articulate the SemS into a specific message, by specifying the way it will be ‘packaged’ for communication.

15 Central and peripheral S / level of R SemR = DSyntR = < DSyntS, Dsynt-CommS, DSynt.- ProsS, Dsynt-AnaphS) SSyntR = DMorphR =

16 2. MTM: rules

17 3. Illustration: Linguistic Synthesis Synthesis: 1 SemR (X 2)  3 PhonR (X 2) SemR [1]: Theme = media  PhonR (1 a, b, c) SemR [2]: Theme = decision  PhonR (2 a, b, c)

18 SemR’s central structure = SemS A SemS represents the propositional meaning of a set of paraphrases. SemS = network: nodes and arcs Nodes: labeled with semantemes. Arcs: labeled with numbers (predicate- argument relations).

19 SemS (example)

20 Peripheral structure Sem-CommS Sem-CommS represents the communicative intent of the Speaker. Formally, Sem-CommS = division of the SemS into communicative areas, each marked with one of mutually exclusive values.

21 Eight communicative oppositions Thematicity = {Theme, Rheme, Specifier} Giveness = {Given, New} Focalization = {Focalized, Non-Focalized} Perspective = {Backgrounded, Foregrounded, Neutral} Emphasis = {Emphasized, Neutral} Assertiveness = {Asserted, Presupposed} Unitariness = {Unitary, Articulated} Locutionality = {Communicated, Signaled, Performed}

22 Other peripheral Sem-structures Sem-RhetS represents the Speaker’s rhetorical intent. Sem-RefS = set of pointers from semantic configurations to the corresponding entities in the real world.

23 Theme: media (1)a. [The media] T [harshly criticized the Government for its decision to increase income taxes] R b. [The media] T [seriously criticized the Government for its decision to raise income taxes] R c. [ The media] T [leveled harsh criticism at the Government for its decision to increase income taxes] R

24 Theme = Media

25 Theme = government’s decision (1)a. [The government’s decision to increase income taxes] T [was severely criticized by the media] R b. [The government’s decision to raise income taxes] T [drew harsh criticism from the media] R c. [ The government’s decision to increase income taxes ] T [came under harsh criticism from the media] R

26 Theme = government’s decision

27 Syntactic dependency Relation of strict hierarchy Characteristics: – Antireflexive – Antisymmetric – Antitransitive

28 Syntactic structure Tree Nodes labeled with lexical units; not linearly ordered Top node does not depend on any lexical unit in the structure, while all other units depend on it, directly or indirectly. Arcs (= branches) labeled with dependency relations

29 DSyntS Nodes: labeled with deep lexical units (≠ pronouns and ‘structural words’) subscripted for all meaning-bearing inflections. Branches: labeled with names of deep syntactic dependency relations. Deep lexical unit = lexeme, (full) phraseme or name of a lexical function.

30 Lexical functions LF = formal tools used to model lexical relations, i.e., restricted lexical co-occurrence (= collocations), and semantic derivation. They have different lexical expressions contingent on the keyword. LF corresponds to a meaning whose expression is phraseologically bound by a particular lexeme L (= argument of the LF).

31 Lexical functions: examples Magn  ‘intense/very’ – Magn (wind) = strong, powerful – Magn (rain (N) ) = heavy, torrential // downpour – Magn (rain (V) ) = heavily, cats and dogs S 1  ‘person/object doing L’ – S 1 (crime) = author, perpetrator [of ART ˷ ] // criminal – S 1 (kill) = killer

32 Lexical functions: classification 1.According to their capacity to appear in the text alongside the keywords: syntagmatic (normally do) and paradigmatic (normally do not) 2.According to their generality/universality: standard (general/universal) and non- standard (neither general nor universal) 3.According to their formal structure: simple and complex

33 Examples Magn : syntagmatic, standard, simple LF S 1 : paradigmatic, standard, simple LF A YEAR that has 366 days = leap [ ˷ ] = non-standard LF: it only applies to one keyword (year) and has just one value (leap); not universal (not valid cross-linguistically) CausePredPlus : complex LF

34 LFs realized in (1) and (2) Magn (criticize) = bitterly, harshly, seriously, strongly // blast Magn (criticism) = bitter, harsh, serious, severe, strong CausePredPlus (taxes) = increase, raise QS Ø (criticize) = criticism QS Ø (decide) = decision Oper 1 (criticism) = level [ ˷ at N|N denotes a person], raise [ ˷ against N], voice [ ˷ ] Oper 2 (criticism) = come [under ˷ ], draw [ ˷ from N], meet [with ˷ ]

35 Deep lexical units Do not correspond one-to-one to the surface lexemes: in the transition towards surface syntax, some deep lexical units may get deleted or pronominalized and some surface lexemes may be added.

36 12 Deep-Syntactic Relations 6 actantial DSyntRels (I, II, III,…, VI) + 1 DSyntRel for representing direct speech (=variant of DSyntRel II) 2 attributive DSyntRels: ATTR restr(ictive) and ATTR qual(ificative) 1 Appenditive DSyntRel (APPEND): links the Main Verb to ‘extra-structural’ sentence elements (sentential adverbs, interjections,…) 2 coordinative DSyntRels: COORD and QUASI- COORD

37 DSyntR – (1a)

38 DSyntR – (1b)

39 DSyntR – (1c)

40 Semantic module: correspondence rules Lexicalization rules Morphologization rules Arborization rules Communicative rules Prosodic rules

41 SemR[1]  DSyntRs (1a) and (1b) Figure 10: A lex.-funct. rule Figure 11: Arbor. rule 1 Figure 12: Arbor. rule 2

42 Semantic module: equivalence rules = paraphrasing rules 1.Semanic equivalence rules  equivalence between (fragments of) 2 SemRs 2.Lexico-syntactic rules: formulated in terms of lexical functions  equivalence between (fragments of) 2 DSyntRs.

43 Ex.: lexical-syntactic equivalence rule

44 From D to SSyntR: the Deep-Syntactic module SSyntS: dependency tree; nodes labeled with actual lexeme; branches labeled with names of language specific surface-syntactic dependency relations. DSyntS ≠ SSyntS: 1.Lexically: only semantically full lexemes vs all lexemes (including full and structural words + pronouns) 2.Syntactically : only universal dependency relations vs specific dependency relations

45 DSyntR / SSyntR (1a)

46 SSyntR (1b)

47 SSyntR (1c)

48 Deep-Syntactic module: major types of rules 1.Phrasemic rules 2.Deep-Syntactic rules 3.Pronominalization rules 4.Ellipsis rules 5.Communicative rules 6.Prosodic rules

49 6 phrasemic rules (1 a-c) SSyntS (1a) – 1) Magn(CRITICIZE) harshly; – 2) CausPredPlus(TAXES) increase SSyntS (1b) – 3) Magn(CRITICIZE) seriously; – 4) CausPredPlus(TAXES) raise SSyntS (1c) – 5) Oper1(CRITICISM) level; – 6) Magn(CRITICISM) harsh

50 Constraints: examples (3) a. The media raised harsh criticism against the Government for its decision to impose higher taxes. / The media leveled harsh criticism at the Government for its decision to impose higher taxes. b. The media raised harsh criticism against the Government’s decision to impose higher taxes. vs. *The media leveled harsh criticism at the Government’s decision to impose higher taxes. (4) ?The media raised harsh criticism against the Government for its decision to raise taxes.

51 DSynt-rule 1 (1a – 1b)

52 DSynt-rule 2 (1a-1b)

53 From SSyntR to DMorphR: the Surface-Syntactic Module DMorphS = string of fully ordered lexemes subscripted with all inflectional values DMorph-ProsS = specification of semantically + syntactically induced prosodies

54 DMorphRs (1) Sentence (1a) – THE MEDIApl | HARSHLY CRITICIZEact, ind, past, 3(?)sg THE GOVERNMENTsg || FOR ITSsg DECISIONsg | TO INCREASEinf INCOMEsg TAXpl ||| Sentence (1b) – THE MEDIApl || SERIOUSLY CRITICIZEact, ind, past, 3 (?)sg THE GOVERNMENTsg, possessive DECISIONsg | TO RAISEinf INCOMEsg TAXpl ||| Sentence (1c) – THE MEDIApl | LEVELact, ind, past, 3 (?)sg HARSH CRITICISMsg AT THE GOVERNMENTsg || FOR ITS DECISIONsg |TO INCREASEinf INCOMEsg TAXpl |||

55 SSynt-module: major types of rules 1.Linearization rules – Local (and semi-local): (5) a. [the government’s] [decision] [ to increase] [taxes] b. [[the Government’s decision] complex ph. [to increase taxes] complex ph. ] complex ph. – Global 2.Morphologization rules 3.Prosodization rules

56 Example: local linearization rule (1c)

57 4. Main features of the MTT 1.Globality, descriptive orientation 2.Semantic bases and synthesis orientation, essential role of the paraphrase and of communicative organization 3.Strong emphasis on the lexicon 4.Relational approach to language: the use of dependencies at all levels of linguistic description 5.Formal character 6.Stratificational and modular organization of MTMs 7.Implementability: the MTT lends itself well to computer applications

58 5.7 Computational Linguistics and NLP Applications Apresjan Ju. et al. (2003). ETAP-3 Linguistics Processor: a Full- Fledged Implementation of the MTT. In: Kahane, S. & Nasr, A., eds. (2003), 279-288. – (1992). Lingvističeskii processor dlja složnyx informacionnyx system [A Linguistic Processor for Complex Information Systems]. Moskva: Nauka. – (1989). Lingvističeskoe obespečenie sistemy ÈTAP-2 [Linguistic Software for the System ETAP-2]. Moskva: Nauka. Apresjan, Ju. & Tsinman, L. (1998). Perifrazirovanie na kompjutere [Paraphrasing on the Computer]. Semiotika i informatika 36, 177- 202. Boguslavskij, I., Iomdin. L. & Sizov. V. (2004). Multilinguality in ETAP- 3. Reuse of Linguistic Ressources. In: Proceedings of the Conference Multilingual Linguistic Ressources. 20 th International Conference on Computational Linguistics, Geneva 2004, 7-14.

59 5.7 Computational Linguistics and NLP Applications Boyer, M. & Lapalme, G. (1985). Generating Paraphrases from Meaning-Text Semantic Networks. Montreal: Université de Montréal. CoGenTex (1992). Bilingual Text Synthesis System for Statistics Canada Database Reports : Design of Retail Trade Statistics (RTS) Prototype. Technical Report 8. CoGenTex Inc., Montreal. Iordanskaja, L., Kim, M., Kittredge, R., Lavoie, B. & Polguère, A. (1992). Generation of Extended Bilingual Statistical Reports. In: COLING-92, Nantes, 1019-1022. Iordanskaja, L., Kim, M. & Polguère, A. (1996). Some Procedural Problems in the Implementation of Lexical Functions for Text Generation. In: Wanner, L., ed., (1996), 279- 297.

60 5.7 Computational Linguistics and NLP Applications Iordanskaja, L., Kittredge, R. & Polguère, A. (1991). Lexical Selection and Paraphrase in a Meaning-Text Generation Model. In: Paris, C. L., Swartout, W. R. & Mann, W. C., eds., Natural Language Generation in Artificial Intelligence and Computational Linguistics. Boston: Kluwer, 293-312. Iordanskaja, L. & Polguère, A. (1988). Semantic Processing for Text Generation. In: Proceedings of the First International Computer Science Conference-88, Hong Kong, 19-21 December 1988, 310-318. Kahane, S. & Mel’čuk, I. (1999). Synthèse des phrases à extraction en français contemporain (Du graphe sémantique à l’arbre de dépendance). T.A.L., 40:2, 25-85. Kittredge, R. (2002). Paraphrasing for Condensation in Journal Abstracting. Journal of Biomedical Informatics 35: 4, 265-277.

61 Bibliography MILIĆEVIĆ, Jasmina (2006): « A Short Guide to the Meaning-Text Linguistic Theory », Journal of Koralex, vol.8: 187-233. NEUVEL, Sylvain: Linguistic Theories> Meaning-Text Linguistics > Introduction (8/5/2011)


Similar presentations

Ads by Google