Presentation is loading. Please wait.

Presentation is loading. Please wait.

INTRODUCCIÓ La major part de les aplicacions interessants de llenguatge requeririen obtenir la representació del significat de les oracions.

Similar presentations


Presentation on theme: "INTRODUCCIÓ La major part de les aplicacions interessants de llenguatge requeririen obtenir la representació del significat de les oracions."— Presentation transcript:

1 INTRODUCCIÓ La major part de les aplicacions interessants de llenguatge requeririen obtenir la representació del significat de les oracions.

2 Estructura predicat argument La estructura predicat argument descriu les relacions semàntiques que es donen entre les entitats que apareixen en la oració. -who does what to whom, -how, where, why?

3 Estructura predicat argument I eat sushi PRED: eat; ARG1: I; ARG2: sushi.

4 Un exemple més complex. En frases complexes tenim més de una proposició. –Mary loves the man who bougth the blue car –P1: Mary loves the man. –P2: The man bought the car. –P3: blue car.

5 Perquè serveix la estructura sintàgmatica? Per obtenir la estructura predicat argument es necessari computar primer la estructura sintagmàtica o al menys una estructura de dependències. La estructura sintagmàtica i les corresponents ‘regles’ son les que permeten que el llenguatge sigui composicional.

6

7

8 Però es cert això?

9 Resum Historia (per entendre els objectius de les teories) Why phrase structures? (problemes) Why dependency grammars? (problemes) Why a probabilistic approach? (Brute force vs. theory) Estat actual del nostre model i recerca futura.

10 Una previa: Com millorar els resultats? Augmentant el training size? Mètodes estadístics més eficients? O millorant les teories?

11 Historia Grammars as computational theories

12 Cognition is computation. A grammar is a form of computation.

13 Computational theories (Marr 1980) What is the goal of the computation? Why is it appropriate? What is the logic of the strategy by which it can be carried out?

14 Chomsky’s Goal A syntactic theory has as a goal to explain the capacity of speakers to judge as acceptable (or ‘generate’) well formed sentences and to rule out ill-formed ones.

15 Justification Syntax is indepedent of semantics. Speakers can judge as ill or well-formed new sentences that they have never heard before.

16 Quin es el origin dels sintagmes? No es semàntic. Un NP no es un NP perquè es correspongui amb un argument semàntic. Es un NP en base a trets purament sintàctics. Regularitats en la distribució de les paraules en les frases. Tests que determinen que es un constituïen (un sintagma) i que no ho és.

17 Constituency Tests “Tests of constituency are basic components of the syntactician’s toolbox. By investigating which strings of words can and cannot be moved, deleted, coordinated or stand in coreference relations, it is possible to draw inferences about the internal structure of sentences.” (Phillips, 1998, p. 1)

18 Chomsky assumed that, given the independence of syntax, a theory of syntax can be developed without a semantic theory and ignoring the mapping process, following only the well-formedness goal.

19 Mapping Goal A syntactic theory has as a goal to explain the capacity of native speakers to map sentences into the corresponding conceptual representations and vice versa.

20 Mapping Goal The mapping goal tries to figure out how linguistic expressions can be mapped in the respective propositional representations in in the most simple and direct way.

21 Mapping Goal (3.a) IBMP gave the company the patent. (3.b) PRED: gave; ARG1: IBMP; ARG2: the patent; ARG3: the company. (4.a) Low prices. (4.b) PRED: low; ARG1: prices.

22 Well-Formedness Goal (3.a) IBMP gave the company the patent. (3.b)** IBMP company gave the the patent. (4.a) Low prices. (4.b)** Prices low

23 Direct mapping The carpenter gave the nurse the book. PRED: gave; ARG1: the carpenter; ARG2: the book; ARG3: the nurse.

24 El mapping pot ser directe en expresions simples Aixo es cert per oracions simples. Culicover, Peter W. and Andrzej Nowak. Dynamical Grammar. Volume Two of Foundations of Syntax. Oxford University Press. 2003. Roger Schank i col·laboradors en els anys 70.

25 5941 Mr. NNP - (A0* * * * * 5941 Nakamur NNP - *) * * * * 5941 cites VBZ cite (V*) * * * * 5941 the DT - (A1* * * * * 5941 case NN - * * * * * 5941 of IN - * * * * * 5941 a DT - * (A0* (A0* (A0* (A0* <---------4NLDs 5941 custome NN - * *) *) *) *) 5941 who WP - * (R-A0*) (R-A0*) (R-A0*) (R-A0*) 5941 wants VBZ want * (V*) * * * 5941 to TO - * (A1* * * * 5941 build VB build * * (V*) * * 5941 a DT - * * (A1* * * 5941 giant JJ - * * * * * 5941 tourism NN - * * * * * 5941 complex NN - * * *) * * 5941 in IN - * * (AM-LOC** * 5941 Baja NN - * *) *) * * 5941 and CC - * * * * * 5941 has VBZ - * * * * * 5941 been VBN - * * * * * 5941 trying VBG try * * * (V*) * 5941 for IN - * * * (AM-TMP** 5941 eight CD - * * * * * 5941 years NNS - * * * *) * 5941 to TO - * * * (A1* * 5941 get VB get * * * * (V* 5941 around IN - * * * * *) 5941 Mexican NNP - * * * * (A1* 5941 restric NNS - * * * * * 5941 on IN - * * * * * 5941 foreign JJ - * * * * * 5941 ownersh NN - * * * * * 5941 of IN - * * * * * 5941 beachfr JJ - * * * * * 5941 propert NN - *) * * *) *) 5941.. - * * * * *

26 Direct mapping Per Culicover en frases mes complexes no es possible. –Mary loves the man who bougth the blue car –P1: Mary loves the man. –P2: The man bought the car. –P3: blue car.

27 Direct mapping No es possible? –Mary loves the man who bougth the blue car –P1: PRED: loves; ARG1: Mary; ARG2: the man. –P2: PRED: bought; ARG1:the man; ARG2: the car.. –P3: PRED:blue; ARG1: car.

28 Why phrase structures? Why dependency grammars?

29 No son necessaries Es pot aconseguir composicionalitat sense computar estructura sintagmàtica Es pot fer un mapping directe a la estructura predicat argument sense computar ni una estructura de dependències ni una sintagmàtica. Es simplfica considerablement el procés de parsing i el tractament de la ambigüitat.

30 Temes per poder entrar a fons Why phrase structures? Why dependency grammars? Why a probabilistic approach? (al menys la versió “brute-cutre force”)

31 D-SemMap V1.0

32 Vectors and propositions A proposition can be represented by a vector of features (Hinton, 1981). In order to represent the proposition the vector is divided into “slots”. Each element of the proposition is represented in one slot.

33 Vectors and propositions Module 2 Module 2 Semantic classes Module 1 Module 1 POS SLOT 0 SLOT 1 SLOT 2 SLOT 3 Types & Backs V MA N DT N action human artifact entity SLOT 0 SLOT 1 SLOT 2 SLOT 3 Types & Backs “Mary drives a bus”

34 Input Word Slot 0 Slot 1 Slot 2 Slot 3 Type S Back, Test & Subcat. MODULE 1 Input Layer Output Layer Yamada (2003) Nivre (2004) Magerman (1994) Ratnaparky (1999)

35 The carpenterboughtashirtwithCredit-card DTN C Slot 0 Slot 1 Slot 2Slot 3 Subcategorization backtracking V MA PE PADTN CIINN C Output The DT PUT 1 carpenter N C PUT 1 bought V MA PE PA PUT 0 a DT PUT 2 shirt N C with IIN PUT 3 Credit-card N C PUT 3 Hidden

36 Module 2 supervises argument position P1) PRED: V MA PE PA ( bought) ARG1: N PR (Mary) ARG2: DT N C ( a shirt ) ARG3: IIN N C (with pockets) P1) PRED: get, transfer, give, pay, ARG1: entity, person ARG2: entity, object, artifact (shirt) ARG3: artifact, part-of-dress MODULE 1 MODULE 2 Subcategorization and Selectional Restrictions Parsing strategy: Attaches first to the current proposition

37 Binding problem

38

39 “Mary bought a shirt with pockets” P1) V MA PE PA PRED: V MA PE PA ( bought) N PR ARG1: N PR (Mary) DT N C ARG2: DT N C ( a shirt ) IIN N C ARG3: IIN N C (with pockets) P1) get, transfer, give, pay, PRED: get, transfer, give, pay, entity, person ARG1: entity, person entity, object, artifact ARG2: entity, object, artifact artifact part-of-dress ARG3: artifact part-of-dress MODULE 1 MODULE 2 artifact part-of-dress ARG3: artifact part-of-dress BACK CLEARP IZ_IN0 Parsing strategy: Attaches first to the current proposition

40 “Mary bought a shirt with pockets” P1: V MA PE PRED: V MA PE ( bought) N PR ARG1: N PR (Mary) DT N C ARG2: DT N C ( a shirt ) ARG3: P2: PRED: DTN C ARG1: DT N C (a shirt) IIN N C ARG2: IIN N C (with pockets) ARG3: P1: PRED: get, transfer,pay, accept... ARG1: entity, person,... ARG2: entity, object, artifact, shirt ARG3: P2: PRED: ARG1: entity, object, artifact, shirt ARG2: artifact part-of-dress ARG3: MODULE 2 MODULE 1

41 PARSING COMPLEX SENTENCES

42 Elementary expressions

43 “a blue shirt” MODULE 1 MODULE 2 JJ PRED: JJ ( blue ) (SLOT 0) DT N C ARG1: DT N C ( a shirt ) (SLOT 1) PRED: colour, blue (SLOT 0) ARG1: entity, object, artifact (SLOT 1) “the governement`s minister” DT N C ARG1: DT N C ( the minister ) (SLOT 1) N CPOS ARG2: N C POS ( goverDTent’s ) (SLOT 2) POS TYPE: POS (SLOT type) ARG1: entity, person... (SLOT 1) ARG2: entity, person... (SLOT 2) TYPE: POS (SLOT type)

44 Complex sentences A complex sentence is any sentence that is formed by more than one elementary expression A complex sentence requires more than one proposition for its semantic representation

45 Input Word Slot 0 Slot 1 Slot 2 Slot 3 Type S Back, Test & Subcat. MODULE 1 Input Layer Output Layer

46 Input Word COMPLETE SENTENCE STRUCTURE (OPERATIONS WITH VECTORS) Non Invariant Solution Input Layer Output Layer

47 Input Word Slot 0 Slot 1 Slot 2 Slot 3 Type S Back, Test & Subcat. STACK Stored Context Invariant solution (a kind of shift and reduce parser) Input Layer Output Layer Focus of attention (current context)

48 Input Word Slot 0 Slot 1 Slot 2 Slot 3 Type S Back, Test & Subcat. STACK Stored Context MODULE 1 Input Layer Output Layer Focus of attention (current context)

49 Modelos conc é ntricos (Cowan, 1988, 1995, 1999; Oberauer, 2002) foco de atención parte activada de la MLP memoria a largo plazo (MLP)

50 AJAJ A L retinotopic visual neurons (as found in V1 and V2) Neurons whose receptive fields are invariant (translation and scale), higher visual areas (inferotemporal cortex) Covert attention A

51 AJAJ A L retinotopic visual neurons (as found in V1 and V2) Neurons whose receptive fields are invariant (translation and scale), higher visual areas (inferotemporal cortex) Covert attention A

52 Attention and invariance Shift reduce parser Stolke (1990), Sopena (1993), Miikulainen (1996)

53

54 Current Pred: A1: A2: A3: Flags: Input Word: The| DT OUTPUT M1: PUT1 M2: The main manager bought some old cars with three wheels.

55 Generalized Role Labeling using Propositional Representations Current Pred: A1: The A2: A3: Flags: @1 Input Word: The| DT M1: NEXT M2: The main manager bought some old cars with three wheels.

56 Generalized Role Labeling using Propositional Representations Current Pred: A1: The A2: A3: Flags: @NEXT @1 Input Word: main | JJ_PR M1: *IZ-IN M2: The main manager bought some old cars with three wheels.

57 Generalized Role Labeling using Propositional Representations Current Pred: A1: A2: A3: Flags: Input Word: main| JJ_PR M1: PUT0 M2: The main manager bought some old cars with three wheels. Top Pred: A1: The A2: A3: Flags: @1

58 Generalized Role Labeling using Propositional Representations Current Pred: main A1: A2: A3: Flags: Input Word: main| JJ_PR M1: NEXT M2: The main manager bought some old cars with three wheels. Top Pred: A1: The A2: A3: Flags: @1

59 Generalized Role Labeling using Propositional Representations Input Word: manager| DT_N M1: PUT1 M2: The main manager bought some old cars with three wheels. Current Pred: main A1: A2: A3: Flags: @NEXT Top Pred: A1: The A2: A3: Flags: @1

60 Generalized Role Labeling using Propositional Representations Input Word: manager| DT_N M1: OZ-OUT M2: The main manager bought some old cars with three wheels. Current Pred: main A1: manager A2: A3: Flags: @NEXT Top Pred: A1: The A2: A3: Flags: @1

61 Generalized Role Labeling using Propositional Representations Input Word: manager| DT_N M1: PUT1 M2: The main manager bought some old cars with three wheels. Current Pred: A1: The A2: A3: Flags: @1 @OZ-OUT P:main|A1:manager

62 Generalized Role Labeling using Propositional Representations Input Word: manager| DT_N M1: NEXT M2: The main manager bought some old cars with three wheels. Current Pred: A1: The manager A2: A3: Flags: @1 @OZ-OUT P:main|A1:manager

63 Generalized Role Labeling using Propositional Representations Input Word: bought| V_MA M1: PUT0 M2: The main manager bought some old cars with three wheels. Current Pred: A1: The manager A2: A3: Flags: @1 @NEXT P:main|A1:manager

64 Generalized Role Labeling using Propositional Representations Input Word: bought| V_MA M1: NEXT M2: The main manager bought some old cars with three wheels. Current Pred: bought A1: The manager A2: A3: Flags: @0 P:main|A1:manager

65 Generalized Role Labeling using Propositional Representations Input Word: some| DT M1: PUT2 M2: The main manager bought some old cars with three wheels. Current Pred: bought A1: The manager A2: A3: Flags: @0 @NEXT P:main|A1:manager

66 Generalized Role Labeling using Propositional Representations Input Word: some| DT M1: NEXT M2: The main manager bought some old cars with three wheels. Current Pred: bought A1: The manager A2: some A3: Flags: @2 P:main|A1:manager

67 Generalized Role Labeling using Propositional Representations Input Word: old| JJ_PR M1: *IZ-IN M2: The main manager bought some old cars with three wheels. Current Pred: bought A1: The manager A2: some A3: Flags: @2 @NEXT P:main|A1:manager

68 Generalized Role Labeling using Propositional Representations Input Word: old| JJ_PR M1: PUT0 M2: The main manager bought some old cars with three wheels. Current Pred: A1: A2: A3: Flags: @IZ-IN P:main|A1:manager Top Pred: bought A1: The manager A2: some A3: Flags: @2

69 Compositionality “We now turn to what I think was an important mistake at the core of generative grammar, one that in retrospect lies behind much of the alienation of linguistic theory from the cognitive sciences. Chomsky did demonstrate that language requires a generative system that makes possible an infinite variety of sentences. However, he explicitly assumed, without argument (1965: 16, 17, 75, 198), that generativity is localized in the syntactic component of the grammar” (Jackendoff, 2002)

70 Compositionality The fact that semantics is ‘purely interpretive’ has as a consequence that thought has no ‘independent status’ and it cannot be creative or have an independent capacity of combinatoriality outside of language. As Phillips (2004) points out, that thought is purely interpretative and not creative is “a consequence that is likely to be uncomfortable for many, including Jackendoff” (Phillips, 2004 p. 574).

71 Compositionality Semantic (or thought) is “purely interpretative”. Only syntax is creative. I think that to believe in ‘absolute free will’ is the main cause of all types of fundamentalism,

72 Training minimalista SS-1-1- (DT The WAIT NEXT) SS-1-2- (DT_N man PUT1 NEXT) SS-1-3- (V_MA sold PUT0 NEXT) SS-1-4- (DT some WAIT NEXT) SS-1-5- (DT_N offerings PUT2 NEXT) SS-1-6- (IIN_DT to WAIT NEXT) SS-1-7- (DT the WAIT NEXT) SS-1-8- (DT_N president PUT3 NEXT) SS-1-9- (.. OZ-OUT NEXT) SS-1-10- (FIN)

73 Training minimalista (8-10 paraules maxim) RL-22-1- (DT a NADA NEXT) RL-22-2- (DT_N land PUT1 NEXT) RL-22-3- (CC, PUTtypeCC NEXT) RL-22-4- (WP2 where TESTARG &NOTEST CLEARmodeCC IZ-IN2 PUTtypeWDT NEXT) RL-22-5- (DT a NADA NEXT) RL-22-6- (DT_N saying PUT1 NEXT) RL-22-7- (V_MA says PUT0 &BACK2 MV23 &BACK_ADJ IZ-INE2 PUTtypeADJ OZ-OUT NEXT) RL-22-8- (.. OZ-OUT OZ-OUT NEXT) RL-22-9- (FIN)

74 Test real, PTBII (55-26) s5974: But predictions that central banks of the Group of Seven - G-7 - major industrial nations would continue their massive dollar sales went astray, as the market drove the dollar downward on its own, reacting to Wall Street 's plunge and subsequent price volatility, lower U.S. interest rates and signs of a slowing U.S. economy.

75 Test Ho hem probat amb 254 frases del PTBII Els resultats son molt bons. La idea es tenir 0% errors i crec que es pot conseguir. Quin es le problema?

76 NLDs, coordination, comparatives, puntuació Dependency grammars and parsers often ignore some classes of dependencies Puntuaci ó (guionets, parentesis, comes, dos punts,.....)

77 NLDs

78 5941 Mr. NNP - (A0* * * * * 5941 Nakamur NNP - *) * * * * 5941 cites VBZ cite (V*) * * * * 5941 the DT - (A1* * * * * 5941 case NN - * * * * * 5941 of IN - * * * * * 5941 a DT - * (A0* (A0* (A0* (A0* <---------4NLDs 5941 custome NN - * *) *) *) *) 5941 who WP - * (R-A0*) (R-A0*) (R-A0*) (R-A0*) 5941 wants VBZ want * (V*) * * * 5941 to TO - * (A1* * * * 5941 build VB build * * (V*) * * 5941 a DT - * * (A1* * * 5941 giant JJ - * * * * * 5941 tourism NN - * * * * * 5941 complex NN - * * *) * * 5941 in IN - * * (AM-LOC** * 5941 Baja NN - * *) *) * * 5941 and CC - * * * * * 5941 has VBZ - * * * * * 5941 been VBN - * * * * * 5941 trying VBG try * * * (V*) * 5941 for IN - * * * (AM-TMP** 5941 eight CD - * * * * * 5941 years NNS - * * * *) * 5941 to TO - * * * (A1* * 5941 get VB get * * * * (V* 5941 around IN - * * * * *) 5941 Mexican NNP - * * * * (A1* 5941 restric NNS - * * * * * 5941 on IN - * * * * * 5941 foreign JJ - * * * * * 5941 ownersh NN - * * * * * 5941 of IN - * * * * * 5941 beachfr JJ - * * * * * 5941 propert NN - *) * * *) *) 5941.. - * * * * *

79 5961 Fed NNP - * * * * 5961 funds NNS - * * * * 5961 is VBZ - * * * * 5961 the DT - (A1* * * * 5961 rate NN - *) * * * 5961 banks NNS - (A0*) * * * 5961 charge VBP charge (V*) * * * 5961 each DT - (A2* * * * 5961 other JJ - *) * * * 5961 on IN - (A3* * * * 5961 overnig JJ - * * * * 5961 loans NNS - *) * * * 5961 ; : - * * * * 5961 the DT - * (A0* (A0* (A0* 5961 Fed NNP - * *) *) *) 5961 influen VBZ influenc* (V*) * * 5961 the DT - * (A1* * * 5961 rate NN - * *) * * 5961 by IN - * (AM-MNR** * 5961 adding VBG add * * (V*) * 5961 or CC - * * * * 5961 drainin VBG drain * * * (V*) 5961 reserve NNS - * * (A1*) (A2*) 5961 from IN - * * (AM-MNR*(A1* 5961 the DT - * * * * 5961 banking NN - * * * * 5961 system NN - * *) *) *) 5961.. - * * * *

80 5920 For IN - * * (AM-PNC * * * 5920 the DT - (A0* * * * * * 5920 PRI NNP - *) * * * * * 5920 to TO - * * * * * * 5920 stand VB stand (V*) * * * * * 5920 a DT - (A1* * * * * * 5920 chance NN - *) * *) * * * 5920,, - * * * * * * 5920 Mr. NNP - * * (A0* * * * 5920 Salinas NNP - * * *) * * * 5920 has VBZ have * (V*) (AM-MOD * * * 5920 to TO - * * * * * * 5920 press VB press * * (V* * * * 5920 on RP - * * *) * * * 5920 with IN - * * (A1* * * * 5920 an DT - * * * (A0* (A0* (A0* 5920 economi JJ - * * * * * * 5920 program NN - * * * *) *) *) 5920 that WDT - * * * (R-A0*) (R-A0*) (R-A0*) <--------- 3NLDs 5920 so RB - * * * (AM-TMP** * 5920 far RB - * * * *) * * 5920 has VBZ - * * * * * * 5920 succeed VBN succeed * * * (V*) * * 5920 in IN - * * * * * * 5920 lowerin VBG lower * * * * (V*) * 5920 inflati NN - * * * * (A1*) * 5920 and CC - * * * * * * 5920 providi VBG provide * * * * * (V*) 5920 moderat JJ - * * * * * (A1* 5920 economi JJ - * * * * * * 5920 growth NN - * * *) *) * *) 5920.. - * * * * * *

81 NLDs (Johnson 2002) Broad coverage syntactic parsers with good performance have recently become available (Charniak, Collins), but these typically produce as output a parse tree that only encodes local syntactic information, i.e., a tree that does not include any "empty nodes".

82 NLDs (Dienes 2003) Intuitively, the problem of parsing with NLDs is that the empty elements (EEs) representing these dependencies are not in the input. Therefore, the parser has to hypothesise where these EEs might occur – in the worst case, it might end up suggesting exponentially many traces, rendering parsing infeasible (Johnson and Kay 1994).

83 NLDs From the point of view of the dependency structure, NLDs are difficult because they violate the assumption that dependency structures are represented as directed trees. Specifically, NLDs give rise to re-entrancies in the dependency graph, i.e., it is no longer a directed tree but a directed graph, with nodes possibly having multi- ple parents (e.g. apple in Figure 2.2). Now, the parser has to explore a much larger search space.

84 NLDs Arguably, the search space is much more restricted by an actual grammar that exploits, for instance, the knowledge that buy is a transitive verb and thus requires a direct object. Nevertheless, the problem does not disappear. Consider the following example: When demand is stronger than suppliers can handle and delivery times lengthen, prices tend to rise. (wsj_0036.mrg)

85 NLDs Non-local dependencies and displacement phenomena have been a central topic of generative linguistics since its inception half a century ago. However … Many current linguistic theories of non- local dependencies are extremely complex, and would be difficult to apply with the kind of broad coverage described here.

86 Why a probabilistic approach? “Ambiguity and underspecification are ubiquitous in human language utterances, at all levels (lexical, syntactic, semantic, etc.), and how to resolve these ambiguities is a key communicative task for both human and computer natural language understanding” (Manning, 2003).

87 At the highest level, the probabilistic approach to natural language understanding is to view the task as trying to learn the probability distribution: –P ( meaning | utterance ; context ) A mapping from form to meaning conditioned by context.

88 Why a probabilistic approach quantum mechanics (uncertainty) Classical Physics (underspecification)

89 Why a probabilistic approach? Collins (1996) –Ambiguity: PP-attachment Coordination

90 PP-attachment Pierre Vinken, 61 years old, joined the board as a nonexecutive director.

91 PP-attachment 4-tuple joined board as director V = joined, N1 = board, P = as, and N2 = director. p (A= l | V=v, N1=n1, P=p, N2=n2) p(l | v, n1,p, n2)

92 Results PP attachment ordered by Accuracy Average Human Expert (Ratnaparkhi, 1994) 88.20 % Ratnaparkhi (1994) ID3 77.70 % Ratnaparkhi (1994) Maximum Entropy Model 81.60 % Collins and Brooks (1995) Backed-Off Model 84.50 % Zavrel et al. (1996) Neural Networks 80.00 % Zavrel et al. (1997) Memory-Based Learning 84.40 % Yeh and Vilain (1998) Error-driven learning 83.10 % Committee Machines 1 (Alegre et al, 1999) 86.08 % Committee Machines 2 (Alegre, 2004) 88.01 % MethodAccuracy Abney et al. (1999) Boosting Takahashi (2001) Neural Networks 83.10 % 84.40 % Krymolowski and Rooth (1998) SNOW 84.80 % 88.01 %

93 “Mary bought a shirt with pockets” P1) V MA PE PA PRED: V MA PE PA ( bought) N PR ARG1: N PR (Mary) DT N C ARG2: DT N C ( a shirt ) IIN N C ARG3: IIN N C (with pockets) P1) get, transfer, give, pay, PRED: get, transfer, give, pay, entity, person ARG1: entity, person entity, object, artifact ARG2: entity, object, artifact artifact part-of-dress ARG3: artifact part-of-dress MODULE 1 MODULE 2 artifact part-of-dress ARG3: artifact part-of-dress BACK CLEARP IZ_IN0 Parsing strategy: Attaches first to the current proposition

94 “Mary bought a shirt with pockets” P1: V MA PE PRED: V MA PE ( bought) N PR ARG1: N PR (Mary) DT N C ARG2: DT N C ( a shirt ) ARG3: P2: PRED: DTN C ARG1: DT N C (a shirt) IIN N C ARG2: IIN N C (with pockets) ARG3: P1: PRED: get, transfer,pay, accept... ARG1: entity, person,... ARG2: entity, object, artifact, shirt ARG3: P2: PRED: ARG1: entity, object, artifact, shirt ARG2: artifact part-of-dress ARG3: MODULE 2 MODULE 1

95 Millors resultats Alegre (2002) –PTBI 89.8% (PTBII 92.3%) Olteanu and Modovan (2005) –PTBII 94%

96 Parsers de dependencies?

97 Ambigüitat Tota la “artilleria” que cal per resoldre el PP-attachment s’hauria de fer servir per resoldre com col.locar cada paraula de la frase en la estructura? En quins fenòmens caldria? La resposta es en molt pocs.


Download ppt "INTRODUCCIÓ La major part de les aplicacions interessants de llenguatge requeririen obtenir la representació del significat de les oracions."

Similar presentations


Ads by Google