Presentation is loading. Please wait.

Presentation is loading. Please wait.

Sense clusters versus sense relations Irina Chugur, Julio Gonzalo UNED (Spain)

Similar presentations


Presentation on theme: "Sense clusters versus sense relations Irina Chugur, Julio Gonzalo UNED (Spain)"— Presentation transcript:

1 Sense clusters versus sense relations Irina Chugur, Julio Gonzalo UNED (Spain)

2 Sense clusters vs. Sense relations Arguments for sense clustering subtle distinctions produce noise in applications WN too fine- grained Remove predictable sense extensions?

3 Sense clusters vs. Sense relations Arguments for sense clustering subtle distinctions produce noise in applications WN too fine- grained Remove predictable sense extensions? But... clusters are not absolute (e.g. metaphors in IR/MT) not really! Use them to infer and study systematic polysemy Polysemy relations are more informative and predictive WN rich sense distinctions permit empirical /quantitative studies on polysemy phenomena

4 Sense clusters vs. Sense relations Arguments for sense clustering subtle distinctions produce noise in applications WN too fine- grained Remove predictable sense extensions? But... clusters are not absolute (e.g., are metaphors close?) not really! Use them to infer and study systematic polysemy Polysemy relations are more informative and predictive Annotation of semantic relations in 1000 wn nouns

5 Sense distinctions for IR Helpful distinctions Spring Spring –season –Fountain –Metal device –To jump Bank Bank –River bank –Seat –Financial institution Useless distinctions Bet –Act of gambling –Money risked on a gamble –To gamble Bother –Smth. or someone who causes trouble, a source of unhappiness –An angry disturbance

6 1) Cluster evidence from Semcor Hypothesis: if two senses tend to co-occur in the same documents, they are not good IR discriminators. Hypothesis: if two senses tend to co-occur in the same documents, they are not good IR discriminators. Criterion: cluster senses that co-occur frequently in IR-Semcor collection. Criterion: cluster senses that co-occur frequently in IR-Semcor collection. Example: fact 1 and fact 2 co-occur in 13 out of 171 docs. Example: fact 1 and fact 2 co-occur in 13 out of 171 docs. –Fact 1. (a piece of information about circumstances that exist or events that have occurred) –Fact 2. (a statement or assertion of verified information about something that is the case or has happened)

7 Cluster from Semcor: results Positive clusters: 507 (630 sense pairs) Positive clusters: 507 (630 sense pairs) Threshold: #docs  2 with similar distribution of senses Precision: 70% (directly related to threshold) Negative clusters: 530 Negative clusters: 530 Threshold: #sense occurrences  8 Precision: 80%

8 2) Cluster evidence from parallel polysemy English Orquesta 4 Spanish Band 1 Instrumentalists not including string players Band 2 A group of musicians playing popular music for dancing

9 2) Cluster evidence from parallel polysemy English Orquesta 4 Spanish Band 1 Instrumentalists not including string players Band 2 A group of musicians playing popular music for dancing Groupe 9 Groupe 6 FrenchGerman Band 2

10 Parallel polysemy in EuroWordNet EnglishSpanish French German {child,kid}  {niño,crío,menor}  {enfant,mineur}  {Kind} {male child,  {niño}  {enfant}  {Kind,Spross} Boy,child}

11 Comparison of clustering criteria

12 Clusters vs. semantic relations Polysemy relations are more predictive!

13 Characterization of sense inventories for WSD Given two senses of a word, Given two senses of a word, –How are they related? (polysemy relations) –How closely? (sense proximity) –In what applications should be distinguished? Given an individual sense of a word Given an individual sense of a word –Should it be split into subsenses? (sense stability)

14 Cross-Linguistic evidence Fine 40129 Mountains on the other side of the valley rose from the mist like islands, and here and there flecks of cloud as pale and fine as sea-spray, trailed across their sombre, wooded slopes. TRANSLATION: * *

15 P L (same lexicalization| w i, w j )  1 |w i || w j |  x  {w i examples } y  {w j examples } tr L (x) = tr L (y) Proximity( w i, w j )  1 |languages|  L  languages P L (same lexicalization| w i, w j ) Sense proximity (Resnik & Yarowsky)

16 Sense Stability Stability(w i )  |x, y  {w i examples }, x  y| | languages| 1  L  languages 1 t L (x) = t L (y)  x, y  w i examples }, x  y

17 Experiment Design MAIN SET 44 Senseval-2 words (nouns and adjectives) 182 senses 508 examples 11 native/bilingual speakers of 4 languages Bulgarian Russian Spanish Urdu (control set: 12 languages, 5 families, 28 subjects)

18 RESULTS: distribution of proximity indexes Average proximity = 0.29: same as Hector in Senseval 1!

19 Results: distribution of stability indexes Average stability = 0.80

20 distribution of homonyms ?

21 distribution of metaphors

22 distribution of metonymy Average proximity: target in source 0.64, source in target 0.37

23 Systematic polysemy  sense proximity Onion 1. Bulbuos plant ( kind of alliaceous plant ) 2. Pungent bulb ( kind of vegetable ) 3. Edible bulb of an onion plant ( kind of bulb ) plant / food Systematic polysemy IR cluster Positive and negative rules ? container / quantity music /dance animal / food language / people

24 distribution of specialization/generalization

25 Annotation of 1000 wn nouns Relation % sense pairs Homonymy41.2 Metonymy32.5 Metaphor13.0 Specialization7.7 Generalization1.7 Equivalence3.1 fuzzy0.8 Need for cluster here!

26 Typology of sense relations HomonymyMetonymyMetaphorSpecializationGeneralizationEquivalencefuzzy

27 Typology of sense relations: metonymy HomonymyMetonymyMetaphorSpecializationGeneralizationEquivalencefuzzy target in source source in target Co-metonymy Animal-meat Animal-fur Tree-wood Object-color Plant-fruit People-language Action-duration Recipient-quantity... Action-object Action-result Shape-object Plant-food/beverage Material-product... Substance-agent

28 Typology of sense relations: metaphors HomonymyMetonymy Metaphor (182) SpecializationGeneralizationEquivalencefuzzy Action/state/entity (source domain) Action/state/entity (target domain) object  object / person (47) person  person (21) physical action  abstract action (16) Physical property  abstract property (11) Animal  person (10)...

29 Typology of sense relations: metaphors HomonymyMetonymy Metaphor (182) SpecializationGeneralizationEquivalencefuzzy Action/state/entity (source domain) Action/state/entity (target domain) object  object / person (47) person  person (21) Source: historical, mythological, biblical character... profession, occupation, position... Target: prototype person e.g. Adonis (greek mythology/handsome)

30 Conclusions Let’s annotate semantic relations between WN word senses!


Download ppt "Sense clusters versus sense relations Irina Chugur, Julio Gonzalo UNED (Spain)"

Similar presentations


Ads by Google