Presentation is loading. Please wait.

Presentation is loading. Please wait.

Recognizing Partial Textual Entailment

Similar presentations


Presentation on theme: "Recognizing Partial Textual Entailment"— Presentation transcript:

1 Recognizing Partial Textual Entailment
Torsten Zesch Iryna Gurevych Ubiquitous Knowledge Processing Lab Technische Universität Darmstadt Omer Levy Ido Dagan Natural Language Processing Lab Bar-Ilan University Good morning. I’m Omer Levy, and I’ll be talking to you today about what we did in the pilot task, from a textual entailment point of view. This is joint work with my advisor, Ido Dagan, and our colleagues Torsten Zesch and Iryna Gurevych from the UKP lab in Darmstadt.

2 Textual Entailment We’ve all heard quite a bit about recognizing textual entailment.

3 T: muscles move bones Given a text T…

4 T: muscles move bones H: muscles generate movement
…and a hypothesis H, we need to find out…

5 T: muscles move bones H: muscles generate movement
…whether a human would infer H from T. So if muscles move bones, then muscles generate movement, right?

6 T: muscles move bones H: the muscles’ main job is to move bones
? T: muscles move bones H: the muscles’ main job is to move bones So how about this example? Well, we know that the muscles move bones, but we don’t know that that’s their main job. So, according to the definition of textual entailment…

7 T: muscles move bones H: the muscles’ main job is to move bones
…this pair doesn’t count.

8 Complete Textual Entailment
So maybe, instead of…

9 Complete Textual Entailment
…complete textual entailment, we need something more fine-grained…

10 Complete Textual Entailment
Partial …partial textual entailment.

11 Complete Textual Entailment
(Nielsen et al, 2009) Partial Complete Textual Entailment And this idea was introduced by Rodney Nielsen.

12 T: muscles move bones H: the muscles’ main job is to move bones
So once again, we have a text and a hypothesis, but this time...

13 T: muscles move bones (main job of muscles) (muscles move) (bones are moved) H: the muscles’ main job is to move bones …we decompose the hypothesis into semantic components…

14 T: muscles move bones (main job of muscles) (muscles move) (bones are moved) H: the muscles’ main job is to move bones …and check whether each one is entailed – on its own. Now, my next point is aimed at all the teaching assistants in the audience.

15 Student Response Analysis
T: muscles move bones (main job of muscles) (muscles move) (bones are moved) H: the muscles’ main job is to move bones Partial entailment could be used for checking exams. You know all those “partially correct” answers? Well, partial entailment breaks it down for you. Student Response Analysis

16 T: muscles move bones (main job of muscles) (muscles move) (bones are moved) H: the muscles’ main job is to move bones It can also be used in summarization and information extraction, where you often need to take “a bit of this” and “a bit of that” to get a complete overall result. So, in this task… Student Response Analysis Summarization Information Extraction

17 Facets T: muscles move bones (main job of muscles) (muscles move) (bones are moved) H: the muscles’ main job is to move bones …facets were used to represent different semantic components. Here are a few examples:

18 Facets T: muscles move bones (main job of muscles) (muscles move) (bones are moved) H: the muscles’ main job is to move bones (main job, muscles) (muscles, move) (move, bones) muscles’ main job

19 Facets T: muscles move bones (main job of muscles) (muscles move) (bones are moved) H: the muscles’ main job is to move bones (main job, muscles) (muscles, move) (move, bones) muscles move <something>

20 Facets T: muscles move bones (main job of muscles) (muscles move) (bones are moved) H: the muscles’ main job is to move bones (main job, muscles) (muscles, move) (move, bones) <something> moves bones. So what exactly is a facet?

21 Facet: a pair of terms and their implied semantic relation.
A facet is a pair of…

22 Facet: a pair of terms and their implied semantic relation.
(move, bones) Facet: a pair of terms and their implied semantic relation. …terms, and their…

23 Facet: a pair of terms and their implied semantic relation.
(move, bones) Facet: a pair of terms and their implied semantic relation. …implied semantic relation. Now, in Nielsen’s original model, the relations were given as part of the facets, But in SemEval, the relation isn’t stated, and only implied from the original sentence. … is to move bones theme

24 Recognizing Faceted Entailment
So to sum it up, we’re dealing with the task of recognizing faceted entailment…

25 T: muscles move bones H: the muscles’ main job is to move bones
Recognizing Faceted Entailment T: muscles move bones H: the muscles’ main job is to move bones …which is basically “given a text…

26 T: muscles move bones H: the muscles’ main job is to move bones
Recognizing Faceted Entailment T: muscles move bones H: the muscles’ main job is to move bones …a hypothesis…

27 T: muscles move bones H: the muscles’ main job is to move bones
Recognizing Faceted Entailment T: muscles move bones H: the muscles’ main job is to move bones …and a facet…

28 T: muscles move bones H: the muscles’ main job is to move bones
Recognizing Faceted Entailment T: muscles move bones H: the muscles’ main job is to move bones …find out if the facet can be inferred from the text”.

29 What did we do? Up until now I’ve been describing the task as it was defined in SemEval. Let’s see how we tried to solve it.

30 Approach: Leverage Existing Methods
So, we were interested in a simple and generic approach…

31 Approach: Leverage Existing Methods
… and the best way to keep it simple, is to use existing methods of textual entailment, and adapt them to the partial task.

32 Approach: Leverage Existing Methods
Exact Match P: 96%, R: 30% penicillin cures pneumonia ⇒ (penicillin, cure) Lexical Inference (penicillin, cure) ∈ pneumonia is treated by penicillin Lexical-Syntactic Inference We start with the simplest method – exact match of words and lemmas. This works surprisingly well on our dataset…

33 Approach: Leverage Existing Methods
Exact Match P: 96%, R: 30% penicillin cures pneumonia ⇒ (penicillin, cure) Lexical Inference (penicillin, cure) ∈ pneumonia is treated by penicillin Lexical-Syntactic Inference …and yields 96% precision and 30% recall. Obviously, this is more of a dataset characteristic than something that you’d expect of a naive method.

34 Approach: Leverage Existing Methods
Exact Match penicillin cures pneumonia ⇒ (penicillin, cure) Lexical Inference WordNet pneumonia is treated by penicillin ⇒ (penicillin, cure) Lexical-Syntactic Inference The second method is lexical inference, which tries to expand words like “cure” to similar terms such as “treat”.

35 Approach: Leverage Existing Methods
Exact Match penicillin cures pneumonia ⇒ (penicillin, cure) Lexical Inference WordNet pneumonia is treated by penicillin ⇒ (penicillin, cure) Lexical-Syntactic Inference We used WordNet similarity for these expansions.

36 Approach: Leverage Existing Methods
Exact Match penicillin cures pneumonia ⇒ (penicillin, cure) Lexical Inference pneumonia is treated by penicillin ⇒ (penicillin, cure) Lexical-Syntactic Inference Finally, we tried lexical-syntactic inference, which can recognize things like “if penicillin cures pneumonia, then pneumonia is treated by penicillin”. penicillin cures pneumonia pneumonia is treated by penicillin

37 Bar-Ilan University Textual Entailment Engine (Stern and Dagan, 2011)
Now, for the final kind of inference, we used “the Bar-Ilan University Textual Entailment Engine”…

38 BIUTEE …a.k.a. BIUTEE.

39 BIUTEE T H pneumonia cures penicillin
BIUTEE uses a dependency parse of the text…

40 BIUTEE T H pneumonia cures penicillin penicillin treated pneumonia is
by T H …and the hypothesis, as inputs…

41 BIUTEE T H pneumonia cures penicillin penicillin treated pneumonia is
by T H …and tries to convert the text to the hypothesis by applying a series of knowledge-based rules. Let’s see how it works.

42 pneumonia cures penicillin
We start with the text, “penicillin cures pneumonia”…

43 pneumonia cures penicillin pneumonia treats penicillin cure → treat
(lexical) pneumonia treats penicillin …we switch “cures” with “treats”…

44 pneumonia cures penicillin pneumonia treats penicillin penicillin
cure → treat (lexical) pneumonia treats penicillin active → passive (syntactic) penicillin treated pneumonia is by …and then transform the sentence from active to passive, and voila! BIUTEE has many more transformations, such as coreference, hybrid lexical-syntactic rules, and more.

45 Facets to Dependencies
treated pneumonia is by But if facets are represented as a pair of words, what are we going to feed BIUTEE? penicillin

46 Facets to Dependencies
treated pneumonia is by As you can see, we can take the path between the two words, and that’s going to be our hypothesis. penicillin

47 Baseline BaseLex BaseSyn Majority
Decision Mechanisms Baseline BaseLex BaseSyn Majority Exact Match Exact Match OR Lexical Inference Exact Match OR Syntactic Inference Lexical Inference Exact Match OR AND Syntactic Inference So, we saw 3 methods for recognizing textual entailment – how are we going to combine them?

48 Baseline BaseLex BaseSyn Majority
Decision Mechanisms Baseline BaseLex BaseSyn Majority Exact Match Exact Match OR Lexical Inference Exact Match OR Syntactic Inference Lexical Inference Exact Match OR AND Syntactic Inference First of all, we want to take the simplest method, exact match, and make that our baseline.

49 Baseline BaseLex BaseSyn Majority
Decision Mechanisms Baseline BaseLex BaseSyn Majority Exact Match Exact Match OR WordNet Exact Match OR BIUTEE Lexical Inference Exact Match OR AND Syntactic Inference Now, on top of that we add each of the other methods separately. These combinations basically mean “try exact match, if that doesn’t work, try something smarter”.

50 Baseline BaseLex BaseSyn Majority
Decision Mechanisms Baseline BaseLex BaseSyn Majority Exact Match Exact Match OR WordNet Exact Match OR BIUTEE WordNet Exact Match OR AND BIUTEE Finally, we’ve got the majority rule, which requires both lexical and the syntactic methods to agree in case it isn’t a trivial exact match.

51 SemEval 2013 Task 7 Let’s see how we were evaluated in SemEval.

52 Dataset SciEntsBank (Djikovska et al, 2012) 15 Domains 5,106 Student Responses 16,263 Facets

53 Dataset SciEntsBank (Djikovska et al, 2012) 15 Domains 5,106 Student Responses 16,263 Facets The challenge used the SciEntsBank dataset…

54 Dataset SciEntsBank (Djikovska et al, 2012) 15 Domains 5,106 Student Responses 16,263 Facets With 15 domains

55 Dataset SciEntsBank (Djikovska et al, 2012) 15 Domains 13,145 Train Facets 16,263 Test Facets About 13,000 training examples

56 Dataset SciEntsBank (Djikovska et al, 2012) 15 Domains 13,145 Train Facets 16,263 Test Facets And 16,000 test instances

57 Unseen Answers 9% Unseen Questions 12% Unseen Domains 79%
Scenarios Unseen Answers 9% Unseen Questions 12% Unseen Domains 79% The test instances were divided into 3 scenarios:

58 Unseen Answers 9% Unseen Questions 12% Unseen Domains 79%
Scenarios Unseen Answers 9% Unseen Questions 12% Unseen Domains 79% Unseen answers from questions that already appeared in the training data,

59 Unseen Answers 9% Unseen Questions 12% Unseen Domains 79%
Scenarios Unseen Answers 9% Unseen Questions 12% Unseen Domains 79% Unseen questions from previously seen domains,

60 Unseen Answers 9% Unseen Questions 12% Unseen Domains 79%
Scenarios Unseen Answers 9% Unseen Questions 12% Unseen Domains 79% And some entirely new domains.

61 Results So, how well did we do?

62 Performance (Weighted F1)
So, what we can see here is the weighted F1 measure of each combination, and each scenario. As you can see, the majority rule dominates the rest, and shows an increase of 8-11% from the baseline. In terms of error reduction, that’s about one third.

63 Summary So, to sum it up:

64 Problem:. Need fine-grained entailment Solution:
Problem: Need fine-grained entailment Solution: Partial Textual Entailment Approach: Leverage existing methods Results: Significant improvement We needed a fine-grained version of textual entailment.

65 Problem:. Need fine-grained entailment Solution:
Problem: Need fine-grained entailment Solution: Partial Textual Entailment Approach: Leverage existing methods Results: Significant improvement Partial textual entailment seemed like the way to go.

66 Problem:. Need fine-grained entailment Solution:
Problem: Need fine-grained entailment Solution: Partial Textual Entailment Approach: Leverage existing methods Results: Significant improvement We tried modifying existing methods to suit the task.

67 Problem:. Need fine-grained entailment Solution:
Problem: Need fine-grained entailment Solution: Partial Textual Entailment Approach: Leverage existing methods Results: Significant improvement And indeed, it paid off. We won  Well, to be fair, we were the only contenders…

68 Partial Textual Entailment is interesting
So, I guess the take-home message is…

69 Partial Textual Entailment is useful
Partial is useful for a bunch of applications,

70 Partial Textual Entailment is practical
It’s practical to get it working,

71 Partial Textual Entailment is interesting
And I really believe it’s one of the most interesting tasks in NLP today, And I’d like you to join us in working on partial textual entailment.

72 Questions? Thank you


Download ppt "Recognizing Partial Textual Entailment"

Similar presentations


Ads by Google