Presentation is loading. Please wait.

Presentation is loading. Please wait.

Exploiting Background Knowledge for Relation Extraction Yee Seng Chan and Dan Roth University of Illinois at Urbana-Champaign 1.

Similar presentations


Presentation on theme: "Exploiting Background Knowledge for Relation Extraction Yee Seng Chan and Dan Roth University of Illinois at Urbana-Champaign 1."— Presentation transcript:

1 Exploiting Background Knowledge for Relation Extraction Yee Seng Chan and Dan Roth University of Illinois at Urbana-Champaign 1

2 Relation Extraction Relation extraction (RE)  “David Cone, a Kansas City native, was originally signed by the Royals and broke into the majors with the team” Supervised RE  Train on sentences annotated with entity mentions and predefined target relations  Common features: BOW, POS tags, syntactic/dependency parses, kernel functions based on structured representations of the sentence 2

3 Background Knowledge Features employed are usually restricted to being defined on the various representations of the target sentences Humans rely on background knowledge to recognize relations Overall aim of this work  Propose methods of using knowledge or resources that exists beyond the sentence Wikipedia, word clusters, hierarchy of relations, entity type constraints, coreference As additional features, or under the Constraint Conditional Model (CCM) framework with Integer Linear Programming (ILP) 3

4 4 David Cone, a Kansas City native, was originally signed by the Royals and broke into the majors with the team Using Background Knowledge

5 5 David Cone, a Kansas City native, was originally signed by the Royals and broke into the majors with the team Using Background Knowledge

6 6 David Cone, a Kansas City native, was originally signed by the Royals and broke into the majors with the team Using Background Knowledge

7 7 David Cone, a Kansas City native, was originally signed by the Royals and broke into the majors with the team Using Background Knowledge

8 8 David Cone, a Kansas City native, was originally signed by the Royals and broke into the majors with the team Using Background Knowledge David Brian Cone (born January 2, 1963) is a former Major League Baseball pitcher. He compiled an 8–3 postseason record over 21 postseason starts and was a part of five World Series championship teams (1992 with the Toronto Blue Jays and 1996, 1998, 1999 & 2000 with the New York Yankees). He had a career postseason ERA of 3.80. He is the subject of the book A Pitcher's Story: Innings With David Cone by Roger Angell. Fans of David are known as "Cone-Heads." Major League BaseballpitcherWorld Series1992Toronto Blue Jays1996199819992000New York YankeesRoger AngellCone-Heads Cone lives in Stamford, Connecticut, and is formerly a color commentator for the Yankees on the YES Network. [1]Stamford, Connecticut color commentatorYES Network [1] Contents [hide]hide 1 Early years 2 Kansas City Royals 3 New York Mets Partly because of the resulting lack of leadership, after the 1994 season the Royals decided to reduce payroll by trading pitcher David Cone and outfielder Brian McRae, then continued their salary dump in the 1995 season. In fact, the team payroll, which was always among the league's highest, was sliced in half from $40.5 million in 1994 (fourth-highest in the major leagues) to $18.5 million in 1996 (second-lowest in the major leagues)David ConeBrian McRae1995 season1996

9 9 David Cone, a Kansas City native, was originally signed by the Royals and broke into the majors with the team Using Background Knowledge fine-grained Employment:Staff0.20 Employment:Executive0.15 Personal:Family0.10 Personal:Business0.10 Affiliation:Citizen0.20 Affiliation:Based-in0.25

10 10 David Cone, a Kansas City native, was originally signed by the Royals and broke into the majors with the team Using Background Knowledge fine-grainedcoarse-grained Employment:Staff0.20 0.35Employment Employment:Executive0.15 Personal:Family0.10 0.40Personal Personal:Business0.10 Affiliation:Citizen0.20 0.25Affiliation Affiliation:Based-in0.25

11 11 David Cone, a Kansas City native, was originally signed by the Royals and broke into the majors with the team Using Background Knowledge fine-grainedcoarse-grained Employment:Staff0.20 0.35Employment Employment:Executive0.15 Personal:Family0.10 0.40Personal Personal:Business0.10 Affiliation:Citizen0.20 0.25Affiliation Affiliation:Based-in0.25

12 12 David Cone, a Kansas City native, was originally signed by the Royals and broke into the majors with the team Using Background Knowledge fine-grainedcoarse-grained Employment:Staff0.20 0.35Employment Employment:Executive0.15 Personal:Family0.10 0.40Personal Personal:Business0.10 Affiliation:Citizen0.20 0.25Affiliation Affiliation:Based-in0.25 0.55

13 Basic Relation Extraction (RE) System Our basicRE system  Given a sentence “... m 1... m 2...”, predict whether any predefined relation holds  Asymmetric relations, e.g. m 1 :r:m 2 vs m 2 :r:m 1 13

14 Basic Relation Extraction (RE) System Our basicRE system  Given a sentence “... m 1... m 2...”, predict whether any predefined relation holds  Asymmetric relations, e.g. m 1 :r:m 2 vs m 2 :r:m 1 14

15 Basic Relation Extraction (RE) System Most of the features based on the work in (Zhou et al., 2005)  Lexical: hw, BOW, bigrams,...  Collocations: words to the left/right of the mentions,...  Structural: m 1 -in- m 2, #mentions between m 1,m 2,...  Entity typing: m 1,m 2 entity-type,...  Dependency: dep-path between m 1,m 2,... 15

16 Knowledge Sources As additional features  Wikipedia  Word clusters As constraints  Hierarchy of relations  Entity type constraints  Coreference 16

17 Knowledge 1 : Wikipedia 1 (as additional feature) We use a Wikifier system (Ratinov et al., 2010) which performs context-sensitive mapping of mentions to Wikipedia pages Introduce a new feature based on:   introduce a new feature by combining the above with the coarse- grained entity types of m i, m j 17 mimi mjmj r ?

18 Knowledge 1 : Wikipedia 2 (as additional feature) Given m i, m j, we use a Parent-Child system (Do and Roth, 2010) to predict whether they have a parent-child relation Introduce a new feature based on:   combine the above with the coarse-grained entity types of m i, m j 18 mimi mjmj parent-child?

19 Knowledge 2 : Word Class Information (as additional feature) Supervised systems face an issue of data sparseness (of lexical features) Use class information of words to support generalization better: instantiated as word clusters in our work  Automatically generated from unlabeled texts using algorithm of (Brown et al., 1992) apple pear Apple IBM 0 10 1 0 1 boughtrun of in 0 1 0 1 0 1 0 1 19

20 Knowledge 2 : Word Class Information Supervised systems face an issue of data sparseness (of lexical features) Use class information of words to support generalization better: instantiated as word clusters in our work  Automatically generated from unlabeled texts using algorithm of (Brown et al., 1992) apple pear Apple 0 10 1 0 1 boughtrun of in 0 1 0 1 0 1 0 1 20 IBM

21 Knowledge 2 : Word Class Information Supervised systems face an issue of data sparseness (of lexical features) Use class information of words to support generalization better: instantiated as word clusters in our work  Automatically generated from unlabeled texts using algorithm of (Brown et al., 1992) apple pear Apple 0 10 1 0 1 boughtrun of in 0 1 0 1 0 1 0 1 21 IBM011

22 Knowledge 2 : Word Class Information All lexical features consisting of single words will be duplicated with its corresponding bit-string representation apple pear Apple IBM 0 10 1 0 1 boughtrun of in 0 1 0 1 0 1 0 1 22 00 0110 11

23 Knowledge Sources As additional features  Wikipedia  Word clusters As constraints  Hierarchy of relations  Entity type constraints  Coreference 23

24 24 weight vector for “local” models collection of classifiers Constraint Conditional Models (CCMs) (Roth and Yih, 2007; Chang et al., 2008)

25 25 weight vector for “local” models collection of classifiers penalty for violating the constraint how far y is from a “legal” assignment

26 Constraint Conditional Models (CCMs) (Roth and Yih, 2007; Chang et al., 2008) 26 Wikipedia word clusters hierarchy of relations entity type constraints coreference

27 Constraint Conditional Models (CCMs) (Roth and Yih, 2007; Chang et al., 2008) Goal of CCM: when you want to predict multiple variables, and you want to exploit the fact that they are related  Encode knowledge as constraints to exploit interaction between the multiple predictions  You impose constraints on the predictions of your various models. This is a global inference problem We learn separate models and then perform joint global inference to arrive at final predictions 27

28 28 David Cone, a Kansas City native, was originally signed by the Royals and broke into the majors with the team Constraint Conditional Models (CCMs) fine-grainedcoarse-grained Employment:Staff0.20 0.35Employment Employment:Executive0.15 Personal:Family0.10 0.40Personal Personal:Business0.10 Affiliation:Citizen0.20 0.25Affiliation Affiliation:Based-in0.25

29 Key steps  Write down a linear objective function  Write down constraints as linear inequalities  Solve using integer linear programming (ILP) packages 29 Constraint Conditional Models (CCMs) (Roth and Yih, 2007; Chang et al., 2008)

30 Knowledge 3 : Relations between our target relations... personal... employment family bizexecutive staff 30

31 Knowledge 3 : Hierarchy of Relations... personal... employment family bizexecutive staff 31 coarse-grained classifier fine-grained classifier

32 Knowledge 3 : Hierarchy of Relations... personal... employment family bizexecutive staff 32 mimi mjmj coarse-grained? fine-grained?

33 Knowledge 3 : Hierarchy of Relations... personal... employment family bizexecutive staff 33

34 Knowledge 3 : Hierarchy of Relations... personal... employment family bizexecutive staff 34

35 Knowledge 3 : Hierarchy of Relations... personal... employment family bizexecutive staff 35

36 Knowledge 3 : Hierarchy of Relations... personal... employment family bizexecutive staff 36

37 Knowledge 3 : Hierarchy of Relations... personal... employment family bizexecutive staff 37

38 Knowledge 3 : Hierarchy of Relations Write down a linear objective function 38 coarse-grained prediction probabilities fine-grained prediction probabilities

39 Knowledge 3 : Hierarchy of Relations Write down a linear objective function 39 coarse-grained prediction probabilities fine-grained prediction probabilities coarse-grained indicator variable fine-grained indicator variable indicator variable == relation assignment

40 Knowledge 3 : Hierarchy of Relations Write down constraints  If a relation R is assigned a coarse-grained label rc, then we must also assign to R a fine-grained relation rf which is a child of rc.  (Capturing the inverse relationship) If we assign rf to R, then we must also assign to R the parent of rf, which is a corresponding coarse-grained label 40

41 Knowledge 4 : Entity Type Constraints ( Roth and Yih, 2004, 2007) Entity types are useful for constraining the possible labels that a relation R can assume 41 mimi mjmj Employment:Staff Employment:Executive Personal:Family Personal:Business Affiliation:Citizen Affiliation:Based-in

42 Entity types are useful for constraining the possible labels that a relation R can assume 42 Employment:Staff Employment:Executive Personal:Family Personal:Business Affiliation:Citizen Affiliation:Based-in per org per org per org gpe per mimi mjmj Knowledge 4 : Entity Type Constraints ( Roth and Yih, 2004, 2007)

43 We gather information on entity type constraints from ACE-2004 documentation and impose them on the coarse-grained relations  By improving the coarse-grained predictions and combining with the hierarchical constraints defined earlier, the improvements would propagate to the fine-grained predications 43 Employment:Staff Employment:Executive Personal:Family Personal:Business Affiliation:Citizen Affiliation:Based-in per org per org per org gpe per mimi mjmj Knowledge 4 : Entity Type Constraints ( Roth and Yih, 2004, 2007)

44 Knowledge 5 : Coreference 44 mimi mjmj Employment:Staff Employment:Executive Personal:Family Personal:Business Affiliation:Citizen Affiliation:Based-in

45 Knowledge 5 : Coreference In this work, we assume that we are given the coreference information, which is available from the ACE annotation. 45 mimi mjmj Employment:Staff Employment:Executive Personal:Family Personal:Business Affiliation:Citizen Affiliation:Based-in null

46 Experiments Used the ACE-2004 dataset for our experiments  Relations do not cross sentence boundaries  We model the argument order (of the mentions) m 1 :r:m 2 vs m 2 :r:m 1  Allow null label prediction when mentions are not related Classifiers  regularized averaged perceptrons implemented within the SNoW (Carlson et al., 1999)  Followed prior work (Jiang and Zhai, 2007) and performed 5-fold cross validation 46

47 Performance of the Basic RE System Build a BasicRE system using only the basic features Compare against the state-of-the-art feature-based RE system of Jiang and Zhai (2007)  The authors performed their evaluation using undirected coarse- grained relations (7 relation labels + 1 null label)  Evaluation on nwire and bnews corpora of ACE-2004  Performance (F1%) 47 Jiang and Zhai (2007)BasicRE 71.5%71.2%

48 Experimental Settings ACE-2004: 7 (coarse) and 23 (fine) grained relations Trained two classifiers:  coarse-grained (15 relation labels)  fine-grained (47 relation labels) Focus on evaluation of fine-grained relations Use the nwire corpus for our experiments  Two of our knowledge sources (Wiki system, word clusters) assume inputs of mixed-case text  bnews corpus in lower-cased text  28,943 relation instances with 2,226 (non-null) 48

49 Evaluation Settings 49 Prior workOur work Train-test data splits at mention level Train-test data splits at document level Evaluation at mention level Evaluation at entity level more realistic

50 Experimental Settings Evaluate our performance at the entity level  Prior work calculated RE performance at the level of mentions  ACE annotators rarely duplicate a relation link for coreferent mentions:  Given a pair of entities, we establish the set of relation types existing between them, based on their mention annotations r... m i... m j... m k... 50 null?

51 Experiment Results (F1%): fine-grained relations 51 All nwire10% of nwire BasicRE50.5%31.0%

52 Experiment Results (F1%): fine-grained relations 52 F1% improvement from using each knowledge source All nwire10% of nwire BasicRE50.5%31.0%

53 Related Work Ji et al. (2005), Zhou et al. (2005,2008), Jiang (2009) 53

54 Conclusion We proposed a broad range of methods to inject background knowledge into a RE system Some methods (e.g. exploiting the relation hierarchy) are general in nature To combine the various relation predictions, we perform global inference within an ILP framework  This allows for easy injection of knowledge as constraints  Ensures globally coherent models and predictions 54


Download ppt "Exploiting Background Knowledge for Relation Extraction Yee Seng Chan and Dan Roth University of Illinois at Urbana-Champaign 1."

Similar presentations


Ads by Google