Presentation is loading. Please wait.

Presentation is loading. Please wait.

Distant Supervision for Relation Extraction without Labeled Data CSE 5539.

Similar presentations


Presentation on theme: "Distant Supervision for Relation Extraction without Labeled Data CSE 5539."— Presentation transcript:

1 Distant Supervision for Relation Extraction without Labeled Data CSE 5539

2 A few administrative notes… I sent an email to the class – Let me know if you’re not able to get it Logging into Carmen? – Let me know if you continue to have trouble Email me to reserve a presentation slot if you haven’t already Email me with project groups – Forum for finding group-mates on Carmen

3 Dan Jurafsky Extracting relations from text Company report: “International Business Machines Corporation (IBM or the company) was incorporated in the State of New York on June 16, 1911, as the Computing-Tabulating-Recording Co. (C-T-R)…” Extracted Complex Relation: Company-Founding Company IBM Location New York Date June 16, 1911 Original-Name Computing-Tabulating-Recording Co. But we will focus on the simpler task of extracting relation triples Founding-year(IBM,1911) Founding-location(IBM,New York)

4 Traditional Relation Extraction Approaches: – Supervised (ACE) – Rule-Based (Hearst Patterns, etc…) Advantages: – Good in-domain performance Disadvantages: – Requires lots of human effort for each new relation… – Doesn’t generalize much beyond the domain

5 Dan Jurafsky Bootstrapping Seed tuple Grep (google) for the environments of the seed tuple “Mark Twain is buried in Elmira, NY.” X is buried in Y “The grave of Mark Twain is in Elmira” The grave of X is in Y “Elmira is Mark Twain’s final resting place” Y is X’s final resting place. Use those patterns to grep for new tuples Iterate

6 Dan Jurafsky Dipre: Extract pairs Start with 5 seeds: Find Instances: The Comedy of Errors, by William Shakespeare, was The Comedy of Errors, by William Shakespeare, is The Comedy of Errors, one of William Shakespeare's earliest attempts The Comedy of Errors, one of William Shakespeare's most Extract patterns (group by middle, take longest common prefix/suffix ) ?x, by ?y, ?x, one of ?y ‘s Now iterate, finding new seeds that match the pattern Brin, Sergei. 1998. Extracting Patterns and Relations from the World Wide Web. AuthorBook Isaac AsimovThe Robots of Dawn David BrinStartide Rising James Gleick Chaos: Making a New Science Charles DickensGreat Expectations William Shakespeare The Comedy of Errors

7 Dan Jurafsky Distant Supervision Combine bootstrapping with supervised learning Instead of 5 seeds, Use a large database to get huge # of seed examples Create lots of features from all these examples Combine in a supervised classifier Snow, Jurafsky, Ng. 2005. Learning syntactic patterns for automatic hypernym discovery. NIPS 17 Fei Wu and Daniel S. Weld. 2007. Autonomously Semantifying Wikipeida. CIKM 2007 Mintz, Bills, Snow, Jurafsky. 2009. Distant supervision for relation extraction without labeled data. ACL09

8 Dan Jurafsky Distant supervision paradigm Like supervised classification: Uses a classifier with lots of features Supervised by detailed hand-created knowledge Doesn’t require iteratively expanding patterns Like unsupervised classification: Uses very large amounts of unlabeled data Not sensitive to genre issues in training corpus

9 Dan Jurafsky Distantly supervised learning of relation extraction patterns For each relation For each tuple in big database Find sentences in large corpus with both entities Extract frequent features (parse, words, etc) Train supervised classifier using thousands of patterns 4 4 1 1 2 2 3 3 5 5 PER was born in LOC PER, born (XXXX), LOC PER’s birthplace in LOC Born-In Hubble was born in Marshfield Einstein, born (1879), Ulm Hubble’s birthplace in Marshfield P(born-in | f 1,f 2,f 3,…,f 70000 )

10

11

12

13 Automatic Evaluation Hold out facts from freebase – Evaluate precision and recall Problems: – Extractions often missing from Freebase – Marked as precision errors – These are the extractions we really care about! New facts, not contained in Freebase 13

14 Automatic Evaluation

15 Automatic Evaluation: Discussion Correct predictions will be missing form DB – Underestimates precision This evaluation is biased – Systems which make predictions for more frequent entity pairs will do better. – Hard constraints => explicitly trained to predict facts already in Freebase 15 [Riedel et. al. 2013]

16 Human Evaluation


Download ppt "Distant Supervision for Relation Extraction without Labeled Data CSE 5539."

Similar presentations


Ads by Google