Presentation is loading. Please wait.

Presentation is loading. Please wait.

Natural Language Knowledge Graphs Open-IE meets Knowledge Representation Ido Dagan Bar-Ilan University, Israel.

Similar presentations


Presentation on theme: "Natural Language Knowledge Graphs Open-IE meets Knowledge Representation Ido Dagan Bar-Ilan University, Israel."— Presentation transcript:

1 Natural Language Knowledge Graphs Open-IE meets Knowledge Representation Ido Dagan Bar-Ilan University, Israel

2 Knowledge Representation (KR) Two complementary frameworks: Knowledge graphs Formal pre-specified schema & predicates Require (supervised) IE to populate from text Targeting established knowledge Open IE Arbitrary propositions found in text (anything said) Represented in natural language terms Our research line: extend Open IE towards a richer KR framework

3 From Berant et al., 2014 Appeal: complex aggregation queries, via semantic parsing (beyond text-QA scope) E.g. politicians spouses who lived in Chicago

4 Open IE

5 What’s missing in Open IE?

6 Enriching Proposition Structure and Coverage Gabriel Stanovsky Jessica Ficler Ido Dagan Yoav Goldberg Based on paper at Semantic Parsing Workshop @ ACL 2014 (work in progress)

7 (Curiosity, Landed on, Mars) (Curiosity, is a, rover) (Curiosity, is a, science lab) (Curiosity, landed on, Mars) (Curiosity, explores, Mars) Open IE produces tuples of predicate and arguments (NASA, launched, Curiosity) (Curiosity, surveys, Mars’ surface) (Curiosity, collects, rock samples)

8 Falls short of capturing all information conveyed Falls short of representing internal structure of information → Enrich proposition representation and extraction Limitations of Current Proposition Structure

9 Extracting Implied Propositions Propositions can be implied from syntax Also implied by adjectives, nominalizations, conjunctions, etc. Curiosity’s robotic arm is used to collect samples Curiosity has a robotic arm Possessives Curiosity, the Mars rover, landed on Mars Curiosity is the Mars rover Apposition

10 from Mars Propositions can be embedded Arguments and predicates may have internal structure NASA utilizes Curiosity to survey Mars Curiosity examines rock samples Enriching Structure

11 Predicate: is Subject: Curiosity Object: the Mars rover NASA utilizes the Mars rover, Curiosity, to examine rock samples from Mars Proposition Structures Predicate: examine Subject: NASA Object: rock samples Modifier: from Mars rock samples Predicate: utilize Subject: NASA Object: the Mars rover Comp: examine Q: “What is Curiosity?” Q: “Who utilizes the Mars rover?” Q: “What did NASA examine?”

12 Predicate: is Subject: Curiosity Object: the Mars rover NASA utilizes the Mars rover, Curiosity, to examine rock samples from Mars Proposition Structures Predicate: examine Subject: the Mars rover Object: rock samples Modifier: from Mars rock samples Predicate: utilize Subject: NASA Object: the Mars rover Comp: examine Explicitly represent implied propositions and embedded structure

13 Further Steps Soon: A tool which produces proposition structures Generically – a better “syntax wrapper” for semantic processing (vs. dependency trees) Add sub-proposition factuality (truth assertion) TruthTeller (Lotan et. al, NAACL 2012) OLLIE (Mausam et al., EMNLP 2012) Extract implied arguments From discourse rather than syntax (Stern and Dagan, ACL 2014)

14 14 Add features to nodes pt+ TruthTeller: Predicate Truth Value Annotation

15 15 pt- Add features to nodes Predicate Truth Value Annotation

16 16 pt? Add features to nodes Predicate Truth Value Annotation

17 1.Main phenomena addressed 1.Presupposition 2.Factives, Implicatives 3.Implication signature changes 4.Negation (verbal, nominal, double…) 5.Conjunctions, apposition… 2.Annotation rules identifying above phenomena 3.Large lexicon of factives, implicatives 4.Recursive algorithm for Natural Logic style calculus 17 Algorithm Overview (Kiparsky & Kiparsky 1970) (Karttunen 1971; 2012) (Lakoff, 1970) (Nairen et al., 2006) (MacCartney & Manning, 2009)

18 Focused Entailment Graphs for Open IE Propositions Omer LevyIdo DaganJacob Goldberger CoNLL 2014 18

19 Adding Inter-proposition Structure to Open IE

20 aspirin, eliminate, headache aspirin, cure, headache headache, control with, aspirin drug, relieve, headache drug, treat, headache analgesic, banish, headache headache, respond to, painkiller headache, treat with, caffeine coffee, help, headache tea, soothe, headache Original Open IE Output

21 aspirin, eliminate, headache aspirin, cure, headache headache, control with, aspirin drug, relieve, headache drug, treat, headache analgesic, banish, headache headache, respond to, painkiller headache, treat with, caffeine coffee, help, headache tea, soothe, headache Consolidated Open IE Output

22 Semantic Applications Example: Structured Queries “What relieves headaches?”

23 Semantic Applications

24 aspirin, eliminate, headache aspirin, cure, headache headache, control with, aspirin drug, relieve, headache drug, treat, headache analgesic, banish, headache headache, respond to, painkiller headache, treat with, caffeine coffee, help, headache tea, soothe, headache

25 aspirin, eliminate, headache aspirin, cure, headache headache, control with, aspirin drug, relieve, headache drug, treat, headache analgesic, banish, headache headache, respond to, painkiller headache, treat with, caffeine coffee, help, headache tea, soothe, headache

26 aspirin drug analgesic painkiller caffeine coffee tea

27 aspirin drug analgesic painkiller caffeine coffee tea Next step – graph aggregation: “Which drinks relieve headache?”

28 Our Contributions Structuring Open IE with Proposition Entailment Graphs Dataset: 30 gold-standard graphs, 1.5 million entailment annotations Algorithm for constructing Focused Proposition Entailment Graphs Analysis: Predicate entailment is not quite what we thought

29 Algorithm

30 How do we recognize proposition entailment?

31

32

33

34 Lexical Entailment (Logistic) Lexical Entailment Lexical Entailment Features

35 Lexical Entailment (Logistic) Lexical Entailment Features WordNet Relations UMLS Distributional Similarity String Edit Distance Lexical Entailment Features Supervision

36 Are WordNet relations capturing real-world predicate entailments?

37 Why isn’t WordNet capturing predicate entailment? Predicate Entailment vs WordNet Relations Over a predicate inference subset, how many predicate entailments are covered by WordNet? Positive indicators synonyms, hypernyms, entailment Negative Indicators antonyms, hyponyms, cohyponyms

38 Predicate Entailment is Context- Sensitive

39 Appeal of NL KR Scalable – in principle unlimited coverage Easy to communicate with people Understand Supervise – add knowledge (vs. in logic representation) May add additional links between propositions Causality, temporal, argumentative Support at least some useful inferences

40 Integration with Logic-based Approaches Integrate with logical/formal representations for concrete phenomena E.g. temporal, arithmetic, spatial Borrow ideas/methods from logic to apply over NL KR Which are relevant and applicable?

41 Text Exploration via NL Knowledge Graphs Customer interactions Exploratory search

42 Example: Service issues not happy with the cateringcoffee is awful coffee in economy is awful no refreshments food on train is too expensive you charge too much for sandwiches food quality is disappointing bad food in premier not enough food selectionprovide veggie meals not happy with the service journey is too slow no clear information not happy with the staff staff is unfriendly no vegetarian foodexpand meal options sandwiches are overpriced sandwiches are too expensive disgusting coffee is served they have horrible coffee food is bad not happy with the catering coffee is awful they have horrible coffee disgusting coffee is served coffee in economy is awful no refreshments food on train is too expensive sandwiches are too expensive sandwiches are overpriced you charge too much for sandwiches food is bad food quality is disappointing bad food in premier not enough food selection expand meal options no vegetarian food provide veggie meals not happy with the service journey is too slow no clear information not happy with the staff staff is unfriendly

43 not happy with the catering coffee is awful coffee in economy is awful no refreshments food on train is too expensive sandwiches are too expensive food is bad bad food in premier not enough food selection no vegetarian food not happy with the service journey is too slow no clear information not happy with the staff staff is unfriendly not happy with the toilets toilets are dirty toilets are smelly missing hygienic supplies no soap in toilets no toilet paper not happy with train facilities seats are uncomfortable missing facilities no children section no WIFI no AC in cars facilities are bad no AC no AC in station station is too crowded Marshfield station is too crowded cars are congested pathway is too narrow lack of personal space lack of storage space improve the website online booking can be better webpage shows old timetables no website for android can‘t find FAQ page Customer Interactions Entailment Graph

44 Conclusion: exciting research area Extend Open IE to become NL-based knowledge graph NL proposition structure Graph of inter-proposition relations Entailment – consolidation and hierarchy for propositions Other relations desired – causal, temporal, argumentative, … How does it integrate with formal/artificial language KR? Thank You!


Download ppt "Natural Language Knowledge Graphs Open-IE meets Knowledge Representation Ido Dagan Bar-Ilan University, Israel."

Similar presentations


Ads by Google