Presentation is loading. Please wait.

Presentation is loading. Please wait.

Explanation Producing Combination of NLP and Logical Reasoning through Translation of Text to KR Formalisms CHITTA BARAL ARIZONA STATE UNIVERSITY 1 School.

Similar presentations


Presentation on theme: "Explanation Producing Combination of NLP and Logical Reasoning through Translation of Text to KR Formalisms CHITTA BARAL ARIZONA STATE UNIVERSITY 1 School."— Presentation transcript:

1 Explanation Producing Combination of NLP and Logical Reasoning through Translation of Text to KR Formalisms CHITTA BARAL ARIZONA STATE UNIVERSITY 1 School of Computing, Informatics, and Decision Systems Engineering Arizona State University

2 2 Our Assumptions and Approaches  Shonan Meeting Title: Towards Explanation Production Combining NLP and Logical Reasoning  Broader Goals:  Natural language understanding: Answer questions with respect to natural language text and to able to explain our answer in a logical manner; Build NL interfaces for various systems.  Make “reasoning” (with knowledge) more wide spread in intelligent systems: a big bottleneck is knowledge acquisition.  Methodology: Translate natural language (both question and text) to Knowledge Representation formalisms and follow it up with reasoning, interfacing etc.  Two Approaches  NL2KR: Developed an NL2KR platform that can be used to translate NL text to formal representation of choice via training.  Kparser: Developed a Knowledge Parser that translates text to a collection of triplets (RDF style). School of Computing, Informatics, and Decision Systems Engineering Arizona State University Explanation Producing Combination of NLP and Logical Reasoning through Translation of Text to KR Formalisms

3 3 Some Guiding Problems  Winograd Schema Challenge (Kparser)  Natural Language Interfaces to various systems (NL2KR)  Geoquery  Clang – Robocup command language  Datanet policy generation  Solving logic puzzles described in English (NL2KR) School of Computing, Informatics, and Decision Systems Engineering Arizona State University Explanation Producing Combination of NLP and Logical Reasoning through Translation of Text to KR Formalisms

4 The town councilors refused to give the demonstrators a permit because they feared violence. Who feared violence? The town councilors refused to give the demonstrators a permit because they advocated violence Who advocated violence? 4  Contains a pair of sentences that differ in only one or two words  The sentences contain an ambiguity that is resolved in opposite ways in the two sentences  Requires the use of world knowledge and reasoning for its resolution School of Computing, Informatics, and Decision Systems Engineering Arizona State University Winograd Schema Challenge Example

5 5  A Question Answering test  A Collection of 141 Winograd Schemas.  282 Total Sentences  A Question about each Sentence. School of Computing, Informatics, and Decision Systems Engineering Arizona State University Winograd Schema Challenge The town councilors refused to give the demonstrators a permit because they feared violence. Who feared violence ? The town councilors refused to give the demonstrators a permit because they advocated violence. Who advocated violence? Example

6 6

7 School of Computing, Informatics, and Decision Systems Engineering Arizona State University NL2KR Platform 7

8 The NL2KR Framework A framework to develop translation systems that translate natural language to a wide variety of formal language representations. Inspired by Montague’s approach

9 Learning Algorithms Inverse Lambda Computes a λ expression F such that H = F@G or H = G@F given H and G. Generalization ◦Learn from syntactically similar words with known meanings. ◦Can learn meanings of words that are not even present in training

10 NL2KR Architecture

11 11

12 12

13 Kparser and System to address Winograd schema challenge 13

14 School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning The Workflow 14 Given Sentence and Question Answer Automatic Background Knowledge Extractor Logical Reasoning Module Background Sentence Semantic Representation of the Sentence and question Semantic Representation of the Background Sentence Pronoun Extractor Semantic Parser Pronoun to Be Resolved

15 School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Semantic Parser & Pronoun Extractor 15

16 A system which converts an English text to a formal representation. An attempt to overcome challenges in representation. WHAT IS K-PARSER ? “John loves the smell of daisies”

17 School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Semantic Parser: initial version Represent text into an Expressive Formal Representation Preserve Grammatical Structure Syntactic Dependency Parse Disambiguate words Ontology (WordNet) Uses General Set of Relations Knowledge Machine (KM) Slot Dictionary 17

18 SYNTACTIC DEPENDENCY PARSE Stanford Dependency Parse of: “The man loves his hat” loves VBZ man NN hat NN his PRP$ The DT det nsubj dobj poss

19 Knowledge Machine Slot Dictionary Mapping KM Slot dictionary is a set of relations Event-Event relations Event-Entity relations Entity-Entity relations Event-Role relations Entity-Value relations etc. Approx 100 relations Example: agent, recipient, causes,trait etc. We map Stanford Dependency Relations to KM Slot Dictionary

20 Knowledge Machine Slot Dictionary Mapping Semantic Parse of “The man loves his hat” loves VBZ man NN hat NN his PRP$ agent recipient possessed_by loves VBZ man NN hat NN his PRP$ The DT det nsubj dobj poss KM mapping Stanford Dependency Parse Graph Graph obtained after Stanford to KM slots mapping

21 Ontology Addition Ontology Addition to the Semantic Parse of “Tom loves Mary because Mary loves him”

22 Kparser Graph Properties Roots: KM relations are represented using directional edge. The nodes with no incoming edge are considered roots. A root can be an event or an entity Other Nodes: Leaf nodes represent classes The nodes with incoming edge instance_of represent the first level of classes. Acquired by using Lemmatization on words Edges: Edges represent KM relations Kparser graph is an acyclic graph

23 Word Sense Disambiguation WSD used for determining correct superclass of words. Example:- Tom walks slowly (walks has superclass motion)

24 Named Entity Recognition NER is used to get correct superclass of named entities. Example: John loves Mia (John and Mia are NE, hence the superclass person)

25 Semantic Roles of Entities We label the agent and recipients of verbs with their semantic roles extracted from Propbank frame sets

26 Representing Quantifiers Represents quantifiers like “every” and “a” Every boxer walksA boxer walks

27 Co-reference Resolution An option for co-reference resolution Example:Fish ate the worm because it was tasty

28 28

29 29

30 30

31 31

32 32

33 33

34 School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Automatic Background Knowledge Extractor 34

35 35 The Idea is, to learn the usage of English words and the contexts in which they are used School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Automatic Background Knowledge Extractor  Creating query by using formal representation of the given sentence and the question  Extracting background knowledge sentences from a big source of raw text That is done by,

36 School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Automatic Background Knowledge Extractor Creating Queries The man could not lift his son because he was so weak. Who was weak ? Query Set 1 (Q1): “.*not.*lift.*because.*weak.*” “.*not.*lift.*because.*so.*weak.*”  Use semantic graph of the given sentence and the question  Trace all nodes of the question into the given sentence (except “Wh” nodes)  Extract semantically important words (except entities)  Also consider the connective words  Combine the words in their order of occurrence in the sentence and join them using wildcard (.*) and quotes (“”) 36

37 37  Using big source of raw text  Use search engine School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Automatic Background Knowledge Extractor Extracting Background Knowledge Sentences

38 38  Two ways in which sentences are extracted from WWW  Example Query: “.*not.*lift.*because.*weak.*” School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Automatic Background Knowledge Extractor Extracting Background Knowledge Sentences

39 39  Filtering the extracted sentences  Should not contain the original sentence  Should contain all the words in the query (in any form)  Should not contain partial sentences School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Automatic Background Knowledge Extractor Extracting Background Knowledge Sentences The man could not lift his son because he was so weak. Query: “.*not.*lift.*because.*weak.*” Filtered sentences:  She could not lift it off the floor because she is a weak girl  She could not even lift her head because she was so weak  I could not even lift my leg to turn over because the muscles were weak after surgery  …..

40 School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Automatic Background Knowledge Extractor Parsing the Background Sentences 40 She could not lift it off the floor because she is a weak girl

41 School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Logical Reasoning Engine 41

42 School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Logical Reasoning Engine Given Sentence Logical Reasoning Engine (ASP Rules) Answer Background Knowledge Sentence Background Knowledge Sentences 42 Pronoun

43 43  Answer Set Programming  Represent the Semantic Representation of the Given Sentence and the Background sentence in ASP predicates  Use ASP Reasoning Rules School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Logical Reasoning Engine

44 School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning System Evaluation & Error Analysis 44

45 School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning System Evaluation 45  Total 282 sentences in WSC  Causal category has >200  Causal sub-categories, Type1 and Type2, combined have 100 sentences  Results Total Number of Sentences Evaluated AnsweredBackground Knowledge Not Found Answered Correctly Answered Incorrectly Percentage Correct (among attempted) 1008020701087.5

46 School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Error Analysis 46  20 out of 100 not answered  Suitable background knowledge was not found Mark ceded the presidency to John because he was less popular

47 School of Computing, Informatics, and Decision Systems Engineering Arizona State University Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning Error Analysis 47 Bob paid for Charlie’s college education, he is very grateful  10 out of 80 incorrectly answered  Deeper analysis of background knowledge is required I paid the price for my stupidity. How grateful I am. Background Sentence: Winograd Sentence:

48 School of Computing, Informatics, and Decision Systems Engineering Arizona State University Conclusion and Future directions 48

49 School of Computing, Informatics, and Decision Systems Engineering Arizona State University Conclusion 49  For greater usage of knowledge representation and reasoning, we need ways to obtain knowledge automatically.; hand-crafted knowledge is not scalable, but may me needed for some parts.  For explainable reasoning with natural language it is important to translate natural language to some formal language.  Winograd schema challenge has been proposed as a way to test reasoning ability of systems (as opposed to black-box learning)  We are pursuing two approaches for the above goals: NL2KR platform and Kparser

50 50 School of Computing, Informatics, and Decision Systems Engineering Arizona State University THANK YOU!!!


Download ppt "Explanation Producing Combination of NLP and Logical Reasoning through Translation of Text to KR Formalisms CHITTA BARAL ARIZONA STATE UNIVERSITY 1 School."

Similar presentations


Ads by Google