Presentation is loading. Please wait.

Presentation is loading. Please wait.

1/21 Automatic Discovery of Intentions in Text and its Application to Question Answering (ACL 2005 Student Research Workshop )

Similar presentations


Presentation on theme: "1/21 Automatic Discovery of Intentions in Text and its Application to Question Answering (ACL 2005 Student Research Workshop )"— Presentation transcript:

1 1/21 Automatic Discovery of Intentions in Text and its Application to Question Answering (ACL 2005 Student Research Workshop )

2 2/17 Abstract Idea: Semantic relations b/w text concepts denote the core elements of lexical semantics. Model: Automatic detection of INTENTION semantic relation. Method: Syntactic + semantic features for a SVM learning classifier. Results: 90.41% accuracy. Application: Question Answering.

3 3/17 Introduction (1/2) Intentions : Express a human ’ s goal-oriented private states of mind, including intents, objectives, aims, and purposes (might not be explicitly stated). Examples: “ Mary is going to buy a TV set. ” Customer: Where do you have the $1 cups? Salesman: How many do you want? Problem: Current state-of-the-art NLP systems cannot extract intentions from open text.

4 4/17 Introduction (2/2) Intentions : Expression of a particular action that shall take place in the future, in which the speaker is some sort of agent. Method: Identify intentions in domain independent texts. Employ machine learning algorithms to create models that locate intentions in a given paragraph using a set of 6 syntactic and semantic features.

5 5/17 Previous Work Purely probabilistic models, empirical methods, hand-coded constraints …… Decision tree, neural networks, memory- based learning, support vector machines …… Wiebe et al. (2004) focused on the detection of subjective language such as opinions, evaluations, or emotions in text. Numerous philosophical studies discuss how intentions relate to other psychological concepts, such as, beliefs, desires, hopes, or expectations.

6 6/17 Syntax and Semantics of Intention (1/2) Syntax patterns: Syntax patterns SemCor text collection  manually classifying the first 2,700 sentences into sentences that contain or not intentions  46 examples were identified. Focus on detecting intentions encoded by VB 1 to VB 2. Example: “ John intends to meet Mary today. ”  (O) “ Mary began to play with the dog.  (X)

7 7/17 Syntax and Semantics of Intention (2/2) Semantics of intentions: Representation for the intention semantic relation: INT(e 1, x 1, e 2 ) e 1 : the event denoting the intention x 1 : denotes the person that has the intention e 2 : the intended action or event INT: THEME, LOCATION, MANNER, SOURCE …… Example: Sentence: John intends to meet Mary today. Representation: Allow the derivation of inference rules.

8 8/17 Learning Model (1/4) Experimental data: The first 10,000 sentences of the SemCor2.0  extract 1,873 sentences  contain 115 intentions (manually identified) 5-fold-cross-validation: 115 positive + 258 negative examples. 115258 80% 20%

9 9/17 Learning Model (2/4) 6 Features for intention (VB 1 to VB 2 ): “ Mary intends to revise the paper. ” 1. The semantic class of the VB 1 verb ’ s agent or specializations of it. Use an in-house semantic parser to retrieve the AGENT of the VB 1 verb. Value: WordNet semantic class. Specialize the semantic class s of a concept w by replacing it with its immediate hyponym (h) that subsumes w. Mary names a person.  entity#1 2. The semantic class of the VB 1 verb or its specializations. intend#1  wish#3

10 10/17 Learning Model (3/4) 6 Features for intention (VB 1 to VB 2 ): “ Mary intends to revise the paper. ” 3. The semantic class of the VB2 verb ’ s agent. Mary names a person.  entity#1 4. The semantic class of the VB2 verb or its specializations. revise#1  act#1 5. A flag indicating if the VB1 verb has an affirmative or a negative form. “ John wants to go for a walk. ”  (O) “ John doesn ’ t want to go for a walk. ”  (X)

11 11/17 Learning Model (4/4) 6 Features for intention (VB 1 to VB 2 ): “ Mary intends to revise the paper. ” 6. The type of the analyzed sentence. Primarily concerned with questions. “ Where do you plan to go for a walk? ” “ Do you plan to go for a walk? ” Values: wh-words (begin a question) or n/a (other types). ? Negative form of the VB 2 verb: b/c it does not effect the objective attribute of the intention. “ John intends not to go for a walk. ”  (O) “ John doesn ’ t intend to go for a walk. ”  (X)

12 12/17 Experimental Results (1/3) 5 experiments: 1. Impact of specialization 2. Learning curves 3. Feature impact on the SVM models 4. Impact of word sense disambiguation 5. C5 results

13 13/17 Experimental Results (2/3) 1. Impact of specialization:Impact of specialization Using the LIBSVM package and the WordNet semantic classes. Specialize VB 2 ’ s agent semantic class  accuracy remained constant  specialization of the VB 2 ’ s agent does not influence the performance  3rd feature ’ s value = 2nd specialization level 2. Learning curvesLearning curves How many training examples are needed to reach 90.41% accuracy? All 3 models exhibit a similar behavior with respect to the change in the training set size.

14 14/17 Experimental Results (3/3) 3. Feature impact on the SVM modelsFeature impact on the SVM models Perform experiments that use only 5 out of 6 features. The most influential attribute: VB 1 verb ’ s semantic class or its specializations. Syntactic features bring an average increase in accuracy of 3.50%. 4. Impact of word sense disambiguationImpact of word sense disambiguation 5. C5 resultsC5 results B/c of the relatively small numbers of training instances, C5 ignores some of the features and makes wrong decisions.

15 15/17 Application to QA Q involving intentions cannot be answered only by keyword-based or simple surface-level matching techniques. Examples

16 16/17 Conclusions A method to detect the INTENT relation encoded by the sentence-level pattern VB 1 to VB 2 with a 90.41% accuracy. Future work: Investigate the other INTENTION patterns as well as other semantic relations such as MOTIVE, IMPLICATION, or MEANING. Incorporate INTENTION detection module into a QA system and show its impact.

17 17/17 ThanQ

18 18/17 Syntactic patterns Cover 95.65% of 46 examples.

19 19/17 Semantics _ implications 1. John intends to start his car to go to the park.  John intends to go to the park. 2. John intends to buy a car.  John intends to pay for it. 4. John intends to go to the park. He ’ s starting his car right now. John intends to start his car. 5. John intends to swim in the pool. He knows he ’ s going to catch a cold in the cold water. John intends to catch a cold.

20 20/17 1. Impact of specializationImpact of specialization

21 21/17 2. Learning curvesLearning curves

22 22/17 3. Feature impact on the SVM modelsFeature impact on the SVM models

23 23/17 4. Impact of word sense disambiguation 5. C5 resultsImpact of word sense disambiguationC5 results

24 24/17 Applications to QA


Download ppt "1/21 Automatic Discovery of Intentions in Text and its Application to Question Answering (ACL 2005 Student Research Workshop )"

Similar presentations


Ads by Google