Presentation is loading. Please wait.

Presentation is loading. Please wait.

Topic: Semantic Text Mining

Similar presentations


Presentation on theme: "Topic: Semantic Text Mining"— Presentation transcript:

1 Topic: Semantic Text Mining
Bin Li 1 1

2 Outline Paper 1 : Table Cell Search for Question Answering Background
Table cell search framework Evaluation 2 2

3 Background complexer question unstructured knowledge resource
limit of query for traditional knowledge bases 3 3

4 Background complexer question unstructured knowledge resource
limit of query for traditional knowledge bases Precisely retrieve table cells from web to answer a user question Measure 4 4

5 Table cell search framework
formulate question chain and relational chain find the best answer (using deep neural networks) extract the corresponding answer

6 what languages do people in france speak?
natural language question

7 Question chain what languages do people in france speak? Topic entity
natural language question Question chain Topic entity Question pattern

8 Question chain what languages do people in france speak? Topic entity
natural language question Question chain Topic entity Question pattern

9 Question chain what languages do people in <e> speak?
natural language question Question chain Topic entity Question pattern

10 Question chain relevant table what languages do people in <e> speak? France ?X

11 Question chain Relevant table what languages do people in <e> speak? France ?X Row graph

12 people in <e> speak? France ?X
Question chain Relevant table what languages do people in <e> speak? France ?X Row graph column name Topic cell Ending cell Relation chain

13 people in <e> speak? France ?X
Question chain Relevant table what languages do people in <e> speak? France ?X matching imply inward and outward relations Row graph column name Topic cell Ending cell Relation chain

14 Candidate Chains Question chain Relevant table what languages do
people in <e> speak? France ?X Candidate Chains matching imply inward and out ward relations Row graph column name Topic cell Ending cell Relation chain

15 Get a large set of candidate chains via string matching.
Evaluate the relevance of a candidate chain to the input question. Get more accurate candidate chains. Explore the information of candidate chains. Use deep neural networks to evaluate the matching degree.

16 Chain inference Semantic representation
non-linear feed-forward neural network fixed-lenth global feature vector extract most salient local feature llocal contextual feature vector convolution concatenate {#-s-p,s-p-e,p-e-a,e-a-k,a-k-#} what languages do people in <e> speak?

17 Chain inference Semantic representation
non-linear feed-forward neural network fixed-lenth global feature vector extract most salient local feature llocal contextual feature vector convolution concatenate {#-s-p,s-p-e,p-e-a,e-a-k,a-k-#} what languages do people in <e> speak? As input: question pattern, word sequence, answer type, peseudo-predicate,entity pairs

18 Features Shallow features Deep features
word-level matching degree Deep features answer type pseudo-predicate entity pairs Calculate the cosine similarity between the question and the candidate chain.

19 Evaluation

20 Evaluation Measures Precision Recall measure
(Harmonic mean of P and R)

21 Paper 2 : Dynamic Collective Entity Representations for Entity Ranking
Outline Background Dynamic collective entity representations(DCER) Evaluation 21 21

22 Background Mismatch Manualy query
Entity's description in knowledge base Manualy query Context dependency Time dependency

23 Dynamic collective entity representation
Collective intelligence Manualy query Knowledge base entity description Fielded documents represent entities continuously update ranking model incorporate new descriptions Retrain model adjust weights associated to entity's fields Dynamic collective entity representation

24 Dynamic collective entity representations(DCER)
Problem: Query: q Knowledge base: KB (consist of entities e E) Aim: find the best match between e and q Approach: Expand entity representations(field document) Reduce the vocabulary gap between queties and entities Train classification-based ranker Combine content from each field

25 Description sources Knowledge base External description sources (static: web achives, web anchors... dynamic: tweets, query logs...) Adaptive entity ranking Supervised entity ranker learns to weight the fields but two challenges by constructing DCER: -Heterogeneity(volume, quaity, quantity, type...) within entities between entites -Dynamicness cannot capture the evolving and continually changing so “Adaptive entity ranking”(continuously undated)

26 Model Entity representaton:
: Field term vector, represent the content of e Updating fields: Estimate e's relevance to a query q: Supervises single-field weighting model

27 Three features express the importance of field and entity
Field similarity Query-Field similarity score Field importance status of the field at a point in time Entity importance resently updated entities

28 Ranker Macine learning Query
(Employ a supervised ranker to learn the optimal feature weights for retrieval) Weight vector: ( : weights of each of the field features and the entity importance feature) Top K-retrieval Query Candidate entities user interaction(i.e.,clicks) feature vectore(x) Ranker Optimal classification's condidence score

29 Evaluation Q1: Does entity ranking effectiveness increase using DCER?
Compare and Q2: Does entity ranking effectiveness increase when employing field and entity features? Compare ,KBER(incorperate entity and field importance feature) and DCER Q3: Does entity ranking effectiveness increase when we continuosly learn the optimal entity representation? Compare DCER and (non-adaptive)

30 Thank you for your attention!


Download ppt "Topic: Semantic Text Mining"

Similar presentations


Ads by Google