Presentation is loading. Please wait.

Presentation is loading. Please wait.

Special Topics in Computer Science Advanced Topics in Information Retrieval Lecture 11: Natural Language Processing and IR. Semantics and Semantically-rich.

Similar presentations

Presentation on theme: "Special Topics in Computer Science Advanced Topics in Information Retrieval Lecture 11: Natural Language Processing and IR. Semantics and Semantically-rich."— Presentation transcript:

1 Special Topics in Computer Science Advanced Topics in Information Retrieval Lecture 11: Natural Language Processing and IR. Semantics and Semantically-rich representations Alexander Gelbukh

2 2 Previous Lecture: Conclusions Syntax structure is one of intermediate representations of a text for its processing Helps text understanding Thus reasoning, question answering,... Directly helps POS tagging Resolves lexical ambiguity of part of speech But not WSD-type ambiguities A big science in itself, with 50 (2000?) years of history

3 3 Previous Lecture: Research topics Faster algorithms E.g. parallel Handling linguistic phenomena not handled by current approaches Ambiguity resolution! Statistical methods A lot can be done

4 4 Contents Semantic representations Semantic networks Conceptual graphs Simpler representations Head-Modifier pairs Tasks beyond IR Question Answering Summarization Information Extraction Cross-language IR

5 5 Syntactic representation A sequence of syntactic trees.

6 6 Semantic analysis

7 7 Semantic representation Complex structure of whole text

8 8 Semantic representation Expresses the (direct) meaning of the text Not what is implied Free of the means of communications Morphological cases (transformed to semantic links) Word order, passive/active Sentences and paragraphs Pronouns (resolved) Free of means of expressing Synonyms (reduced to a common ID) Lexical functions

9 9 Lexical Functions The same meaning expressed by different words The choice of the word is a function of other words Few standard meanings Example: Magn = much, very Strong wind, tea, desire Thick soup High temperature, potential, sea; highly expensive Hard work; hardcore porno Deep understanding, knowledge, appreciation

10 10...Lexical Functions give pay attention provide help adjudge a prize yield the word confer a degree deliver a lection get attract attention obtain help receive a degree attend a lection

11 11...Syntagmatic lexical functions In semantic representation, are transformed to the function name: Magn wind, tea, desire Magn soup Magn temperature, potential, sea; MAGN expensive Magn work; Magn porno Magn understanding, knowledge, appreciation In different languages, different words are used... Russian: dense soup; Spanish: loaded tea, lend attention...but the same function names.

12 12 Example: Translation

13 13...Paradigmatic lexical functions Used for synonymic rephrasing Need to reduce the meaning to a standard form Example: Syn, hyponyms, hypernyms W Syn (W) complex apparatus complex mechanism Example: Conv 31, Conv 24,... A V B C C Conv 31 (V) B A John sold the book to Mary for $5 Mary bough the book from John for $5 The book costed Mary $5

14 14 Semantic network Representation of the text as a directed graph Nodes are situations and entities Edges are participation of an entity in a situation Also situation in a situation: begin reading a book, John died yesterday Situation can be expressed with a noun: Professor delivered a lection to students Professor *lectured to students Lecture on history, memorial to heroes A node can participate in many situations! No division into sentences

15 15 Situations Situations with different participants are different situations John reads a book and Mary reads a newspaper. He aks her whether the newspaper is interesting. Here two different situations of reading! But the same entities: John, Mary, newspaper, participating in different situations Tense and number is described as situations John reads a book: Now (reading (John, book) & quantity (book, one)

16 16 Semantic valencies A situation can have few participants (up to ~5) Their meaning is usually very general They are usually naturally ordered: Who (agent) What (patient, object) To whom (receiver) With what (instrument,...) John sold the book to Mary for $5 So, in the network the outgoing arcs of a node are numbered

17 17 Semantic representation Complex structure of whole text SCIENCE IMPORTANT COUNTRY WE GOVERNMENT ATTENTION Give 2 1 Possess 1 2 Now Quantity 1

18 18 Reasoning and common-sense info One can reason on the network If John sold a book, he does not have it For this, additional knowledge is needed! A huge amount of knowledge to reason A 9-year-old child knows some 10,000,000 simple facts Probably some of them can be inferred, but not (yet) automatically There were attempts to compile such knowledge manually There is a hope to compile it automatically...

19 19 Semantic representation... and common-sense knowledge

20 20 Computer representation Logical predicates Arcs are arguments In AI, allows reasoning In IR, can allow comparison even without reasoning

21 21 Conceptual Graphs A CG is a bipartite graph. Concept nodes represent entities, attributes, or events (actions). Relation nodes denote the kinds of relationships between the concept nodes. [John] (agnt) [love] (ptnt) [Mary]

22 22 program:{*} analyze logically pnt mnr criteria provide use Invariant:{*} ptn forptn Implication:{*} examine approach diagnosis automatic correction error logical ptn of for of attr for

23 23 Use in IR Restrict the search to specific situations Where John loves Mary, but not vice versa or Soften the comparison Approximate search Look for John loves Mary, get someone loves Mary

24 24 Obtaining from text Algebraic formulation of flow diagrams Algebraic|JJ formulation|NN of|IN flow|NN diagrams|NNS [[np, [n, [formulation, sg]], [adj, [algebraic]], [of, [np, [n, [diagram, pl]], [n_pos, [np, [n, [flow, sg]]]]]]]] [algebraically] (manr) [formulate] (ptn) [flow- diagram] Tagging Parsing Graph Generation TEXTS CGs

25 25 Steps of comparison Determine the common elements (overlap) between the two graphs. Based on the CG theory Compatible common generalizations Measure their similarity. The similarity must be proportional to the size of their overlap.

26 26 An overlap Given two conceptual graphs G 1 and G 2, the set of their common generalizations O = {g 1, g 2,...,g n } is an overlap if: If all common generalizations g i are compatible. If the set O is maximal.

27 27 An example of overlap candidate:Gore criticize candidate:Bush G1:G1: G2:G2: Candidate:Gore O2:O2: Agnt Ptnt criticize Candidate:Bush candidate:Bush criticize candidate:Gore Agnt Ptnt candidate:Gore criticize candidate:Bush G1:G1: G2:G2: candidate O1:O1: Agnt Ptnt criticize candidate Agnt Ptnt candidate:Bush criticize candidate:Gore Agnt Ptnt (a) (b)

28 28 Similarity measure Conceptual similarity: indicates the amount of information contained in common concepts of G1 and G2. Do they mention similar concepts? Relational similarity: indicates how similar the contexts of the common concepts in both graphs are. Do they mention similar things about the common concepts?

29 29 Conceptual similarity Analogous to the Dice coefficient. Considers different weights for the different kinds of concepts. Considers the level of generalization of the common concepts (of the overlap).

30 30 Relational Similarity Analogous to the Dice coefficient. Considers just the neighbors of the common concepts. Considers different weights for the different kinds of conceptual relations.

31 31 Similarity Measure Combines the conceptual and relational similarities. Multiplicative combination: a similarity roughly proportional to each of the two components. Relational similarity has secondary importance: even if no common relations exits, the pieces of knowledge are still similar to some degree.

32 32 Flexibility of the comparison Configurable by the user. Use different concept hierarchies. Designate the importance for the different kind of concepts. Manipulate the importance of the conceptual and relational similarities.

33 33 Example of the flexibility Gore criticezes Bush vs. Bush criticizes Gore

34 34 An Experiment Use the collection CACM-3204 (articles of computer scie nce). We built the conceptual graphs from the document titles. Query: Description of a fast procedure for solving a system of linear equations.

35 35 The results Focus on the structural similarity, basically on the one caused by the entities and attributes. (a=0.3,b=0.7, We=Wa=10,Wv=1) One of the best matches: Description of a fast algorithm for copying list structures.

36 36 The results (2) Focus on the structural similarity, basically on the one ca used by the entities and actions. (a=0.3,b=0.7, We=Wv=10,Wa=1) One of the best matches: Solution of an overdetermined system of equations in the L1 norm.

37 37 Advantages of CGs Well-known strategies for text comparison (Dice coefficient) with new characteristics derived from the CGs structure. The similarity is a combination of two sources of similarity: the conceptual similarity and the relational similarity. Appropriate to compare small pieces of knowledge (other methods based on topical statistics do not work). Two interesting characteristics: uses domain knowledge and allows a direct influence of the user. Analyze the similarity between two CGs from different points of view. Selects the best interpretation in accordance with the user interests.

38 38 Simpler representations Head-Modifier pairs John sold Mary an interesting book for a very low price John sold, sold Mary, sold book, sold for price interesting book, low price A paper in CICLing-2004 Restrict your semantic representation to only two words Shallow syntax Semantics improves this representation Standard form: Mary bought John sold, etc.

39 39 Tasks beyond IR: Question Answering User information need An answer to a question Not a bunch of docs Who won Nobel Peace Prize in 1992? (35500 docs)

40 40...QA Answer: Rigoberta Menchú Tum Logical methods: Understand the text Reason on it Construct the answer Generate the text expressing it Statistical methods (no or little semantics) Look what word is repeated in the docs Perhaps try to understand something around it

41 41...Better QA What is the info is not in a single document? Who is the queen of Spain? King of Spain is Juan Carlos Wife of Juan Carlos is Sofía (Wife of a king is a queen) Logical reasoning may prove useful In practice, the degree of understanding is not yet enough We are working to improve it

42 42 Tasks beyond IR: Passage Extraction If the answer is long: a story What do you know on wars between England and France? Or if we cannot detect the simple answer Then find short pieces of the text where the answer is Can be done even with keywords: Find passages with many keywords (Kang et al. 2004): Choose passages with greatest vector similarity. Too short: few keywords, too long: normalized Awful quality Reasoning can help

43 43 Tasks beyond IR: Summarization And what if the answer is not in a short passage Summarize: say the same (without unimportant details) but in fewer words Now: statistical methods Reasoning can help

44 44 Tasks beyond IR: Information Extraction Question answering on a massive basis Fill a database with the answers Example: what company bought what company and when? A database of three columns Now: (statistical) patterns Reasoning can help

45 45 Cross-lingual IR Question in one language, answer in another language Or: question and summary of the answer in English, over a database in Chinese Is a kind of translation, but simpler Thus can be done more reliably A transformation into semantic network can greatly help

46 46 Research topics Recognition of the semantic structure Convert text to conceptual graphs All kinds of disambiguation Shallow semantic representations Application of semantic representations to specific tasks Similarity measures on semantic representations Reasoning and IR

47 47 Conclusions Semantic representation gives meaning Language-specific constructions used only in the process of communication are removed Network of entities / situations and predicates Allows for translation and logical reasoning Can improve IR: Compare the query with the doc by meaning, not words Search for a specific situation Search for an approximate situation QA, summarization, IE Cross-lingual IR

48 48 Thank you! Till June 15? 6 pm Thesis presentation? Oral test?

Download ppt "Special Topics in Computer Science Advanced Topics in Information Retrieval Lecture 11: Natural Language Processing and IR. Semantics and Semantically-rich."

Similar presentations

Ads by Google