Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Tensor Query Expansion: a cognitively motivated relevance model Mike Symonds, Peter Bruza, Laurianne Sitbon and Ian Turner Queensland University of Technology.

Similar presentations


Presentation on theme: "1 Tensor Query Expansion: a cognitively motivated relevance model Mike Symonds, Peter Bruza, Laurianne Sitbon and Ian Turner Queensland University of Technology."— Presentation transcript:

1 1 Tensor Query Expansion: a cognitively motivated relevance model Mike Symonds, Peter Bruza, Laurianne Sitbon and Ian Turner Queensland University of Technology

2 2 Introduction We use a formal model of word meaning to simulate cognitive processes used when a user formulates their query We use a formal model of word meaning to simulate cognitive processes used when a user formulates their query Use this approach for query expansion in an ad hoc retrieval task. Use this approach for query expansion in an ad hoc retrieval task. Our approach shows significant improvement in retrieval effectiveness over state-of-the-art for: Our approach shows significant improvement in retrieval effectiveness over state-of-the-art for: short queries and short queries and newswire TREC data sets newswire TREC data sets Introduction Background Tensor Query expansion Results Future work

3 3 Query Expansion (QE) Geometric representations: Geometric representations: Rocchio (Rocchio, 1971); Rocchio (Rocchio, 1971); Probabilistic representations: Probabilistic representations: Relevance models (Lavrenko and Croft, 2001); P(w|R) Relevance models (Lavrenko and Croft, 2001); P(w|R) Term dependency approaches Term dependency approaches Latent concept expansion (Metzler and Croft, 2007) and positional relevance model (Lv and Zhai, 2010) Latent concept expansion (Metzler and Croft, 2007) and positional relevance model (Lv and Zhai, 2010) Introduction Background Tensor Query expansion Results Future work

4 4 Motivation The user’s information need is a cognitive construct. The user’s information need is a cognitive construct. The use of cognitive models in query expansion has not been extensively studied. The use of cognitive models in query expansion has not been extensively studied. Trend in QE research: term dependency approaches are outperforming term independent. Trend in QE research: term dependency approaches are outperforming term independent. However, current semantic features have little, if any, linguistic meaning. However, current semantic features have little, if any, linguistic meaning. Introduction Background Tensor Query expansion Results Future work

5 5 Hypothesis Using a cognitively motivated model of word meaning within the query expansion process can significantly improve retrieval effectiveness. Using a cognitively motivated model of word meaning within the query expansion process can significantly improve retrieval effectiveness. Model of Word Meaning Model of Word Meaning Tensor Encoding Model (Symonds,2011) Tensor Encoding Model (Symonds,2011) Structural Linguistic Theory Structural Linguistic Theory Ferdinand de Saussure (1916) Ferdinand de Saussure (1916) Syntagmatic associations (hot-sun) Syntagmatic associations (hot-sun) Paradigmatic associations (quick-fast) Paradigmatic associations (quick-fast) Introduction Background Tensor Query expansion Results Future work

6 6 Modeling word meaning Syntagmatic Associations Syntagmatic Associations Use efficient cosine measure Use efficient cosine measure Paradigmatic Associations Paradigmatic Associations Use probabilistic based measure Use probabilistic based measure Introduction Background Tensor Query Expansion Results Future work S syn (Q,w) =

7 7 Tensor Query Expansion Formally combine the syntagmatic and paradigmatic features Formally combine the syntagmatic and paradigmatic features using a Markov using a Markov random field, and random field, and fit into the relevance fit into the relevance modeling framework; modeling framework; replace P(w|R) with P G, Γ (w|Q) replace P(w|R) with P G, Γ (w|Q) Introduction Background Tensor Query Expansion Results Future work

8 8 Ad Hoc Retrieval Results Mean average precision (MAP) Mean average precision (MAP) Introduction Background Tensor Query Expansion Results Future work

9 9 Ad Hoc Retrieval Results Robustness Robustness Introduction Background Tensor Query Expansion Results Questions Associated Press Wall Street Journal

10 10 Ad Hoc Retrieval Results Parameter sensitivity Parameter sensitivity Observe the change in MAP for different mix of syntagmatic and paradigmatic (i.e., gamma ) Observe the change in MAP for different mix of syntagmatic and paradigmatic (i.e., gamma ) Introduction Background Tensor Query Expansion Results Future work Associated Press Wall Street Journal

11 11 Summary of contribution Cognitively motivated approach to performing query expansion. Cognitively motivated approach to performing query expansion. Use of semantic features that have explicit linguistic meaning. Use of semantic features that have explicit linguistic meaning. Demonstrated significant improvement in retrieval effectiveness over the unigram relevance model. Demonstrated significant improvement in retrieval effectiveness over the unigram relevance model. Introduction Background Tensor Encoding Model Results Future work

12 12 Future Work Introduction Background Tensor Query Expansion Results Future work Evaluate on larger data sets Evaluate on larger data sets TREC GOV2, ClueWeb TREC GOV2, ClueWeb Compare to the Positional Relevance Model or LCE Compare to the Positional Relevance Model or LCE Evaluate on verbose queries Evaluate on verbose queries More semantic information in longer queries More semantic information in longer queries Questions?

13 13 Advantages over LCE, PRM 1. TQE has a strong link to the cognitive motivation behind a user’s real information need. 2. TQE uses semantic features from a formal model of word meaning. 3. TQE has only one parameter (gamma) Positional relevance model has two parameters and LCE has more than two. Positional relevance model has two parameters and LCE has more than two. Introduction Background Tensor Query Expansion Results Future work


Download ppt "1 Tensor Query Expansion: a cognitively motivated relevance model Mike Symonds, Peter Bruza, Laurianne Sitbon and Ian Turner Queensland University of Technology."

Similar presentations


Ads by Google