Presentation is loading. Please wait.

Presentation is loading. Please wait.

Seesaw Personalized Web Search Jaime Teevan, MIT with Susan T. Dumais and Eric Horvitz, MSR.

Similar presentations


Presentation on theme: "Seesaw Personalized Web Search Jaime Teevan, MIT with Susan T. Dumais and Eric Horvitz, MSR."— Presentation transcript:

1 Seesaw Personalized Web Search Jaime Teevan, MIT with Susan T. Dumais and Eric Horvitz, MSR

2

3 Query expansion Personalization Algorithms Standard IR Document Query User Server Client

4 Query expansion Personalization Algorithms Standard IR Document Query User Server Client v. Result re-ranking

5 Result Re-Ranking Ensures privacy Good evaluation framework Can look at rich user profile Look at light weight user models  Collected on server side  Sent as query expansion

6 Seesaw Search EngineSeesaw dog 1 cat10 india 2 mit 4 search93 amherst12 vegas 1

7 Seesaw Search Engine query dog 1 cat10 india 2 mit 4 search93 amherst12 vegas 1

8 Seesaw Search Engine query dog 1 cat10 india 2 mit 4 search93 amherst12 vegas 1 dog cat monkey banana food baby infant child boy girl forest hiking walking gorp baby infant child boy girl csail mit artificial research robot web search retrieval ir hunt

9 Seesaw Search Engine query dog 1 cat10 india 2 mit 4 search93 amherst12 vegas 1 1.60.2 6.0 0.2 2.7 1.3 Search results page web search retrieval ir hunt 1.3

10 Calculating a Document’s Score Based on standard tf.idf web search retrieval ir hunt 1.3

11 Calculating a Document’s Score Based on standard tf.idf (r i +0.5)(N-n i -R+r i +0.5) (n i -r i +0.5)(R-r i +0.5) w i = log 1.3 0.1 0.5 0.05 0.35 0.3 User as relevance feedback  Stuff I’ve Seen index  More is better

12 Finding the Score Efficiently Corpus representation (N, n i )  Web statistics  Result set Document representation  Download document  Use result set snippet Efficiency hacks generally OK!

13 Evaluating Personalized Search 15 evaluators Evaluate 50 results for a query  Highly relevant  Relevant  Irrelevant Measure algorithm quality  DCG(i) = { Gain(i), DCG (i–1) + Gain(i)/log(i), if i = 1 otherwise

14 Evaluating Personalized Search Query selection  Chose from 10 pre-selected queries  Previously issued query cancer Microsoft traffic … bison frise Red Sox airlines … Las Vegas rice McDonalds … Pre-selected 53 pre-selected (2-9/query) Total: 137 Joe Mary

15 Seesaw Improves Text Retrieval Random Relevance Feedback Seesaw

16 Text Features Not Enough

17 Take Advantage of Web Ranking

18 Further Exploration Explore larger parameter space Learn parameters  Based on individual  Based on query  Based on results Give user control?

19 Making Seesaw Practical Learn most about personalization by deploying a system Best algorithm reasonably efficient Merging server and client  Query expansion Get more relevant results in the set to be re-ranked  Design snippets for personalization

20 User Interface Issues Make personalization transparent Give user control over personalization  Slider between Web and personalized results  Allows for background computation Creates problem with re-finding  Results change as user model changes  Thesis research – Re:Search Engine

21 Thank you! teevan@csail.mit.edu


Download ppt "Seesaw Personalized Web Search Jaime Teevan, MIT with Susan T. Dumais and Eric Horvitz, MSR."

Similar presentations


Ads by Google