Potential for Personalization Transactions on Computer-Human Interaction, 17(1), March 2010 Data Mining for Understanding User Needs Jaime Teevan, Susan.

Slides:



Advertisements
Similar presentations
Beliefs & Biases in Web Search
Advertisements

Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.
On Enhancing the User Experience in Web Search Engines Franco Maria Nardini.
1 Evaluation Rong Jin. 2 Evaluation  Evaluation is key to building effective and efficient search engines usually carried out in controlled experiments.
Optimizing search engines using clickthrough data
Personalization and Search Jaime Teevan Microsoft Research.
WSCD INTRODUCTION  Query suggestion has often been described as the process of making a user query resemble more closely the documents it is expected.
1 Learning User Interaction Models for Predicting Web Search Result Preferences Eugene Agichtein Eric Brill Susan Dumais Robert Ragno Microsoft Research.
Presenters: Başak Çakar Şadiye Kaptanoğlu.  Typical output of an IR system – static predefined summary ◦ Title ◦ First few sentences  Not a clear view.
Personalizing Search via Automated Analysis of Interests and Activities Jaime Teevan Susan T.Dumains Eric Horvitz MIT,CSAILMicrosoft Researcher Microsoft.
Seesaw Personalized Web Search Jaime Teevan, MIT with Susan T. Dumais and Eric Horvitz, MSR.
Ryen W. White, Microsoft Research Jeff Huang, University of Washington.
COMP 630L Paper Presentation Javy Hoi Ying Lau. Selected Paper “A Large Scale Evaluation and Analysis of Personalized Search Strategies” By Zhicheng Dou,
Cohort Modeling for Enhanced Personalized Search Jinyun YanWei ChuRyen White Rutgers University Microsoft BingMicrosoft Research.
Section 2: Finding and Refinding Jaime Teevan Microsoft Research 1.
Finding and Re-Finding Through Personalization Jaime Teevan MIT, CSAIL David Karger (advisor), Mark Ackerman, Sue Dumais, Rob Miller (committee), Eytan.
Information Re-Retrieval Repeat Queries in Yahoo’s Logs Jaime Teevan (MSR), Eytan Adar (UW), Rosie Jones and Mike Potts (Yahoo) Presented by Hugo Zaragoza.
SLOW SEARCH Jaime Teevan, Kevyn Collins-Thompson, Ryen White, Susan Dumais and Yubin Kim.
Jaime Teevan Microsoft Research Finding and Re-Finding Personal Information.
From Devices to People: Attribution of Search Activity in Multi-User Settings Ryen White, Ahmed Hassan, Adish Singla, Eric Horvitz Microsoft Research,
Facets of Personalization Jaime Teevan Microsoft Research (CLUES) with S. Dumais, E. Horvitz, D. Liebling, E. Adar, J. Elsas, R. Hughes.
Dr. Susan Gauch When is a rock not a rock? Conceptual Approaches to Personalized Search and Recommendations Nov. 8, 2011 TResNet.
Improving Web Search Ranking by Incorporating User Behavior Information Eugene Agichtein Eric Brill Susan Dumais Microsoft Research.
1 Can People Collaborate to Improve the relevance of Search Results? Florian Eiteljörge June 11, 2013Florian Eiteljörge.
Ruirui Li, Ben Kao, Bin Bi, Reynold Cheng, Eric Lo Speaker: Ruirui Li 1 The University of Hong Kong.
Understanding Query Ambiguity Jaime Teevan, Susan Dumais, Dan Liebling Microsoft Research.
Implicit Acquisition of Context for Personalization of Information Retrieval Systems Chang Liu, Nicholas J. Belkin School of Communication and Information.
Implicit User Feedback Hongning Wang Explicit relevance feedback 2 Updated query Feedback Judgments: d 1 + d 2 - d 3 + … d k -... Query User judgment.
Personalized Search Xiao Liu
Question Answering over Implicitly Structured Web Content
Discovering and Using Groups to Improve Personalized Search Jaime Teevan, Merrie Morris, Steve Bush Microsoft Research.
Analysis of Topic Dynamics in Web Search Xuehua Shen (University of Illinois) Susan Dumais (Microsoft Research) Eric Horvitz (Microsoft Research) WWW 2005.
The 5 W’s of Collaborative Search Jaime Teevan and Merrie Morris Microsoft Research.
Personalizing Search Jaime Teevan, MIT Susan T. Dumais, MSR and Eric Horvitz, MSR.
Qi Guo Emory University Ryen White, Susan Dumais, Jue Wang, Blake Anderson Microsoft Presented by Tetsuya Sakai, Microsoft Research.
CiteSight: Contextual Citation Recommendation with Differential Search Avishay Livne 1, Vivek Gokuladas 2, Jaime Teevan 3, Susan Dumais 3, Eytan Adar 1.
Implicit User Feedback Hongning Wang Explicit relevance feedback 2 Updated query Feedback Judgments: d 1 + d 2 - d 3 + … d k -... Query User judgment.
COLLABORATIVE SEARCH TECHNIQUES Submitted By: Shikha Singla MIT-872-2K11 M.Tech(2 nd Sem) Information Technology.
Web Information Retrieval Prof. Alessandro Agostini 1 Context in Web Search Steve Lawrence Speaker: Antonella Delmestri IEEE Data Engineering Bulletin.
Named Entity Recognition in Query Jiafeng Guo 1, Gu Xu 2, Xueqi Cheng 1,Hang Li 2 1 Institute of Computing Technology, CAS, China 2 Microsoft Research.
Understanding and Predicting Personal Navigation.
Post-Ranking query suggestion by diversifying search Chao Wang.
A New Algorithm for Inferring User Search Goals with Feedback Sessions.
Information Design Trends Unit Five: Delivery Channels Lecture 2: Portals and Personalization Part 2.
THE WEB CHANGES EVERYTHING Jaime Teevan, Microsoft
Identifying “Best Bet” Web Search Results by Mining Past User Behavior Author: Eugene Agichtein, Zijian Zheng (Microsoft Research) Source: KDD2006 Reporter:
Adaptive Faceted Browsing in Job Offers Danielle H. Lee
Ask any Joomla-based site owner and he'll tell you that he doesn't use dynamic keyword insertion. That's because this technology, also known as DKI, is.
Date: 2013/9/25 Author: Mikhail Ageev, Dmitry Lagun, Eugene Agichtein Source: SIGIR’13 Advisor: Jia-ling Koh Speaker: Chen-Yu Huang Improving Search Result.
Personalizing Web Search Jaime Teevan, MIT with Susan T. Dumais and Eric Horvitz, MSR.
To Personalize or Not to Personalize: Modeling Queries with Variation in User Intent Presented by Jaime Teevan, Susan T. Dumais, Daniel J. Liebling Microsoft.
Introduction Web analysis includes the study of users’ behavior on the web Traffic analysis – Usage analysis Behavior at particular website or across.
1 Personalizing Search via Automated Analysis of Interests and Activities Jaime Teevan, MIT Susan T. Dumais, Microsoft Eric Horvitz, Microsoft SIGIR 2005.
Seesaw Personalized Web Search Jaime Teevan, MIT with Susan T. Dumais and Eric Horvitz, MSR.
CONCLUSIONS & CONTRIBUTIONS Ground-truth dataset, simulated search tasks environment Implicit feedback, semi-explicit feedback (annotations), explicit.
SEARCH AND CONTEXT Susan Dumais, Microsoft Research INFO 320.
Accurately Interpreting Clickthrough Data as Implicit Feedback
User Characterization in Search Personalization
Simultaneous Support for Finding and Re-Finding
Search User Behavior: Expanding The Web Search Frontier
Amazing The Re:Search Engine Jaime Teevan MIT, CSAIL.
SIS: A system for Personal Information Retrieval and Re-Use
Mining Query Subtopics from Search Log Data
Beliefs and Biases in Web Search
Evidence from Behavior
Web Information retrieval (Web IR)
Evaluating Information Retrieval Systems
Ryen White, Ahmed Hassan, Adish Singla, Eric Horvitz
CS246: Leveraging User Feedback
Mining Anchor Text for Query Refinement
Presentation transcript:

Potential for Personalization Transactions on Computer-Human Interaction, 17(1), March 2010 Data Mining for Understanding User Needs Jaime Teevan, Susan Dumais, and Eric Horvitz Microsoft Research

CFP Paper

Questions How good are search results? Do people want the same results for a query? How to capture variation in user intent? – Explicitly – Implicitly How can we use what we learn?

personalization research Ask the searcher – Is this relevant? Look at searcher’s clicks Similarity to content searcher’s seen before

Ask the Searcher Explicit indicator of relevance Benefits – Direct insight Drawbacks – Amount of data limited – Hard to get answers for the same query – Unlikely to be available in a real system

Searcher’s Clicks Implicit behavior-based indicator of relevance Benefits – Possible to collect from all users Drawbacks – People click by mistake or get side tracked – Biased towards what is presented

Similarity to Seen Content Implicit content-based indicator of relevance Benefits – Can collect from all users – Can collect for all queries Drawbacks – Privacy considerations – Measures of textual similarity noisy

Explicit Indicator Implicit Indicators BehaviorContent # Users M59 # Queries11944 K24 >5 Users1744 K24 # Instances M822 Summary of Data Sets

Questions How good are search results? Do people want the same results for a query? How to capture variation in user intent? – Explicitly – Implicitly How can we use what we learn?

How Good Are Search Results? Lots of relevant results ranked low

How Good Are Search Results? Lots of relevant results ranked low Behavior data has presentation bias

How Good Are Search Results? Lots of relevant results ranked low Content data also identifies low results Behavior data has presentation bias

Do People Want the Same Results? What’s best for – For you? – For everyone? When it’s just you, can rank perfectly With many people, ranking must be a compromise personalization research?

Do People Want the Same Results? Potential for Personalization

Do People Want the Same Results? Potential for Personalization

How to Capture Variation? Behavior gap smaller because of presentation bias

How to Capture Variation? Content data shows more variation than explicit judgments Behavior gap smaller because of presentation bias

How to Use What We Have Learned? Identify ambiguous queries Solicit more information about need Personalize search – Using content and behavior-based measures Web Personalized

Answers Lots of relevant content ranked low Potential for personalization high Implicit measures capture explicit variation – Behavior-based: Highly accurate – Content-based: Lots of variation Example: Personalized Search – Behavior + content work best together – Improves search result click through

THANK YOU! Potential for Personalization