Evaluating Implicit Measures to Improve the Search Experience SIGIR 2003 Steve Fox.

Slides:



Advertisements
Similar presentations
Web Usage Mining Web Usage Mining (Clickstream Analysis) Mark Levene (Follow the links to learn more!)
Advertisements

The Internet Adult Literacy Center Created by Andrea L. Lawrence MS.
Accurately Interpreting Clickthrough Data as Implicit Feedback Joachims, Granka, Pan, Hembrooke, Gay Paper Presentation: Vinay Goel 10/27/05.
CONTRIBUTIONS Ground-truth dataset Simulated search tasks environment Multiple everyday applications (MS Word, MS PowerPoint, Mozilla Browser) Implicit.
Struggling or Exploring? Disambiguating Long Search Sessions
LogCLEF 2009 Log Analysis for Digital Societies (LADS) Thomas Mandl, Maristella Agosti, Giorgio Maria Di Nunzio, Alexander Yeh, Inderjeet Mani, Christine.
Modelling Relevance and User Behaviour in Sponsored Search using Click-Data Adarsh Prasad, IIT Delhi Advisors: Dinesh Govindaraj SVN Vishwanathan* Group:
Optimizing search engines using clickthrough data
Interception of User’s Interests on the Web Michal Barla Supervisor: prof. Mária Bieliková.
Query Chains: Learning to Rank from Implicit Feedback Paper Authors: Filip Radlinski Thorsten Joachims Presented By: Steven Carr.
Center for E-Business Technology Seoul National University Seoul, Korea Socially Filtered Web Search: An approach using social bookmarking tags to personalize.
Eye Tracking Analysis of User Behavior in WWW Search Laura Granka Thorsten Joachims Geri Gay.
Synera - synera ePack TM Product Overview eBusiness Intelligence.
WWW Challenges : Supporting Users in Search and Navigation Natasa Milic-Frayling Microsoft Research, Cambridge UK SOFSEM 2004 January 28, 2004.
1 Learning User Interaction Models for Predicting Web Search Result Preferences Eugene Agichtein Eric Brill Susan Dumais Robert Ragno Microsoft Research.
Chapter 12: Web Usage Mining - An introduction
Time-dependent Similarity Measure of Queries Using Historical Click- through Data Qiankun Zhao*, Steven C. H. Hoi*, Tie-Yan Liu, et al. Presented by: Tie-Yan.
Mobile Web Search Personalization Kapil Goenka. Outline Introduction & Background Methodology Evaluation Future Work Conclusion.
Metrics for Performance Measurement in E-Commerce MARK 3030 – Week 10.
Jeffrey P. Bigham Anna C. Cavender, Jeremy T. Brudvik, Jacob O. Wobbrock * and Richard E. Ladner Computer Science & Engineering The Information School*
Recognizing User Interest and Document Value from Reading and Organizing Activities in Document Triage Rajiv Badi, Soonil Bae, J. Michael Moore, Konstantinos.
University of Kansas Department of Electrical Engineering and Computer Science Dr. Susan Gauch April 2005 I T T C Dr. Susan Gauch Personalized Search Based.
Web-based Control Interface For a model train control system By: Kevin Sendra.
12 -1 Lecture 12 User Modeling Topics –Basics –Example User Model –Construction of User Models –Updating of User Models –Applications.
Query Log Analysis Naama Kraus Slides are based on the papers: Andrei Broder, A taxonomy of web search Ricardo Baeza-Yates, Graphs from Search Engine Queries.
Prof. Vishnuprasad Nagadevara Indian Institute of Management Bangalore
Modern Retrieval Evaluations Hongning Wang
Ramakrishnan Srikant Sugato Basu Ni Wang Daryl Pregibon 1.
PERSONALIZED SEARCH Ram Nithin Baalay. Personalized Search? Search Engine: A Vital Need Next level of Intelligent Information Retrieval. Retrieval of.
Adobe Certified Associate Objectives 6 Evaluating and Maintaining a site.
Integrated Collaborative Information Systems Ahmet E. Topcu Advisor: Prof Dr. Geoffrey Fox 1.
Hao Wu Nov Outline Introduction Related Work Experiment Methods Results Conclusions & Next Steps.
©2015 Apigee Corp. All Rights Reserved. Preserving signal in customer journeys Joy Thomas, Apigee Jagdish Chand, Visa.
Implicit Acquisition of Context for Personalization of Information Retrieval Systems Chang Liu, Nicholas J. Belkin School of Communication and Information.
Lecture 2 Jan 13, 2010 Social Search. What is Social Search? Social Information Access –a stream of research that explores methods for organizing users’
Implicit User Feedback Hongning Wang Explicit relevance feedback 2 Updated query Feedback Judgments: d 1 + d 2 - d 3 + … d k -... Query User judgment.
Personalized Search Xiao Liu
Chapter 12: Web Usage Mining - An introduction Chapter written by Bamshad Mobasher Many slides are from a tutorial given by B. Berendt, B. Mobasher, M.
Lecture 2 Jan 15, 2008 Social Search. What is Social Search? Social Information Access –a stream of research that explores methods for organizing users’
Deep Learning Powered In- Session Contextual Ranking using Clickthrough Data Xiujun Li 1, Chenlei Guo 2, Wei Chu 2, Ye-Yi Wang 2, Jude Shavlik 1 1 University.
Personalization with user’s local data Personalizing Search via Automated Analysis of Interests and Activities 1 Sungjick Lee Department of Electrical.
Qi Guo Emory University Ryen White, Susan Dumais, Jue Wang, Blake Anderson Microsoft Presented by Tetsuya Sakai, Microsoft Research.
Jiafeng Guo(ICT) Xueqi Cheng(ICT) Hua-Wei Shen(ICT) Gu Xu (MSRA) Speaker: Rui-Rui Li Supervisor: Prof. Ben Kao.
Is Your E-commerce Site Losing Customers? Sharon Taylor.
CONCLUSIONS & CONTRIBUTIONS Ground-truth dataset, simulated search tasks environment Multiple everyday applications (MS Word, MS PowerPoint, Mozilla Browser)
Browser Wars (Click on the logo to see the performance)
Implicit User Feedback Hongning Wang Explicit relevance feedback 2 Updated query Feedback Judgments: d 1 + d 2 - d 3 + … d k -... Query User judgment.
A collaborative tool for sequence annotation. Contact:
B. Trousse, R. Kanawati - JTE : Advanced Services on the Web, Paris 7 may 1999 Broadway: a recommendation computation approach based on user behaviour.
Google Analytics Workshop
Web Search – Summer Term 2006 VII. Web Search - Indexing: Structure Index (c) Wolfgang Hürst, Albert-Ludwigs-University.
Bloom Cookies: Web Search Personalization without User Tracking Authors: Nitesh Mor, Oriana Riva, Suman Nath, and John Kubiatowicz Presented by Ben Summers.
ASSIST: Adaptive Social Support for Information Space Traversal Jill Freyne and Rosta Farzan.
1 Click Chain Model in Web Search Fan Guo Carnegie Mellon University PPT Revised and Presented by Xin Xin.
Modern Retrieval Evaluations Hongning Wang
Identifying “Best Bet” Web Search Results by Mining Past User Behavior Author: Eugene Agichtein, Zijian Zheng (Microsoft Research) Source: KDD2006 Reporter:
Date: 2013/9/25 Author: Mikhail Ageev, Dmitry Lagun, Eugene Agichtein Source: SIGIR’13 Advisor: Jia-ling Koh Speaker: Chen-Yu Huang Improving Search Result.
Predicting User Interests from Contextual Information R. W. White, P. Bailey, L. Chen Microsoft (SIGIR 2009) Presenter : Jae-won Lee.
Predicting Short-Term Interests Using Activity-Based Search Context CIKM’10 Advisor: Jia Ling, Koh Speaker: Yu Cheng, Hsieh.
To Personalize or Not to Personalize: Modeling Queries with Variation in User Intent Presented by Jaime Teevan, Susan T. Dumais, Daniel J. Liebling Microsoft.
DATA MINING It is a process of extracting interesting(non trivial, implicit, previously, unknown and useful ) information from any data repository. The.
Interaction classes Record context Custom lookups.
CONCLUSIONS & CONTRIBUTIONS Ground-truth dataset, simulated search tasks environment Implicit feedback, semi-explicit feedback (annotations), explicit.
Some from Chapter 11.9 – “Web” 4 th edition and SY306 Web and Databases for Cyber Operations Cookies and.
Accurately Interpreting Clickthrough Data as Implicit Feedback
HMVR System Final Presentation
Modern Retrieval Evaluations
Prepared by Rao Umar Anwar For Detail information Visit my blog:
Evidence from Behavior
Procedure for adding a Trusted Site
Presentation transcript:

Evaluating Implicit Measures to Improve the Search Experience SIGIR 2003 Steve Fox

Outline Background Approach Data Analysis Value-Add Contributions Result-Level Findings Session-Level Findings

Background Interested in implicit measures to improve users search experience What the user wants What satisfies them Significant implicit measures Needed to prove it! Two goals: Test association between implicit measures and user satisfaction Understand what implicit measures were useful within this association

Approach Architecture Internet Explorer add-in Client-Server Configured for MSN Search and Google Deployment Internal MS employees (n = 146) – work environment Implicit measures and explicit feedback SQL Server back-end

Approach, contd

Data Analysis Bayesian modeling at result and session level Trained on 80% and tested on 20% Three levels of SAT – VSAT, PSAT & DSAT Implicit measures: Result-LevelSession-Level Diff Secs, Duration SecsAverages of result-level measures (Dwell Time and Position) Scrolled, ScrollCnt, AvgSecsBetweenScroll, TotalScrollTime, MaxScroll Query count TimeToFirstClick, TimeToFirstScrollResults set count Page, Page Position, Absolute PositionResults visited VisitsEnd action Exit Type ImageCnt, PageSize, ScriptCnt Added to Favorites, Printed

Data Analysis, contd

Result-Level Findings 1. Dwell time, clickthrough and exit type strongest predictors of SAT 2. Printing and Adding to Favorites highly predictive of SAT when present 3. Combined measures predict SAT better than clickthrough

Result Level Findings, contd Only clickthrough Combined measures Combined measures with confidence of > 0.5 (80-20 train/test split)

Session-Level Findings Four findings: 1. Strong predictor of session-level SAT was result-level SAT 2. Dwell time strong predictor of SAT 3. Combination of (slightly different) implicit measures could predict SAT better than clickthrough 4. Some gene sequences predict SAT (preliminary and descriptive)

Session Level Findings, contd Common patterns in gene analysis, e.g. SqLrZ Session starts (S) Submit a query (q) Result list returned (L) Click a result (r) Exit on result (Z) PatternFrequency%VSAT%PSAT%DSAT Avg. VSAT Dwell Time Avg. PSAT Dwell Time Avg. DSAT Dwell Time SqLrZ

Value-Add Contributions Deployed in the work setting Collected data in context of web search Rich user behavior data stream Annotated data stream with explicit judgment Used new methodology to analyze the data Gene analysis to analyze usage patterns Mapped usage patterns to SAT

Question(s)