IST 497G Ron Grzywacz November 2002 Personalization of Information Retrieval.

Slides:



Advertisements
Similar presentations
Critical Reading Strategies: Overview of Research Process
Advertisements

CSE594 Fall 2009 Jennifer Wong Oct. 14, 2009
What is a Professional Literature Review? Not to be confused with a book review, a literature review surveys scholarly articles, books and other sources.
Project Proposal.
The Literature Review in 3 Key Steps
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 6 Finding the Evidence: Informational Sources, Search Strategies, and Critical.
Best Web Directories and Search Engines Order Out of Chaos on the World Wide Web.
Automated Reference Assistance: Reference for a New Generation Denise Troll Covey Associate University Librarian Carnegie Mellon CNI Meeting – April 2002.
CSCD 555 Research Methods for Computer Science
INFO 624 Week 3 Retrieval System Evaluation
21 21 Web Content Management Architectures Vagan Terziyan MIT Department, University of Jyvaskyla, AI Department, Kharkov National University of Radioelectronics.
WebMiningResearchASurvey Web Mining Research: A Survey Raymond Kosala and Hendrik Blockeel ACM SIGKDD, July 2000 Presented by Shan Huang, 4/24/2007 Revised.
Interface for the University Library Catalogue Implementing Direct Manipulation Proposal 4.
Copyright c 2001 The McGraw-Hill Companies, Inc.1 Chapter 2 The Research Process: Getting Started Researcher as a detective Seeking answers to questions.
Personalized Ontologies for Web Search and Caching Susan Gauch Information and Telecommunications Technology Center Electrical Engineering and Computer.
August 15 click! 1 Basics Kitsap Regional Library.
Welcome to the CINAHL* tutorial By the end of this tutorial you should be able to: Do a basic search to find references Use search techniques to make your.
What’s The Difference??  Subject Directory  Search Engine  Deep Web Search.
Lesson 46: Using Information From the Web copy and paste information from a Web site print a Web page download information from a Web site customize Web.
Databases & Data Warehouses Chapter 3 Database Processing.
AdWords Instructor: Dawn Rauscher. Quality Score in Action 0a2PVhPQhttp:// 0a2PVhPQ.
Periodical Databases Full-text article – entire textual contents of article in online format Abstract – brief summary of article Citation – basic information.
Citation Recommendation 1 Web Technology Laboratory Ferdowsi University of Mashhad.
Personalization features to accelerate research Presented by: Armond DiRado Account Development Manager
THOMSON SCIENTIFIC Web of Science Using the specialized search and analyze features Jackie Stapleton, librarian Fall 2006.
Introduction to Microsoft Access 2003 Mr. A. Craig Dixon CIS 100: Introduction to Computers Spring 2006.
Recommendation system MOPSI project KAROL WAGA
UOS 1 Ontology Based Personalized Search Zhang Tao The University of Seoul.
Chapter Chapter 3 Internet Agents. Chapter Contents Background Web Search Agents Information Filtering Agents Notification Agents Other Service.
Software Requirements (Advanced Topics) “Walking on water and developing software from a specification are easy if both are frozen.” --Edward V Berard.
Search and Navigation Based on the paper, “Improved Search Engines and Navigation Preference in Personal Information Management” Ofer Bergman, Ruth Beyth-Marom,
1 Information Retrieval Acknowledgements: Dr Mounia Lalmas (QMW) Dr Joemon Jose (Glasgow)
TOPIC CENTRIC QUERY ROUTING Research Methods (CS689) 11/21/00 By Anupam Khanal.
How to read a scientific paper
XP New Perspectives on The Internet, Sixth Edition— Comprehensive Tutorial 3 1 Searching the Web Using Search Engines and Directories Effectively Tutorial.
Personalized Search Xiao Liu
CSM06 Information Retrieval Lecture 6: Visualising the Results Set Dr Andrew Salway
2007. Software Engineering Laboratory, School of Computer Science S E Web-Harvest Web-Harvest: Open Source Web Data Extraction tool 이재정 Software Engineering.
Data Mining for Web Intelligence Presentation by Julia Erdman.
Before we begin… Go to the LHS webpage. Click “Departments”. Click “Media Center”. Or go directly to
22 November Databases. Presentations Tega: news 1954 Prediction.
LOGO A comparison of two web-based document management systems ShaoxinYu Columbia University March 31, 2009.
Exercise Your your Library ® RefWorks: The Basics October 10, 2006.
Personalized Interaction With Semantic Information Portals Eric Schwarzkopf DFKI
Welcome to the Business Source Premier tutorial By the end of this tutorial you should be able to: Do a basic search to find references Use search techniques.
Lindsey Aldrich Website Analysis: EBSCOhost An Outstanding Research Engine and Academic Tool.
WEB 2.0 PATTERNS Carolina Marin. Content  Introduction  The Participation-Collaboration Pattern  The Collaborative Tagging Pattern.
Digital Libraries1 David Rashty. Digital Libraries2 “A library is an arsenal of liberty” Anonymous.
Information Retrieval
Web Information Retrieval Prof. Alessandro Agostini 1 Context in Web Search Steve Lawrence Speaker: Antonella Delmestri IEEE Data Engineering Bulletin.
Augmenting (personal) IR Readings Review Evaluation Papers returned & discussed Papers and Projects checkin time.
Chapter. 3: Retrieval Evaluation 1/2/2016Dr. Almetwally Mostafa 1.
A System for Automatic Personalized Tracking of Scientific Literature on the Web Tzachi Perlstein Yael Nir.
1 FollowMyLink Individual APT Presentation First Talk February 2006.
Finding similar items by leveraging social tag clouds Speaker: Po-Hsien Shih Advisor: Jia-Ling Koh Source: SAC 2012’ Date: October 4, 2012.
What is Multimedia Anyway? David Millard and Paul Lewis.
Protecting your search privacy A lesson plan created & presented by Maria Bernhey (MLS) Adjunct Information Literacy Instructor
Thomas Grandell April 8 th, 2016 This work is licensed under the Creative Commons Attribution 4.0 International.
 Project Team: Suzana Vaserman David Fleish Moran Zafir Tzvika Stein  Academic adviser: Dr. Mayer Goldberg  Technical adviser: Mr. Guy Wiener.
SEMINAR ON INTERNET SEARCHING PRESENTED BY:- AVIPSA PUROHIT REGD NO GUIDED BY:- Lect. ANANYA MISHRA.
Searching the Web for academic information Ruth Stubbings.
Client-Side Internet and Web Programming
OARE Module 5A: Scopus (Elsevier)
Metasearch Thanks to Eric Glover NEC Research Institute.
User Characterization in Search Personalization
Augmenting (personal) IR
Prepared by Rao Umar Anwar For Detail information Visit my blog:
IST 497E Information Retrieval and Organization
Information Retrieval and Web Design
Presentation transcript:

IST 497G Ron Grzywacz November 2002 Personalization of Information Retrieval

Overview The Topic Issues Importance What has been done and how The next step

Personalization of IR This refers to the automatic adjustment of information content, structure, and presentation tailored to an individual user. Characteristics Age Gender Special Interest Groups Topic

Issues Why Use? One Size Doesn’t fit all Limits Diversity Limits Functionality Limits Competition Different Users have different needs

Issues How to customize the process? Which characteristics? Not all people of certain groups are the same Virtually impossible to create one unique search engine for every individual Wouldn’t make sense to build 100 million different versions of Google from the group up based on one user’s needs

Importance Why important? Not every solution is ideal for each person Certain people don’t understand how to use certain systems Current systems aren’t tailored for a specific type of user

The Example Suppose we query a system for “Michael Jordan” Most popular engines would return information about the famous basketball player Suppose that we were looking for information about computer science papers written by Michael Jordan By using the query “Michael Jordan”, there is no information regarding the context of our desired search

Context Search This presents a problem We have no way to infer or assume the context of a user’s desired search Currently it’s a hit or miss process to return relevant contextual information What if we tried to automatically infer contextual information?

Automatic Inferring of Contextual Information By monitoring user patterns, we could gather information about the user and the possible context of search This raises issues regarding privacy Do you want some company building information about your preferences? What happened if this information was released or misused against you.

Personalized Search By combining the previous items we can build a personalized search service. The example “Michael Jordan” query could return data about the basketball player to sports enthusiasts and information about computer science papers to researchers.

What’s Been Done? Inquirus/Inquirus 2 This is a meta-search engine done by a team from NEC (which included Prof. Giles) It attempts to add a category or contextual information to a keyword search Examples would be “personal homepage”, “research paper” and “general introductory information” It uses the information to query relevant search engines, modify queries and select ordering policy

What’s Been Done? My experiences with Inquirus You have 3 search options within this service Web Search Google Groups (Google Groups) Returned valid results from both Google and Groups Valid results were also returned from Web Search but rank order was not as good It also included words such as “of” in the search terms. This could be problematic, since sometimes words like that had the most results

What’s Been Done? The Watson Project – Northwestern Univ.The Watson Project Suppose someone is searching for “information about cats” It’s easy to manipulate text to create many unique scenarios Veterinary student writing a term paper on animal cancer  Feline Cancer, Diagnosis, Treatment Contractor working on a proposal for a new building.  Caterpillar Corp. machinery Grade-school student writing a paper about Egypt.  Pictures and stories about cat mummies and gods

What’s Been Done? The Problems with the query Relevance of active goals The active goals of the user contribute significantly to the interpretation of the query and to the criteria for judging a resource relevant to the query. Word-sense ambiguity The word sense of “cat” is different from the others in each scenario. The context of the request provides a clear choice of word sense. Audience appropriateness The audiences in each of the scenarios also constrain the choice of results. Sources appropriate for a veterinarian probably will not be appropriate for a student in grade school.

What’s Been Done? The Watson Project Solution A system used to collect contextual information from everyday computer use Watson is a client side application that monitors you daily computer use of applications such as word processors, web browsers and clients. By knowing about you and your work, Watson can help you find information that is relevant to you.

What’s Been Done? My experience with Watson A small download and brief install loaded the java based application I used it while creating this presentation It managed to return results regarding search engines and their development, but it did not return anything relevant to my specific topic It did manager to generate a search in CiteSeer for me

What’s Been Done Context already assumed Although we can not automatically assume the context of a user’s search yet, people have built engines that use a given context CiteSeer This is a search engine for research papers in scientific literature

What’s Been Done PubMed Customized Science and Medicine database of journals and articles Questia Search engine for students who are doing research and writing papers

What’s Been Done A different approach using personalized IR In what other ways can we use this type of technology? KnowledgeFlow Inc. – Web AngelWeb Angel Browser Plug-in that stores your preferences and returns customized advertisements to you as you browse the web A practical consumer/commercial application of personalized IR

What’s Been Done There are other ways to personalize web content Recommender Systems A System that monitors your habits or receives input about your preferences and generates things you might be interested in Amazon.com Barnes and Noble

Other Recommender Systems Movielens Movie selection service Gives you a survey to gauge your preferences Makes recommendations based on your likes Book Forager Novel selection service Allows you to choose a variety of book characteristics Makes a recommendation based on your current choices

What’s Next? How can we build upon current services? Use of AI to evaluate query to determine contextual information Use user provided information to generate specific data Build upon user provided data by monitoring browsing preferences

Summary The Topic Issues Importance What has been done and how The next step

Citations Budzik, J., and Hammond, K. User Interactions with Everyday Applications as Context for Just-in-time Information Access. In Proceedings of Intelligent User Interfaces ACM Press, (Nominated for Best Paper Award) Steve Lawrence. Context in Web Search, IEEE Data Engineering Bulletin, Volume 23, Number 3, pp. 25–32, N. Ramakrishnan and S. Perugini. The Partial Evaluation Approach to Information Personalization. ACM Transactions on Information Systems, August 2001.