Recuperação de Informação. IR: representation, storage, organization of, and access to information items Emphasis is on the retrieval of information (not.

Slides:



Advertisements
Similar presentations
Chapter 5: Introduction to Information Retrieval
Advertisements

Modern information retrieval Modelling. Introduction IR systems usually adopt index terms to process queries IR systems usually adopt index terms to process.
Multimedia Database Systems
Basic IR: Modeling Basic IR Task: Slightly more complex:
The Probabilistic Model. Probabilistic Model n Objective: to capture the IR problem using a probabilistic framework; n Given a user query, there is an.
Introduction to Information Retrieval (Part 2) By Evren Ermis.
Motivation and Outline
IR Models: Overview, Boolean, and Vector
Information Retrieval Ling573 NLP Systems and Applications April 26, 2011.
ISP 433/533 Week 2 IR Models.
Information Retrieval Modeling CS 652 Information Extraction and Integration.
T.Sharon - A.Frank 1 Internet Resources Discovery (IRD) IR Queries.
Information Retrieval Concerned with the: Representation of Storage of Organization of, and Access to Information items.
Ch 4: Information Retrieval and Text Mining
Modern Information Retrieval Chapter 2 Modeling. Can keywords be used to represent a document or a query? keywords as query and matching as query processing.
Chapter 2Modeling 資工 4B 陳建勳. Introduction.  Traditional information retrieval systems usually adopt index terms to index and retrieve documents.
1/22 --Are you getting s sent to class mailing list? (if not, send me ) Agenda: --Intro wrapup --Start on text retrieval Link du jour: vivisimo.com.
Modeling Modern Information Retrieval
Evaluating the Performance of IR Sytems
Vector Space Model CS 652 Information Extraction and Integration.
The Vector Space Model …and applications in Information Retrieval.
Modern Information Retrieval Chapter 2 Modeling. Can keywords be used to represent a document or a query? keywords as query and matching as query processing.
IR Models: Review Vector Model and Probabilistic.
Chapter 5: Information Retrieval and Web Search
Information Retrieval: Foundation to Web Search Zachary G. Ives University of Pennsylvania CIS 455 / 555 – Internet and Web Systems August 13, 2015 Some.
 IR: representation, storage, organization of, and access to information items  Focus is on the user information need  User information need:  Find.
Modeling (Chap. 2) Modern Information Retrieval Spring 2000.
CS344: Introduction to Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 32-33: Information Retrieval: Basic concepts and Model.
Basics of Information Retrieval Lillian N. Cassel Some of these slides are taken or adapted from Source:
Modern Information Retrieval Computer engineering department Fall 2005.
Information Retrieval Introduction/Overview Material for these slides obtained from: Modern Information Retrieval by Ricardo Baeza-Yates and Berthier Ribeiro-Neto.
PrasadL2IRModels1 Models for IR Adapted from Lectures by Berthier Ribeiro-Neto (Brazil), Prabhakar Raghavan (Yahoo and Stanford) and Christopher Manning.
Information Retrieval Chapter 2: Modeling 2.1, 2.2, 2.3, 2.4, 2.5.1, 2.5.2, Slides provided by the author, modified by L N Cassel September 2003.
Information Retrieval Models - 1 Boolean. Introduction IR systems usually adopt index terms to process queries Index terms:  A keyword or group of selected.
Xiaoying Gao Computer Science Victoria University of Wellington Intelligent Agents COMP 423.
Chapter 6: Information Retrieval and Web Search
Introduction to Digital Libraries hussein suleman uct cs honours 2003.
CSCE 5300 Information Retrieval and Web Search Introduction to IR models and methods Instructor: Rada Mihalcea Class web page:
Information Retrieval CSE 8337 Spring 2005 Modeling Material for these slides obtained from: Modern Information Retrieval by Ricardo Baeza-Yates and Berthier.
Comparing and Ranking Documents Once our search engine has retrieved a set of documents, we may want to Rank them by relevance –Which are the best fit.
1 University of Palestine Topics In CIS ITBS 3202 Ms. Eman Alajrami 2 nd Semester
Introduction to Information Retrieval Aj. Khuanlux MitsophonsiriCS.426 INFORMATION RETRIEVAL.
Vector Space Models.
Information Retrieval CSE 8337 Spring 2007 Introduction/Overview Some Material for these slides obtained from: Modern Information Retrieval by Ricardo.
Recuperação de Informação Cap. 01: Introdução 21 de Fevereiro de 1999 Berthier Ribeiro-Neto.
Information Retrieval
Information Retrieval and Web Search Introduction to IR models and methods Rada Mihalcea (Some of the slides in this slide set come from IR courses taught.
Data Integration and Information Retrieval Zachary G. Ives University of Pennsylvania CIS 455 / 555 – Internet and Web Systems February 18, 2008 Some slides.
Recuperação de Informação B Cap. 02: Modeling (Latent Semantic Indexing & Neural Network Model) 2.7.2, September 27, 1999.
Introduction n IR systems usually adopt index terms to process queries n Index term: u a keyword or group of selected words u any word (more general) n.
Probabilistic Model n Objective: to capture the IR problem using a probabilistic framework n Given a user query, there is an ideal answer set n Querying.
Xiaoying Gao Computer Science Victoria University of Wellington COMP307 NLP 4 Information Retrieval.
Information Retrieval Models School of Informatics Dept. of Library and Information Studies Dr. Miguel E. Ruiz.
Modern Information Retrieval
Information Retrieval
Information Retrieval and Web Search
Representation of documents and queries
CSE 635 Multimedia Information Retrieval
Chapter 5: Information Retrieval and Web Search
4. Boolean and Vector Space Retrieval Models
Recuperação de Informação B
Boolean and Vector Space Retrieval Models
Recuperação de Informação B
Recuperação de Informação B
Information Retrieval and Web Design
Recuperação de Informação B
Recuperação de Informação
Recuperação de Informação B
Information Retrieval and Web Design
Advanced information retrieval
Presentation transcript:

Recuperação de Informação

IR: representation, storage, organization of, and access to information items Emphasis is on the retrieval of information (not data) Focus is on the user information need Introduction to IR

Data retrieval  Well defined semantics  a single erroneous object implies failure! Information retrieval  information about a subject or topic  semantics is frequently loose  small errors are tolerated IR system:  interpret contents of information items  generate a ranking which reflects relevance  notion of relevance is most important

IR at the center of the stage  IR in the last 20 years: classification and categorization systems and languages user interfaces and visualization  Still, area was seen as of narrow interest  Advent of the Web changed this perception once and for all universal repository of knowledge free (low cost) universal access no central editorial board many problems though: IR seen as key to finding the solutions!

Logical view of the documents structure Accents spacing stopwords Noun groups stemming Manual indexing Docs structureFull textIndex terms

User Interface Text Operations Query Operations Indexing Searching Ranking Index Text query user need user feedback ranked docs retrieved docs logical view inverted file DB Manager Module Text Database Text The Retrieval Process

Basic Concepts IR systems usually adopt index terms to process queries Index term:  a keyword or group of selected words  any word (more general) Stemming might be used:  connect: connecting, connection, connections An inverted file is built for the chosen index terms

Matching at index term level is quite imprecise No surprise that users get frequently unsatisfied Since most users have no training in query formation, problem is even worst Frequent dissatisfaction of Web users Issue of deciding relevance is critical for IR systems: ranking Basic Concepts

A ranking is an ordering of the documents retrieved that (hopefully) reflects the relevance of the documents to the user query A ranking is based on fundamental premisses regarding the notion of relevance, such as:  common sets of index terms  sharing of weighted terms  likelihood of relevance Each set of premisses leads to a distinct IR model Basic Concepts

Each document represented by a set of representative keywords or index terms An index term is a document word useful for remembering the document main themes Usually, index terms are nouns because nouns have meaning by themselves However, some search engines assume that all words are index terms (full text representation) Basic Concepts

Not all terms are equally useful for representing the document contents: less frequent terms allow identifying a narrower set of documents The importance of the index terms is represented by weights associated to them Let  ki be an index term  dj be a document  wij is a weight associated with (ki,dj) The weight wij quantifies the importance of the index term for describing the document contents Basic Concepts

 ki is an index term  dj is a document  t is the total number of docs  K = (k1, k2, …, kt) is the set of all index terms  wij >= 0 is a weight associated with (ki,dj)  wij = 0 indicates that term does not belong to doc  vec(dj) = (w1j, w2j, …, wtj) is a weighted vector associated with the document dj Basic Concepts

The Vector Model Use of binary weights is too limiting Non-binary weights provide consideration for partial matches These term weights are used to compute a degree of similarity between a query and each document Ranked set of documents provides for better matching

The Vector Model Define:  wij > 0 whenever ki  dj  wiq >= 0 associated with the pair (ki,q)  vec(dj) = (w1j, w2j,..., wtj) vec(q) = (w1q, w2q,..., wtq)  To each term ki is associated a unitary vector vec(i)  The unitary vectors vec(i) and vec(j) are assumed to be orthonormal (i.e., index terms are assumed to occur independently within the documents) The t unitary vectors vec(i) form an orthonormal basis for a t-dimensional space In this space, queries and documents are represented as weighted vectors

The Vector Model Sim(q,dj) = cos(  ) = [vec(dj)  vec(q)] / |dj| * |q| = [  wij * wiq] / |dj| * |q| Since wij > 0 and wiq > 0, 0 <= sim(q,dj) <=1 A document is retrieved even if it matches the query terms only partially i j dj q 

The Vector Model Sim(q,dj) = [  wij * wiq] / |dj| * |q| How to compute the weights wij and wiq ? A good weight must take into account two effects:  quantification of intra-document contents (similarity) tf factor, the term frequency within a document  quantification of inter-documents separation (dissi- milarity) idf factor, the inverse document frequency  wij = tf(i,j) * idf(i)

The Vector Model Let,  N be the total number of docs in the collection  ni be the number of docs which contain ki  freq(i,j) raw frequency of ki within dj A normalized tf factor is given by  f(i,j) = freq(i,j) / max(freq(l,j))  where the maximum is computed over all terms which occur within the document dj The idf factor is computed as  idf(i) = log (N/ni)  the log is used to make the values of tf and idf comparable. It can also be interpreted as the amount of information associated with the term ki.

The Vector Model The best term-weighting schemes use weights which are give by  wij = f(i,j) * log(N/ni)  the strategy is called a tf-idf weighting scheme For the query term weights, a suggestion is  wiq = (0.5 + [0.5 * freq(i,q) / max(freq(l,q)]) * log(N/ni) The vector model with tf-idf weights is a good ranking strategy with general collections The vector model is usually as good as the known ranking alternatives. It is also simple and fast to compute.

The Vector Model Advantages:  term-weighting improves quality of the answer set  partial matching allows retrieval of docs that approximate the query conditions  cosine ranking formula sorts documents according to degree of similarity to the query Disadvantages:  assumes independence of index terms (??); not clear that this is bad though

The Vector Model: Example I d1 d2 d3 d4d5 d6 d7 k1 k2 k3

The Vector Model: Example II d1 d2 d3 d4d5 d6 d7 k1 k2 k3

The Vector Model: Example III d1 d2 d3 d4d5 d6 d7 k1 k2 k3

Evaluation Precision: from the returned docs, how many are relevant Recall: from all relevant docs, how many were returned

Evaluation Collection Relevant Docs |R| Answer Set |A| Relevant Docs in Answer Set |Ra| Recall= |Ra|/|R| Precision = |Ra|/|A|