1 IFLA FRBR & MIC Metadata Evaluation Ying Zhang Yuelin Li October 14, 2003.

Slides:



Advertisements
Similar presentations
MOVING IMAGE COLLECTIONS A Window to the Worlds Moving Images Grace Agnew – March 2004.
Advertisements

MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Survey Quality Control.
Survey design. What is a survey?? Asking questions – questionnaires Finding out things about people Simple things – lots of people What things? What people?
Robin L. Donaldson May 5, 2010 Prospectus Defense Florida State University College of Communication and Information.
1 QUANTITATIVE DESIGN AND ANALYSIS MARK 2048 Instructor: Armand Gervais
Literacy Assessment and Monitoring Programme (LAMP) UNESCO Institute for Statistics.
User Mediation & the Reference Interview IS 530 Fall 2009 Dr. D. Bilal.
Azra Rafique Khalid Mahmood. Introduction “To learn each and everything in a limited time frame of degree course is not possible for students”. (Mahmood,
Usability Process for eBP at Intel Eric Townsend, Intel.
Customer: Rosalva Gallardo Team members: Susan Lin Buda Chiou Jim Milewski Marcos Mercado November 23, 2010.
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
1 Adaptive Management Portal April
© Tefko Saracevic, Rutgers University1 Search strategy & tactics Governed by effectiveness & feedback.
Chapter 13 Survey Designs
© Tefko Saracevic, Rutgers University1 Interaction in information retrieval There is MUCH more to searching than knowing computers, networks & commands,
Midterm Review Evaluation & Research Concepts Proposals & Research Design Measurement Sampling Survey methods.
Measuring the quality of academic library electronic services and resources Jillian R Griffiths Research Associate CERLIM – Centre for Research in Library.
© Tefko Saracevic, Rutgers University 1 EVALUATION in searching IR systems Digital libraries Reference sources Web sources.
Usability & Usability Engineering. Usability What is usability Easy to use? User Friendly?
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
Formative and Summative Evaluations
Design of metadata surrogates in search result interfaces of learning object repositories: Linear versus clustered metadata design Panos Balatsoukas Anne.
Rest of Course Proposals & Research Design Measurement Sampling Survey methods Basic Statistics for Survey Analysis Experiments Other Approaches- Observation,
Usability 2004 J T Burns1 Usability & Usability Engineering.
RDA: Resource Description and Access A New Cataloging Standard for a Digital Future Jennifer Bowen OLAC 2006 Conference October 27, 2006
© Tefko Saracevic, Rutgers University1 Presentation of search results A search is not finished with the search Guidelines for deliverables.
Chapter Three Research Design.
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
Needs Analysis Instructor: Dr. Mavis Shang
Usability Evaluation Methods Computer-Mediated Communication Lab week 10; Tuesday 7/11/06.
Survey Designs EDUC 640- Dr. William M. Bauer
Quality Improvement Prepeared By Dr: Manal Moussa.
Evaluation of digital Libraries: Criteria and problems from users’ perspectives Article by Hong (Iris) Xie Discussion by Pam Pagels.
Business and Management Research
What is a Usable Library Website? Results from a Nationwide Study Anthony Chow, Ph.D., Assistant Professor Michelle Bridges, Patricia Commander, Amy Figley,
RESEARCH A systematic quest for undiscovered truth A way of thinking
Literature Review and Parts of Proposal
Research methodology Data Collection tools and Techniques.
1 CERN library stage report Group meeting – 5 Oct 2004 Giuseppina Vullo – Administrative student (Aug-Nov 2004)
S556 SYSTEMS ANALYSIS & DESIGN Week 11. Creating a Vision (Solution) SLIS S556 2  Visioning:  Encourages you to think more systemically about your redesign.
An Online Knowledge Base for Sustainable Military Facilities & Infrastructure Dr. Annie R. Pearce, Branch Head Sustainable Facilities & Infrastructure.
Understanding MYP Criteria
Usability Evaluation June 8, Why do we need to do usability evaluation?
Chapter Four Managing Marketing Information. Copyright 2007, Prentice Hall, Inc.4-2 The Importance of Marketing Information  Companies need information.
Copyright 2007, Prentice Hall, Inc. 1 1 Principles of Marketing Fall Term MKTG 220 Fall Term MKTG 220 Dr. Abdullah Sultan Dr. Abdullah Sultan.
Research Methodology Lecture No :14 (Sampling Design)
Chapter 13. Reviewing, Evaluating, and Testing © 2010 by Bedford/St. Martin's1 Usability relates to five factors of use: ease of learning efficiency of.
SCIENTIFIC COMMUNICATION Preparing a Research Proposal الدكتورة أسماء الصالح رقم المكتب 5T201 الموقع :
8. Observation Jin-Wan Seo, Professor Dept. of Public Administration, University of Incheon.
Software Engineering User Interface Design Slide 1 User Interface Design.
Intellectual Works and their Manifestations Representation of Information Objects IR Systems & Information objects Spring January, 2006 Bharat.
Research Methodology For AEP Assoc. Prof. Dr. Nguyen Thi Tuyet Mai HÀ NỘI 12/2015.
REPRESENTING CONTEXT IN AN ARCHIVE OF EDUCATIONAL EVALUATIONS PROJECT ACTIVITIES The project team canvassed opinion across the.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
REPRESENTING CONTEXT IN AN ARCHIVE OF EDUCATIONAL EVALUATIONS The project has constructed a permanent archive of significant.
Usability Testing TECM 4180 Dr. Lam. What is Usability? A quality attribute that assesses how easy user interfaces are to use Learnability – Ease of use.
Principals of Research Writing. What is Research Writing? Process of communicating your research  Before the fact  Research proposal  After the fact.
Rest of Course Proposals & Research Design Measurement Sampling
Chapter Two Copyright © 2006 McGraw-Hill/Irwin The Marketing Research Process.
Session 6: Data Flow, Data Management, and Data Quality.
Definition, purposes/functions, elements of IR systems Lesson 1.
Chapter 10 (3.8) Marketing Research.  What is Marketing Research? Marketing research is the systematic design, collection, analysis, and reporting of.
1 1 Principles of Marketing Spring Term MKTG 220 Spring Term MKTG 220 Dr. Abdullah Sultan Dr. Abdullah Sultan.
Online Information and Education Conference 2004, Bangkok Dr. Britta Woldering, German National Library Metadata development in The European Library.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Day 8 Usability testing.
Evaluation of an Information System in an Information Seeking Process Lena Blomgren, Helena Vallo and Katriina Byström The Swedish School of Library and.
The Web Web Design. 3.2 The Web Focus on Reading Main Ideas A URL is an address that identifies a specific Web page. Web browsers have varying capabilities.
IB Environmental Systems and Societies
digital libraries and human information behavior
Presentation transcript:

1 IFLA FRBR & MIC Metadata Evaluation Ying Zhang Yuelin Li October 14, 2003

2 MIC Metadata Evaluation Framework – the IFLA FRBR (Functional Requirement for Bibliographic Record) MIC evaluation cases Experiences and lessons

3 The IFLA FRBR - MICEval Framework Find – can a user enter a search and retrieve records relevant to the search Identify – once the user retrieves record, can he/she successfully interpret the information in the record to know whether the source information will be relevant to his/her needs? Select – can the user compare the information in multiple records and determine the most relevant record? Obtain – can the user successfully obtain the original artifacts, based on the information provided in the source information

4 Directory Schema Evaluation – question/methods Usefulness assessment How useful is each directory element in terms of helping target users to find, identify, select, and obtain source information ? Criterion: Perceived usefulness Methodology: Online survey –Embed the FRBR framework –Provide situational information –Sampling frame (Science educators, archivists)

5 Directory Schema Evaluation – Sample section head 3. SELECT – Confirm that the record describes the organization most appropriate to the users’ needs based on conformance to important criteria for comparing one organization to others retrieved in a search USE: These fields will display in the short listing when multiple records result from a search. These fields will enable a user to quickly select the most useful records among multiple records retrieved in a search Example (Prototype screen for illustration)

6 Directory Schema Evaluation – Results/Applications FIND (Org) FIND (Collect)IDENTIFYSELECTOBTAIN #1#1 Org Name (93.9%) Predominant subjects in collection (90.9%) Org Name (84.0%) Org’s URL (94.0%) Primary contact for obtain (97.0%) #2#2 State / region (93.9%) Classes of materials in collection (87.9%) Country (81.8%) Org Name (93.9%) Primary address (97.0%) #3#3 Org type (90.9%) General physical format (84.8%) Service provided (81.8%) Org’s address (87.9%) URL for obtaining (96.9%) Determine useful directory elements for the user community Identify potential elements that are missed in current schema Improve the search and result display interfaces

7 Metadata schema evaluation— Proposal Usability test How usable is the MIC metadata schema in terms of helping target users to find, identify, select, and obtain source information ? Measures –Information adequacy –Information accuracy –Ease of understanding –Helpfulness – Physical item accessibility – Precision – Error rate – Satisfaction

8 Metadata schema evaluation Embedment of FRBR Query modification Users’ relevance judgment vs. evaluators’ false judgment detection Treatments Physical accessibility check Ease of understanding Information Adequacy Error Rate Physical Item Accessibility Usability Measures Information accuracy Helpfulness Precision Find Identify Select Obtain IFLA FRBR “Generic Tasks”

9 Metadata schema evaluation Methods/treatments Stratified and purposive sampling Training and practicing Demographic questionnaire Simulated topical scenario Query modification using metadata records as the source of relevance feedback Post-test questionnaire Lab observation (audio/video taping, observation notes) Think aloud protocol Exit interview …

10 MIC Evaluation (experiences/lessons) Evaluation questions Criteria measures instruments Brain storming Literature review Communication MIC Analysis FRBR 4 generic tasks MIC Evaluation Approach Adaptation Embedment Embed the IFLA FRBR Adapt measures & treatments Provide situational information

11 Acknowledgement Thank Ms. Grace Agnew for her innovative idea of applying the IFLA FRBR as the framework for the evaluation project Thank Dr. Tefko Saracevic for his excellent leadership on our evaluation team Thank Ms. Judy Jeng for her nice work as a team member

12 MIC Evaluation Team Tefko Saracevic, Ph.D, Evaluation Investigator Ying Zhang, Doctoral Student, Evaluation Coordinator Yuelin Li, Doctoral Student Judy Jeng, Ph.D. Candidate School of Communication, Information and Library Studies Rutgers, the State University of New Jersey 4 Huntington Street, New Brunswick, NJ U.S.A. Tel.: (732) /Extension 8222 Fax: (732) URL: