Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 IFLA FRBR & MIC Metadata Evaluation Ying Zhang Yuelin Li October 14, 2003.

Similar presentations


Presentation on theme: "1 IFLA FRBR & MIC Metadata Evaluation Ying Zhang Yuelin Li October 14, 2003."— Presentation transcript:

1 1 IFLA FRBR & MIC Metadata Evaluation http://www.scils.rutgers.edu/~miceval Ying Zhang Yuelin Li October 14, 2003

2 2 MIC Metadata Evaluation Framework – the IFLA FRBR (Functional Requirement for Bibliographic Record) MIC evaluation cases Experiences and lessons

3 3 The IFLA FRBR - MICEval Framework Find – can a user enter a search and retrieve records relevant to the search Identify – once the user retrieves record, can he/she successfully interpret the information in the record to know whether the source information will be relevant to his/her needs? Select – can the user compare the information in multiple records and determine the most relevant record? Obtain – can the user successfully obtain the original artifacts, based on the information provided in the source information

4 4 Directory Schema Evaluation – question/methods Usefulness assessment How useful is each directory element in terms of helping target users to find, identify, select, and obtain source information ? Criterion: Perceived usefulness Methodology: Online survey –Embed the FRBR framework –Provide situational information –Sampling frame (Science educators, archivists)

5 5 Directory Schema Evaluation – Sample section head 3. SELECT – Confirm that the record describes the organization most appropriate to the users’ needs based on conformance to important criteria for comparing one organization to others retrieved in a search USE: These fields will display in the short listing when multiple records result from a search. These fields will enable a user to quickly select the most useful records among multiple records retrieved in a search Example (Prototype screen for illustration)

6 6 Directory Schema Evaluation – Results/Applications FIND (Org) FIND (Collect)IDENTIFYSELECTOBTAIN #1#1 Org Name (93.9%) Predominant subjects in collection (90.9%) Org Name (84.0%) Org’s URL (94.0%) Primary contact for obtain (97.0%) #2#2 State / region (93.9%) Classes of materials in collection (87.9%) Country (81.8%) Org Name (93.9%) Primary email address (97.0%) #3#3 Org type (90.9%) General physical format (84.8%) Service provided (81.8%) Org’s address (87.9%) URL for obtaining (96.9%) Determine useful directory elements for the user community Identify potential elements that are missed in current schema Improve the search and result display interfaces

7 7 Metadata schema evaluation— Proposal Usability test How usable is the MIC metadata schema in terms of helping target users to find, identify, select, and obtain source information ? Measures –Information adequacy –Information accuracy –Ease of understanding –Helpfulness – Physical item accessibility – Precision – Error rate – Satisfaction

8 8 Metadata schema evaluation Embedment of FRBR Query modification Users’ relevance judgment vs. evaluators’ false judgment detection Treatments Physical accessibility check Ease of understanding Information Adequacy Error Rate Physical Item Accessibility Usability Measures Information accuracy Helpfulness Precision Find Identify Select Obtain IFLA FRBR “Generic Tasks”

9 9 Metadata schema evaluation Methods/treatments Stratified and purposive sampling Training and practicing Demographic questionnaire Simulated topical scenario Query modification using metadata records as the source of relevance feedback Post-test questionnaire Lab observation (audio/video taping, observation notes) Think aloud protocol Exit interview …

10 10 MIC Evaluation (experiences/lessons) Evaluation questions Criteria measures instruments Brain storming Literature review Communication MIC Analysis FRBR 4 generic tasks MIC Evaluation Approach Adaptation Embedment Embed the IFLA FRBR Adapt measures & treatments Provide situational information

11 11 Acknowledgement Thank Ms. Grace Agnew for her innovative idea of applying the IFLA FRBR as the framework for the evaluation project Thank Dr. Tefko Saracevic for his excellent leadership on our evaluation team Thank Ms. Judy Jeng for her nice work as a team member

12 12 MIC Evaluation Team Tefko Saracevic, Ph.D, Evaluation Investigator Ying Zhang, Doctoral Student, Evaluation Coordinator Yuelin Li, Doctoral Student Judy Jeng, Ph.D. Candidate School of Communication, Information and Library Studies Rutgers, the State University of New Jersey 4 Huntington Street, New Brunswick, NJ 08901 U.S.A. Tel.: (732)932-7500/Extension 8222 Fax: (732)932-2644 Email: miceval@scils.rutgers.edu URL: http://www.scils.rutgers.edu/~miceval


Download ppt "1 IFLA FRBR & MIC Metadata Evaluation Ying Zhang Yuelin Li October 14, 2003."

Similar presentations


Ads by Google