Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 August 1, 2011 ALCTS Continuing Education Committee Webinar Aiming for a Robust Metadata Infrastructure for the Future: RDA Component Beacher Wiggins,

Similar presentations


Presentation on theme: "1 August 1, 2011 ALCTS Continuing Education Committee Webinar Aiming for a Robust Metadata Infrastructure for the Future: RDA Component Beacher Wiggins,"— Presentation transcript:

1 1 August 1, 2011 ALCTS Continuing Education Committee Webinar Aiming for a Robust Metadata Infrastructure for the Future: RDA Component Beacher Wiggins, Director Acquisitions & Bibliographic Access Library of Congress

2 2 Why a Test? Report of LC Working Group on the Future of Bibliographic Control  Suspend work on RDA Were the JSC’s goals for RDA being met?  General feasibility  Technical feasibility  Financial feasibility 2

3 RDA Goals and Test Findings Provide a consistent, flexible and extensible framework for all types of resources and all types of content  Goal was met Be compatible with internationally established principles and standards  Goal was partially met. Need increased harmonization among JSC, ISBD and ISSN communities Be usable primarily within the library community, but able to be used by other communities  Test did not cover this goal 3

4 RDA Goals and Test Findings Enable users to find, identify, select, and obtain resources appropriate to their information needs  Goal was partially met Be compatible with descriptions and access points in existing catalogs and databases  Goal was mostly met Be independent of the format, medium, or system used to store or communicate the data  Goal was met 4

5 RDA Goals and Test Findings Be readily adaptable to newly-emerging database structures  Test did not verify this goal Be optimized for use as an online tool  Goal was not met Be written in plain English, and able to be used in other language communities  Goal was not met Be easy and efficient to use, both as a working tool and for training purposes  Goal was not met 5

6 6 Test Partners 26 formal test partners, including LC, NAL, NLM Partners included a cross-section:  Diverse types and sizes of institutions  Libraries, consortia, educators, vendors  Describing different formats and content  Program for Cooperative Cataloging libraries 6

7 7 Methodology: Materials Tested Common original set (COS) Common copy set (CCS) Extra original set (EOS) Extra copy set (ECS) 7

8 8 Methodology: Common Original Set 25 titles cataloged twice by different catalogers:  Once using RDA  Once using current content code  No subject analysis or classification Range of analog and digital content:  Textual monographs (10)  AV materials (5)  Serials (5 - print & other)  Integrating resources (5) 8

9 9 Methodology: Common Copy Set 5 resources copy cataloged  Printed text, in English: Monograph Serial Translation Compilation Novel 9

10 10 Methodology: Extra Sets Test partners cataloged regular receipts using RDA (at least 25 original items)  Foreign languages  Cartographic materials  Music materials  Law materials Authority data created if normally done for both common and extra set titles 10

11 11 Summary of numbers of RDA records collected SetsBibliographicAuthority Common original set 1,5091,226 Common copy set 123N/A Extra set7,78610,184 Informal testers1,1481,390 Totals10,56612,800

12 12 Methodology: Survey Instruments 4 surveys with questions related to test sets—  Common original set (COS)  Common copy set (CCS)  Extra original set (EOS)  Extra copy set (ECS) 4 Additional Surveys  Record creator profile (RCP)  Record user (RU)  Institutional questionnaire (IQ)  Informal testers (IT)

13 13 Number of Surveys Received 29 219 163 1200 5908 111 801 80

14 The Challenge: Interpreting the mountain of data 14 Photo: Jorg Mollowitz

15 Record Review Evaluate records in depth Compare AACR 2 and RDA records Possible only with Common Original Set:  Surrogates were available  Titles were cataloged using both rule sets 15

16 Record Review Findings Regina Reynolds and Barbara Bushman, members of the U.S. RDA Test Coordinating Committee, will present test findings in more detail as part a follow on ALCTS Continuing Education Committee webinar, August 31 16

17 Record Review Findings RDA and AACR2 records were equivalent in their consistency and error rate RDA errors tended to cluster around providing required access points for works and expressions manifested RDA record creators expressed concerns that they had found all the applicable rules and interpreted them correctly 17

18 Findings: Record Creation Times All times were self reported and ranged from a low of 1 minute to a high of 720 minutes Overall average time to create an original RDA bib record for the Extra Original Set was 31 minutes 18

19 The Learning Curve 19 Record creation times dropped about 50% after the first 20 records

20 20

21 USER SURVEY Positive FeaturesNegative Features Content/carrier/media elements in place of GMD Content/carrier/media elements difficult to understand No GMD Fuller recordsToo much information Spelling out of previously abbreviated words Spelling out of universally known abbreviations Rule of three droppedConfusing when publishing and copyright dates are the same Elimination of Latin termsElimination of “sic” in a title indicating there is a problem on the piece More access pointsFRBR terminology on record

22 Findings: Costs and Benefits Costs  Subscription to the RDA Toolkit  Development of training materials  Creation/revision of documentation  Loss of production time during initial training and implementation  Impacts to cataloging contracts 22

23 Findings: Costs and Benefits Benefits  Change in how characteristics of things are identified  Focus on user tasks  New abilities to use and re-use bibliographic metadata  Encouragement of new encoding schemas and better systems for resource discovery 23

24 24 Key Survey Findings

25 25 Key Survey Findings

26 26 Key Survey Findings

27 Decision 27 Contingent on the satisfactory progress/completion of the identified tasks and action items, the Coordinating Committee recommends that RDA should be implemented by LC, NAL, and NLM no sooner than January 2013. The three national libraries should commit resources to ensure progress is made on these activities that will require significant effort from many in and beyond the library community.

28 Recommendations Recommendations to various communities  U.S. library community (including PCC)  Joint Steering Committee  Vendors  RDA Co-publishers

29 Recommendations to specific groups (& suggested completion timeframes) To Joint Steering Committee  Rewrite (i.e., reword) of RDA in clear, plain English Within 18 months  Define process for RDA updating in the online environment Within 3 months (JSC had already begun work on this issue) 29

30 Recommendations to specific groups (& suggested completion timeframes) To ALA Publishing  Enhance and improve RDA Toolkit functionality and navigation Within 3 months (ALA had already begun work on this issue)  Provide complete RDA record examples in MARC and other encoding schema Within 6 months 30

31 Recommendations to specific groups (& suggested completion timeframes) To Library of Congress  Begin transition to a MARC replacement Within 18 – 24 months  Involve the community in the process Within 12 months  Lead and coordinate training Within 18 months 31

32 32 Next Steps U.S. RDA Test Coordinating Committee  determine the plan for overseeing and monitoring the recommended changes and action items  issue communications plan to alert community to status of recommendations and action items

33 Next Steps LC  keep updated: “RDA Transition: Frequently Asked Questions”  create a new web page to share information: “Information and Resources in Preparation for RDA: Resource Description and Access”  work and coordinate with PCC (Program for Cooperative Cataloging) regarding training and documentation

34 Next Steps LC  Set Timeline in Preparation for RDA: Resource Description and Access at the Library of Congress October, 2011: RDA catalogers/technicians (former LC testers) prepare for returning to RDA cataloging: classroom sessions and practice record discussions November, 2011:RDA catalogers/technicians return to creating RDA authority and bibliographic records Not sooner than July 2012: LC begins to train remaining catalogers to apply RDA

35 Thank you Thank you! Questions Beacher Wiggins, Director for Acquisitions & Bibliographic Access (bwig@loc.gov)bwig@loc.gov 35


Download ppt "1 August 1, 2011 ALCTS Continuing Education Committee Webinar Aiming for a Robust Metadata Infrastructure for the Future: RDA Component Beacher Wiggins,"

Similar presentations


Ads by Google