A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 Evaluation of Digital library Anna Maria Tammaro University of Parma.

Slides:



Advertisements
Similar presentations
COUNTER: improving usage statistics Peter Shepherd Director COUNTER December 2006.
Advertisements

Dublin Core for Digital Video: Overview of the ViDe Application Profile.
Parma, 21st November 2003Minerva European Conference : Quality for cultural Web sites Quality Framework and Guidelines for Cultural Web Sites Isabelle.
Ministerial NEtwoRk for Valorising Activising in digitisation A Handbook on Good Practice in Digitisation Borje Justrell National Archives of Sweden.
Geoscience Information Network Stephen M Richard Arizona Geological Survey National Geothermal Data System.
X4L Prog Mtg – March 05 Evaluation – What works best - when and why? Prof Mark Stiles Project Director: SURF X4L, SURF WBL, SUNIWE and ICIER Projects Head.
Joint Information Systems Committee Supporting Higher and Further Education Portals and the JISC Information Environment Strategy Chris Awre Programme.
INDIANA UNIVERSITY LIBRARIES Enabling the library in university systems Trial and evaluation in the use of library services away from the library Chris.
Digital Preservation A Matter of Trust. Context * As of March 5, 2011.
Standardizing Usage Statistics Requests with SUSHI Theodore Fons Senior Product Manager Innovative Interfaces.
Re-Shaping Library Service Programming: New Strategies for the New Millennium Daryl C. Youngman Kansas State University 23d Annual Conference International.
Evaluation Mary Rowlatt MDR Partners. Definition of project evaluation Evaluation focuses on whether the project was effective, achieved its objectives,
Key Stage 3 National Strategy Standards and assessment: session 3.
Providing access to your data: Determining your audience Robert R. Downs, PhD NASA Socioeconomic Data and Applications Center (SEDAC) Center for International.
WP6: Delivery and evaluation of learning environment and training programme Dublin, June /05/13.
OSI and Bibliographic Access: opening a conversation Caroline Arms Kevin Novak Michelle Rago.
8 August 2001ALIA Untangling the Web 8/8/ Chris Taylor The University of Queensland Library Gateways: A cottage industry going places?
The COUNTER Code of Practice for Books and Reference Works Peter Shepherd Project Director COUNTER UKSG E-Books Seminar, 9 November 2005.
Joint Information Systems Committee Supporting Higher and Further Education The role of discovery tools in the era of full-text Alison McNab & Dr Philip.
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
1 Adaptive Management Portal April
EURIDICE project Evaluation of image database use in online learning environment 11/
Future Access to the Scientific and Cultural Heritage – A shared Responsibility Birte Christensen-Dalsgaard State and University Library.
University of Michigan’s OAI Metadata Harvesting Project Kat Hagedorn OAIster Librarian, UM April 16, 2002.
University of Michigan’s OAI Metadata Harvesting Project Kat Hagedorn OAIster Librarian, UM May 12, 2002.
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
Challenges for the DL and the Standards to solve them Alan Hopkinson Technical Manager (Library Systems) Learning Resources Middlesex University.
The Subject Librarian's Role in Building Digital Collections: Where Information Management and Subject Expertise Meet Ruth Vondracek Oregon State University.
ReQuest (Validating Semantic Searches) Norman Piedade de Noronha 16 th July, 2004.
Evaluating and Revising the Physical Education Instructional Program.
A.M.TammaroNMPLIS Summer School, Tblisi, July 5th-15th, 2010 Digital library concepts Anna Maria Tammaro.
© Tefko Saracevic, Rutgers University1 digital libraries and human information behavior Tefko Saracevic, Ph.D. School of Communication, Information and.
Evaluation of digital Libraries: Criteria and problems from users’ perspectives Article by Hong (Iris) Xie Discussion by Pam Pagels.
supported by a local government initiative sharing nationally to improve services locally A-Z & Knowledge Base Project Sheila Apicella.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Integrating Digital Curation in a Digital Library curriculum: the International Master DILL case study Anna Maria Tammaro University of Parma Florence,
Providing Access to Your Data: Access Mechanisms Robert R. Downs, PhD NASA Socioeconomic Data and Applications Center (SEDAC) Center for International.
Changing the perspective: users’involvement in prioritising needs in the Italian National Statistical Programme Monica Attias, ISTAT Q2014 Vienna, June.
1 Usability of Digital Libraries Sanda Erdelez & Borchuluun Yadamsuren Information Experience Laboratory University of Missouri – Columbia USA Libraries.
The Digital Library for Earth System Education: A Community Resource
National and University Library of Slovenia University of Ljubljana, Faculty of Civil and Geodetic Engineering User-centred evaluation of digital repositories:
Research Data Management Services Katherine McNeill Social Sciences Librarians Boot Camp June 1, 2012.
Outcome Based Evaluation for Digital Library Projects and Services
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Providing Access to Your Data: Access Mechanisms Robert R. Downs, PhD NASA Socioeconomic Data and Applications Center (SEDAC) Center for International.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Funded by: © AHDS Digitisation – Other Issues Houghton-le-Spring, November 2005 Alastair Dunning Arts and Humanities Data Service Executive King’s College.
Assessment: Research in Context Allison Sivak University of Alberta Libraries June 13, 2008.
1 Evaluating the Quality of the e-Learning Experience in Higher Education Anne Jelfs and Keir Thorpe, Institute of Educational Technology (IET), The Open.
Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.
ISchool Los Angeles 2008 DILL : an European approach to digital librarian education Anna Maria Tammaro Ragnar Audunson Sirje Virkus.
Digital Libraries1 David Rashty. Digital Libraries2 “A library is an arsenal of liberty” Anonymous.
Digital Library Evaluation Flora McMartin Broad Based Knowledge
Quality Assessment of MFA’s evaluations Rita Tesselaar Policy and operations Evaluation Department Netherlands Ministry of Foreign Affairs.
MINES for Libraries™ Presented by Martha Kyrillidou Director of the ARL Statistics and Service.
Information Literacy Online: p.r.n. Information Literacy Standards Delivery and Design Face to Face Partnerships Tutorials E-struction Assessment.
Providing access to your data: Determining your audience Robert R. Downs, PhD NASA Socioeconomic Data and Applications Center (SEDAC) Center for International.
C4EO – Ways of Working Heather Rushton, Planning and Performance 1.
Summary of Organization Reports UNECE Work Session on the Communication of Statistics 30 June – 2 July 2010, Paris, France United Nations Economic Commission.
Grant Writing for Digital Projects September 2012 IODE Project Office IODE Project Office Oostende, Belgium Oostende, Belgium Sustainability and.
Online Information and Education Conference 2004, Bangkok Dr. Britta Woldering, German National Library Metadata development in The European Library.
Assessing the hybrid university library: addressing diverse agendas through effective and efficient approaches Dr Graham Walton, Research Fellow 24 th.
Role of Metadata in dissemination of census data Regional Seminar on dissemination and spatial analysis of census data, Nairobi, September, 2010.
Principles of Language Assessment
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
Assessing the Assessment Tool
Information Literacy – where we are at
Global trends in academic library development
digital libraries and human information behavior
Presentation transcript:

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 Evaluation of Digital library Anna Maria Tammaro University of Parma

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 Outline  What is the digital library?  Why evaluate?  Evaluation cycle What? How?  Good practices

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 What is a digital library?  What is encompassed? Visions of library  What elements to take?  What is critical?

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 Why evaluate?  Evaluation is a fact finding,  Evidence based value measuring,  Integrated in the management process of digital libraries  Accountability: evidence of resources spent  Effectiveness: understanding basic phenomena (as information seeking)  Impact: as increased learning, research, dissemination

A.M.TammaroUniversity of Tbilisi, 5-15 July major questions for evaluation  What actually occurred?  How can it be improved?  Did it accomplish the objectives?  What impact did it have?

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 Description  What actually occurred?  Documentation Evaluation, MIS

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 Improvement  How can it be improved?  Formative evaluation

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 Fit for purposes?  Did it accomplish its objectives?  Effectiveness evaluation

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 Impact of digital library  What impact did it have?  The ultimate question for evaluation is: “How are digital libraries transforming research,education, learning and living?” (Saracevic 2002, p. 368)

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 What evaluate?  Content  Services/system  Users and uses

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 Content evaluation  Content quality (subject coverage, relevance)  Content scope (what is included? Online journals, ebook)  Content organisation (metadata, bibliographic organisation,indexing)  Effectiveness (management, user support)  Efficiency (cost)

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 System interface  Interface (usability, design, accessibility)  System performance (interactivity,algorithms for searching, processing time)  System configuration (networks, security,authentication)

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 Outcomes  The ways in which library users are changed as a result of their contact with the library resources and programs (ARL 1998)

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 Outcomes based evaluation Have audiences been sufficiently identified? Are outcomes clearly written? Are outcomes sufficient to describe what you hope will happen? Are data collection methods cost efficient? Add: Do they provide the data you want and need?

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 Users  Who are they? (researchers, students, remote, etc.? What is their context?)  How do they access the digital library? (infomation seeking behviour, usability)  Why do they need the digital library? (activities, expectations)  What type of resources do they need? (subject, etc.)  What is the value of digital library? (impact, outcomes, potential for community building)

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 European Minerva Project  Minerva  Handbook on cultural web user interaction  ns/handbookwebusers.htm

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 How to evaluate?  Survey  Focus group  Interviews  Transaction logs  Observation  Ethnographic evaluation  Usability  Combined methods  Longitudinal studies  Crosscultural assessment  Benchmarking

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 Standard – COUNTER, SUSHI (NISO standard usage statistics harvesting initiative)  No benchmarking or longitudinal studies (for the rate of change)

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 Good practice  DigiQual – www. digiqual. org/  PEAK – www. dlib. org/ dlib/ june99/ 06bonn. html  E- valued – www. evalued. uce. ac. uk

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 Bad news  There is no single, easy to administer, inexpensive, reliable, and valid approach to evaluating interactive learning from DLs.

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 Good news There are practical strategies for documenting the development and use of interactive learning, improving it, and building a case for its effectiveness and impact.

A.M.TammaroUniversity of Tbilisi, 5-15 July 2010 Questions?  Thanks of attention!