Presentation is loading. Please wait.

Presentation is loading. Please wait.

SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Standardized Survey Tools for Assessment in Archives and Special Collections Elizabeth Yakel University of.

Similar presentations


Presentation on theme: "SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Standardized Survey Tools for Assessment in Archives and Special Collections Elizabeth Yakel University of."— Presentation transcript:

1 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Standardized Survey Tools for Assessment in Archives and Special Collections Elizabeth Yakel University of Michigan Library Assessment Conference August 4 th, 2008

2 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Archival Metrics Drs. Wendy Duff and Joan Cherry, University of Toronto Dr. Helen Tibbo, University of North Carolina Aprille McKay, J.D., University of Michigan Andrew W. Mellon Foundation Grant (June 2005 – March 2008) Toward the Development of Archival Metrics in College and University Archives and Special Collections Archival Metrics

3 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Archival Metrics Toolkits On-site Researcher Archival Website Online Finding Aids Student Researcher Teaching Support

4 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Archives and Special Collections Weak culture of assessment Need the ability to perform user based evaluation on concepts that are specific to archives and special collections No consistency in user-based evaluation tools No reliability in any survey tools No means of comparing data across repositories

5 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Developing the Archival Metrics Toolkits Analysis of other survey instruments Interviews with archivists, faculty, students to identify core concepts for evaluation Concept Map Creation of the questionnaires Testing the questionnaires, survey administration procedures, and the instructions for use and analysis

6 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Analysis of Other Instruments LibQual+ E-metrics (Charles McClure and David Lankes, Florida State University) Definitions Question phraseology

7 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Interviews Develop concepts for evaluation Identify areas of greatest concern Understand the feasibility of deploying surveys using different methods

8 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Interviews: Availability (1) Archivist: This is a very big place and people are busy. If somebody doesn’t stop you to ask you a question and you don’t even see that they’re there because they don’t…we’re actually thinking about finding some mechanism like a bell rings when somebody walks in the door because you get so focused on your computer screen that you don’t even know somebody’s there. We understand why they’re a little intimidated. (MAM02, lines )

9 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Interviews: Availability (2) Student: I’ve said this a million times, but the access to someone who is open to helping. So not just someone who is supposed to be doing it, but who actually wants to. (MSM02, lines ) Professor: And they'll even do it after hours, almost every one of them. In fact they'd rather do it after hours because they don't have a crush of business in there. (MPM04, lines )

10 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Defining terms Accessibility of service Accessibility of Service is a measure of how easily potential users are able to avail themselves of the service and includes (but is certainly not limited to) such factors as: availability (both time and day of the week); site design (simplicity of interface); ADA compliance; ease of use; placement in website hierarchy if using web submission form or link from the website; use of metatags for digital reference websites (indexed in major search tools, etc.); or multilingual capabilities in both interface and staff, if warranted based on target population (E-Metrics)

11 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Conceptual Framework Quality of the Interaction Access to systems and services Physical Information Space Learning Outcomes

12 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Quality of the Interaction Perceived expertise of the archivist Availability of staff to assist researchers Instructors and students in the interviews returned to these dimensions again and again as important in a successful visit.

13 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Learning Outcomes Student Researcher Cognitive and affective outcomes Confidence

14 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Testing of the tools Iterative design and initial user feedback Small scale deployment Larger scale testing

15 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Pilot Testing Testing Two phases Single archives / special collections Walked people through the questionnaire Post test interview Clarity of questions Length of instrument Comprehensiveness of questions Willingness to complete a questionnaire of this type Single implementations of the questionnaire and administration procedures until the instrument was stable

16 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Pilot Testing Outcomes Lots of feedback on question wording Decision to include a definition of finding aids Decision not to pursue a question bank approach Move away from expectation questions Decisions about how to administer the surveys On-site researcher and student researcher surveys became paper-based Online finding aids and website survey - online

17 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Larger Scale Testing ToolkitNumber of Tests Total respondents Researcher6230 Student Researcher 4214 Online Finding Aids 4198 Website286 Teaching Support 212

18 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Testing Administration Procedures Researcher When to administer? During visit Problematic due to archival procedures Student End of term Need instructor cooperation Early with online survey link had a poor response rate

19 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Administration of the Web- based Tools to recent on-site researchers Rolling invitations immediately after a response to an reference question Retrospective invitations to reference requestors Across all the surveys tested, we received an average response rate of 65%.

20 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Accumulating Reference Requests ArchivesSurveyNumber of reference requests Days for accumulation BFinding Aids5281 CFinding Aids3665 EWebsite5064

21 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Administration Recommendations reference and on-site researchers represent two different populations Few reference requestors had visited the repository in person

22 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Differences in Samples Last accessed finding aidsB (In-house Researchers) % (n=22) B ( Reference) % (n=24) Less than a day9.1%12.5% Less than a week4.5%16.7% Less than a month31.8%33.3% Less than six months27.3%0.0% Less than one year4.5%0.0% More than one year0.0% I have never accessed your online finding aids 22.7%37.5%

23 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Reliability Testing QuestionnaireConstruct# of itemsSample SizeAlpha Coefficient ResearcherInteraction Quality ResearcherUsability (Web Catalog) ResearcherUsability (Paper Finding aids) ResearcherUsability (Digital Finding Aids) ResearcherInfo. Space WebsiteUsability WebsiteInfo. Quality Online Finding Aids Usability

24 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Reliability of Other Questions Free text, multiple choice questions generate consistent responses Student survey focuses on learning outcomes, confidence, development of transferable skills Contradictory results in Questions 12 (‘Is archival research valuable for your goals?’) and 13 (‘Have you developed any skills by doing research in archives that help you in other areas of your work or studies?’) Moderately correlated chi-square test, the phi coefficient was.296;  2(1, N=426) = 37.39, p<.05.

25 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Archival Metrics Toolkits 1.Questionnaire Word document 1.PDF 2.Can transfer Survey Monkey version to other Survey Monkey accounts 2.Administering the Survey (Instructions) 3.Preparing your data for analysis 4.Excel spreadsheet pre-formatted for data from the Questionnaire 5.Pre-coded questionnaire 6.SPSS file pre-formatted for data from the Website Questionnaire 7.Sample report

26 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Administration Instructions How the questionnaires can and cannot be amended Identifying a survey population Sampling strategies Soliciting subjects Applying to their university’s Institutional Review Board (IRB) or ethics panel.

27 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Analysis Instructions Instructions Preformatted Excel spreadsheet Codebook Preformatted SPSS file

28 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Archival Metrics Toolkits Free and available for download Creative Commons License Must ‘register” but this information will not be shared We will be doing our own user-based evaluation of their use

29 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN

30 SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Thank-you and Questions


Download ppt "SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Standardized Survey Tools for Assessment in Archives and Special Collections Elizabeth Yakel University of."

Similar presentations


Ads by Google