MAC Fall Symposium: Learning What You Want to Know and Implementing Change Elizabeth Yakel, Ph.D. October 22, 2010.

Slides:



Advertisements
Similar presentations
Chapter 11 Direct Data Collection: Surveys and Interviews Zina OLeary.
Advertisements

©2011 1www.id-book.com Evaluation studies: From controlled to natural settings Chapter 14.
Chapter 14: Usability testing and field studies
Evaluation Mary Rowlatt MDR Partners. Definition of project evaluation Evaluation focuses on whether the project was effective, achieved its objectives,
How do we know it works? Evaluating Learning Technology Projects EDUCAUSE Learning Initiative Seminar Clare van den Blink
Findings from Fall 2001 UM.CourseTools Survey Michelle Bejian, UM Media Union Findings from UM.CourseTools Satisfaction Survey Fall 2001.
TEACHER EVALUATION IN NEW JERSEY: PREVIEW OF FINAL REPORT RUTGERS GRADUATE SCHOOL OF EDUCATION.
SCHOOL OF INFORMATION. UNIVERSITY OF MICHIGAN Standardized Survey Tools for Assessment in Archives and Special Collections Elizabeth Yakel University of.
Existing Documentation
Online Bulletin Board Focus Groups Nobles Research.
Neil Naftzger Principal Researcher Washington 21st CCLC Evaluation February 2015 Copyright © 20XX American Institutes for Research. All rights reserved.
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
NCknows Evaluation Overview Jeffrey Pomerantz, Lili Luo School of Information & Library Science UNC Chapel Tapping the vast reservoir of.
Getting Started in Blackboard. You will need… A web browser, preferably Internet Explorer, version 4.0 or higher An account and the knowledge of.
Two Decades of User Surveys The Experience of Two Research Libraries from 1992 to 2011 Jim Self, University of Virginia Steve Hiller, University of Washington.
Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 1 Designing user studies February.
Information Seeking Behavior of Scientists Brad Hemminger School of Information and Library Science University of North Carolina at Chapel.
Administrivia Turn in ranking sheets, we’ll have group assignments to you as soon as possible Homeworks Programming Assignment 1 due next Tuesday Group.
From Controlled to Natural Settings
UNDERSTANDING, PLANNING AND PREPARING FOR THE SCHOOL-WIDE EVALUATION TOOL (SET)
How to Assess Student Learning in Arts Partnerships Part II: Survey Research Revised April 2, 2012 Mary Campbell-Zopf, Ohio Arts Council
Business and Management Research
+ Measuring Teaching Quality in the Online Classroom Ann H. Taylor Director, Dutton e-Education Institute College of Earth and Mineral Sciences.
What is a Usable Library Website? Results from a Nationwide Study Anthony Chow, Ph.D., Assistant Professor Michelle Bridges, Patricia Commander, Amy Figley,
Management, marketing and population of repositories Morag Greig, University of Glasgow.
Making YOUR WEBSITE MORE EFFECTIVE Website Evaluation & Usability September 17 th,
Assessment of Staff Needs and Satisfaction in an Evergreen Environment David Green NC Cardinal Consultant State Library of North Carolina.
Data Analysis for Evaluation Eric Graig, Ph.D.. Slide 2 Innovation Network, Inc. Purpose of this Training To increase your skills in analysis and interpretation.
User Interface Evaluation Usability Inquiry Methods
ATP Online Module July 2006 Conducting Qualitative Research
Impact assessment framework
Library Assessment in North America Stephanie Wright, University of Washington Lynda S. White, University of Virginia American Library Association Mid-Winter.
ACTIVITY. THE BRIEF You need to provide solid proof to your stakeholders that your mobile website meets the needs of your audience. You have two websites.
LibQual 2013 Concordia University Montréal, Québec.
Welcome to the Successful Practices Network Tom Venezio, Vice President & Director, Successful Practices Network International Center for Leadership in.
Put it to the Test: Usability Testing of Library Web Sites Nicole Campbell, Washington State University.
111 © 2001, Cisco Systems, Inc. All rights reserved. Presentation_ID.
Partnering with an Evaluator Presented by Joanne Kahn 1.
CONFERENCE EVALUATION ADMINISTRATION OF SURVEYS & INTERVIEWS.
Research and Analysis Methods October 5, Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Usability, the User Experience & Interface Design: The Role of Reference July 30, 2013.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Quality Assessment July 31, 2006 Informing Practice.
The ArtWorx Museum Training Program for Volunteers The heart of a volunteer is not measured in size, but by the depth of the commitment to make a difference.
Partnering with an Evaluator Presented by Joanne Kahn 1.
Copyright 2010, The World Bank Group. All Rights Reserved. Reducing Non-Response Section B 1.
Re-Visioning the Future of University Libraries and Archives through LIBQUAL+ Cynthia Akers Associate Professor and Assessment Coordinator ESU Libraries.
Marketing Research Approaches. Research Approaches Observational Research Ethnographic Research Survey Research Experimental Research.
MAP the Way to Success in Math: A Hybridization of Tutoring and SI Support Evin Deschamps Northern Arizona University Student Learning Centers.
1 AMP Results Overview for Educators October 30, 2015.
360 Feedback A Tool For Improving Individual And Organizational Effectiveness.
By Godwin Alemoh. What is usability testing Usability testing: is the process of carrying out experiments to find out specific information about a design.
AVI/Psych 358/IE 340: Human Factors Evaluation October 31, 2008.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Identifying Assessments
LibQual+ Spring 2008 results and recommendations Library Assessment Working Group 11/19/2008 Library Faculty Meeting.
Introduction to Blackboard Rabie A. Ramadan Session 3.
UCI Library Website Chris Lee Archana Vaidyanathan Duncan Tsai Karen Quan.
Research Design in Education Research Methods. Describe your research topic What is the nature of the problem and your research question? To answer the.
Davide Azzolini, FBK-IRVAPP Enrico Rettore, FBK-IRVAPP Antonio Schizzerotto, FBK-IRVAPP MENTEP Kick-Off Meeting Brussels, April 2015.
Agenda: Surveys –In-person, Paper, telephone, and web –Computer assisted –Instrument Design Administrative data WorkFirst Longitudinal Study Hands-on surveys.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
CaMSP Science Assessment Webinar Public Works, Inc. Sharing Lessons Learned in the Development and Use of Science Assessments for CaMSP Teachers and Students.
Individualized research consultations in academic libraries: Useful or useless? Let the evidence speak for itself Karine Fournier Lindsey Sikora Health.
HRA User Satisfaction Report
Tasks & Grades for MET3.
Cabrillo College’s Ellucian Portal Project
Presentation transcript:

MAC Fall Symposium: Learning What You Want to Know and Implementing Change Elizabeth Yakel, Ph.D. October 22, 2010

Outline Asking good research questions Identifying the right method to get the answers Operationalizing your concepts Piloting your instruments Implementing the study – Identifying subjects – Sample size Human subjects issues

Asking Good Research Questions What do you want to know? Selecting the right method for what you want to answer

Implications of Questions Reference – Satisfaction levels; user needs Collections usage – Measuring use; Citation analysis Website or finding aids – Ease of use; usage patterns Instruction – Evaluation of teaching; impact of primary sources on learning

Reference User-based evaluation – How satisfied are researchers with public services in my repository? – Getting reseachers’ opinions about operations to improve service User needs – How do researchers’ questions evolve during an archives visit and how can I better support their resource discovery? – Understanding general information seeking patterns and research questions

Collections Usage Measuring use – What are the patterns in collection requests and does my retrieval schedule and reading room staffing support this level of use? – Assess workflow around collection retrieval (rate and times for retrieval, time from request to – Assess levels of staffing in the reading room Quality / Nature of use – How do researchers actually use collections in my repository? – Citation patterns / rates

Website Ease of use – What problems do users encounter on my website? – Make changes to the website to improve the user experience Patterns of use – How do visitors find my website and what do the committed visitors do when they are there?

Instruction Evaluation of teaching – How can I improve my lecture on using the archives? Assessment of learning – What is the impact of using the archives on students’ critical thinking skills?

METHODS Non-invasiveInvasive Reference Observation Statistics (# of readers, registrants, questions) Content analysis (online chat or reference) Surveys Focus Groups Interviews Web site Web analytics Content analysis of search terms Surveys Usability tests Focus Groups Interviews Collections Usage Statistics (circulation) Call slips Citation analysis Surveys Focus Groups Interviews Finding aids Web analytics Content analysis of search terms Observation Web 2.0 (analysis of comments/tags) Surveys Focus Groups Interviews Usability tests Instruction Statistics (# of classes taught, # of participants) # of archives visits after session Survey Grades / Student assessment Surveys Focus Groups Interviews Field experiment

Operationalizing Concepts Reference – Satisfaction with services Which services? What is the nature of the satisfaction? – Friendliness of the staff – Time to retrieve materials – Reference process or procedures – Find what you were looking for?

Operationalizing Concepts (2) Ease of use – Ability to use online finding aids – Ability to locate key information in < 1 minute (hours, parking) – Specific tasks that you want researches to do on the website

Operationalizing Concepts (3) Instruction - How does archives use contribute to critical thinking skills? – Confidence – Use of skills learned in the archives – Explain archives to a peer – Assessment

Measuring Your Concepts Lib Qual – Gap measurement – Minimum service level, desired service level, and perceived service performance – Service superiority = Perceived minus desired service – Service adequacy = Perceived minus minimum

Measuring Your Concepts Explore your options Qualitative measures – Find what you were looking for? Quantitative measures – Approachability of the reference staff Yes / no Scale

Concepts and Instruments ConceptType of instrument Reference Satisfaction with services Work processes Survey Focus groups Web site Ease of useSurveys Collections Usage Nature of useInterviews / Focus groups Finding aids Ease of use (search, navigation, locating important information) Usability tests Instruction Evaluation of orientation Learning Survey

Administration Issues ProsCons Paper Immediate Size readily recognizable Allows for a wider set of data Have to enter data Online Data is already entered Allows for skip logic Gives people constraints Hard to assess length Need more contextual cues

Pre-Implementation Pilot testing – Staff – Targeted participants 1-2 make changes 1-2 make more changes

Population / Sample Reference: – All Researchers using the archives / special collections / Researchers in the past month Collections usage: – All call slips / the past year of call slips Website: – All search terms used / Last 1000 searches Instruction – Students in the classes to which you gave talks / students with a requirement to use the archives

Sampling Getting enough “n” Recruiting participants Representativeness Response rate

Administration Instructions Getting staff on board Creating a script / message System for recruiting – Data range – Type of user – Random sampling

Common Issues in User-based Research Memory – Recent of contact 3-6 months – Frequency of contact Several times a year Knowledge of Archives – What prompts do you need to provide?

Recruiting Website survey – Link on website – Send link to recent reference requestors Recent onsite researchers – In-house researchers – Stakeholders – Partner groups (genealogical societies) Focus groups – Composition

Recruiting Usability testing – Single individuals representing specific types – “Friendly dyads” Interviews – Single – Researchers working together

Representativeness Does the sample mirror or approximate the population? Does the sample truly represent the group you are targeting?

Response Rate Surveys – Large surveys 15%-20% – Archives surveys (n=50) – Percentage of respondents mirrors percentage of like users? – Responses per question

Human Subjects Check with your IRB if you have one – Research for internal improvement – Publication Incentives Consent Responsible data management