Presentation on theme: "Using Rubrics to Collect Evidence for Decision-Making: What do Librarians Need to Learn? Megan Oakleaf, MLS, PhD School of Information Studies Syracuse."— Presentation transcript:
Using Rubrics to Collect Evidence for Decision-Making: What do Librarians Need to Learn? Megan Oakleaf, MLS, PhD School of Information Studies Syracuse University 4 th International Evidence Based Library & Information Practice Conference May 2007
IndicatorsBeginningDevelopingExemplaryData Source AttendanceAttendance rates are similar to the 2006 Open House Attendance rates increase by 20% from 2006 Open House Attendance rates will increase by 50% from 2006 Open House Staff [Committee and Volunteers] records Staff Participation Staff participation is similar to 2006 Open House, no volunteers Increase in participation by library staff [librarians and paraprofessiona ls] and student volunteers Increase in participation with library staff [librarians and paraprofessiona ls], student volunteers, student workers, and academic faculty Staff [Committee and Volunteers] records BudgetBudget same as 2006 Open House, $200 Budget increases by $100 from 2006 Open House Budget increases by $300 from 2006 Open House Budget, Financial Statements Reference Statistics Reference statistics similar to 2006 Reference statistics increase by 20% from 2006 Reference statistics increase by 50% from 2006 Library Reference Department Statistics Student Attitudes Students are pleased with Open House Students enjoy the Open House, are satisfied with information Students are excited about the Open House, volunteer to participate with the next years event Survey Rubric for a Library Open House Event for First Year Students Rubric created by: Katherine Thurston & Jennifer Bibbens
IndicatorsBeginningDevelopingExemplaryData Source Transactions0 – 4 reference transactions per week. 5 – 7 reference transactions per week. 8 + reference transactions per week. Transaction Logs User SatisfactionStudents, faculty and staff report they are dissatisfied or very dissatisfied with reference transactions. Students, faculty and staff report they are neutral about reference transactions. Students, faculty and staff report they are satisfied or very satisfied with reference transactions. User Surveys TrainingLibrarians report they are uncomfortable or very uncomfortable with providing virtual reference service. Librarians report they are neutral about providing virtual reference service. Librarians report they are comfortable or very comfortable with providing virtual reference service. Post-Training Surveys TechnologyBetween 75 % and 100 % of transactions a week report dropped calls or technical difficulties. Between 25 % and 74% of transactions a week report dropped calls or technical difficulties. Between 0 % and 24% of transactions a week report dropped calls or technical difficulties. System Transcripts Electronic Resources0 – 50 hits on electronic resources a week. 50 – 100 hits on electronic resources a week hits on electronic resources a week. Systems Analysis Logs Rubric for a Virtual Reference Service Rubric created by: Ana Guimaraes & Katie Hayduke
American Library Association Information Literacy Competency Standards for Higher Education. 22 April 2005.http://www.ala.org/ala/acrl/acrlstandards/informationliteracycompetency.htm Arter, Judith and Jay McTighe. Scoring Rubrics in the Classroom: Using Performance Criteria for Assessing and Improving Student Performance. Thousand Oaks, California: Corwin Press, Bernier, Rosemarie. Making Yourself Indispensible By Helping Teachers Create Rubrics. CSLA Journal 27.2 (2004). Bresciani, Marilee J., Carrie L. Zelna, and James A. Anderson. Assessing Student Learning and Development: A Handbook for Practitioners. Washington: National Association of Student Personnel Administrators, Callison, Daniel. Rubrics. School Library Media Activities Monthly 17.2 (Oct 2000): 34. Colton, Dean A., Xiaohong Gao, Deborah J. Harris, Michael J. Kolen, Dara Martinovich-Barhite, Tianyou Wang, and Catherine J. Welch. Reliability Issues with Performance Assessments: A Collection of Papers. ACT Research Report Series 97-3, Gwet, Kilem. Handbook of Inter-Rater Reliability: How to Estimate the Level of Agreement between Two or Multiple Raters. Gaithersburg, Maryland: STATAXIS, Hafner, John C. Quantitative Analysis of the Rubric as an Assessment Tool: An Empirical Study of Student Peer-Group Rating. International Journal of Science Education (2003). Iannuzzi, Patricia. We Are Teaching, But Are They Learning: Accountability, Productivity, and Assessment. Journal of Academic Librarianship 25.4 (1999): Landis, J. Richard and Gary G. Koch. The Measure of Observer Agreement for Categorical Data. Biometrics 33 (1977). Lichtenstein, Art A. Informed Instruction: Learning Theory and Information Literacy. Journal of Educational Media and Library Sciences 38.1 (2000). Mertler, Craig A. Designing Scoring Rubrics For Your Classroom. Practical Assessment, Research and Evaluation 7.25 (2001). Moskal, Barbara M. Scoring Rubrics: What, When, and How? Practical Assessment, Research, and Evaluation 7.3 (2000). Nitko, Anthony J. Educational Assessment of Students. Englewood Cliffs, New Jersey: Prentice Hall, Popham, W. James. Test Better, Teach Better: The Instructional Role of Assessment. Alexandria, Virginia: Association for Supervision and Curriculum Development, Prus, Joseph and Reid Johnson. A Critical Review of Student Assessment Options. New Directions for Community Colleges 88 (1994). Smith, Kenneth R. New Roles and Responsibilities for the University Library: Advancing Student Learning through Outcomes Assessment. Association of Research Libraries, Stevens, Dannielle D. and Antonia Levi. Introduction to Rubrics: An Assessment Tool to Save Grading Time, Convey Effective Feedback, and Promote Student Learning. Sterling, Virginia: Stylus, Tierney, Robin and Marielle Simon. What's Still Wrong With Rubrics: Focusing On the Consistency of Performance Criteria Across Scale Levels. Practical Assessment, Research, and Evaluation 9.2 (2004). Wiggins, Grant. Creating Tests Worth Taking. A Handbook for Student Performance in an Era of Restructuring. Eds. R. E. Blum and Judith Arter. Alexandria, Virginia: Association for Supervision and Curriculum Development Wolfe, Edward W., Chi-Wen Kao, and Michael Ranney. Cognitive Differences In Proficient and Nonproficient Essay Scorers. Written Communication 15.4 (1998).