Creating An Architecture of Assessment: using benchmarks to measure library instruction progress and success Candice Benjes-Small Eric Ackermann Radford.

Slides:



Advertisements
Similar presentations

Advertisements

LIBQUAL+® AND THE EVOLUTION OF LIBRARY AS PLACE AT RADFORD UNIVERSITY, Eric Ackermann Radford University Library Assessment Conference University.
M & E for K to 12 BEP in Schools
Developing, Implementing, and Monitoring Interventions Janice Garland Lynn Sodat OSI Focus Schools Conference September 17-18, 2012.
Student Learning Targets (SLT)
SJC’s Quality Program: Value Educational Access and Student Success Dr. Connie Jacobs English Program Coordinator and Outgoing Co-Director of the Honors.
Campus Collaboration to Build a Series of Information Competency Workshops Nancy Getty and Deborah Moore Glendale Community College LOEX 2007.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Smarter Balanced Assessment Update California Mathematics.
Rubric Assessment of Student Responses to an Information Literacy Tutorial Megan Oakleaf Librarian for Instruction & Undergraduate Research Steve McCann.
Department of Special Education August 3, 2010 iSTEEP Follow-up & Training Presented by: Raecheal Vizier, M.Ed. Special Education Program Effectiveness.
PRESENTED BY: MARY SAGAR CHERYL BECKER BECCA BRITT Read to Succeed! Tutoring Program Chandler Public Library November 14, 2013 AzLA Conference.
Data and Reporting In Schoolnet for District Admins Dan Urbanski, DPI IIS - Learning Systems Division.
Computer Science Department Middle States Assessment Computer Science has 4 programs (minor, bachelor’s, master’s and doctorate) and therefore 4 different.
Response to Intervention (RTI) Presented by Ashley Adamo and Brian Mitchell January 6, 2012.
Professional Development & the Evaluation Process MARIA BERNALBROWARD COLLEGE EDWARD CORNEJOJUNE 9, 2015 PEDRO OLIVEIRA.
Flexible Scheduling Improving Student Learning Through Expanded Use of the Library Media Center Library Media Center Mr. Brown's class enters Mrs. Smith's.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Using IDEA for Assessment, Program Review, and SACS University of Alabama Birmingham September 11, 2012 Shelley A. Chapman, PhD.
TEACHING EVALUATION CAN BE A ONE DISH MEAL Heather Campbell Brescia University College London, Ontario, Canada
2012 Secondary Curriculum Teacher In-Service
WELCOME THE 12 TH ANNUAL MARCES/MSDE CONFERENCE: Value Added Modeling and Growth Modeling with Particular Application to Teacher and School Effectiveness.
Linking Systems: State, District, School, Classroom: The Virginia Project Chairs: Nancy Protheroe, Center on Innovation & Improvement Kathleen Smith, Virginia.
Implementing Active Learning Strategies in a Large Class Setting Travis White, Pharm.D., Assistant Professor Kristy Lucas, Pharm.D., Professor Pharmacy.
Methods of Rewarding Teaching N. Kevin Krane, M.D., F.A.C.P. Tulane University School of Medicine Vice Dean for Academic Affairs Floyd C. Knoop, Ph.D.
Wayne County High School Interventions for Student Achievement Marlene Griffith Mathematics Department Chair Wayne County High School (912) , ext.
2011 SAA Annual Meeting Genya O’Gara SPECIAL COLLECTIONS RESEARCH CENTER Engaged! Innovative Engagement and Outreach and Its Assessment.
1 Beyond Re-grouping and Re-teaching: Using Data to Dramatically Improve Instruction.
New Teacher Introduction to Evaluation 08/28/2012.
Read the Standards! Read the Standards! How do you teach the standards? Accessing and Using the MCA-III Math Data on the AIR Website January
Redesign of Precalculus Mathematics THE UNIVERSITY OF ALABAMA College of Arts and Sciences Course Redesign Workshop October 21, 2006.
Research in Education Faculty Development Workshop March 8, 2013 Donna L. Pattison, PhD Instructional Professor Department of Biology & Biochemistry.
Experiences at Illinois State University in the first year of deployment Bruce Stoffel, Head, Liaison and Reference Services EBSCO Discovery Service.
ALL SATISFACTION IS LOCAL: USING A TACTICAL ASSESSMENT TOOL TO GAUGE STUDENT SATISFACTION AND AS THE BASIS FOR ON-THE-FLY PROGRAM IMPROVEMENTS U sing a.
Results of Survey on Level Organization June 2012.
South Dakota Department of Education WriteToLearn Gay Pickner Director of Assessment June 22, 2011.
Assessment: Departmental & LAE Fall Faculty Conference 2008.
NWEA’s Vision: A world in which education is kid-centric, relying on accurate and comprehensive data to inform each child’s optimal learning path. Partnering.
Challenging Curriculum and Organizational Structures Oct. 23, 2013 Jesse White.
Florida Education: The Next Generation DRAFT March 13, 2008 Version 1.0 Lesson Study Presented by: Darliny G. Katz, Instructional Reading Specialist Florida.
1 14. Project closure n An information system project must be administratively closed once its product is successfully delivered to the customer. n A failed.
GRAPHIC ORGANIZERS What is Universal Design for learning?
Unit Assessment System Teacher Education at Purdue March 4, 2012.
Responsiveness to Instruction RtI Tier III. Before beginning Tier III Review Tier I & Tier II for … oClear beginning & ending dates oIntervention design.
“Measuring That Which Is Valued”: Implementing and Managing Efficient Formative Assessment and Evaluation of Library Instruction Carol A. Leibiger Alan.
Understanding by Design, UbD - based on work by McTighe and Wiggins.
School Monitoring and OEPA Greg Miller MEL – 540 School Resource Management Spring 2015.
Flexible Scheduling Mrs. Smith's class leaves Mr. Brown's class enters Improving Student Learning Through Expanded Use of the Library Media Center Library.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
College Algebra MOOC Julie Spitzer and Elaine Collins.
Continuous School Improvement Planning, Session 2 Professional Development Services Curriculum, Instruction, and Assessment.
CHAPTER 4 PLANNING. Introduction Plans – Methods formulated beforehand for achieving a desired result. – Plans should specify at minimum what will you.
Library Assessment of Research Skills Instruction Tim Held Library May 13, 2011.
Information Seeking Behavior and Information Literacy Among Business Majors Casey Long Business Liaison Librarian University Library Georgia State University,
Starting From Scratch: Meaningful Integration of Information Literacy through Collaborative Course and Assignment Design Chris Sweet, Information Literacy.
The State of U.S. School Libraries By the Numbers Dr. Lesley Farmer California State University Long Beach
Monitoring School Progress Towards Reading First Goals Pam Bell Morris, Ph.D. Theresa Clarke.
Carol A. Leibiger Alan W. Aldrich University of South Dakota
An agency of the Office of the Secretary of Education and the Arts
Affect of Viewing Tutorials on Percentage Change:
Introductions… Who am I? Why am I here?. Programme Leaders Developing Academic Leadership and Innovative Practice.
Professional Development & the Evaluation Process
Learning Prompt- How do you think tonight’s readings connect to the next steps in our assessment process?
Standards Based Grading
Professional Development & the Evaluation Process
Sarah Lucchesi Learning Services Librarian
Strategic Plan: Tri-Cities High School
Measuring the Impact of Utilizing Undergraduate Teaching Assistants in the Classroom on Student Success in a First-Year Engineering Course Poria Dorali1.
How electronic assessment can help you!
Starting From Scratch: Meaningful Integration of Information Literacy through Collaborative Course and Assignment Design Chris Sweet, Information Literacy.
Examinations Council of Zambia
Presentation transcript:

Creating An Architecture of Assessment: using benchmarks to measure library instruction progress and success Candice Benjes-Small Eric Ackermann Radford University

“So, Candice, how many library sessions have we taught this year?”

Look at all these instruction librarians!

But… Curricular changes Librarian burnout Students reported BI overload

On the other hand University administration wants to see progress

Looking for alternatives Number of sessions plateau Scoured literature Attended conferences Networked with colleagues

Our environment Public university students Courses not sequenced Instruction built on one-shots

Macro look at program Focus on us, not students Search for improvements over time Student evaluations as basis

A little bit about our evaluation form

Goals Provide data to satisfy three constituents –Instruction librarians: immediate feedback –Instruction team leader: annual evaluations –Library Admin: justify instruction program

Background Began in 2005 Iterative process

Development 4-point Likert scale Originally had a comment box at end Major concern: linking comments to scale responses

Solution: Linked score and comment responses Q1. I learned something useful from this workshop. Q2. I think this librarian was a good teacher.

Inspiration for benchmarks University of Virginia library system use of metrics to determine success Targets outlined We would do one department rather than entire library To learn more about UVA’s efforts, visit

Benchmark baby steps Look at just one small part of instruction program Begin with a single benchmark Identify one area to assess Decided to do one particular class

Introduction to Psychology Teach fall and spring, beginning sections of 60+ students Shared script and PPT Everyone teaches over 2 days To see our shared PPT, visit

Developing benchmarks Selected a comment based metric for Instruction Team Chose class of comments: “What did you dislike about the teaching?” (Question #2)

Current benchmarks Partial success: 5 < 10% total comments for Question 2 are negative Total success: < 5% total comments for Question 2 are negative

How did we do?

Results

Success? Reached our desired benchmark for partial success- never quite went below 5% Tweaking the script again Continuous improvement

Scaling for your program Adjust the benchmark levels Only look at score responses (quantitative) instead of comments (qualitative) Adjust the number of benchmarks used

Sharing with administrators Team annual reports Stress evidence-based nature Use percentages, not a 4-point scale

Disadvantages Time intensive Follow through required Evaluation forms not easy to change

More disadvantages Labor intensive to analyze comments Results may reveal your failures

Advantages Flexiblity to measure what you want to know Provides structured goal Evidence-based results more convincing

More advantages Continuous evaluation results over time Data-driven decisions about instruction program Do-able

Contact Candice Benjes-Small Eric Ackermann