Assessing Information Literacy Skills: A Look at Project SAILS Joseph A. Salem, Jr. Kent State University ARL New Measures Initiatives CREPUQ February.

Slides:



Advertisements
Similar presentations
Forging Faculty Librarian Collaborations in Information Literacy Savannah State University October 1, 2010 Georgia Conference on IL.
Advertisements

Auditing Subject Knowledge in Initial Teacher Training using Online Methods.
Educational Specialists Performance Evaluation System
March 2007 ULS Information Literacy and Assessment of Learning Program.
Assessment of General Education Patricia E. Beeson Vice Provost for Graduate and Undergraduate Studies February 11, 2008 University of Pittsburgh.
Measuring Information Literacy from Third Grader to College Senior Carolyn Radcliff, M.L.S., SAILS Project Director Julie Gedeon, Ph.D., TRAILS Assessment.
Assessment Policy Overview Dwayne Holford Coordinator, Academic Affairs.
BEST PRACTICES IN INFORMATION LITERACY ASSESSMENT Yvonne Mery, Vicki Mills, Jill Newby, University of Arizona Libraries February 11, 2009.
Howard R. Mzumara Ranjita D. Shinde Denise L. Czachura IUPUI Testing Center 1 Presentation given at the IMLS/IUPUI Steering Committee Meeting at IUPUI,
Alcorn State University Information L I N K J. D. Boyd Library.
Visual Literacy Standards Task Force Open Meeting ACRL Image Resources Interest Group Virtual meeting, ALA Midwinter 2011.
WASHBURNWASHBURN Friends of Mabee Library October 28, 2004 Standardized Assessment of Information Literacy Skills Presented by Judy Druse Martha Imparato.
1 Reading First Internal Evaluation Leadership Tuesday 2/3/03 Scott K. Baker Barbara Gunn Pacific Institutes for Research University of Oregon Portland,
Formative and Summative Evaluations
Barbara F. Schloman Libraries & Media Services Profiling Students’ Understanding: Using TRAILS to Assess 9 th Grade Information.
Diana Chan, HKUST Enhancing Learning Experiences in Higher Education: International Conference 2 -3 December 2010.
+ Results of a National Assessment of Information Literacy Skills.
Project SAILS: Facing the Challenges of Information Literacy Assessment Julie Gedeon Carolyn Radcliff Rick Wiggins Kent State University EDUCAUSE 2004.
Alternate Assessment on Alternate Achievement Standards Aligned to Common Core State Standards 1.
PISA Partnership to Improve Student Achievement through Real World Learning in Engineering, Science, Mathematics and Technology.
Assessing Financial Education: A Practitioner’s Guide December 2010.
Instructional System Design
The Child Outcomes Summary Competency Check (COS-CC) Amy Nicholas, Naomi Younggren, Siobhan Colgan & Kathi Gillaspy 2013 Improving Data, Improving Outcomes.
VALUE-ing Information Literacy: Developing a Community of Practitioners through Assessment Mary C. MacDonald, Jim Kinnie, and Elaine Finan Project funded.
The Five New Multi-State Assessment Systems Under Development April 1, 2012 These illustrations have been approved by the leadership of each Consortium.
Revising instructional materials
NEXT GENERATION BALANCED ASSESSMENT SYSTEMS ALIGNED TO THE CCSS Stanley Rabinowitz, Ph.D. WestEd CORE Summer Design Institute June 19,
Confluence in Information Literacy Lisa Baures Randall McClure Georgia Conference on Information Literacy Coastal Georgia Center Savannah, GA 6 October.
Adolescent Literacy – Professional Development
Librarians Prepare for their Global Information Role in the 21 st Century Hannelore B. Rader University of Louisville Louisville, Kentucky, US January.
ICT Assessment, the Librarian’s View Eileen Stec, Instruction & Outreach Librarian, Rutgers University April 18, 2007.
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
Before & After: What Undergraduates and Alumni Say About Their College Experience and Outcomes Angie L. Miller, NSSE & SNAAP Research Analyst Amber D.
Getting Started with the State Pre-Planning Process.
The Genetics Concept Assessment: a new concept inventory for genetics Michelle K. Smith, William B. Wood, and Jennifer K. Knight Science Education Initiative.
Setting purposeful goals Douglas County Schools July 2011.
Assessing SAGES with NSSE data Office of Institutional Research September 25 th, 2007.
Information Literacy in Academic Environments June 1, 2005 AMICAL Meeting No. 2 June 1-3, 2005 Delivered by Samira Rafidi Meghdessian Information Services.
Building Assessment Literacy in Michigan through Quality Common Assessment Development.
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
Barbara F. Schloman, Ph.D., TRAILS Project Director Julie A. Gedeon, Ph.D., TRAILS Assessment Coordinator Kent State University Libraries and Media Services.
Educator’s view of the assessment tool. Contents Getting started Getting around – creating assessments – assigning assessments – marking assessments Interpreting.
Challenges and Trade-offs in Measuring the Outcomes of NSF’s Mathematics and Science Partnership Program: Lessons from four years on the learning curve.
Assessment of Information Literacy Presented by Touro College Libraries Sara Tabaei, Information Literacy Director Bashe Simon, Director of Touro Libraries.
Jennifer Schwelik, MEd, TRAILS Project Manager, KSU Paula Baco, MLS, Trumbull Career and Technical Center Using TRAILS: (Tools for Real-Time Assessment.
EPoster: Co-teaching in a collaborative environment The Orchestrators Kelly Taylor & Russ Simpson Summer 2005.
PROGRAM Perkins III Accountability and Continuous Improvement “Work in Progress” at Minnesota State Colleges and Universities Mary Jacquart Minnesota State.
Intro to Outcomes. What is “Outcomes”? A. a statewide initiative aimed at improving learning and accountability in education B. a standing SFCC committee.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
Assessing Learning Outcomes Polices, Progress and Challenges 1.
DATA ANALYSIS Looking at Student Work November 2013.
Agenda Debrief on past module development Tools for online content development Module development template Timeline Suggested guidelines for developing.
Cameron University Library Library Fall 2008 Program Quality Improvement Report
Instructional System Design Presented by Catherine Chacon.
Report on the NCSEAM Part C Family Survey Batya Elbaum, Ph.D. National Center for Special Education Accountability Monitoring February 2005.
By Mario Carrizo. Definition Instructional design is define basically as the person who teaches, designs or develops instructions. Instructional designers.
Barry Williams1 Designing & Conducting Formative Evaluation Dick & Carey Chp. 10.
Library Assessment of Research Skills Instruction Tim Held Library May 13, 2011.
Instructional Design Course Evaluation Phase. Agenda The Evaluation Process Expert Review Small Group Review The Pilot Feedback and Revision Evaluation.
Dick & Carey Instructional Design Model Sabri Elamin ED 6615.
Project SAILS: An Information Literacy Research Project Carolyn Radcliff Kent State University.
Assessing Tools that Measure Information Literacy (IL) Skills: Could Project SAILS Prove Useful for Law Libraries? AALL/Wolters Kluwer (Aspen) Grant –Funded.
FSM NSTT Teaching Competency Test Evaluation. The NSTT Teaching Competency differs from the three other NSTT tests. It is accompanied by a Preparation.
Jennifer Schwelik, MEd, TRAILS Project Manager, KSU Jennifer Flaherty, MLS, Beachwood High School Librarian Paula Baco, MLS, Trumbull County Career Center.
Assessment and the Institutional Environment Context Institutiona l Mission vision and values Intended learning and Educational Experiences Impact Educational.
Instructional Leadership Supporting Common Assessments.
Literacy Assessment and Monitoring Programme (LAMP)
Cuyamaca College Library
Information Literacy: What is it and Why Should I Care?
Presentation transcript:

Assessing Information Literacy Skills: A Look at Project SAILS Joseph A. Salem, Jr. Kent State University ARL New Measures Initiatives CREPUQ February 11, 2005

Context Explosion of interest in information literacy Accountability Assessment  Formative: for planning/improvement  Summative: evidence/documenting

What Is an Information Literate Person? Information Power - American Association of School Librarians  9 standards Big6 Information Competency Standards for Higher Education – ACRL  5 standards, 22 performance indicators, 87 outcomes, 138 objectives

Our Questions Does information literacy make a difference to student success? Does the library contribute to information literacy? How do we know if a student is information literate?

The Idea of SAILS Perceived need – No tool available Project goal – Make a tool:  Programmatic evaluation  Valid  Reliable  Cross-institutional comparison  Easy to administer for wide delivery  Acceptable to university administrators

The Project Structure Kent State team  Librarians, programmer, measurement expert Association of Research Libraries partnership Ohio Board of Regents collaborative grant with Bowling Green State University (part for SAILS) IMLS National Leadership Grant  Working with many institutions

Project Parameters Test based on ACRL document Test development model: Systems design approach Measurement model: Item Response Theory Tests cohorts of students (not individuals) – programmatic assessment A name  Standardized Assessment of Information Literacy Skills

Test Development Systems design approach: 1. Determine instructional goal 2. Analyze instructional goal 3. Analyze learners and contexts 4. Write performance objectives 5. Develop assessment instrument The Systematic Design of Instruction. 6 th ed. By Walter Dick, Lou Carey, James O. Carey. Boston: Pearson/Allyn and Bacon, c2005.

ACRL Standards – Significant Challenges Breadth and depth Objectives  Multi-part  Multi-level Habits/behaviors versus knowledge

Consider Skill Sets Regrouping the ACRL objectives (and some outcomes) 12 sets of skills organized around activities/concepts More closely mirrors instructional efforts?

Item Development Process Review competencies and draft some items: "How can I know that a student has achieved this competency?" Formulate a question and answers. This may take several iterations. Develop additional responses that are incorrect, yet plausible. Aim for five answers total.

Testing the Test Items Conduct one-on-one trials  Meet with individual students, talk through test items Conduct small group trials  Administer set of items to group, engage in discussion after Conduct field trials  Administer set of items to 500+ students, analyze data

Measurement Model Item Response Theory  Also called Latent Trait Theory  Measures ability levels  Looks at patterns of responses For test-takers For items  Rasch measurement using software program “Winsteps” (

Response Pattern Example Easier questionsHarder questions Q1Q2Q3Q4Q5Q6Q7Q8 Person ACCCC C Person BCC CC C Person C C CC CC Person D CCCCCCCC C = gave correct answer

Data Reports Based on standards and skill sets Looking at cohorts, not individuals Show areas of strength and areas of weakness

The Person-Item Map Plots items according to difficulty level Plots test-takers according to their patterns of responses Can mark average score for cohorts  Cross-institutional average  Specific institution average

The Bar Chart Another representation of the information Group averages  Major, class standing, etc.  Which groups are important to measure? How do you know which differences in means are important?

Current Instrument Status 158 items developed, tested, and in use Most ACRL learning outcomes covered  Not Standard 4: Uses information effectively to accomplish a specific purpose. 12 skill sets developed based on ACRL document

IMLS Grant Status Phase I complete  6 institutions participated  Feedback from institutions Phase II underway  36 institutions participated Phase III started June 2004  About 70 institutions participating  Wrap-up summer 2005

Project Highlights Discipline-specific modules Canadian version of the instrument Automated survey generation Automated report generation

Next Steps for SAILS IMLS grant period ends on September 30, 2005 Stop administering SAILS to allow analysis of the instrument  Does the instrument measure what we want it to?  Are institutions getting what they need?

Next Steps for SAILS Analyze data and input from institutions Validate the instrument  Factor analysis and skill sets  Outside criterion testing through performance testing  Test-taker characteristics Sex, ethnicity, class standing, GPA  Test administration methods

Next Steps for SAILS Re-think how results can be presented or used  Scoring for the individual  Pre and post-testing  Cut scores Administrative challenges  Automate data analysis  Re-engineer administrative tools  Create customer interface Test development  Develop new items

Summary Vision:  Standardized, cross-institutional instrument that measures what we think it does To answer the question:  Does information literacy make a difference to student success?

For More Information Joseph Salem, Mary Thompson, project coordinator

Questions?