Project SAILS: Facing the Challenges of Information Literacy Assessment Julie Gedeon Carolyn Radcliff Rick Wiggins Kent State University EDUCAUSE 2004.

Slides:



Advertisements
Similar presentations
March 2007 ULS Information Literacy and Assessment of Learning Program.
Advertisements

Measuring Department Wide Staff Satisfaction at Queen’s University Canadian National Higher Education Information Technology Conference -- June 2003 Trish.
A Practical Guide. The Handbook Part I BCCC Vision of Assessment Guiding Principles of Assessment Part II The Assessment Model Part III A guide on how.
Minnesota’s Professional Development Plan to Prepare for the 2014 GED Test Amy Vickers, Minneapolis Adult Education Astrid Liden, Minnesota Department.
Assessment is about Quality AMICAL CONFERENCE – MAY 2008 Ann Ferren, Provost, American University in Bulgaria.
Test Automation Success: Choosing the Right People & Process
Measuring Information Literacy from Third Grader to College Senior Carolyn Radcliff, M.L.S., SAILS Project Director Julie Gedeon, Ph.D., TRAILS Assessment.
Federal Guidance on Statistical Use of Administrative Data Shelly Wilkie Martinez, Statistical and Science Policy, OIRA U. S. Office of Management and.
BEST PRACTICES IN INFORMATION LITERACY ASSESSMENT Yvonne Mery, Vicki Mills, Jill Newby, University of Arizona Libraries February 11, 2009.
Next Generation of Assessments Paula Mahaley ∙ January 27 and 28, 2014.
WASHBURNWASHBURN Friends of Mabee Library October 28, 2004 Standardized Assessment of Information Literacy Skills Presented by Judy Druse Martha Imparato.
EBS Survey Vermont PBS “Bringing out the BEST in all of us.”
Barbara F. Schloman Libraries & Media Services Profiling Students’ Understanding: Using TRAILS to Assess 9 th Grade Information.
Kuder ® Career Planning System The sky’s the limit!
+ Results of a National Assessment of Information Literacy Skills.
PIAAC data in Canada and the United States Satya Brink, Ph.D
Michael Cagle, Holly Langer-Evans, and Anne Tuccillo | Dec U.S. Department of Education 2013 FSA Training Conference for Financial Aid Professionals.
TeamSTEPPS TM National Implementation Measurement The following slides are not part of the TeamSTEPPS Instructor Guide. Due to federal 508 compliance requirements.
Keystone State Reading Conference October 29, 2012 Dr. Deb Carr, King’s College.
Assessing Financial Education: A Practitioner’s Guide December 2010.
Copyright Facilitate.com Beyond Web Conferencing: How To Create Value With Web Facilitation
Information Literacy Adapted from a presentation by Anke Tonn Nicholls State University.
Data Collection on the Cheap: A System for Small Budgets and Small Organizations Lac Courte Oreilles Ojibwe Community College Ann Martin, Assessment Coordinator.
Assessing Information Literacy Skills: A Look at Project SAILS Joseph A. Salem, Jr. Kent State University ARL New Measures Initiatives CREPUQ February.
Data Analysis for Evaluation Eric Graig, Ph.D.. Slide 2 Innovation Network, Inc. Purpose of this Training To increase your skills in analysis and interpretation.
The Uses of Blackboard in IDARI
Justice Information Exchange Model (JIEM) Larry Webster SEARCH January 23, 2004.
of Research Libraries Assessing Library Performance: New Measures, Methods, and Models 24 th IATUL Conference 2-5 June 2003 Ankara,
Librarians Prepare for their Global Information Role in the 21 st Century Hannelore B. Rader University of Louisville Louisville, Kentucky, US January.
AL-QADISIYIA UNIVERSITY COLLEGE OF ENGINEERING SELF ASSESSMENT REPORT Submitted by SAR committee.
FAEIS Project User Opinion Survey 2005 Thursday, June 23, 2005 Washington, D.C. H. Dean SutphinYasamin Miller ProfessorDirector, SRI Agriculture & Extension.
Before & After: What Undergraduates and Alumni Say About Their College Experience and Outcomes Angie L. Miller, NSSE & SNAAP Research Analyst Amber D.
Catherine C. Marshall Akshay Kulkarni.  Explores practices associated with ◦ Collaborative Authoring ◦ Reference Use ◦ Informal Creation of Personal.
Lance Speelmon Scholarly Technologist Enhancing OSP for Programmatic and Institutional Assessment.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
The Role of the NCATE Coordinator Kate M. Steffens St. Cloud State University NCATE Institutional Orientation September, 2002.
Broadening Our Reach: Collaborating for Improvement ACRL 2005, Minneapolis, MN Nancy J. Burich, Frances A. Devlin, Anne Marie Casey and Svetlana Vladimir.
Student Curriculum Planning System MSE Project Presentation I Kevin Sung.
Issues in Selecting Assessments for Measuring Outcomes for Young Children Issues in Selecting Assessments for Measuring Outcomes for Young Children Dale.
The Partner Interface Pennsylvania State Grant Program.
1 The New York State Education Department New York State’s Student Data Collection and Reporting System.
Barbara F. Schloman, Ph.D., TRAILS Project Director Julie A. Gedeon, Ph.D., TRAILS Assessment Coordinator Kent State University Libraries and Media Services.
University of Texas Libraries Integrating Library Resources with Blackboard TBUG Conference, Fall 2006.
Re-Visioning the Future of University Libraries and Archives through LIBQUAL+ Cynthia Akers Associate Professor and Assessment Coordinator ESU Libraries.
Framework for the Creation of Digital Knowledge Resources to meet the Challenges for Digital Future: A Librarian’s Perspective Dr. Harish Chandra Librarian.
1 Project web site Evaluating Library Service Quality: Use of LibQUAL+  IATUL Kansas City, MO June 2002 Julia Blixrud Association.
Jennifer Schwelik, MEd, TRAILS Project Manager, KSU Paula Baco, MLS, Trumbull Career and Technical Center Using TRAILS: (Tools for Real-Time Assessment.
Evaluation Gamblers Assistance Program Spotlight 7 July 2009.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
Development of the West Virginia University Electronic Theses & Dissertations System Presented By Haritha Garapati at ETD the 7 th International.
Report on the NCSEAM Part C Family Survey Batya Elbaum, Ph.D. National Center for Special Education Accountability Monitoring February 2005.
Partners Red Button DDoS Readiness. DDoS Readiness Program The service itself is described in the ‘DDoS Readiness Program’ slide deck.
IWM 2004 Jouko Lehonen Jouko Lehtonen A Proposal for Future Databases on Micropiles.
Library Assessment of Research Skills Instruction Tim Held Library May 13, 2011.
National 4-H Common Measures Suzanne Le Menestrel, Ph.D. National Institute of Food and Agriculture, USDA Jill Walahoski, Ph.D. University of Nebraska-Lincoln.
Project SAILS: An Information Literacy Research Project Carolyn Radcliff Kent State University.
Kenneth C. C. Yang The University of Texas at El Paso Presented at 2016 Sun Conference TEACHING INFORMATION LITERACY SKILLS IN COLLEGE CLASSROOMS: EMPIRICAL.
Vision to Reality: How Knowledge Sharing Promotes Efficiencies Through Process Improvement  History of the Knowledge Collaboration Centre (KCC)  The.
Assessing Tools that Measure Information Literacy (IL) Skills: Could Project SAILS Prove Useful for Law Libraries? AALL/Wolters Kluwer (Aspen) Grant –Funded.
AREA MEETING 2016 What is the CLIMATE of your CLUB?
Center for Assessment and Improvement of Learning
Chapter 24: Architecture Competence
Lecture 17 ATAM Team Expertise
Development of Assessment Literacy Knowledge Base
Assessing Library Performance:
Academy Hub An eUnomia Factory Solution.
The InWEnt Blended-learning approach; GC21 as an e-learning and Blended-learning platform 22/02/2019 An introduction course on InWEnt Blended-learning.
Basic eSkills Foundation or Frustration
Academy Hub An eUnomia Factory Solution.
The Teacher Work Sample: An Authentic Assessment
Presentation transcript:

Project SAILS: Facing the Challenges of Information Literacy Assessment Julie Gedeon Carolyn Radcliff Rick Wiggins Kent State University EDUCAUSE 2004 Conference Denver Colorado

2 What is information literacy? Ability to locate, access, use, and evaluate information efficiently and effectively. Guiding document: “Information Competency Standards for Higher Education” – Association of College & Research Libraries ( competency.htm)

3 Our questions Does information literacy make a difference to student success? Does the library contribute to information literacy? How do we know if a student is information literate?

4 The Idea of SAILS Perceived need – No tool available Project goal – Make a tool: Program evaluation Valid Reliable Enables cross-institutional comparison Easy to administer for wide delivery Acceptable to university administrators

5 Project parameters Test Systems design approach Measurement model – Item Response Theory Tests cohorts of students (not individuals) A name Standardized Assessment of Information Literacy Skills

6 The project structure Kent State team Ohio Board of Regents collaborative grant with Bowling Green State University (part for SAILS) IMLS National Leadership Grant Association of Research Libraries partnership

7 Technical components Environment Item builder Survey builder Survey generator Report generation Challenges

8 Environment Linux (Red Hat) Apache MySQL PHP

9 Survey process Create survey questions (items) Create survey for this phase Add schools for this phase Schools create web front-end Collect data

10 Item Builder

11 Item maintenance

12 Survey Builder

13 Survey items

14 Random selection of items

15 School information

16 SAILS front-end

17 Redirection to SAILS web site Parameters passed: Unique student identifier School code Authorization code

18 Link test

19 Demographic data

20 Survey questions

21 Report process Send schools unique identifiers Upload demographics Scan & upload paper surveys Generate entire dataset file Offline IRT analysis Upload IRT results Generate reports

22 Sample report text

23 Sample report graph

24 Technical challenges Creation of the front-end Customizations for schools Automating the data analysis Supporting different languages

25 Data analysis Item Response Theory Measures ability levels Looks at patterns of responses For test-takers For items (questions) Based on standards and skill sets Show areas of strength and areas of weakness

26 Status Instrument 126 items developed, tested, and in use Web-based and paper-based administration Grant Project - IMLS Phase I complete - 6 institutions Phase II complete - 34 institutions Phase III began June institutions

27 Next steps for SAILS Analyze data and other input Administrative challenges Self reported demographic data Testing environment Report generation Does the instrument measure what we want it to? Are institutions getting what they need?

28 Summary Vision: Standardized, cross-institutional instrument that measures what we think it does To answer the questions: Do students gain information literacy skills? Does information literacy make a difference to student success?

29 For more information Julie Gedeon, Carolyn Radcliff, Rick Wiggins, Mary Thompson, project coordinator