Project SAILS: Facing the Challenges of Information Literacy Assessment Julie Gedeon Carolyn Radcliff Rick Wiggins Kent State University EDUCAUSE 2004 Conference Denver Colorado
2 What is information literacy? Ability to locate, access, use, and evaluate information efficiently and effectively. Guiding document: “Information Competency Standards for Higher Education” – Association of College & Research Libraries ( competency.htm)
3 Our questions Does information literacy make a difference to student success? Does the library contribute to information literacy? How do we know if a student is information literate?
4 The Idea of SAILS Perceived need – No tool available Project goal – Make a tool: Program evaluation Valid Reliable Enables cross-institutional comparison Easy to administer for wide delivery Acceptable to university administrators
5 Project parameters Test Systems design approach Measurement model – Item Response Theory Tests cohorts of students (not individuals) A name Standardized Assessment of Information Literacy Skills
6 The project structure Kent State team Ohio Board of Regents collaborative grant with Bowling Green State University (part for SAILS) IMLS National Leadership Grant Association of Research Libraries partnership
7 Technical components Environment Item builder Survey builder Survey generator Report generation Challenges
8 Environment Linux (Red Hat) Apache MySQL PHP
9 Survey process Create survey questions (items) Create survey for this phase Add schools for this phase Schools create web front-end Collect data
10 Item Builder
11 Item maintenance
12 Survey Builder
13 Survey items
14 Random selection of items
15 School information
16 SAILS front-end
17 Redirection to SAILS web site Parameters passed: Unique student identifier School code Authorization code
18 Link test
19 Demographic data
20 Survey questions
21 Report process Send schools unique identifiers Upload demographics Scan & upload paper surveys Generate entire dataset file Offline IRT analysis Upload IRT results Generate reports
22 Sample report text
23 Sample report graph
24 Technical challenges Creation of the front-end Customizations for schools Automating the data analysis Supporting different languages
25 Data analysis Item Response Theory Measures ability levels Looks at patterns of responses For test-takers For items (questions) Based on standards and skill sets Show areas of strength and areas of weakness
26 Status Instrument 126 items developed, tested, and in use Web-based and paper-based administration Grant Project - IMLS Phase I complete - 6 institutions Phase II complete - 34 institutions Phase III began June institutions
27 Next steps for SAILS Analyze data and other input Administrative challenges Self reported demographic data Testing environment Report generation Does the instrument measure what we want it to? Are institutions getting what they need?
28 Summary Vision: Standardized, cross-institutional instrument that measures what we think it does To answer the questions: Do students gain information literacy skills? Does information literacy make a difference to student success?
29 For more information Julie Gedeon, Carolyn Radcliff, Rick Wiggins, Mary Thompson, project coordinator