Presentation is loading. Please wait.

Presentation is loading. Please wait.

Information Literacy Assessment: What Works, What Doesn’t, and the San Jose State University Experience CARL 2006 Conference Presented by Toby Matoush.

Similar presentations


Presentation on theme: "Information Literacy Assessment: What Works, What Doesn’t, and the San Jose State University Experience CARL 2006 Conference Presented by Toby Matoush."— Presentation transcript:

1 Information Literacy Assessment: What Works, What Doesn’t, and the San Jose State University Experience CARL 2006 Conference Presented by Toby Matoush SJSU Library

2 Abstract This poster session will detail the San Jose State University experience using two national information literacy assessment tests currently being developed: the ETS Information and Communication Technology Literacy (ICT) Test and the Standardized Assessment of Information Literacy Skills (SAILS) Test.

3 Standardized Assessment of Information Literacy Test Tests information literacy concepts based on ACRL information literacy objectives 50 Multiple-choice questions; no tasks Available both online and in print (monitored or unmonitored testing environment) Tests cognitive but not performative skills

4 ETS Information & Communication Technology Literacy (ICT) Test Tests both information literacy & information technology (computer skills) Scenario-based questions; 15 tasks Available online only in highly monitored testing center Tests not only cognitive skills but also performative

5 Test Similarities Based on ACRL information literacy objectives Tests skill sets Available online Can be used to determine student research ability and research skills which need work

6 Test Differences Cost: ETS ICT is $18.75-$25 (may increase) per test; SAILS is $2,000 per semester Testing environment: ICT is highly monitored only; SAILS can be monitored or unmonitored Test length: ETS ICT is 90 minutes; SAILS is 25 minutes Scoring: ETS ICT automatic scoring based on sub-proficiences; SAILS scoring using Winsteps software administered by Kent State faculty/IT

7 The SJSU Experience: SAILS Participated in all 3 phases; tested approx students per phase ( ) Test can be taken both from home or in a monitored library classroom SJSU is not using test scores Unable to add questions to test (demographics, library); may change in 2006

8 The SJSU Experience: ETS ICT Test Participated in beta-testing Spring 2005 and 2006; tested 115 (05) and 650 students (06) Test must be taken in highly monitored environment with proctors Campus assessment project; scores will be used in WASC & GE assessment Ability to add any additional questions

9 What SJSU Learned: What Works Faculty recruitment and testing classes Campus buy-in; SJSU received a grant for ETS ICT testing for Spring 2006 Faculty recruitment using faculty; Library was assisted in recruitment by faculty University culture of assessment; SJSU has new GE information literacy objectives

10 What SJSU Learned: What Doesn’t Work Student recruitment via phone or Student testing without incentives (extra credit) Random sampling (most statistically accurate) versus testing classes of interested professors Testing without faculty and campus buy-in

11 SAILS Challenges Difficult to recruit without campus buy-in Score reports difficult to read Validity and usability of test is questionable: Test reveals little variability between test groups (freshman vs. seniors, etc.) Testing only really works if campus uses it for assessment Faculty will usually only assign if students can take from home

12 ETS ICT Challenges Little flexibility with test times, testing environment, methodology Schedules testing during busiest time of semester (first 2 months) Test scores available only online Highly proprietary about test material Test is extremely time-consuming to administer


Download ppt "Information Literacy Assessment: What Works, What Doesn’t, and the San Jose State University Experience CARL 2006 Conference Presented by Toby Matoush."

Similar presentations


Ads by Google