Information Literacy Assessment- 2014 SPECIAL THANKS TO JIM WAUGH, OPIE!

Slides:



Advertisements
Similar presentations
Embedding Assessment of Student Learning Outcomes in Regularly Scheduled Assignments Dr. Larry H. Kelley Auburn, Alabama
Advertisements

Bridging the Sophomore Gap: A Developmental Model of Information Literacy Shawn Bethke, Head of Library Public Services George Loveland, Library Director.
Assessment of the Impact of Ubiquitous Computing on Learning Ross A. Griffith Wake Forest University Ubiquitous Computing Conference Seton Hall University.
Enhancing Critical Thinking Skills 2012 HBCU Library Alliance Leadership Institute Presented By: Violene Williams, MLIS Reference Librarian James Herbert.
Bill Zannini Business Programs Coordinator October 27, 2008.
Using the IDEA Student Ratings System: An Introduction University of Saint Thomas Fall
Poster Design & Printing by Genigraphics ® Influence of Preservice Teacher Instrumental Background upon Effectiveness of Teaching Episodes.
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
Azra Rafique Khalid Mahmood. Introduction “To learn each and everything in a limited time frame of degree course is not possible for students”. (Mahmood,
Information Literacy for a Freshman Engineering Studies Course Larry Schmidt Science/Engineering Librarian Charles W. Dolan H. T. Person Professor Department.
ENG 111 & 112: Goals Overview English 111 & 112 use an integrated reading/writing approach to develop students’ critical thinking and analytical writing.
USING STUDENT OUTCOMES WHEN INTEGRATING INFORMATION LITERACY SKILLS INTO COURSES Information Literacy Department Asa H. Gordon Library Savannah State University.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
LEARNING PROFILE Title of Degree Program PROGRAM CHARACTERISTICS (Description, Unique Experiences, Inputs, Outcomes) (EXAMPLES) Year Established. Accreditation.
A Conversation Across the Disciplines to Integrate Literacy into Middle & Secondary Classrooms Drs. Pixita del Hill Prado, Ellen Friedland, & Jevon Hunter.
-SLO Development Progress -SLO Activity -Assessment Progress -Support Needs.
Sheila Roberts Department of Geology Bowling Green State University.
Information Literacy Standards for the 21 st Century Learners.
Assessment Data from the First Year of… J. Corey Butler, Co-Chair of the LEC SMSU Professional Development Day August 17th, 2011.
Plagiarism: The Crime of Intellectual Kidnapping: An Interactive Information Competence Tutorial at San Jose State University Pamela A. Jackson Reference/Instruction.
STUDENT LEARNING OUTCOMES The Holy Grail of Academic Library Assessment Lisa Norberg, Barnard College ENYACRL Spring Conference 2012.
Closing the Loop From Analysis to Use of Results SLO Winter 2011 Workshop.
Creating Library Assignments. Students and Research Faculty Expectations vs. Student Realities Expectation: Generation Y = “digital natives” Reality:
Assessing SAGES with NSSE data Office of Institutional Research September 25 th, 2007.
David Steer Department of Geosciences The University of Akron Learning objectives and assessments May 2013.
Educators’ Attitudes about the Accessibility and Integration of Technology into the Secondary Curriculum Dr. Christal C. Pritchett Auburn University
ICOLIS 2007 AN ATTEMPT TO MAP INFORMATION LITERACY SKILLS VIA CITATION ANALYSIS OF FINAL YEAR PROJECT REPORTS N.N. Edzan Library and Information Science.
Using Peer Reviewed Research to Teach Reading, Critical Thinking and Information Literacy in Student Success Courses Dr. Christine Harrington Middlesex.
Replacing Old Bridges with New Kristen Motz Helen Woodman Ferris State University
HE 520: Higher Education Laws and Regulations Unit One Seminar Pre-Seminar Welcome to HE 520: Higher Education Laws and Regulations, Unit One Seminar Seminar.
Statistics AP Course Expectations. Guidelines for the Statistics AP Course We will emphasize statistical literacy and develop statistical thinking. We.
0 1 1.Key Performance Indicator Results ( ) KPI Survey Statistics Student Distribution by Year in Program KPI Overall Results Student Satisfaction.
HUMANITIES 9 COURSE DESCRIPTION UNITED NATIONS. Introduction In order to better prepare you for your entrance into the IGCSE programme next year, you.
US AP Exam Review Revised Exam for THE AP EXAM The College Board redesigned the APUSH Exam for the school year. Students will need to.
Information Literacy Module for FYI Available to any FYI Tony Penny, Research Librarian – Goddard Library Research & Library Instruction Services We support.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
Mark Vargas, Library Director Anne Buchanan, Assistant Librarian Saint Xavier University Library Building and Strengthening Upper Level Research Skills.
Research Problem The role of the instructor in online courses depends on course design. Traditional instructor responsibilities include class management,
Research. Elements of a Research Report  A thesis statement that is clearly identifiable.  Factual support/evidence from a variety of reliable, credited.
How Technologically Literate are EMCC Students ?.
Ma Lei Hsieh Instruction Librarian Rider University Patricia H. Dawson Science Librarian Rider University VALE User.
Databases vs the Internet. QUESTION: What is the main difference between using library databases and search engines? ANSWER: Databases are NOT the Internet.
Environmental Systems and Society Internal Assessment.
Poster Design & Printing by Genigraphics ® Influence of Pre-Service Teacher Instrumental Background upon Effectiveness of Teaching Episodes.
Greenbush. An informed citizen possesses the knowledge needed to understand contemporary political, economic, and social issues. A thoughtful citizen.
Research Assignment Design Kerri Carter – ext Diane VanderPol
 Teaching: Chapter 14. Assessments provide feedback about students’ learning as it is occurring and evaluates students’ learning after instruction has.
Chapter 20 Asking Questions, Finding Sources. Characteristics of a Good Research Paper Poses an interesting question and significant problem Responds.
Information Seeking Behavior and Information Literacy Among Business Majors Casey Long Business Liaison Librarian University Library Georgia State University,
Course title: RESEARCH PLANNING AND REPORT WRITING Lecture 1 Introduction to postgraduate research.
Using Primary Sources and DBQs
Learning Objectives Made Easy
Bringing It All Together: Advising With an Academic Skills Inventory
ODU Faculty Development March 18, 2011
Home Learning and Research Skills
Information Literacy Requirement Charter Oak State College
From Analysis to Use of Results SLO Winter 2011 Workshop
SCGR Results Spring 2016 Student Academic Achievement Committee
How Technologically Literate are EMCC Students?
Building and Strengthening Upper Level
Analysis: Clarity of Scholarly Question Style & Scholarly Relevance
Assessment Workshop Design Program Portfolio & Presentation / ART- 230
Title of Your Thesis Your name
Background Third time assessed
..
Cameron University Library
Assignment Design Workshop
Background Third time assessed
SCGR Results Spring 2016 Student Academic Achievement Committee
Analysis: Clarity of Scholarly Question Style & Scholarly Relevance
Presentation transcript:

Information Literacy Assessment SPECIAL THANKS TO JIM WAUGH, OPIE!

Introduction  Information Literacy one of seven general education abilities  Initial assessment occurred in spring 2011  Most recent reiteration administered spring  Four primary Information Literacy competencies included:  Framing the Research Question  Accessing Sources  Evaluation of Information Resources  Create Original Work

Methodology  Library faculty review and score  Also helped SAAC design an Evaluation Rubric  The individual competencies were assessed using a clearly defined three level scale:  Level 1 / Beginner  Level 2 / Satisfactory  Level 3 / Proficient

Data Collected Data was collected from 13 courses in 2011 and from 20 courses in 2014 which included: MAT102 NUR251 NUR271 PHY101 PSY290AB (2011, too) SOC212 (2011, too) AJS101 CIS105 (2011, too) COM225 (2011, too) EDU112 EDU220 EDU221 EDU222 (2011, too) EDU230 EDU236 EED215 ENG091 ENG101 (2011, too) ENG102 (2011, too) ENH285 (2011, too)

Data Collected (continued) * Five instructors provided assessment data for both 2011 and 2014 assessment cycles About Assessments: % Change Number of Instructors Involved * % Number of Sections Involved % Number of Students Assessed %

Data Collected (continued) Assessment of materials from:2011 (n=346) 2014 (n=488) % Change In-Person courses71%77%Up 6% Internet courses14% Stable Hybrid courses15%9%Down 6% Developmental Education courses0%11%Up 11% 100-Level courses62%34%Down 28% 200-Level courses38%55%Up 17% Freshmen68%52%Down 16% Sophomores32%48%Up 16%

2011 and 2014 Comparative Highlights  Five instructors assessed Information Literacy in both 2011 and Changes in 2014 which may have contributed a positive impact on improving student Information Literacy performance include:  Increased emphasis on instructor and student engagement in the classroom  Increased access to Information Literacy presentations, Library staff and resources in and out of the classroom  Better equipped classrooms (use of Learning Studio vs. non-computer equipped classroom) to better support Information Literacy skills

Framing the Research Question (All Participants) * Statistically significant difference in means Level 1 / Beginner: Recognizes the need for information to answer a question 13% (n=45)3% (n=14) Level 2 / Satisfactory: Recognizes the information need for the appropriate topic, identifies key concepts & related Terms 59% (n=205)65% (n=316) Level 3 / Proficient: Identifies key concepts & related terms and locates quality resources to meet that need 28% (n=96)32% (n=158) Total Mean = 2.15 (n=346) * Mean = 2.30 (n = 488)

Framing the Research Question (Comparing Participants)  Sophomores (mean = 2.39) outperformed freshmen (mean=2.21) in 2014  Sophomores in 2014 (mean=2.39) outperformed 2011 sophomores (mean=2.17)  EMCC students in 2014 (mean=2.30) outperformed 2011 EMCC students (mean=2.15)

Accessing Resources (All Participants) * Statistically significant difference in means Level 1 / Beginner: Uses a minimal number and/or types of sources to retrieve Information 34% (n=118)14% (n=67) Level 2 / Satisfactory: Used various types of information sources databases, books newspapers etc. 38% (n=133)64% (n=310) Level 3 / Proficient: Uses significant number of sources including primary & secondary 28% (n=95)22% (n=109) Total Mean = 1.93 (n=346) * Mean = 2.09 (n = 486)

Accessing Resources (Comparing Participants)  Sophomores (mean = 2.17) outperformed freshmen (mean=1.77) in 2014  Freshmen: in 2014 (mean=2.01) outperformed freshmen 2011 (mean=1.88)  EMCC students in 2014 (mean=2.09) outperformed 2011 EMCC students (mean=1.93)

Evaluation of Information Resources (All Participants) ** Not a statistically significant difference in means Level 1 / Beginner: Uncertain as to whether the original information need has been satisfied 34% (n=117)14% (n=70) Level 2 / Satisfactory: Appears information need has been satisfied, uses various sources from differing viewpoints 37% (n=129)70% (n=336) Level 3 / Proficient: Meets requirements of Level 2 & uses a variety of peer ‐ reviewed sources 29% (n=100)16% (n=77) Total Mean = 1.95 (n=346) ** Mean = 2.01 (n = 483)

Evaluation of Information Resources (Comparing Participants)  Sophomores (mean = 2.12) outperformed freshmen (mean=1.92) in 2014  EMCC students in 2014 (mean=2.01) scored higher than 2011 EMCC students (mean=1.95) but not at a statistically significant level

Create Original Work (All Participants) * Statistically significant difference in means Level 1 / Beginner: Uncertain if cited sources support thesis or informational need of original work 19% (n=65)10% (n=47) Level 2 / Satisfactory: Cited sources seem to support original work and investigates differing viewpoints 55% (n=190)63% (n=307) Level 3 / Proficient: Meets requirement of Level 2 & uses formal citation format cites a variety of strong sources 26% (n=91)27% (n=134) Total Mean = 2.08 (n=346) * Mean = 2.18 (n = 488)

Create Original Work (Comparing Participants)  Sophomores (mean = 2.29) outperformed freshmen (mean=2.07) in 2014  Freshmen: in 2014 (mean=2.29) outperformed 2011 (mean=2.11)  EMCC students in 2014 (mean=2.18) outperformed 2011 EMCC students (mean=2.08)

Take Aways  2014 Faculty participants, sections, and number of students participating significantly increased from 2011  SAAC to encourage faculty to refocus on improving Accessing Resources and Evaluation of Informational Resources  2014 Sophomores outperformed 2014 Freshmen in every category  2014 EMCC students outperformed 2011 EMCC students in three out of four categories (scored higher in the 4 th category but not at a statistically significant level)