Presentation is loading. Please wait.

Presentation is loading. Please wait.

Demonstrate your value: Choosing and using a blend of methods to assess the effectiveness of information literacy instruction Carrie Leatherman Natural.

Similar presentations


Presentation on theme: "Demonstrate your value: Choosing and using a blend of methods to assess the effectiveness of information literacy instruction Carrie Leatherman Natural."— Presentation transcript:

1 Demonstrate your value: Choosing and using a blend of methods to assess the effectiveness of information literacy instruction Carrie Leatherman Natural Sciences Librarian Western Michigan University May 17, 2013

2 Overview Background The Basics Why assess instruction? Planning for assessment Assessment Examples Exercise 2

3 Background How I got interested in assessment Provide information literacy instruction New emphasis on assessment at my institution Value of Academic Libraries report 3

4 The Basics Assessment “[S]ystematically examining patterns of student learning across courses and programs and using this information to improve educational practices.” (“What is the difference…,”[2013]) 4

5 The Basics Formative vs. Summative Assessment “Formative assessment refers to the gathering of information or data about student learning during a course or program that is used to guide improvements in teaching and learning.” (“What is the difference…,”[2013]) 5

6 The Basics Formative vs. Summative Assessment Summative Assessment: “The gathering of information at the conclusion of a course, program, or undergraduate career to improve learning or to meet accountability demands.” (“What is the difference…,”[2013]) 6

7 The Basics Direct vs. Indirect Assessment “ Direct assessment is when measures of learning are based on student performance or demonstrates the learning itself.” (“Common assessment terms,”[2013]) 7

8 The Basics Direct vs. Indirect Assessment “ Indirect assessments use perceptions, reflections or secondary evidence to make inferences about student learning.” (“Common assessment terms,”[2013]) 8

9 The Basics Learning Outcomes “Learning outcomes are statements of what students will learn in a class... The statements are focused on student learning (What will students learn today?) rather than instructor teaching (What am I going to teach today?). These statements should include a verb phrase and an impact ("in order to") phrase -- what students will do/be able to do and how they will apply that skill or knowledge.” (“Tips on Writing Learning Outcomes,” 2009) 9

10 The Basics Learning Outcome Example Students will be able to search a database using relevant search terms in order to retrieve articles that are on-target and topic- relevant. 10

11 Why Assess Instruction? Evaluate the effectiveness of instructional practices Evaluate students' mastery of skills Prove the value of library programs Accountability (“Assessment Issues,” 2013; Oakleaf, 2010) 11

12 Planning for Assessment Ask yourself: 1.Am I ready to do assessment? 2.Why am I doing assessment? 3.What are stakeholders’ needs? 4.Will the assessment tell me what I want to know? 5.What are the costs of assessment? 6.What are the assessment’s institutional implications? (Oakleaf and Kaske, 2009) 12

13 Planning for Assessment Think about assessment approaches: Fixed-choice tests Performance assessments Rubrics (Oakleaf, 2008) 13

14 Planning for Assessment Fixed-choice tests Indirect assessment Measures acquisition of facts Pros Collect a lot of data quickly; easy to score Can use for pre- and post-instruction comparison Cons Don’t test higher-level thinking skills Difficult to construct and analyze (Oakleaf, 2008 ) 14

15 Planning for Assessment Performance Assessment Direct assessment Measures real-life application of knowledge, skills Pros Captures higher-order thinking skills Supports learning and assessment in authentic contexts Cons More difficult to create, administer, and score (Oakleaf, 2008 ) 15

16 Planning for Assessment Rubric “A rubric is a scoring tool that explicitly represents the performance expectations for an assignment or piece of work. A rubric divides the assigned work into component parts and provides clear descriptions of the characteristics of the work associated with each component, at varying levels of mastery” (“Common Assessment Terms,” 2013) 16

17 Planning for Assessment Rubrics Measures quality of student performance Pros Means to create common instruction values Focuses on higher-order thinking skills Produces data that is easy to understand, defend, and covey Cons Design flaws can affect data quality Time-intensive to develop, use, train raters to use (Oakleaf, 2008 ) 17

18 Example 1: One-Shot Instruction Background Intro biology class for biology majors Past classes used poor-quality sources for class project Class project requirements Group presentation about area of molecular biology Required sources: Empirical research article Review article Three other sources 18

19 Example 1: One-Shot Instruction Collaborated with instructor and TAs Students’ information literacy skills needed to improve Formulated learning objectives One-shot instruction session 19

20 Example 1: One-Shot Instruction Desired learning outcomes of instruction Students will be able to: From a detailed research question in order to define key terms, relevant to research topic, to use as article database search terms Identify components of empirical research article in order to locate one relevant to project. Identify components of review article in order to locate one relevant to project. 20

21 Example 1: One-Shot Instruction Before instruction asked myself 1.Am I ready to do assessment? Yes! 2.Why am I doing assessment? Increase student learning; accountability; improve instruction 3.What are stakeholders’ needs ? Non-anecdotal; unobtrusive 4.Will the assessment tell me what I want to know? Baseline; new info; trustworthy; meets needs 5.What are the costs of assessment? Time; personnel costs 6.What are the institutional implications of the assessment? Information literacy goal of libraries, university What if the instruction isn’t effective? 21

22 Example 1: One-Shot Instruction Assessment methods selected Pre-, post-instruction quizzes (Fixed-choice) Open-ended questionnaire (Performance, Rubric) Bibliography analysis (Performance, Rubric) All administered within class elearning shell 22

23 Example 1: One-Shot Instruction Pre-, post-instruction quizzes 23

24 Example 1: One-Shot Instruction Open-ended questionnaire 24

25 Example 1: One-Shot Instruction Bibliography Analysis 25

26 Example 1: One-Shot Instruction Rubric for open-ended questionnaire 26

27 Example 1: One-Shot Instruction Rubric bibliography analysis 27

28 Example 2: Online Tutorial Background Anecdotal evidence that online information literacy tutorial outdated Created a new version of tutorial Updated content and interface Same concepts New tutorial More effective? Preferred by students? 28

29 Example 2: Online Tutorial Desired learning outcomes of instruction Students will be able to: Define an appropriate research question given assignment criteria in order to select a manageable focus within a general topic. Construct effective search queries in order to locate information resources for a given research question. Apply evaluation criteria to resources in order to identify ones most appropriate to research question. Recognize when it is necessary to provide a citation for information sources in order to avoid plagiarism. 29

30 Example 2: Online Tutorial Before instruction asked myself 1.Am I ready to do assessment? Yes! 2.Why am I doing assessment? Increase student learning; accountability; improve instruction 3.What are stakeholders’ needs ? Non-anecdotal; time consuming but compensation 4.Will the assessment tell me what I want to know? Baseline; new info; trustworthy; meets needs 5.What are the costs of assessment? Time; personnel costs 6.What are the institutional implications of the assessment? Information literacy goal of libraries, university What if the instruction isn’t effective? 30

31 Example 2: Online Tutorial Assessment methods Quizzes (Fixed-choice; direct method; summative) Hypothetical research project (Performance; direct; summative) Focus groups Satisfaction with new tutorial Not learning outcomes-based assessment! 31

32 Example 2: Online Tutorial Quizzes 32

33 Example 2: Online Tutorial Hypothetical research project 33

34 Example 2: Online Tutorial Rubric for hypothetical research project 34

35 Miscellaneous Triangulating data via multiple assessment methods Statistical analysis increases validity Research that involves human subjects What campus resources are available? 35

36 Lessons Learned Assessment Is increasingly important Takes planning Various methods and approaches Is ongoing Is worthwhile! 36

37 Exercise Think Pick an instructional activity you would like to assess Ask yourself: 1.Am I ready to do assessment? 2.Why am I doing assessment? 3.What do stakeholders want to know? 4.Will the assessment tell us this? 5.What are the costs of assessment? 6.What are the assessment’s institutional implications? Pair With a partner’s help, choose an assessment method that would work best for your situation Help your partner choose an assessment method Share! 37

38 References Association of College and Research Libraries. 2013. Assessment Issues. Retrieved from http://www.ala.org/acrl/issues/infolit/resources/assess/issues. Carnegie Mellon Eberly Center for Teaching Excellence and Educational Innovation. [2013]. Common assessment terms. Retrieved from http://www.cmu.edu/teaching/assessment/basics/glossary.html. Carnegie Mellon Eberly Center for Teaching Excellence and Educational Innovation. [2013]. What is the difference between assessment and grading? Retrieved from http://www.cmu.edu/teaching/assessment/basics/grading-assessment.html. Connaway, L. S., & Powell, R. R. 2010. Basic research methods for librarians (5th ed.). Santa Barbara, Calif.: Libraries Unlimited. Oakleaf, M. 2008. Dangers and opportunities: A conceptual map of information literacy assessment approaches. portal: Libraries & the Academy, 8(3), 233-253. Oakleaf, Megan. 2009. Writing information literacy assessment plans: a guide to best practice. Communications in Information Literacy 3(2): 80-89. Oakleaf, Megan. 2010. Value of Academic Libraries: A Comprehensive Research Review and Report. Chicago: Association of College and Research Libraries. Oakleaf, Megan and Kaske, Neil. 2009. Guiding Questions for Assessing Information Literacy in Higher Education. portal: Libraries & the Academy 9(20): 273-86. University Library, University of Illinois at Urbana-Champaign. 2009. Tips on Writing Learning Outcomes. Retrieved from http://www.library.illinois.edu/infolit/learningoutcomes.html. 38

39 Questions? 39

40 Thank you! Carrie Leatherman carrie.leatherman@wmich.edu http://scholarworks.wmich.edu/ 40


Download ppt "Demonstrate your value: Choosing and using a blend of methods to assess the effectiveness of information literacy instruction Carrie Leatherman Natural."

Similar presentations


Ads by Google