Assessment of Information Skills. There is more teaching going on around here than learning and you ought to do something about that. Graduating Senior.

Slides:



Advertisements
Similar presentations
Evaluation Overview - Basics. Purpose of Testing Diagnostic Formative Summative.
Advertisements

General Education Assessment AAC&U GE and Assessment Conference March 1, 2007.
Graduation and Employment: Program Evaluation Using Dr. Michele F. Ernst Chief Academic Officer Globe Education Network.
Objectives Explain the purpose of the RIME feedback method.
What “Counts” as Evidence of Student Learning in Program Assessment?
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
Evaluation for 1st Year Grantees Shelly Potts, Ph.D. Arizona State University
Best Practices in Assessment, Workshop 2 December 1, 2011.
Assessing student learning. Diagnostic - The gathering of information at the outset of a course or program of study to provide information for both the.
A Guide for College Assessment Leaders Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.
Program Evaluation.
Mary Allen Qatar University September Workshop participants will be able to:  draft/revise learning outcomes  develop/analyze curriculum maps.
An Assessment Primer Fall 2007 Click here to begin.
Department Plans for Assessment of Student Learning Teachers Talking February 16, 2011.
1 General Education Assessment at Cleveland State University What We Have Accomplished What We Have Yet to Do.
NCATE Institutional Orientation Session on PROGRAM REVIEW Moving Away from Input- based Programs Toward Performance-based Programs Emerson J. Elliott,
STRATEGIC PLANNING AND ASSESSMENT PLANNING Presentation to CLAS Unit Heads Nov. 16, 2005 Maria Cimitile Julie Guevara Carol Griffin.
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
Month, XX YEAR (Arial 10) Evaluating Student Learning – Ted Scholz.
Common Definitions Activities undertaken by teachers –
Developing an Assessment Plan Owens Community College Assessment Day Workshops November 13-14, 2009 Anne Fulkerson, Ph.D. Institutional Research.
FLCC knows a lot about assessment – J will send examples
Chapter 1 Assessment in Elementary and Secondary Classrooms
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Assessment COURSE ED 1203: INTRODUCTION TO TEACHING COURSE INSTRUCTOR
ASSESSMENT Formative, Summative, and Performance-Based
Direct vs Indirect Assessment of Student Learning: An Introduction Dr. Sheila Handy, Chair Business Management and Co-Chair University Assessment Committee.
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
Janet Fulks, ASCCC Bakersfield College Bob Pacheco, RP, Barstow College.
Becoming a Teacher Ninth Edition
ABET Assessing Program Outcomes Amir Rezaei. Outline Context of Assessment Process of Assessment Similarities and differences between classroom and program.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Building Your Assessment Plan Esther Isabelle Wilder Lehman College.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
Assessment Workshop College of San Mateo February 2006.
Threshold Concepts & Assessment Ahmed Alwan, American University of Sharjah Threshold Concepts For Information Literacy: The Good, the Bad and the Ugly.
Approaches to Assessment Workshop for College of the Redwoods Fred Trapp August 18, 2008.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
FACULTY ROLES & STUDENT SUCCESS. Faculty Roles in: Program level learning outcomes Curriculum mapping Assessment of student learning Student success.
TEST,MEASUREMENT AND EVALUATION
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
The selection of appropriate assessment methods in a course is influenced by many factors: the intended learning outcomes, the discipline and related professional.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Assessment of Learning in Student-Centered Courses Barbara Duch, MSERC Susan Groh, Chemistry & Biochemistry.
B.A. (English Language) UNIVERSITI PUTRA MALAYSIA
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
Developing Rubrics within the Context of Assessment Methods Peggy Maki Senior Scholar Assessing for Learning AAHE
Rubrics Staff development workshop 19/9/2014 Dr Ruth Fazakerley.
Assessment My favorite topic (after grammar, of course)
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
Re-Cap NGSS. Assessment, Evaluation, and Alignment.
 Teaching: Chapter 14. Assessments provide feedback about students’ learning as it is occurring and evaluates students’ learning after instruction has.
 Summative  Formative  Individuals,  Assignments,  Learning activity  Courses,  Programs,  Institutions.  Assessment at a point in time.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Teaching and Learning Cycle and Differentiated Instruction A Perfect Fit Rigor Relevance Quality Learning Environment Differentiation.
Chapter 1 Assessment in Elementary and Secondary Classrooms
Classroom Assessments Checklists, Rating Scales, and Rubrics
Consider Your Audience
The assessment process
Finding/Creating Meaning in SLO Assessment
Classroom Assessments Checklists, Rating Scales, and Rubrics
Assessment of Classroom Learning
Presentation transcript:

Assessment of Information Skills

There is more teaching going on around here than learning and you ought to do something about that. Graduating Senior King’s College, 1968

Assessment A systematic ongoing process of  setting goals or asking questions,  gathering information,  interpreting it, and  using it to improve student learning. Barbara D. Wright

Why not use grades? “A grade is an inadequate report of an inaccurate judgment by a biased and variable judge of the extent to which a student has attained an undefined level of mastery of an unknown proportion of an indefinite material.” Bill Boyle & Tom Christie Issues in Setting Standards Falmer Press, 1996

Grades may reflect many things besides mastery of course objectives: Verbal ability ParticipationCooperation Extra credit AttendanceEffort Criterion Performance vs. Value Added Myths regarding student evaluations

Pay me now or Pay me later

Align your Outcomes Institutional outcomes Program outcomes Course outcomes Class outcomes

Dimensions of Assessment  Formative - Process Feedback for improvement Feedback for improvement Corrective & diagnostic Corrective & diagnostic  Summative – Evaluative Judgmental Judgmental Decisional Decisional

Dimensions of Assessment  An individual  A cohort  A class  A course  A program  An institution

Quantitative Data  Numerical  Statistically reliable  Structured techniques  Objective  Assumes static reality  Usually can be generalized  Allows for comparison and correlation  Looks for facts and causes

Qualitative Data  Not numerical  Provides “richness”  Allows for ambiguities  Structured or semi-structured  Written, verbal, visual, etc  Often subjective  Can be exploratory  Assumes dynamic reality  Looks for motivations and points of view

AssumptionsVs.Assessment

Assessment  Enhance student learning and institutional effectiveness  Student learning rather than faculty evaluation  Assess the process, rather than the outcome, for the purpose of improving the outcome

Measure What Matters  Measure what matters, NOT what is measurable  Because what you measure becomes what you focus on

Assessment  Knowing what you are doing  Knowing why you are doing it  Knowing what students are learning as a result  Changing because of the information Debra Gilchrist

Indirect Evidence of Student Learning  Surveys, self-reports & journals  Focus groups & interviews  Alumni & employer surveys  Percent of students entering graduate or professional schools

Limitations of Indirect Measures  Provide data on factors that predict or mediate learning  DO NOT evaluate learning per se  DO NOT NECESSARILY imply that value- added learning has or has not occurred (e.g., enthusiasm or lack of interest may be caused by factors not related to the course) Oswald Ratteray

Direct Evidence of Student Learning  Student assignments  Standardized tests  Course embedded assessments  Portfolios of students’ work  Capstone experiences  Student performances & exhibits  Other observations of student behavior

Direct Evidence of Student Learning  Rubrics & exemplars  Concept maps  Juried/peer review of student projects  Performance on a case study or problem  Locally devised tests  Commercially produced tests & exams

Limitations of Direct Measures  Indicate: WHAT students learned WHAT students learned HOW MUCH they learned HOW MUCH they learned What they DID NOT learn What they DID NOT learn  DO NOT indicate: WHY students learn or did not learn WHY students learn or did not learn  DO NOT necessarily indicate: Whether VALUE-ADDED learning occurred Whether VALUE-ADDED learning occurred Oswald Ratteray

Develop Rubrics to Assess Work:  Levels of achievement  Criteria that distinguish good work from poor work  Descriptions of criteria at each level of achievement Peggy Maki

Basic Rubric Grid Format Title Scale Level 1 Scale Level 2 Scale Level 3 Dimension 1 Dimension 2 Dimension 3 Dimension 4 Task Description

Why use a Rubric?  Examine complex work efficiently  Clarifies faculty expectations  Communicates expectations  Improves students’ work  Criterion-referenced grades rather than normative-referenced  Facilitates course or program assessment

Examples

Portfolios  Teaching portfolio Documents teaching over time Documents teaching over time  Course portfolio Reflection & review of a single course Reflection & review of a single course  Student portfolio Highlights student learning Highlights student learning

Portfolio Elements  Course design, philosophy, rationale  Implementation  Results  Reflection

Portfolio Elements  Focus on the match between assessment & course goals  Use existing assessments & data, when possible  Include a variety of evidence types  Be purposeful when selecting evidence about learning

Assessment is not about us, it is about student learning. it is about student learning.

Assessment is like scotch, it is an acquired taste.

Questions?

Seniors’ Information Skills by Standard : Mean Scores/Percentage Correct - Fall 2004* GroupN Composite Scores Std 1 Std 2 Std 3 Std 4 Std A b B C a D E F G ab Refers to comparisons within column where the MEAN scores of group a are significantly (P<.05) higher than the MEAN scores of group b. *Please note that in some cases the small group size and number of questions per standard may preclude more meaningful statistical comparisons.

Seniors’ Information Skills by Standard : Mean Scores/Percentage Correct -Fall 2004* GroupN Composite Scores KnowledgeApplication Seniors A b b50.00 B C D E a a54.44 F G a a56.41 ab Refers to comparisons within column where the MEAN scores of group a are significantly (P<.05) higher than the MEAN scores of group b. *Please note that in some cases the small group size and number of questions per standard may preclude more meaningful statistical comparisons.

Information Skill of Seniors & Graduate Students from Three Northeast, Pennsylvania Colleges by Standard: Mean Scores/Percentage - Fall 2004 * CollegeN Composite Scores Std1Std2Std3Std4Std (Srs) b 2 (Grads) a 3 (Srs) a ab Refers to comparisons within column where the MEAN scores of group a are significantly (P<.05) higher than the MEAN scores of group b. *Please note that in some cases the small group size and number of questions per standard may preclude more meaningful statistical comparisons.

Information Skill of Seniors & Graduate Students from Three Northeast, Pennsylvania Colleges by Standard : Mean Scores/Percentage - Fall 2004* CollegeN Composite Scores KnowledgeApplication (Srs) (Grads) (Srs) ab Refers to comparisons within column where the MEAN scores of group a are significantly (P<.05) higher than the MEAN scores of group b. *Please note that in some cases the small group size and number of questions per standard may preclude more meaningful statistical comparisons.

Five Stages of Assessment (from Elisabeth Kubler-Ross) 1. Denial n “No, not me” 2. Anger or resentment n “Why me?” 3. Bargaining n “Yes, me, but…” 4. Depression n “Yes, me” 5. Acceptance