Taft College’s Fall Assessment Team November 7, 2008.

Slides:



Advertisements
Similar presentations
Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Advertisements

General Education Assessment AAC&U GE and Assessment Conference March 1, 2007.
Rubric Design Denise White Office of Instruction WVDE.
Using Rubrics to Grade, Assess, and Improve Learning
Developing an Outcomes Assessment Plan. Part One: Asking a Meaningful Question OA is not hard science as we are doing it. Data that you collect is only.
Classroom Instruction That Works Providing Feedback.
The ABCs of Assessment Improving Student Learning Through New Approaches to Classroom Assessment.
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
Mary Jo Sariscsany Assessing Health- Related Fitness and Physical Activity 13 chapter.
Assessing Learning for Classroom Success Friday, February 16, :00-3:00.
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
GCAT-SEEK Workshop, 2013 Dr. Tammy Tobin, Susquehanna University Adapted from Mary J. Allen, SACS-COC Summer Institute Developing and Using Rubrics to.
EVALUATING WRITING What, Why, and How? Workshopping explanation and guidelines Rubrics: for students and instructors Students Responding to Instructor.
Classroom Assessment (1)
Introduction to Rubrics. What is a rubric? Rubrics classify behaviors or abilities into categories that vary along a continuum, and they are tools that.
SLO Summer Team, Session II June 24, 2008 Rubrics.
2005 CELT Conference1 Strategies from the Assessment Trenches: A Self-Help Clinic presented by AURA.
GRADING PLANS. SO WHAT? What do grades mean Who uses them Different perspectives on grading *from text if not in class*
Grade 12 Subject Specific Ministry Training Sessions
Student Engagement.
CLASSROOM ASSESSMENT FOR STUDENT LEARNING
AET/515 Spanish 101 Instructional Plan SofiaDiaz
OCTOBER ED DIRECTOR PROFESSIONAL DEVELOPMENT 10/1/14 POWERFUL & PURPOSEFUL FEEDBACK.
ASSESSMENT Formative, Summative, and Performance-Based
Classroom Assessment A Practical Guide for Educators by Craig A
Janet Fulks, ASCCC Bakersfield College Bob Pacheco, RP, Barstow College.
October 31, Dialog about SLOs, assessment, and existing practices at TC Identify course level SLO to assess this semester Align SLO with TC’s institutional.
August 3,  Review “Guiding Principles for SLO Assessment” (ASCCC, 2010)  Review Assessment Pulse Roundtable results  Discuss and formulate our.
Susan Agre-Kippenhan, Portland State University Professor, Art Department Evaluating the Effectiveness of Service Learning.
Accreditation, SLOs and You – What are the patient outcomes? or Will the patient survive? Facilitators: Janet Fulks and Phillip Maynard.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment A Practical Guide for Educators by Craig A
EDU 385 EDUCATION ASSESSMENT IN THE CLASSROOM
Beyond Multiple Choice: Using Performance and Portfolio Assessments to Evaluate Student Learning.
Chapter 5 Building Assessment into Instruction Misti Foster
Designing for Learning Tools to Help Faculty Design More Inclusive Courses Beth Harrison, PhD University of Dayton.
Building Assessments with Differentiation in Mind Fonda Vadnais
© E. Kowch iD Instructional Design Evaluation, Assessment & Design: A Discussion (EDER 673 L.91 ) From Calgary With Asst. Professor Eugene G. Kowch.
David Steer Department of Geosciences The University of Akron Learning objectives and assessments May 2013.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
OCTOBER ED DIRECTOR PROFESSIONAL DEVELOPMENT 10/1/14 POWERFUL & PURPOSEFUL FEEDBACK.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
What the*!?# is an SLO? Train the Trainers Workshop Marcy Alancraig Cabrillo College
SHOW US YOUR RUBRICS A FACULTY DEVELOPMENT WORKSHOP SERIES Material for this workshop comes from the Schreyer Institute for Innovation in Learning.
Effective Grading Strategies Alison Morrison-Shetlar Faculty Center for Teaching and Learning Adapted from the book Effective Grading by Barbara Walvoord.
What the*!?# is an SLO? Workshop on Student Learning Outcomes For De Anza College Faculty.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Taft College SLO Summer Team: Connecting SLOs with Assessment June 19-July 24, 2008.
Criterion-Referenced Testing and Curriculum-Based Assessment EDPI 344.
Identifying Assessments
Day 8 Induction – Year 2 Pennsbury School District Maureen Gradel Staff Developer.
Personal Project: THE RUBRIC Learning Intention We are learning to identify the important components of the Personal Project, and understand.
Assessment My favorite topic (after grammar, of course)
Scoring Rubrics: Validity and Reliability Barbara M. Moskal Colorado School of Mines.
 Teaching: Chapter 14. Assessments provide feedback about students’ learning as it is occurring and evaluates students’ learning after instruction has.
Checklists EDUC 307. What is an Observation Checklist?  The observation checklist is a strategy to monitor specific skills, behaviors, or dispositions.
CERTIFICATE IN ASSESSING VOCATIONAL ACHIEVEMENT (CAVA) Unit 1: Understanding the principles and practices of assessment.
Implementing Formative Assessment Processes: What's Working in Schools and Why it is Working Sophie Snell & Mary Jenatscheck.
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
If I hear, I forget. If I see, I remember. If I do, I understand. Rubrics.
Learning Goals Development & Assessment The Basics of Goals-Based Course Design.
Nuts and Bolts: Functional Variations of Assessment and Evaluation Barbara Hornum, PhD Director, Drexel Center for Academic Excellence Associate Professor,
Classroom Assessments Checklists, Rating Scales, and Rubrics
Getting Prepared for the Webinar
Writing Rubrics Module 5 Activity 4.
Classroom Assessments Checklists, Rating Scales, and Rubrics
What to do With Data? (Or, closing the alleged loop)
Student Learning Outcomes Assessment
The Teacher Work Sample: An Authentic Assessment
Presentation transcript:

Taft College’s Fall Assessment Team November 7, 2008

Carefully consider what you want to learn from your individual assessment report Examine authentic, formative, summative, and embedded assessments Discuss types of rubrics used to score and measure Work toward finalizing assessment plans

What are you hoping to learn from your assessment project? How will you measure it? How do you know the assessment will measure what you hope it will? How could the results be useful?

Janet Fulks defines Authentic Assessment this way: “Authentic assessment. Assessment that evaluates the student’s ability to use their knowledge and to perform tasks that are approximate those found in the work place or other venues outside of the classroom setting.” Please refer to page 2 of today’s handout

It is an opportunity for students to perform the outcome, in a real world setting You have a more accurate picture of students’ abilities It is mandated by ACCJC (see page 4 of today’s packet)

Summative assessments measure learned knowledge, skills, and attitudes, and happen during the end of a course Formative assessments let instructors know how to meet the needs of individual students within a course When do you think it be most useful to use formative assessment? Summative? How could either help us to improve our students’ learning experience? Please refer to page 5 of today’s handout.

Instructors at many campuses have chosen to embed direct assessments into exams. Selected questions are designed to measure specific learning outcomes. This process involves careful discussion of what the question will measure and how. To be accurate, the process requires validation. Please refer to pages 6 through 11 of today’s handout.

Evaluation or scoring instrument Can be used for grading and assessment Explain expectations of student work Can come in a variety of forms

Reduce entire assignment to one score or rank Primary traits of assignment are grouped together into one score Are often used to evaluated standardized writing tests Do not provide the same type of specific, useful data as other rubrics Can be misleading if one score does not reflect all primary traits of assignment See Page 3 of Rubric Examples packet

Analytic rubrics allow you to indicate a specific range of achievement for each primary trait or criterion of an assignment. They can, for a given assignment, provide more specific feedback. They detail a range of expectations. They can create more specific, meaningful data in assessment than simply one score. Without careful explanation, they can seem overwhelming to students. See page 4 of Rubric Examples.

In the classroom, primary traits of a rubric can be connected to points or percentages of an assignment. This requires careful planning, since every primary trait in your rubric may have a different value in the assignment. It may provide students with a tangible motivator to focus on your grading criteria for a given category. It can also be frustrating if you discover the point or percentage values don’t match the grade you feel the work deserves.

Another approach is to use the rubric without attaching points to it. Expectations are still clearly defined. Students can still understand the basis of their grade. Many instructors find this easier than trying to match point or percentage values appropriately with primary traits.

Homegrown rubrics allow you to include any criteria for any assignment you want. They allow instructors great flexibility, but are incredibly time consuming to construct well. It is easy to leave an important criterion out of a rubric by mistake. It is easy to create a rubric that doesn’t actually match your grading practices. Sometimes you don’t know until you try; then it is too late.

Some rubrics have been rendered reliable and validated through faculty discussion, research, and trial and error. The reliability of a rubric is determined by how accurately it classifies student work and by how likely it is that two different faculty using the same rubric will get the same results. This takes a lot of work; much of it has already been done, if we know where to look.

Think of the criteria you will use to assess an SLO or grade an assignment as your Primary traits. They can become categories in an analytic rubric. Creating a performance range for each category allows you to rank performance of the specified tasks. Go to Janet Fulks’ training manual: sment/Section_4_Assessment_Tools/Section4_6 PTA.htm sment/Section_4_Assessment_Tools/Section4_6 PTA.htm

Consider what you want to measure, and how. Consider the type of data that you want your assessment to produce. An analytic rubric will yield more ranges of data, thereby illuminating key areas of success and weakness in performance. Consider how you will track the data during your assessment. Keep the student in mind. Refer to Allen’s “A Rubric for Rubrics,” page 60 of Rubric Examples.

Janet Fulks’ Training Manual: ction_4_Assessment_Tools/Section4_7rubrics.htm ction_4_Assessment_Tools/Section4_7rubrics.htm RubiStar (Rubric building web page): Assessment Workshop Materials, Mary Allen: WorkshopHandoutJan06.pdf WorkshopHandoutJan06.pdf

Consider the means of best assessing student learning in your course to give you meaningful data to interpret for improvement. Solidify your assessment plan. Communicate the status of your assessment plan with the SLO coordinator.