MODULE 3 1st 2nd 3rd. The Backward Design Learning Objectives What is the purpose of doing an assessment? How to determine what kind of evidences to.

Slides:



Advertisements
Similar presentations
Differentiated Instruction (DI) Meets Understand by Design (UbD) UB EDUC- 503 October 15, 2012.
Advertisements

Performance Assessment
Understanding by Design Day 2 Roosevelt Complex Secondary Science Training.
Understanding by Design Stage 3
Analyzing Student Work
Measuring Complex Achievement: Essay Questions
Victorian Curriculum and Assessment Authority
Understanding by Design
Understanding by Design
Understanding by Design Ensuring Learning through Lesson Design
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
1 Quick Write: Take about 10 minutes and address the following questions about Assessment: What is it? What is it for? What is the difference between formal.
Assessment in the MYP CENTURY MIDDLE SCHOOL. What is Assessment? Assessment is integral to all teaching and learning. MYP assessment requires teachers.
Beginning with the End in Mind Overview of “Backwards Design” Jim Wright Kennesaw State University.
1 Backward Design, Assessment, and Rubrics Based on Understanding by Design By Grant Wiggins and Jay McTighe.
GCAT-SEEK Workshop, 2013 Dr. Tammy Tobin, Susquehanna University Adapted from Mary J. Allen, SACS-COC Summer Institute Developing and Using Rubrics to.
Open Ended Assignments Deanna E. Mayers Director of Curriculum Blendedschools.net.
ASSESSMENT LITERACY PROJECT Kansas State Department of Education Rubrics “You need to learn what rubrics are and how they can help you do a better job.
Authentic Performance Tasks
The mere imparting of information is not education. Above all things, the effort must result in helping a person think and do for himself/herself. Carter.
1.  com com  And click on the “ASCD Presentations” tab  Tamalpais August 8-9 Day Two.
Principles of High Quality Assessment
Dr. Robert Mayes University of Wyoming Science and Mathematics Teaching Center
ED 3501: Curriculum and Instruction Section GHI - Fall Understanding by Design Understanding and Creating Effective Instructional Design.
Redding Elementary School Integrated Learning Experiences Summer 2011 Presentation created by Christopher Wermuth 2011.
KPIs: Definition and Real Examples
DOK and GRASPS, an Introduction for new staff
Performance-Based Assessment June 16, 17, 18, 2008 Workshop.
Essay Assessment Tasks
TEXAS TECH UNIVERSITY HEALTH SCIENCES CENTER SCHOOL OF PHARMACY KRYSTAL K. HAASE, PHARM.D., FCCP, BCPS ASSOCIATE PROFESSOR BEYOND MULTIPLE CHOICE QUESTIONS.
ASSESSMENT Formative, Summative, and Performance-Based
1 Development of Valid and Reliable Case Studies for Teaching, Diagnostic Reasoning, and Other Purposes Margaret Lunney, RN, PhD Professor College of.
Understanding by Design
Central concepts:  Assessment can measure habits of mind or habits of recall.  Tests have their limits.  It is important to know the purpose the test.
Understanding By Design A Contemporary Approach to Curriculum Design.
UNDERSTANDING BY DESIGN
RUBRICS: A REFRESHER COURSE PRINCESS ANNE MIDDLE SCHOOL STAFF WEEK TRAINING, AUGUST 2014.
Classroom Assessment A Practical Guide for Educators by Craig A
Four Basic Principles to Follow: Test what was taught. Test what was taught. Test in a way that reflects way in which it was taught. Test in a way that.
T 7.0 Chapter 7: Questioning for Inquiry Chapter 7: Questioning for Inquiry Central concepts:  Questioning stimulates and guides inquiry  Teachers use.
Chapter 5 Building Assessment into Instruction Misti Foster
Measuring Complex Achievement
Teaching Today: An Introduction to Education 8th edition
Developing Assessments for and of Deeper Learning [Day 2b-afternoon session] Santa Clara County Office of Education June 25, 2014 Karin K. Hess, Ed.D.
Stages 1 and 2 Wednesday, August 4th, Stage 1: Step 5 National and State Standards.
Performance-Based Assessment Authentic Assessment
Using Understanding by Design
Classroom Evaluation & Grading Chapter 15. Intelligence and Achievement Intelligence and achievement are not the same Intelligence and achievement are.
Ch. 3 StudyCast SarahBeth Walker. NETS-T Standard 1  Teachers use their knowledge of subject matter, teaching and learning, and technology to facilitate.
Adapted from Understanding by Design Academy, Seattle, WA, July 2001 presented by Jay McTighe, ASCD. Think “Scrapbook” versus “Snapshot ”
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Assessment Whittney Smith, Ed.D.. “Physical vs. Autopsy” Formative: Ongoing, varied assessment used as a tool for learning and diagnosing Summative:
SHOW US YOUR RUBRICS A FACULTY DEVELOPMENT WORKSHOP SERIES Material for this workshop comes from the Schreyer Institute for Innovation in Learning.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Stage 2 Understanding by Design Assessment Evidence.
Georgia will lead the nation in improving student achievement. 1 Georgia Performance Standards Day 3: Assessment FOR Learning.
ASSESSMENT TOOLS DEVELOPMENT: RUBRICS Marcia Torgrude
Assessment Basics and Active Student Involvement Block II.
Understanding by Design Developed by Grant Wiggins and Jay McTighe Published by the Association for Supervision and Curriculum Development (ASCD) A.K.A.
MAVILLE ALASTRE-DIZON Philippine Normal University
Essay Questions. Two Main Purposes for essay questions 1. to assess students' understanding of and ability to think with subject matter content. 2. to.
21 st Century Learning and Instruction Session 2: Balanced Assessment.
Chapter 7- Thinking Like an Assessor Amy Broadbent and Mary Beck EDU 5103.
Writing Learning Outcomes Best Practices. Do Now What is your process for writing learning objectives? How do you come up with the information?
Adapted From the Work and Wisdom of Grant Wiggins & Jay McTighe, UBD 08/2002 Understanding by Design the ‘big ideas’ of UbD.
Designing Quality Assessment and Rubrics
Understanding by Design “Backwards Design”
Understanding by Design Ensuring Learning through Lesson Design
Writing Learning Outcomes
Backward Design, Assessment, and Rubrics
Presentation transcript:

MODULE 3 1st 2nd 3rd

The Backward Design

Learning Objectives What is the purpose of doing an assessment? How to determine what kind of evidences to look for? What kind of methods can be used? When? How to create assessment tasks and evaluation criteria? How to make sure the assessment is valid and reliable?

Why to assess? The assessment purpose is to measure understanding, not to generate grades!!!! Reliable information to infer about student learning Feedback to improve their teaching methods Provide professor with: Feedback on how well they understand the content Feedback to improve their learning Provide students with:

How to create assessments? Assessment Objectives Evidences of Learning Assessment Evaluation Criteria Validity and Reliability 5

Assessment Objectives Big Ideas Important to know and do Superficial knowledge Core Concepts Worth to be familiar with 1

Evidences of Learning Big Ideas Important to know and do Superficial knowledge Core Concepts Worth to be familiar with Know Concepts, Definitions Ability to transfer knowledge to different contexts Related to the ABILITY of doing something Ability to apply a specified framework to contexts approached in class 2 EVIDENCE refers to something that can be DEMONSTRATED! Micro-descriptive Micro, domain-specific Macro, across domains, multi-disciplinary

Evidences of Learning Recall Definitions Summarize ideas, explain concepts Apply concepts to situations similar to the ones approached in class Break concepts into parts and understand their relationship Apply concepts to situations different from the ones approached in class. Create new application or interpretation of the concepts Judge results of concepts application and make decision about the quality of the application 2 Bloom 2001

Assessment Assessment Tasks When to assess? Which Method? 3

When to assess? Photo AlbumSnapshotvs 3 Summative Formative + Summative

Formative and Summative Assessment 3 Formative Objective is give feedback to students Build learning Students can adjust Summative More focused on grade End of the grading period. There is no opportunity to adjust and show improvement Both are necessary! At least 50% of each! Combination of both leads to a good result! F F S

Continuous Assessment Different Moments and Different Methods!! 3

Assessment Tasks Big Ideas Important to know and do Superficial knowledge Core Concepts Worth to be familiar with Traditional Quizzes and Tests Paper-and-pencil Multiple-Choice Constructed response Performance and Task Projects Complex Open-Ended Authentic Adapted from “Understanding by Design”, Wiggins and McTighe 3

Assessment Tasks Quizzes and Traditional Tests Ask about definition Open-Ended Questions Simple Performance Task Straightforward application, Exercises Analytical Task Experiments, Scenarios Simulation, Cases Complex Performance Task Application to new contexts and situations, create artifact or project Result of Analysis - Decision Pros vs Cons, Cost vs Benefits, Reflection 3 Bloom 2001 Authentic Tasks

Authentic Task Is realistic contextualized Replicates key challenging real-life situations Requires judgment and innovation Students are asked to “ do ” the subject Assesses students ability to integrate concepts and ideas Gives the opportunity to practice and get feedback Task that reflects possible real-world challenges It is problem-based NOT an exercise! 3 It is a performance-based assessment! From “Understanding by Design”, Wiggins and McTighe

Authentic Task vs. Exercise Authentic Task Accuracy is what matters Question is “ noisy ” and complicated Various approaches can be used Integration of concepts and skills Appropriate solution Arguments is what matters Right approach Right solution and answer Exercise 3 In Class, Formative Out of class, summative From “Understanding by Design”, Wiggins and McTighe

How to formulate an Authentic Task? oal ole udience ituation erformance tandards What is the goal of the task? What is the problem that has to be solved? What is the student role? What students will be asked to do? Who is the audience? Who is the client? Who students need to convince? What is the situation or the context? What are the challenges involved? 3 From “Understanding by Design”, Wiggins and McTighe

Evaluation Criteria 4 Provide feedback for students Be clear Communicated in advance Be consisted of independent variables Focus on the central cause of performance Focus on the understanding and use of the Big Idea Must…

Types of Evaluation Criteria 4 Check List Criteria

Check List 4 2. List of individual traits with the maximum points associate to each of them There are two types of Check Lists 1. List of questions and their correct answers

Check List: Questions and answers 4 This type is used to Multiple-choice, True/False, etc. In other words, where there is a correct answer 1.A 2.C 3.D 4.B 5.B 6.D

Check List: Traits and their value 4 Performance Trait 1 Trait 2 Trait... Weight (%) or points Grade = weighted average or Grade = sum of points

Analytic Rubric is better 4 Provides more detailed feedback for students Provides students with information about how they will be evaluated Is clearer Evaluates independently each characteristic that composes performance Holistic Rubric is used when it is required only an overall impression On the other hand…

Analytic Rubrics 4 How to create them?

4 How to create Analytical Rubrics? Ideas Organization Grammar ExcellentSatisfactoryPoor Levels of achievement Traits Example: a simple rubric to evaluate an essay

It can be created from a Check List! 4 Performance Trait 1 Trait 2 Trait... Weight (%) or points Excellent Acceptable Unacceptable Excellent Acceptable Unacceptable Excellent Acceptable Unacceptable The difference is that each trait is broken down into levels of achievement, which have detailed description!

How to define traits? 4 Get samples of students ’ previous work Classify the sample into different levels (strong, middle, poor … ) and write down the reasons Cluster the reasons into traits 3. It can be defined based on experience or on historical data: Write down the definition of each trait 4. Select among the samples the ones that illustrate each trait 5. Continuously refine the traits ’ definitions 6. It can also be defined based specific objectives and learning questions From “Understanding by Design”, Wiggins and McTighe

How to build Analytic Rubric? 4 The following website is a free tool that helps to create rubrics

5 Validity and Reliability

5 TargetDesired understandings / objectives ShotsAssessment Outcomes

Checking for Validity 5 Is it possible to a student do well on the assessment task, but really not demonstrate the understandings you are after? Is it possible to a student do poorly, but still have significant understanding of the ideas? Would this student be able to show his understanding in other ways? If yes, the assessment is not valid. It does not provide a good evidence to make any inference Self-assess the assessment tasks by asking yourself the following questions: (Note: for both questions, consider the task characteristics and the rubrics used for evaluation) Adapted from “Understanding by Design”, Wiggins and McTighe

Checking for Validity 5 The previous questions can be broken down into more detailed questions: How likely is that a student could do well on the assessment by: Making clever guesses based on limited understanding? Plugging in what was learned, with accurate recall but limited understanding? Making a good effort, with a lot of hard work, but with limited understanding? Producing lovely products and performance, but with limited understanding? Applying natural ability to be articulated and intelligent, but with limited understanding? Next Slide From “Understanding by Design”, Wiggins and McTighe

Checking for Validity 5 How likely is that a student could do poorly on the assessment by: Failing to meet performance goals despite having a deep understanding of the Big Ideas? Failing to meet the grading criteria despite having a deep understanding of the Big Ideas? Make sure all the answers are “very unlike” !!! From “Understanding by Design”, Wiggins and McTighe

Checking for Reliability 5 Assess rubric reliability by asking: Would different professors grade similarly the same exam? Would the same professor give the same grade if he grades the test twice, but at different moments? Assess task reliability by asking: If a student did well (or poorly) in one exam, would he do well (or poorly) in a similar exam? Task reliability can be achieved by applying continuous assessments From “Understanding by Design”, Wiggins and McTighe

Summary Learning Objectives Evidences of Learning Observable, Demonstrable

Summary Time Evidences of Learning Formative Assessment Tasks Summative Assessment Task Complexity depends on the desired level of understanding Clear evaluation criteria (Rubrics) Task and criteria must provide accurate and consistent judgments

Learning Objectives What is the purpose of doing an assessment? How to determine what kind of evidences to look for? What kind of methods can be used? When? How to create assessment tasks and evaluation criteria? How to make sure the assessment is valid and reliable?

References The main source of information used in this module is the following book Wiggins, Grant and McTighe, Jay. Understanding by Design. 2nd Edition. ASCD, Virginia, Rubrics