Presentation is loading. Please wait.

Presentation is loading. Please wait.

MODULE 3 1st 2nd 3rd. The Backward Design Learning Objectives What is the purpose of doing an assessment? How to determine what kind of evidences to.

Similar presentations


Presentation on theme: "MODULE 3 1st 2nd 3rd. The Backward Design Learning Objectives What is the purpose of doing an assessment? How to determine what kind of evidences to."— Presentation transcript:

1 MODULE 3 1st 2nd 3rd

2 The Backward Design

3 Learning Objectives What is the purpose of doing an assessment? How to determine what kind of evidences to look for? What kind of methods can be used? When? How to create assessment tasks and evaluation criteria? How to make sure the assessment is valid and reliable?

4 Why to assess? The assessment purpose is to measure understanding, not to generate grades!!!! Reliable information to infer about student learning Feedback to improve their teaching methods Provide professor with: Feedback on how well they understand the content Feedback to improve their learning Provide students with:

5 How to create assessments? Assessment Objectives Evidences of Learning Assessment Evaluation Criteria 1 2 3 4 Validity and Reliability 5

6 Assessment Objectives Big Ideas Important to know and do Superficial knowledge Core Concepts Worth to be familiar with 1

7 Evidences of Learning Big Ideas Important to know and do Superficial knowledge Core Concepts Worth to be familiar with Know Concepts, Definitions Ability to transfer knowledge to different contexts Related to the ABILITY of doing something Ability to apply a specified framework to contexts approached in class 2 EVIDENCE refers to something that can be DEMONSTRATED! Micro-descriptive Micro, domain-specific Macro, across domains, multi-disciplinary

8 Evidences of Learning Recall Definitions Summarize ideas, explain concepts Apply concepts to situations similar to the ones approached in class Break concepts into parts and understand their relationship Apply concepts to situations different from the ones approached in class. Create new application or interpretation of the concepts Judge results of concepts application and make decision about the quality of the application 2 Bloom 2001

9 Assessment Assessment Tasks When to assess? Which Method? 3

10 When to assess? Photo AlbumSnapshotvs 3 Summative Formative + Summative

11 Formative and Summative Assessment 3 Formative Objective is give feedback to students Build learning Students can adjust Summative More focused on grade End of the grading period. There is no opportunity to adjust and show improvement Both are necessary! At least 50% of each! Combination of both leads to a good result! F F S

12 Continuous Assessment Different Moments and Different Methods!! 3

13 Assessment Tasks Big Ideas Important to know and do Superficial knowledge Core Concepts Worth to be familiar with Traditional Quizzes and Tests Paper-and-pencil Multiple-Choice Constructed response Performance and Task Projects Complex Open-Ended Authentic Adapted from “Understanding by Design”, Wiggins and McTighe 3

14 Assessment Tasks Quizzes and Traditional Tests Ask about definition Open-Ended Questions Simple Performance Task Straightforward application, Exercises Analytical Task Experiments, Scenarios Simulation, Cases Complex Performance Task Application to new contexts and situations, create artifact or project Result of Analysis - Decision Pros vs Cons, Cost vs Benefits, Reflection 3 Bloom 2001 Authentic Tasks

15 Authentic Task Is realistic contextualized Replicates key challenging real-life situations Requires judgment and innovation Students are asked to “ do ” the subject Assesses students ability to integrate concepts and ideas Gives the opportunity to practice and get feedback Task that reflects possible real-world challenges It is problem-based NOT an exercise! 3 It is a performance-based assessment! From “Understanding by Design”, Wiggins and McTighe

16 Authentic Task vs. Exercise Authentic Task Accuracy is what matters Question is “ noisy ” and complicated Various approaches can be used Integration of concepts and skills Appropriate solution Arguments is what matters Right approach Right solution and answer Exercise 3 In Class, Formative Out of class, summative From “Understanding by Design”, Wiggins and McTighe

17 How to formulate an Authentic Task? oal ole udience ituation erformance tandards What is the goal of the task? What is the problem that has to be solved? What is the student role? What students will be asked to do? Who is the audience? Who is the client? Who students need to convince? What is the situation or the context? What are the challenges involved? 3 From “Understanding by Design”, Wiggins and McTighe

18 Evaluation Criteria 4 Provide feedback for students Be clear Communicated in advance Be consisted of independent variables Focus on the central cause of performance Focus on the understanding and use of the Big Idea Must…

19 Types of Evaluation Criteria 4 Check List Criteria

20 Check List 4 2. List of individual traits with the maximum points associate to each of them There are two types of Check Lists 1. List of questions and their correct answers

21 Check List: Questions and answers 4 This type is used to Multiple-choice, True/False, etc. In other words, where there is a correct answer 1.A 2.C 3.D 4.B 5.B 6.D

22 Check List: Traits and their value 4 Performance Trait 1 Trait 2 Trait... Weight (%) or points Grade = weighted average or Grade = sum of points

23 Analytic Rubric is better 4 Provides more detailed feedback for students Provides students with information about how they will be evaluated Is clearer Evaluates independently each characteristic that composes performance Holistic Rubric is used when it is required only an overall impression On the other hand…

24 Analytic Rubrics 4 How to create them?

25 4 How to create Analytical Rubrics? Ideas Organization Grammar ExcellentSatisfactoryPoor Levels of achievement Traits Example: a simple rubric to evaluate an essay

26 It can be created from a Check List! 4 Performance Trait 1 Trait 2 Trait... Weight (%) or points Excellent Acceptable Unacceptable Excellent Acceptable Unacceptable Excellent Acceptable Unacceptable The difference is that each trait is broken down into levels of achievement, which have detailed description!

27 How to define traits? 4 Get samples of students ’ previous work 1. 2. Classify the sample into different levels (strong, middle, poor … ) and write down the reasons Cluster the reasons into traits 3. It can be defined based on experience or on historical data: Write down the definition of each trait 4. Select among the samples the ones that illustrate each trait 5. Continuously refine the traits ’ definitions 6. It can also be defined based specific objectives and learning questions From “Understanding by Design”, Wiggins and McTighe

28 How to build Analytic Rubric? 4 http://rubistar.4teachers.org/index.php The following website is a free tool that helps to create rubrics

29 5 Validity and Reliability

30 5 http://ccnmtl.columbia.edu/projects/qmss/images/target.gif TargetDesired understandings / objectives ShotsAssessment Outcomes

31 Checking for Validity 5 Is it possible to a student do well on the assessment task, but really not demonstrate the understandings you are after? Is it possible to a student do poorly, but still have significant understanding of the ideas? Would this student be able to show his understanding in other ways? If yes, the assessment is not valid. It does not provide a good evidence to make any inference Self-assess the assessment tasks by asking yourself the following questions: (Note: for both questions, consider the task characteristics and the rubrics used for evaluation) Adapted from “Understanding by Design”, Wiggins and McTighe

32 Checking for Validity 5 The previous questions can be broken down into more detailed questions: How likely is that a student could do well on the assessment by: Making clever guesses based on limited understanding? Plugging in what was learned, with accurate recall but limited understanding? Making a good effort, with a lot of hard work, but with limited understanding? Producing lovely products and performance, but with limited understanding? Applying natural ability to be articulated and intelligent, but with limited understanding? Next Slide From “Understanding by Design”, Wiggins and McTighe

33 Checking for Validity 5 How likely is that a student could do poorly on the assessment by: Failing to meet performance goals despite having a deep understanding of the Big Ideas? Failing to meet the grading criteria despite having a deep understanding of the Big Ideas? Make sure all the answers are “very unlike” !!! From “Understanding by Design”, Wiggins and McTighe

34 Checking for Reliability 5 Assess rubric reliability by asking: Would different professors grade similarly the same exam? Would the same professor give the same grade if he grades the test twice, but at different moments? Assess task reliability by asking: If a student did well (or poorly) in one exam, would he do well (or poorly) in a similar exam? Task reliability can be achieved by applying continuous assessments From “Understanding by Design”, Wiggins and McTighe

35 Summary Learning Objectives Evidences of Learning Observable, Demonstrable

36 Summary Time Evidences of Learning Formative Assessment Tasks Summative Assessment Task Complexity depends on the desired level of understanding Clear evaluation criteria (Rubrics) Task and criteria must provide accurate and consistent judgments

37 Learning Objectives What is the purpose of doing an assessment? How to determine what kind of evidences to look for? What kind of methods can be used? When? How to create assessment tasks and evaluation criteria? How to make sure the assessment is valid and reliable?

38 References The main source of information used in this module is the following book Wiggins, Grant and McTighe, Jay. Understanding by Design. 2nd Edition. ASCD, Virginia, 2005. Rubrics http://rubistar.4teachers.org/index.php


Download ppt "MODULE 3 1st 2nd 3rd. The Backward Design Learning Objectives What is the purpose of doing an assessment? How to determine what kind of evidences to."

Similar presentations


Ads by Google