Presentation is loading. Please wait.

Presentation is loading. Please wait.

Designing Rubrics with the Three Categories of Knowledge

Similar presentations


Presentation on theme: "Designing Rubrics with the Three Categories of Knowledge"— Presentation transcript:

1 Designing Rubrics with the Three Categories of Knowledge

2 Designing Rubrics with the Three Categories of Knowledge

3 Types of Rubrics Holistic: provides overall impression of elements of quality and performance levels in a student’s work Analytic: indicates student’s performance on two or more separate elements of quality (i.e., rubric designed for each criterion).

4 Effective Rubrics Include:
Anchors – purpose? Specific criteria for teachers Specific criteria for students Consider novice vs. expert understanding Pp. 167 & 172. Rubrics are not designed for skill development but for judging performance and insight related to understanding of ideas and meaning. Pp. 168 – 169.

5 Characteristics of Effective Rubrics
Designed to relate specific task requirements to more general performance goals Discriminates among different degrees of understanding or proficiency – p. 170 Do not combine independent criteria in one rubric Are based on analysis of many specific work samples, using widest possible range

6 Effective Rubrics cont’d
Rely on descriptive language as opposed to relying heavily on mere comparatives or value language – p. 171 Lowest scores describe how novice or ineffective performance appears – are constructively critical. Highlight the judging of performance’s impact vs. the processes, content used, or good-faith effort. Ultimately the performance is about results Complete numbers 4 and 5 on worksheet

7 Misconceptions Regarding Rubrics
Validity applies to rubrics, not just to assessment tasks; take care to focus on the most appropriate criteria. Focus on the purpose of the work –not the content and polish of the work. Rubrics can only be developed after a specific task has been designed – NOT! Rubrics are finalized prior to using it – NOT! Scoring what is easiest to score, not what is essential. More is not always better.

8 Validity Valid = questioning the appropriateness of the evidence. “As educators, we want to make sure that the specific answers or performance is logically connected to the more general understanding we seek to assess.” Validity also involves the degree to which we can confidently generalize about what a student knows or can do using a sample of evidence as a basis.

9 Validity Continued Critically examine performance task by asking the following two questions: Could a student successfully accomplish the task but lack the desired understanding? Could a student perform the task poorly but possess understanding? See example p. 256 By changing the standard or adjusting the assessment, the evidence can be made more less appropriate.

10 Wrapping It Up Helpful websites: What are you wondering?
multimedia rubrics What are you wondering? Complete Pp. 63 – 65 in M&W using your unit of interest.


Download ppt "Designing Rubrics with the Three Categories of Knowledge"

Similar presentations


Ads by Google