Presentation is loading. Please wait.

Presentation is loading. Please wait.

How to Norm Rubrics Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.

Similar presentations


Presentation on theme: "How to Norm Rubrics Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College."— Presentation transcript:

1 How to Norm Rubrics Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College

2 What is a Rubric? A rubric is a scoring guide such as: A checklist
A rating scale A matrix or list containing descriptions of student work characteristics at different levels of sophistication, a.k.a. a descriptive rubric Holistic (containing performance levels but not separating out criteria) Analytic (providing descriptions for each criterion at each performance level)

3 Consideration of Diverse Points of View
Checklist Sanitized hands Verified the patient’s fasting status Asked about latex sensitivity Selected appropriate gloves and tourniquet Assembled necessary supplies Positioned the patient properly Rating Scale Beginning (1) Developing (2) Proficient (3) Developed key ideas Addressed important details Organized information logically Used proper writing mechanics Descriptive Rubric Beginning (1) Developing (2) Proficient (3) Consideration of Diverse Points of View Wholly dismisses or disparages points of view that diverge from own worldview Identifies valid components of differing perspectives but responds in accordance with own worldview without reflection Analyzes the complexity and validity of differing perspectives and re-evaluates own perspectives in light of alternative worldviews

4 Objective vs. Subjective scoring
With the exception of checklists, rubrics are used to lend a level of objectivity to evaluation that is inherently subjective Checklists are for use when the demonstration of learning either is or is not present, with no in-between degrees of manifestation Checklists do not require norming Rating scales are the most subjective because they rely on the scorer’s interpretation of the performance-level headings Descriptive rubrics can essentially eliminate subjectivity by clearly identifying indicators of distinct levels of performance Objective vs. Subjective scoring And Selecting the Right Tool

5 Why Norm a Rubric? Rubric norming is called for when more than one person will be scoring students’ work and the results will be aggregated for assessment of student learning To develop shared understanding of the outcome(s) assessed To discover any need for editing of the rubric To develop scoring consistency among raters Minimize the variable of individual expectations regarding rigor Minimize potential for differences in interpretation of criteria tied to identification of performance levels

6 Norming RubricS or Norming Raters?
Norming rating scales = developing consensus among raters What the different performance levels are intended to capture What level of rigor should be applied in distinguishing the levels All who will be raters should be involved in the norming session(s) Norming descriptive rubrics = perfecting the rubric The better written the rubric, the less possibility it allows for differences in interpretation Identify and fix gray areas and ambiguities in the rubric Reduce need for scorers to conform to a group standard for interpretation

7 What You’ll Need A rubric A facilitator Some work samples
Student learning outcome statements to which the rubric is tied An outline of the steps of the norming process Raters (the faculty who will be doing the scoring)

8 Selecting Work Samples
Real student work or mock-up samples If using real student work, redact any identifying information Select samples that demonstrate different performance levels Plan to have 1 to 3 for each of 2 to 4 scoring sessions Determine number based on the time and complexity of scoring If the rubric contains multiple criteria, select samples that display differing levels of performance on differing criteria (i.e., samples that are neither all good nor all bad)

9 Outline of the Steps Orientation
Discuss levels and rating criteria/thought processes Score the samples Compare scores Discuss (and possibly modify the rubric) Repeat above two steps as needed until consensus is reached

10 Orienting the Raters to the Process Components
The student learning outcome(s) being assessed The purpose of the norming session The rubric itself How it came to be Its intended alignment to the SLO(s) Its intended use An outline of the process

11 Discussing levels & criteria
Model the thought processes involved in using the rubric Entertain perspectives regarding: The number of performance levels and their headings The construct validity of the criteria that have been identified Perceived distinctions between performance levels Perceptions regarding how the faculty think most of the student work will be scored and the discriminative value of the rubric Discussing levels & criteria Part of the Orientation

12 Scoring the Samples All raters score the same samples concurrently Raters refrain from discussing the works and/or their scores during the scoring sessions Each time, samples provide a range of skill demonstration Start with the most straight-forward samples and work up to those that require more refined decision-making

13 Comparing Scores Look for consistencies and inconsistencies
Confirm and summarize the rationale behind consistencies Ask raters to articulate the rationale behind inconsistencies Review the scoring criteria Encourage discussion

14 Reconciling Inconsistent Scores
Descriptive rubrics: Can the criteria or descriptions be revised in a way that produces agreement? Strive for natural delineations in performance that reflect discernable steps or benchmarks in the development of proficiency Rating scales: Can the criteria be revised in a way that reduces the opportunity for rater bias Strive for consensus or at least democratic agreement

15 If the process gets stuck
If disagreement persists, change course Consider breaking to calculate percentages of agreement May help raters calibrate their judgments by seeing how their scores compare to those of other raters Calculate the percentage of agreement between pairs of raters and then calculate the mean of the percentages overall (shown on next slide) Consider asking for all to commit to a democratically established convention for scoring in this particular context Those who are the outliers agree to disagree but nonetheless concede to score in accordance with the majority’s bias for the sake of consistency If the process gets stuck Part of Reconciliation

16 Percentage of Agreement
Raters Compared Number of Agreements Total Items Percentage of Agreement 1 & 2 3 7 14% 1 & 3 6 86% 1 & 4 2 & 3 4 57% 2 & 4 3 & 4 100% Overall 30 42 71% Percentage of Agreement How to Calculate It

17 Final Notes Rubric norming is a means for creating solidarity among the faculty regarding what the shared goals are (as reflected in the student learning outcome statements) and what student performance looks like when the outcomes are partially versus wholly achieved. Rubric norming is most effective when assessment is steeped in an ethos of inquiry, scholarly analysis, and civil academic discourse that encourages faculty participation in decision-making.


Download ppt "How to Norm Rubrics Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College."

Similar presentations


Ads by Google