Presentation on theme: "Fairness, Accuracy, & Consistency in Assessment"— Presentation transcript:
1Fairness, Accuracy, & Consistency in Assessment NCATE’s recommendations to reduce bias and ensure fairness to students
2Fairness “Assess what’s been taught” Candidates should be aware of the knowledge, skills, and dispositions which are measured in the assessments.Was it taught?Curriculum map – shows where students learn and practice what is assessedDo students understand the expectations?Syllabi containing instructions and timing of assessmentsRubrics/scoring guides shared with students
3Fairness According to these guidelines, are your program’s assessments FAIR? Is your curriculum map up to date?Does the curriculum map link to all professional standards and GSE standards and dispositions?Does the curriculum map indicate any gaps?Do syllabi indicate timing of assessments?Are rubrics/scoring guides shared with students when assigning the work?
4Accuracy “Assessments measure what they say they measure” Assessments should be aligned with standards and proficiencies they are designed to measureAre assessments aligned with standards?Content matchComplexity matchAppropriate degree of difficultyIs there corroborating evidence?Is there field input on the assessment?Ex – paper and pencil test may be ok for content knowledge, but not classroom management skills. Classroom observation may not be best to judge content knowledge.
5Accuracy According to these guidelines, are your program’s assessments ACCURATE? Do your program’s assessment types (test, observation) match what is being assessed?Dispositions = observationsSkills = performance assessmentWhich of your assessments “validate” (relate to) other assessments?Ex. - Work sample to lesson plan assignmentHave you had input from working professionals?
6Consistency “Assessments produce dependable, trustworthy results” The assessment results should be reliably consistent regardless of time and raterAre scoring tools sufficiently descriptive?Language to differentiate between components and between performance levelsClear descriptors that promote accurate scoringStudents can tell from scoring tool why they were rated at a certain levelAre raters trained?Agreement between “what a 3 in lesson planning” looks likeRaters understand of the consequences of final scoresPrograms have a plan to support/address students with insufficient performance that alleviate rater-pressure
7Consistency According to these guidelines, are your program’s assessments CONSISTENT? Does your program consistently use descriptive rubrics for major assignments and performance observations?Does your program have regular rater training?Has your program engaged in rubric calibration and moderation activities?Are raters aware of how their assessment’s scores affect the student?Are raters aware of how their assessment’s scores contribute to program review?What is the plan to support struggling students?
8Avoiding Bias “removing contextual and cultural bias from assessments” The assessment itself and assessment context should be analyzed for factors that would affect performance.Are clear assessments administered in the proper environment?Location/equipmentClear instructions/questionsHave assessments been reviewed for bias?Racial/ethnic/cultural stereotypesDisability resource center reviewAssignments that favor one group over another
9Avoiding Bias According to these guidelines, are your program’s assessments BIAS-FREE? Have all key assessments been reviewed for clarity of expectations?Have all your assignments been reviewed for accessibility?Have all your assignments been scrutinized for cultural bias and stereotypes?Has your program analyzed student outcomes according to sub-groups to determine if consistent scoring bias exists?
10GSE Rubric Guidelines Developed by the Assessment Committee - 2010 4 levels of competency: Unsatisfactory, Emerging, Proficient, ExemplaryIn ascending order from left to rightIf numbers are used: 1-4 from left to rightNEEDS IMPROVEMENT(1)EMERGING(2)PROFICIENT(3)EXEMPLARY(4)States values, beliefs and assumptions about priorities for resource allocation.Statement of values, beliefs and assumptions is absentStatement of values, beliefs, and assumptions is vague, too general or contrived.Statement of values, beliefs, and assumptions is clearly defined and specific.Statement of values, beliefs, and assumptions is clear, specific, convincing, and includes personal experience that promotes clarity.
11Rubric Moderation Process of strengthening consistency Develops inter-rater reliability through shared examination and discussion of student work.Involves all/many ratersProcessRecruit raters for a two-hour sessionProvide 4 samples of workProvide rubric and have raters “score” each sampleDiscuss as a group why the raters chose the scoresDebrief about what was learned, what remains unanswered.
12Rubric Calibration Process of strengthening consistency Develops inter-rater reliability by setting expectations of what the scores mean regarding student work.Involves all/many ratersProcessRecruit raters for a two-hour sessionProvide 4 samples of work that have been pre-scored (anchor papers), at least one at a low, medium, and high level of performanceDiscuss rubric areas and expectations for each level and component before scoring beginsProvide rubric and have raters “score” each sample with your discussion in mind.Have raters compare the scores they assigned to the “anchor” papersDebrief about what was learned, what remains unanswered.
13Next StepsHow is your program doing in providing fair, accurate, consistent, bias-free assessments to students?What work needs to be done in your program to ensure quality assessments are used?What do you need to be able to accomplish this?