Presentation on theme: "Assessing Student Performance"— Presentation transcript:
1 Assessing Student Performance OKGreat Work!Assessing Student PerformanceTry this...Needs work
2 Performance Objective Given a unit of instruction, develop a valid, reliable, criterion referenced student assessment instrument that scores at least 70 points on the evaluation checksheet..
3 Distinguish among evaluation, measurement, and testing Enabling ObjectivesDistinguish among evaluation, measurement, and testingDifferentiate between formative and summative assessment.Differentiate between criterion-referenced and norm-referenced assessment.Explain validity in student assessment.Explain reliability in student assessment.Plan for criterion-referenced assessment of student performance.
4 Why assess student performance? Assign gradesGauge student progress and award credit for task completionImprove instructionMotivate students to workProvide feedback to students
5 BasicsEvaluation - the general process of estimating student progress toward achieving performance objectivesMeasurement - the use of a specific tool to estimate an outcomeTesting - one specific form of evaluation that uses a measurement tool to formally evaluate student performance
6 Methods of Assessment Testing “Objective”“Subjective”Performance demonstration other than testPsychomotor TaskProjectLab Skill DemonstrationHigher-Level Cognitive TaskPaperPortfolioEtc…
8 Formative TestingThe process of using measurement tools to conduct evaluation for the purpose of IMPROVING student PERFORMANCEStudent receives feedback of resultsTeacher considers results in planning subsequent instructionGrades are not recorded!!
9 Summative TestingThe process of using measurement tools to conduct evaluation for the purpose of ASSIGNING student GRADESStudent receives feedback of resultsTeacher considers results in planning subsequent instructionGrades are recorded
10 A Test Can BeNorm-ReferencedorCriterion-Referenced
11 Norm-Referenced TestMeasures student performance against other studentsStudent scores better or worse than other studentsCompetition is between student and peersGrade is based on location on “the curve”Best students get “A,” poorest students fail
12 Normal CurveOn most measures of human behavior, graphing individual results will result in a “bell-shaped,” or normal curveMost individual scores will fall toward the middle (mean)Fewer scores will fall toward the upper and lower endsAverageScoresHighest ScoresLowest Scores
13 Making Test Norm Referenced Make test intentionally difficultAverage score should be about 50%Strong students should tend to score high and weak students should tend to score lowAward As for highest scores, Fs for lowest scores, Cs for average scores
14 Criterion-Referenced Test Measures student performance against predetermined standardsStudent meets or does not meet the standardCompetition is between the student and the skill, knowledge, or abilityGrade is based on accomplishmentEverybody can earn a passing grade if they meet the standard
15 Making Tests Criterion-Referenced Remember that a performance objective has a:Condition, Task, & StandardCriterion = StandardWrite test items using performance objective standard statements and your test will be criterion-referencedEvery objective 1 or more test itemsEvery item an objectiveValidity is assured
16 Characteristics of a Test ValidityReliabilityObjectivityDiscrimination (applies to norm-referenced test only)Comprehensiveness“Score-Ability”
17 Validity A valid test measures: what it is intended to measure what the teacher intended for the students to learnwhat the teacher actually taughtA valid test is FAIR
18 Questions about Validity Does the test actually measure what you intend it to measure?Did you teach the content and skills that are being tested?Does the test require the student to know or do something other than what you intended and/or taught?Does some aspect of the test prevent the student who may know the material from responding correctly?
19 Example of Problem in Validity You taught the names and uses of hand tools using lecture with overheads and handouts.But:On the test, you ask the students to describe how to maintain the tools in good condition. The problem is you taught one thing (names & uses) but tested knowledge of another (maintenance).
20 Another ExampleYou taught the students to write resumes in the classroom and had them hand write their own resumes, but provided no computer instruction.But:You have them prepare their resumes on a computer and grade heavily on appearance. The problem is you are evaluating their word processing skills at least as much as their resume writing skills.
21 A Third Validity Problem You intended to teach the students how to repair a small engine. You taught the lesson in the classroom using overheads, chalkboard, and a teacher demonstration. The students never touched an engine.But:On test day, you give them a disassembled engine to reassemble. The problem is you thought you taught a psychomotor skill, really taught only cognitive content, but are testing the psychomotor skill you never taught
22 Reliability A reliable test provides accurate and consistent results Test reliability can be viewed from two perspectives:Student reliabilityScorer reliability
23 Student Reliability Test items are readable and clear Instructions are simple and unambiguousResponses test only knowledge of the subject matter and not test wiseness, reading ability, agility, or other unrelated trait
24 Scorer Reliability Items can be scored consistently Same scorer would produce similar results on repeated evaluationsDifferent scorers would produce similar results if working independently
25 Objectivity Objectively written Objectively administered items are reliableitems are validObjectively administeredObjectively Scored
26 Discrimination Important ONLY for norm-referenced testing Test separates more knowledgeable students from less knowledgeable studentsDiscriminating test is intended to reward best students and punish weakest studentsIdeal for using normal curve to interpret score
27 Comprehensiveness Assessment covers or samples all of the content Every performance objective is representedMultiple items address each objective
28 Score-Ability Test has scorer reliability Scoring is easily completed “Objective” items are easiest to score“Subjective” items can be scored “objectively”
29 Review Evaluation vs. Measurement vs. Testing Criterion-Referenced or Norm-ReferencedFormative or SummativeCharacteristics of a Test:ValidityReliabilityObjectivityDiscrimination (applies to norm-referenced test only)Comprehensiveness“Score-Ability”
30 The AnswerRare to find an educator who claims to have the right answer, but…in Career and Technical EducationTesting should be BOTH normative and summativeANDTesting should be criterion-referenced
31 So What? Assessment can be positive or threatening Do not use assessment as punishment or as a threatUse assessment to improve student performance and instructionAssign grades fairly: validly, reliably, objectively, and comprehensively