Presentation is loading. Please wait.

Presentation is loading. Please wait.

W ELCOME TO A SSESSING A S KILL OR P ERFORMANCE FOL Part 1 2012 Sandy Odrowski Durham College.

Similar presentations

Presentation on theme: "W ELCOME TO A SSESSING A S KILL OR P ERFORMANCE FOL Part 1 2012 Sandy Odrowski Durham College."— Presentation transcript:

1 W ELCOME TO A SSESSING A S KILL OR P ERFORMANCE FOL Part 1 2012 Sandy Odrowski Durham College

2 A GENDA 1. Getting started  Welcome and session outcomes  Story Time (Connection Activity)  Identifying best assessment practices 2. Reviewing key concepts (Content Activity)  What, Why, How, When and Who of Performance Assessment 3. Exploring – small group work (Practice Activity)  Creating, administering and evaluating checklists, grading scales and rubrics 4. Summarizing and sharing(Summary Activity)

3 S ESSION O UTCOMES During this workshop you will  identify tools that can be used with performance assessments,  Distinguish between rubrics, checklists and rating scales,  explore and share tips to guide the assessment of performances.




7 B ETTY DIDN ’ T DO SO WELL Wilma was very surprised! They compared impressions… 3/5 Wilma gave Betty some advice…




11 T HINK PAIR SHARE  On your own think of an assessment method (for a practical skill) you have used with students or that you have encountered in your past that left you feeling satisfied with the outcome.  Record on the paper provided 1 minute

12 T HINK PAIR SHARE  Pair with a person beside you and share your answer.  Be sure to include WHY you found it a valuable assessment tool 2 minutes

13 G ROUP B RAINSTORMING E XERCISE In your group  Further discuss what you believe are characteristics of “best assessment practices” and record individually on post-it notes provided  When you are finished, place post it notes on the flip chart provided at the front of the class 5 minutes

14 W HAT DID YOU COME UP WITH ? Brilliant!


16 T HE W AY WE W ERE ! Knowledge in the hands of the experts (Behaviorist paradigm) Teacher - centered Focus on assessment of discrete, isolated knowledge and skills Traditional f-2-f delivery  Norm-referenced testing Emphasis on summative assessments Limited perspectives (e.g. teacher)  Product based

17 T HE W AY WE A RE ! Knowledge collaboration, cooperation and community (Constructivist paradigm) Immersive environments - Learning IN Technology - learning anytime any where – distributive learning Student centered – outcome based (complex, integrated learning)real- world context Criterion-referenced Assessment as learning (Formative)  Multiple perspectives (self, peer, teacher) Product and process based (authentic assessments)

18 R EVIEWING KEY CONCEPTS D EFINITIONS Turn to page 8 of your handout. Complete the mix and match with a partner  Assessment  Evaluation  Performance/Authentic Assessment  Validity  Reliability  Assessment Task Attributes  Learning Outcome  Diagnostic, Summative and Formative Evaluation  Rubric  Performance Scale  Performance Check list  Performance criteria (descriptors) 5 minutes


20 W HAT IS P ERFORMANCE /A UTHENTIC A SSESSMENT ?  Performance Assessment – close proximity to “actual criterion situation”. Usually measures complex skills, cognitive processes and communication important in real world (contextualized tasks) (Palm, 2008).  Authentic Assessments – defining features are the specific cognitive processes (disciplined inquiry) and products (knowledge beyond the mere reproduction of presented knowledge) considered important in the perspective of life beyond school (Newman & Archibald, 1992). HTTP :// JOLT. MERLOT. ORG / DOCUMENTS / VOL 1_ NO 1_ MUELLER _001. PDF

21 D EFINITION FOR T ODAY  Performance assessment, also known as alternative or authentic assessment, is a form of testing that requires students to perform a real life task rather than select an answer from a ready-made list. HTTP :// ABCRESOURCE. LOYALISTCOLLEGE. CA / LEARNINGASSESSMENT. HTM # PERFORMANCE

22 A SSESSMENT VS. E VALUATION  Lots of controversy/confusion over these terms Definitions  Assessment = feedback on practice; supports learning but may or may not generate marks/grades  Evaluation= summary measurement providing some kind of grade/mark/final feedback

23 W HY AND W HEN ? T HE P URPOSE OF A SSESSMENT : A SSESSMENT FOR, AS AND OF L EARNING (E ARL, 2003) Assessment FOR Learning Assessment AS Learning Assessment OF Learning Diagnostic (before)Formative (during)Summative (after) Identifies level of knowledge and skill prior to instruction Provides feedback aimed at understanding expected learning and improving performance Summarizes student attainment of expected learning Enables instruction to adapt to learner needs (individualized) Self and peer assessment are a key component (multiple perspectives) Conducted at the end of the instructional period e.g. determination of final grade Enables focus on required instruction (efficient) Motivates student to improved performance (Crooks, 2001) Test scores and grades monitor learning but do little to motivate learning (Huba & Freed, 2000)

24 V ALIDITY AND R ELIABILITY Validity  Validity refers to the degree to which an assessment measures the intended learning outcome. i.e. Does the test measure what it is intended to measure? (the learning outcome?) Reliability  Reliability refers to the consistency of test scores. Reliability affected by testing conditions, rater variability. i.e. Does the test produce consistent results?

25 … IS VALID AND RELIABLE  Measures what it is supposed to measure (valid)  Measures the same information consistently (reliable)  Recognizes that some aspects of learning are hard to measure or may be unplanned

26 W HO IS THE “ ASSESSOR ”?  Multiple perspectives  Self  Peer  Teacher

27 S ELECTING AN A SSESSMENT T OOL Assessment Tool AttributesSample Checklist the least complex form of scoring looking for the presence/absence of specific elements in the product of a performance Rating Scale unlike the typical checklist, it allows for attaching quality to elements in the process or product ratings can be numeric or descriptive Rubric multiple criteria are applied to the assessment quality of the performance & product are typically transparent in the assessment criteria appropriate for complex tasks rather than discrete activities


29 S AMPLE P ERFORMANCE /R ATING S CALE Participates in group problem solving:  4 (Outstanding)  3 (Satisfactory)  2 (Tolerable)  1 (Unsatisfactory)


31 S ELECTING THE P ERFORMANCE / SKILL T ASK  Assessment task (performance, product or process) determined to reflect learning outcome.  Assessment task attributes – essential elements or characteristics of a good performance of an assessment task  Performance criteria (descriptors) – specific behavioral descriptions of performance at each level of performance

32 P ARTS OF A RUBRIC Assessment task attribute Descriptors

33 G ROUP A SSIGNMENT Assessment task Group#1 Checklist Rating Scale Group#2 Checklist Rating Scale Group#3 Rubric Group#4 Rubric Group#5 Checklist Rating Scale Group#6 Rubric Group#7 Checklist Rating Scale Group#8 Rubric Page 6

34 L EARNING A CTIVITY : COLLABORATIVELY CREATE, ADMINISTER AND EVALUATE A PERFORMANCE ASSESSMENT TOOL. Assign group roles as follows:  Performance expert _________________  Student ___________________________  Peer (reporter)_____________________  Teacher (timekeeper)________________ Page 6

35 I NSTRUCTIONS  In consultation with the performance expert, determine performance attributes and criteria (what you feel is essential to achieve the learning outcome) for the identified task. Learning Outcome: Accurately measure and report a radial pulse. OR Learning Outcome: Design, construct and fly a paper airplane.

36 C OMPLETE THE FOLLOWING STEPS  In consultation with the performance expert, determine performance attributes and criteria (what you feel is essential to achieve the learning outcome) for the identified task.  Collaboratively develop the assessment tool (checklist, rating scale or rubric) using template provided.  The performance expert instructs the student on the performance task using the assessment tool as a guide.  The student completes a “return demonstration” of the performance task.

37 S TEPS CONTINUED  The teacher and peer complete the developed assessment tool based on the observed student’s performance.  Discuss the advantages and disadvantages of the developed assessment tool from a variety of perspectives (student, peer and teacher). You can record on your handout.  Be prepared to share your group’s finding with the larger group.

38 Y OUR F INDINGS AssessmentAdvantagesDisadvantages Checklists Performance Scales Rubrics

39 F INAL TIPS Review the learning outcome and purpose of the assessment tool (validity) Collect samples of student work that exemplify levels of performance (validity) Consult with experts in fields to develop “authentic performance tests” and to validate task attributes and performance descriptors (validity, authenticity) Develop performance scoring instruments collaboratively with colleagues

40 F INAL TIPS C ONTINUED Gather multiple perspectives on scoring of same performance (rater reliability) Share scoring rubric with students in advance of the performance task (equity, transparency, accountability) Share performance exemplars (sample projects, assignments, video depictions) with students in advance of performance task (transparency, accountability) Be prepared to review and revise assessment tool (accountability, validity)

41 A C HECKLIST FOR C HOOSING P ERFORMANCE T ASKS Adapted from Checklist for Choosing Performance Assessment Tasks. Algonquin College, Professors Resource Site Ye s No Does the task match the expected learning (learning outcome or course learning requirement)? Does the task adequately represent and elicit the content and skills you expect the student to attain? Does the task enable students to demonstrate their capabilities and progress? Does the assessment use “authentic”, real world tasks? Does the task require the learner to integrate their learning? Can the task be structured to provide a measure of several outcomes? Does the task match an important outcome which reflects complex thinking skills? Does the task pose an enduring problem type-- the type the learner is likely to encounter in the future? Is the task fair and free of bias? Will the task be seen as meaningful by important stakeholders? Will the task be meaningful and engaging to students so that they will be motivated to show their capabilities?

42 I N C ONCLUSION Did we achieve the learning outcomes of this session?  identify tools that can be used with performance assessments,  Distinguish between rubrics, checklists and rating scales,  explore and share tips to guide the assessment of Performances.

Download ppt "W ELCOME TO A SSESSING A S KILL OR P ERFORMANCE FOL Part 1 2012 Sandy Odrowski Durham College."

Similar presentations

Ads by Google