Incorporating Authentic Assessment in the Classroom Narrowing the Gulf Conference March/April 2011.

Slides:



Advertisements
Similar presentations
Analyzing Student Work
Advertisements

Sue Sears Sally Spencer Nancy Burstein OSEP Directors’ Conference 2013
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
College of Nursing January 2011 Best Practices for Writing Objective Test Items.
Quality Enhancement Plan QEP Team and Faculty Champions
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
Minnesota State Community and Technical College Critical Thinking Assignment Example and Assessment.
Faculty Champion Meeting March 2011
Dr. Michael Earle & Dr. Janice Thiel.  Background Deer, and Terry, and Snakes, Oh my!  Activity  Discussion.
Assessing for Critical Thinking
Making Assignment Expectations Clear: Create a Grading Rubric Barb Thompson Communication Skills Libby Daugherty Assessment FOR Student Learning 1.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
Social Science Faculty Meeting January 2010 Mastering the Art of Test Writing Roundtable Discussion.
Education 3504 Week 3 reliability & validity observation techniques checklists and rubrics.
Dr. Robert Mayes University of Wyoming Science and Mathematics Teaching Center
Questions to check whether or not the test is well designed: 1. How do you know if a test is effective? 2. Can it be given within appropriate administrative.
Daniel Fasko, Jr., Ph.D..  Definition of Critical Thinking  Critical Thinking Skills  Critical Thinking Dispositions  Instructional Strategies  Assessment.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 9 Subjective Test Items.
Classroom Test and Assessment
FLCC knows a lot about assessment – J will send examples
Assessing Critical Thinking Summer Critical Thinking Institute QEP Team, Faculty Champions, and Academic Roundtables 2008.
Chapter 1 Assessment in Elementary and Secondary Classrooms
Principles of Assessment
TEXAS TECH UNIVERSITY HEALTH SCIENCES CENTER SCHOOL OF PHARMACY KRYSTAL K. HAASE, PHARM.D., FCCP, BCPS ASSOCIATE PROFESSOR BEYOND MULTIPLE CHOICE QUESTIONS.
All College Day: Our Role in Student Success Incorporating Authentic Assessment in the Classroom October 2010.
Assessing Critical Thinking Skills Dr. Barry Stein - Professor of Psychology, Director of Planning, Coordinator of TTU Critical Thinking Initiative Dr.
Narrowing the Gulf Annual Conference 2010 March 2010 Mastering the Art of Writing Objective Test Items.
SCORING. INTRODUCTION & PURPOSE Define what SCORING means for the purpose of these modules Explain how and why you should use well-designed tools, such.
Becoming a Teacher Ninth Edition
January 29, 2010ART Beach Retreat ART Beach Retreat 2010 Assessment Rubric for Critical Thinking First Scoring Session Summary ART Beach Retreat.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Four Basic Principles to Follow: Test what was taught. Test what was taught. Test in a way that reflects way in which it was taught. Test in a way that.
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
ELA Common Core Shifts. Shift 1 Balancing Informational & Literary Text.
Incorporating Authentic Assessment in the Classroom Narrowing the Gulf Annual Conference 2011 March/April 2011.
T 7.0 Chapter 7: Questioning for Inquiry Chapter 7: Questioning for Inquiry Central concepts:  Questioning stimulates and guides inquiry  Teachers use.
Agenda Peer Assessment Roundtables – Student Learning 10: :10 – Welcome and Explain Process 10: :30 – Full Group: Coaching Assessment – SLOs.
1 Issues in Assessment in Higher Education: Science Higher Education Forum on Scientific Competencies Medellin-Colombia Nov 2-4, 2005 Dr Hans Wagemaker.
Understanding Meaning and Importance of Competency Based Assessment
Exam Taking Kinds of Tests and Test Taking Strategies.
Measuring Complex Achievement
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
Veterinary Technology Staff Meeting Incorporating Authentic Assessment in the Classroom February 2011.
Chap. 2 Principles of Language Assessment
TAKS Writing Rubric
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
Fourth session of the NEPBE II in cycle Dirección de Educación Secundaria February 25th, 2013 Assessment Instruments.
2009 Professional Development Day October 2009 Mastering the Art of Test Writing.
Assessment and Testing
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
McGraw-Hill/Irwin © 2012 The McGraw-Hill Companies, Inc. All rights reserved. Obtaining Valid and Reliable Classroom Evidence Chapter 4:
Georgia will lead the nation in improving student achievement. 1 Georgia Performance Standards Day 3: Assessment FOR Learning.
Quick Write Reflection How will you implement the Engineering Design Process with your students in your classes?
College Career Ready Conference Today we will:  Unpack the PARCC Narrative and Analytical writing rubrics while comparing them to the standards.
Subject-specific content: A Generic scoring guide for information-based topics 4 The student has a complete and detailed understanding of the information.
Greenbush. An informed citizen possesses the knowledge needed to understand contemporary political, economic, and social issues. A thoughtful citizen.
Michigan Assessment Consortium Common Assessment Development Series Module 16 – Validity.
Test Question Writing Instructor Development ANSF Nurse Training Program.
 Good for:  Knowledge level content  Evaluating student understanding of popular misconceptions  Concepts with two logical responses.
Assessment in Education ~ What teachers need to know.
Chapter 1 Assessment in Elementary and Secondary Classrooms
CRITICAL CORE: Straight Talk.
Classroom Assessment A Practical Guide for Educators by Craig A
IB Assessments CRITERION!!!.
Concept of Test Validity
Classroom test and Assessment
Incorporating Authentic Assessment in the Classroom
Narrowing the Gulf Annual Conference 2010
Presentation transcript:

Incorporating Authentic Assessment in the Classroom Narrowing the Gulf Conference March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 St. Petersburg College Presenters  Dr. James Coraggio, Director, Academic Effectiveness and Assessment, St. Petersburg College  Dr. Carol Weideman, Mathematics Professor, Gibbs Campus, St. Petersburg College Narrowing the Gulf Conference2 March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Assessment Basics Why do we assess?  To see how well we are doing  To confirm what we already know  To share our progress with others  To see where we can improve and change  In some cases to demonstrate what does not work Narrowing the Gulf Conference3 J March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Assessment Basics Why do we assess? Narrowing the Gulf Conference4 Source: J March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Purpose of an Assessment  “Clearly delineate between those that know the content and those that do not.”  To determine whether the student knows the content, not whether the student is a good test- taker.  Likewise, confusing and tricky questions should be avoided to prevent incorrect responses from students who know the material. Narrowing the Gulf Conference5 J March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Types of Assessments  Objective assessments  Authentic assessment Narrowing the Gulf Conference6 J March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Objective Assessments  Measure several types of learning (also levels)  Wide content, short period of time  Variations for flexibility  Easy to administer, score, and analyze  Scored more reliability and quickly Narrowing the Gulf Conference7 C March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Types of Objective Tests  Written-response  Completion (fill-in-the-blank)  Short answer  Selected-response  Alternative response (two options)  Matching  Keyed (like matching)  Multiple choice Narrowing the Gulf Conference8 C March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Assignment Example I  Multiple Choice Assessment  Assignment Objectives  Solving equations using addition and multiplication principles  Solving applied problems by translating to equations Narrowing the Gulf Conference9 C March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Assignment Example I  Multiple Choice Assessment 1.Solve for x: 3(x + 8) = 4 (x – 4) a. -8 b. 40 c. 8 d Write the sentence as an algebraic equation: The sum of 18 and a number is 5 a. x -18 = 5 b = x c. 18 = x + 5 d x = 5 Narrowing the Gulf Conference10 C March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Issues with Objective Assessments  Limited depth of content  Not able to reveal student misconceptions  Limited ability to test critical thinking skills  Students are ‘Test-wise’ Narrowing the Gulf Conference11 C March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Test-wise Students  Are familiar with item formats  Use informed and educated guessing  Avoid common mistakes  Have testing experience  Use time effectively  Apply various strategies to solve different problem types Narrowing the Gulf Conference12 C March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Authentic Experiences  Is the course aligned with the expectations for the student in the ‘real-world’?  Authentic Learning  Authentic Assessment Narrowing the Gulf Conference13 J March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Class Without Authentic Experiences  Didactic instruction where students are presented with factual information from a text book  Assessment is primarily multiple choice items where students are expected to regurgitate factual information Narrowing the Gulf Conference14 J March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Class With Authentic Experiences  Interactive learning environment where students not only learn facts but the relationship between the facts and the application of that information  Authentic assessment where students are able to model the applications of the discipline through simulations, projects, etc. Narrowing the Gulf Conference15 J March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Narrowing the Gulf Conference16 Authentic Assessments  Authentic assessments serve dual purposes of  encouraging students to think critically and  providing assessment data for measuring improved student learning.  These assessment techniques fall into three general categories:  criterion-referenced rubrics,  student reports (reflection or self-assessments), and  student portfolios. J March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Authentic Assessments Authentic assessments include…  Criterion-referenced rubrics. Complex, higher-order objectives can be measured only by having students create a unique product, whether written or oral [in-class essays, speeches, term papers, videos, computer programs, blueprints, or artwork] (Carey, 2000).  Student Reflection. Written reflection is espoused to have several important benefits: it can deepen the quality of critical thinking, increase active involvement in learning, and increase personal ownership of the new learning by the student (Moon, 1999).  Student Portfolios. Collections of students’ work over a course or a program and can be an effective method of demonstrating student progress in the area of critical thinking (Carey, 2000). Narrowing the Gulf Conference17 J March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Assignment Example II  Authentic Assessment: Painting a Room  Assignment Objectives  Solving Equations using addition and multiplication principles  Solving applied problems by translating to equations Narrowing the Gulf Conference18 C March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Assignment Example II You have decided to paint your family room. Assume that one coat will be sufficient. Paint is currently on sale at Home Depot for $15.99 per gallon and one gallon will cover 400 square feet. Your family room is 10 ft by 14 ft and the ceiling is 9 ft above the floor. There are 3 windows, 30 in by 50 in and 2 doorways, 36 in by 7 ft high. Answer the following questions: How many square feet of walls do you need to cover? How much paint is needed? How much will it cost? Is there any additional information that would be helpful? Narrowing the Gulf Conference19 C March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Assignment Example II A.Write an equation for the square feet of wall. Let x = square feet of wall y = Total cost for paint Calculate the square footage. B. Using the given information and the equation from Part A, how much paint do you need? C. How much will the paint cost? D. What other information would be helpful? Narrowing the Gulf Conference20 C March/April 2011 Room is 10 ft wide x 12 ft long x 9 ft high. 2 walls that are 10 ft x 9 ft = 90 ft x 2 = 180 sq ft 2 walls that are 14 ft x 9 ft = 126 x 2 = 252 sq ft Total Square feet = 432 sq. ft BUT: we need to adjust for windows and doors One gallon covers 400 sq ft; we need one gallon of paint Paint cost is $15.99 Are we painting the ceiling? Do we need supplies (brushes, drop clothes, etc)? Do we need to add sales tax to paint cost? If so, what is rate?

Incorporating Authentic Assessment in the Classroom 2011 Assignment Example II A.Write an equation for the square feet of wall. Let x = square feet of wall y = Total cost for paint Calculate the square footage. B. Using the given information and the equation from Part A, how much paint do you need? C. How much will the paint cost? D. What other information would be helpful? Narrowing the Gulf Conference21 C March/April windows: 30 in x 50 in x 3= 4500/144 sq in per sq ft = sq ft 2 doorways: 3 ft by 7 ft = 21 sq ft x 2 = 42 sq ft Total Square ft = 432 – – 42 = sq ft One gallon covers 400 sq ft; we need one gallon of paint Paint cost is $15.99 Are we painting the ceiling? Do we need supplies (brushes, drop clothes, etc)? Do we need to add sales tax to paint cost? If so, what is rate?

Incorporating Authentic Assessment in the Classroom 2011 Rubrics What is a rubric?  Scoring guidelines, consisting of specific pre-established performance criteria, used in evaluating student work on performance assessments Narrowing the Gulf Conference22 C March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Rubrics SPC currently uses rubrics in such programs as…  College of Education  College of Nursing  Paralegal Studies Program Narrowing the Gulf Conference23 C March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Rubric Development Process 1.Re-examine the learning objectives to be addressed by the task 2.Identify specific observable attributes your students should demonstrate 3.Describe characteristics at each attribute 4.Write narrative descriptions for each level of continuum 5.Collect samples of student work 6.Score student work and identify samples that exemplify various levels 7.Revise the rubric as needed Narrowing the Gulf Conference24 Repeat as Needed J March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Assignment Profile  Designed to provide consistency and accuracy as well as provide guidelines for the use  Rubric is an evaluation ‘tool’, but for a tool to be effective it must be in the correct situation or ‘job.’ It would be inefficient to use a machete to conduct heart surgery.  Rubric must be aligned to the most appropriate course assignment  The instructor is the assessment instrument not the rubric Narrowing the Gulf Conference25 J March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Assignment Example III  Rubric: STA2023 Sampling Project  Assignment Objectives  Identify the sampling strategies commonly employed to collect data  Describe potential biases encountered with sampling strategies used in various statistical applications  Suggest strategies to avoid potential biases when using sampling to collect data for a statistical application Narrowing the Gulf Conference26 J/C March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Assignment Example III Management at a retail store is concerned about the possibility of drug abuse by people who work there. They decide to check on the extent of the problem by having a random sample of the employees undergo a drug test. The lawyers for the retail store have assured the management that there are no legal issues with the proposed drug testing as long as the individual test results are not identified to a specific employee. Depending the extent of illegal drugs identified in the drug testing, drug counseling may be offered to all employees under the promise of complete confidentiality. You have been hired as the statistician who will design the sampling plan. The budget for the drug testing will cover the cost of 40 drug tests. Narrowing the Gulf Conference27 J/C March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Assignment Example III Management has proposed several different ideas about the best way to obtain the random sample of 40 employees who will be drug tested. There are currently 500 employees at this retail store. There are four classifications of employees: supervisors, full-time sales clerks, part- time sales clerks and maintenance staff. These sampling possibilities are listed below:  Select one of the employee classifications and sample all employees in that classification.  Choose every fourth person who clocks in for each shift.  Randomly select 10 employees from each classification.  Each employee has a three-digit employee number. Randomly select 40 employees. Narrowing the Gulf Conference28 J/C March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Assignment Example III Answer the following questions regarding this scenario:  Define this problem in your own words.  Compare and contrast the four proposed sampling plans.  Select one of the proposed sampling plan that you feel is most appropriate for this situation and defend your choice.  Describe any weaknesses in your selected sampling plan.  Make suggestions on ways to improve/strengthen the sampling plan. You may include information not described in the scenario above.  Reflect on your own thought process after completing the assignment. “What did you learn from this process?” “What would you do differently next time to improve?” Narrowing the Gulf Conference29 J/C March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Narrowing the Gulf Conference30 Assessment Rubric for CT Performance Element Exemplary (4) Proficient (3) Developing (2) Emerging (1) Not Present (0) I. Communication Define problem in your own words. Identifies the main idea or problem with numerous supporting details and examples which are organized logically and coherently. Identifies the main idea or problem with some supporting details and examples in an organized manner. Identifies the main idea or problem with few details or examples in a somewhat organized manner. Identifies the main idea or problem poorly with few or no details or states the main idea or problem verbatim from the text. Does not identify the main idea or problem. II. Analysis Compare & contrast the available solutions. Uses specific inductive or deductive reasoning to make inferences regarding premises; addresses implications and consequences; identifies facts and relevant information correctly. Uses logical reasoning to make inferences regarding solutions; addresses implications and consequences; Identifies facts and relevant information correctly. Uses superficial reasoning to make inferences regarding solutions; Shows some confusion regarding facts, opinions, and relevant, evidence, data, or information. Makes unexplained, unsupported, or unreasonable inferences regarding solutions; makes multiple errors in distinguishing fact from fiction or in selecting relevant evidence. Does not analyze multiple solutions. III. Problem Solving Select & defend your final solution. Thoroughly identifies and addresses key aspects of the problem and insightfully uses facts and relevant evidence from analysis to support and defend potentially valid solutions. Identifies and addresses key aspects of the problem and uses facts and relevant evidence from analysis to develop potentially valid conclusions or solutions. Identifies and addresses some aspects of the problem; develops possible conclusions or solutions using some inappropriate opinions and irrelevant information from analysis. Identifies and addresses only one aspect of the problem but develops untestable hypothesis; or develops invalid conclusions or solutions based on opinion or irrelevant information. Does not select and defend a solution. March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Narrowing the Gulf Conference31 Assessment Rubric for CT Performance Element Exemplary (4) Proficient (3) Developing (2) Emerging (1) Not Present (0) IV. Evaluation Identify weaknesses in your final solution. Insightfully interprets data or information; identifies obvious as well as hidden assumptions, establishes credibility of sources on points other than authority alone, avoids fallacies in reasoning; distinguishes appropriate arguments from extraneous elements; provides sufficient logical support. Accurately interprets data or information; identifies obvious assumptions, establishes credibility of sources on points other than authority alone, avoids fallacies in reasoning; distinguishes appropriate arguments from extraneous elements; provides sufficient logical support. Makes some errors in data or information interpretation; makes arguments using weak evidence; provides superficial support for conclusions or solutions. Interprets data or information incorrectly; Supports conclusions or solutions without evidence or logic; uses data, information, or evidence skewed by invalid assumptions; uses poor sources of information; uses fallacious arguments. Does not evaluate data, information, or evidence related to final solution. V. Synthesis Suggest ways to improve/strengthen your final solution. Insightfully relates concepts and ideas from multiple sources; uses new information to enhance final solution; recognizes missing information; correctly identifies potential effects of new information. Accurately relates concepts and ideas from multiple sources; uses new information to enhance final solution; correctly identifies potential effects of new information. Inaccurately or incompletely relates concepts and ideas from multiple sources; shallow determination of effect of new information on final solution. Poorly integrates information from more than one source to support final solution; Incorrectly predicts the effect of new information on final solution. Does not identify new information for final solution. March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Narrowing the Gulf Conference32 Assessment Rubric for CT Performance Element Exemplary (4) Proficient (3) Developing (2) Emerging (1) Not Present (0) VI. Reflection Reflect on your own thought process. “What did you learn from this process?” “What would you do differently next time to improve?” Identifies strengths and weaknesses in own thinking: recognizes personal assumptions, values and perspectives, compares to others’, and evaluates them in the context of alternate points of view. Identifies strengths and weaknesses in own thinking: recognizes personal assumptions, values and perspectives, compares to others’, with some comparisons of alternate points of view. Identifies some personal assumptions, values, and perspectives; recognizes some assumptions, values and perspectives of others; shallow comparisons of alternate points of view. Identifies some personal assumptions, values, and perspectives; does not consider alternate points of view. Does not reflect on own thinking. March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Questions Narrowing the Gulf Conference33 March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Assessment Basics  Alignment of course objectives  Competency, clarity, bias, level of difficulty  Validity and Reliability Narrowing the Gulf Conference34 C March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Assessment Basics Alignment  Everything needs to align (objectives through assessment) Narrowing the Gulf Conference35 C March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Competency  Items should test for the appropriate or adequate level of knowledge, skill, or ability (KSA) for the students.  Assessing lower division students on graduate level material is an ‘unfair’ expectation.  The competent student should do well on an assessment, items should not be written for only the top students in the class. Narrowing the Gulf Conference36 C March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Clarity  Clear, precise item and instruction  Correct grammar, punctuation, spelling  Address one single issue  Avoid extraneous material (teaching)  One correct or clearly best answer  Legible copies of exam Narrowing the Gulf Conference37 C March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Bias  Tests should be free from bias…  No stereotyping  No gender bias  No racial bias  No cultural bias  No religious bias  No political bias Narrowing the Gulf Conference38 C March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Level of Difficulty  Ideally, test difficulty should be aimed a middle level of difficulty. This can not always be achieved when the subject matter is based on specific expectations (i.e, workforce area). Narrowing the Gulf Conference39 J March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Trivial and Trick Questions  Avoid trivia and tricks  Avoid humorous or ludicrous responses  Items should be straight forward, they should cleanly delineate those that know the material from those that do not  Make sure every item has value and that it is contributing to the final score Narrowing the Gulf Conference40 J March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Assessment Basics Does one size fit all?  Assessments need to be valid  Assessments need to be reliable Narrowing the Gulf Conference41 J March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Validity Does the assessment measure what it is suppose to measure?  “Validation is the process of accumulating evidence that supports the appropriateness of inferences that are made of student responses…” (AERA, APA, & NCME, 1999) Narrowing the Gulf Conference42 J March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Types of Validity Evidence  Content Related - the extent to which a student’s responses to a given assessment reflect that student’s knowledge of the content area  Construct Related - the extent to which the responses being evaluated are appropriate indicators of the underlying construct  Criterion Related - the extent to which the results of the assessment correlate with a current or future event  Consequential – the consequences or use of the assessment results Narrowing the Gulf Conference43 J March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Reliability Consistency of the assessment scores  Types of reliability… Interrater Reliability – scores vary from instructor to instructor. Intrarater Reliability – scores vary from a single instructor from paper to paper  A test can be reliable and not valid, but never valid and not reliable Narrowing the Gulf Conference44 J March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Reliability Concerns Reliability 1.Are the score categories well defined? 2.Are the differences between the score categories clear? 3.Would two independent raters arrive at the same score for a given student response based on the scoring rubric? Narrowing the Gulf Conference45 J March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Improving Scoring Consistency  Provide grading rubrics or scoring criteria to students prior to assessment  Grade papers anonymously  Use anchor papers to define levels of proficiency for reference  Use multiple scorers  Calculate reliability statistics during training and grading Narrowing the Gulf Conference46 J March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Assessment Basics  Multiple Measures  Always to good to implement multiple measures when possible  Ideally direct and indirect measures of competency Narrowing the Gulf Conference47 C March/April 2011

Incorporating Authentic Assessment in the Classroom 2011 Indirect Methods  “indirect measures …help deepen the interpretation of student learning” (Maki, 2004).  SSI is a good example of an indirect measure. Narrowing the Gulf Conference48 C March/April 2011

Incorporating Authentic Assessment in the Classroom Narrowing the Gulf Conference March/April 2011