Evidence & Preference: Bias in Scoring TEDS-M Scoring Training Seminar Miami Beach, Florida.

Slides:



Advertisements
Similar presentations
TWS Aid for Supervisors & Mentor Teachers Background on the TWS.
Advertisements

K-6 Science and Technology Consistent teaching – Assessing K-6 Science and Technology © 2006 Curriculum K-12 Directorate, NSW Department of Education and.
Assessment Assessment should be an integral part of a unit of work and should support student learning. Assessment is the process of identifying, gathering.
1 © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training English K-6 Syllabus Using the syllabus for consistency of assessment.
HOUSTON EMPLOYEE ASSESSMENT AND REVIEW (HEAR) PROCESS INFORMATION SESSION NON-SUPERVISOR For more information, visit
Applying Psychology to Teaching
Job Analysis-Based Performance Appraisals
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
PSHE education and the SEF The contribution of PSHE education to the school inspection process.
3 levels: Foundation, Standard, Advanced Language B Spanish Criteria.
A Terse Self-Test about Testing
Chapter 4 How to Observe Children
TWS Aid for Scorers Information on the Background of TWS.
Assessment: Reliability, Validity, and Absence of bias
The Cultural Contexts of Teaching and Learning Stuart Greene Associate Professor of English Director of Education, Schooling, and Society Co-founder of.
E-Program Portfolio Let’s Begin Department of Reading and Language Arts Program Portfolio Central Connecticut State University Name: Date Submitted: Program.
Grade 12 Subject Specific Ministry Training Sessions
Geography Subject leaders Training Exploring the content of the new National Curriculum.
Noynay, Kelvin G. BSED-ENGLISH Educational Technology 1.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 9 Subjective Test Items.
Alya Reeve, MD, MPH DDMI-TUG  Culture can be defined as the ways people live with each other, differentiating themselves from other groups of.
Essay Assessment Tasks
Planning in Religious Education Learning Intentions for the day: To identify the essential elements of high quality planning in RE To identify the.
What Does Research Tell Us About the Teaching and Learning of Writing? Writing Resources Center UNC Charlotte 1.
Science & Technology Grades Spring 2007
Adapted from Growing Success (Ontario Schools) by K. Gibson
SESSION ONE PERFORMANCE MANAGEMENT & APPRAISALS.
Exploring Strategies for the Secondary Level in Mathematics Patricia Latham and Cathie McQueeney September 28, 2006.
Bank of Performance Assessment Tasks in English
D ecreasing P atient-Provider C onflict University of Utah Dialysis Program.
Staff Development and the Change Process
A presentation of key findings from a national survey of 800 registered voters conducted September 10-12, 2007.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Thomas College Name Major Expected date of graduation address
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
Measuring Complex Achievement
Week 5 Lecture 4. Lecture’s objectives  Understand the principles of language assessment.  Use language assessment principles to evaluate existing tests.
Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia Ara Tekian, PhD, MHPE University of Illinois at Chicago.
USEFULNESS IN ASSESSMENT Prepared by Vera Novikova and Tatyana Shkuratova.
Issues in Validity and Reliability Conducting Educational Research Chapter 4 Presented by: Vanessa Colón.
Transfer Like a Champ! By Michelle Brazeal. Transfer Training Why do we teach?
VALUE/Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment Pilot Year Study Findings and Summary These slides summarize results from.
Assessment Design. Four Professional Learning Modules 1.Unpacking the AC achievement standards 2.Validity and reliability of assessments 3. Confirming.
McGraw-Hill/Irwin © 2012 The McGraw-Hill Companies, Inc. All rights reserved. Obtaining Valid and Reliable Classroom Evidence Chapter 4:
World Languages Department New Haven Professional Development Day October 1, 2007.
Classroom Assessment Chapter 9 ELED 4050 Summer 2007.
Essay Questions. Two Main Purposes for essay questions 1. to assess students' understanding of and ability to think with subject matter content. 2. to.
Jeanne Ormrod Eighth Edition © 2014, 2011, 2008, 2006, 2003 Pearson Education, Inc. All rights reserved. Educational Psychology Developing Learners.
Moderation and Validation of Teacher Judgements in School.
Unit 11: Use observation, assessment and planning
National Science Education Standards. Outline what students need to know, understand, and be able to do to be scientifically literate at different grade.
RUBRICS Why use them?. FEEDBACK John Hattie identifies improving feedback as one of the most effective strategies for improving the learning experience.
CERTIFICATE IN ASSESSING VOCATIONAL ACHIEVEMENT (CAVA) Unit 1: Understanding the principles and practices of assessment.
Developing Instructional Pathways Curriculum Leading to Student Performance © all rights reserved gmeaders_ch.
Defining 21st Century Skills: A Frameworks for Norfolk Public Schools NORFOLK BOARD OF EDUCATION Fall 2009.
CHAPTER 5 Transfer of Training.
EDUC 410 Fall, “Teachers are designers. An essential act of our profession is the crafting of curriculum and learning experiences to meet specified.
Designing Quality Assessment and Rubrics
© 2013 by Nelson Education1 Foundations of Recruitment and Selection I: Reliability and Validity.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessments Checklists, Rating Scales, and Rubrics
Communication Applications
What Does Research Tell Us About the Teaching and Learning of Writing
Academic Rubric Slides
The Framework for Teaching
Literacy Content Specialist, CDE
Dixie Grupe Social Studies Director
Samuel O. Ortiz, Ph.D. Professor St. John’s University
Unit 7: Instructional Communication and Technology
CHAPTER 5: Selecting and Using Assessment Instruments
Presentation transcript:

Evidence & Preference: Bias in Scoring TEDS-M Scoring Training Seminar Miami Beach, Florida

Bias Bias is part of being a thinking person. Bias is part of the way we see the world. Bias is our personal way of interacting with the world. Bias is built into the issues important to us. Bias is part of our disagreements with others – sometimes we dont agree with anyone. Bias informs how we evaluate our own work.

Judgments We make judgments throughout the day, about how we conduct ourselves and how we interact with others – how we do our work and evaluate the work of others. In part, our judgments originate from assumptions because of experience. As we grow, we receive messages about competence, accomplishment, ways of talking and behaving, appearance, gender, race, class.

Influence We use certain assumptions or biases, often unconsciously, to make judgments about things so we can understand our surroundings in our terms. Each of us prefer specific approaches and strategies for doing quality work and we know when we see quality work and effort. We often critique others on characteristics we do not like to see in ourselves.

Evidence As we score responses, particularly those on the pedagogy-related tasks, we need to be able to separate our personal biases from the evaluation of evidence of understanding, knowledge, and skill. A focus on evidence provided in a response will provide scores that are meaningful and useful to inform

Bias-Free Evaluation Not possible. We all hold specific biases – and that is in part a good thing – they hold us accountable. At times, our biases interfere with our ability to understand situations and others. At times, our biases interfere with fair assessment of the performance of others.

Validity The degree to which we have evidence to suggest that the inferences, interpretations, and uses of assessment results are meaningful, appropriate, and useful. An argument we develop to support our uses of assessment results. Variation in scores is construct-relevant.

Toward Fair Assessment We can identify our own biases. We can learn how biases interfere with our assessment of performance. We can see how bias affects judgment. We can separate bias from judgment.

Scoring & Professional Development Scoring provides a great opportunity for professional development. However, scoring is not about you. Scoring is about assessing Future Teacher performance. Scoring is about identifying aspects of understanding, knowledge, skill, competence.

Scoring Tools The scoring system is developed to guide scoring toward evidence and to focus scorers on the nature of mathematics and mathematics pedagogy. Scoring training, scoring guides, and examples are provided as guides to focus judgments. Providing opportunities to uncover personal biases allows us to attend to them, to keep them under control.

Skills for Teaching Mathematics What WOULD you like to see in responses to pure or contextual mathematics problems? What WOULD you like to see in responses to problems seeking mathematical knowledge for teaching? How do you know that a teacher is prepared to teach mathematics to diverse students?

Bias Training Tool (1) Developing a List of Personal Preferences Allows us to identify and control the impact of our personal preferences. This is a list that can be developed in the initial training session and scorers can add to the list as they continue scoring. This list should be in front of us as we score.

Bias Training Tool (2) Develop a List of Teaching Practices that we prefer or try to teach in our profession. Identify those that are construct-relevant. Identify those that are construct-irrelevant. Teaching practices I like Teaching practices I dont like 1. 2.

Bias Training Tool (3) Create a List of Characteristics of people that are different than us. – How might these characteristics be evident in responses – affect responses? Create a List of Characteristics of Competence in teaching mathematics. – What characteristics are superficial and should be guarded when scoring?

Biases About Written Responses ISSUES: Grammar, spelling Organization Failure to follow directions Concise writing; response length Missing words Descriptive writing INTERPRETATIONS: Careless, lazy Poorly educated Does not want to demonstrate limited knowledge Limited evidence Little to say – covering up lack of knowledge

More Biases About Written Responses Use of non-standard form of the language Fluent writing Drawing ability Providing grade-level examples

Interpreting the Scoring Guide Some scorers will interpret scoring guides literally – word for word with no room for interpretation Others will interpret scoring guides with a great deal of interpretation

Consistency & Fairness Consistency is often seen as an important characteristic of Fairness. Consistency improves comparability. Consistency may require a level of agreement that cannot be achieved. Consistency can restrict the opportunity to fairly judge novel, unique, or culturally appropriate approaches.

Training Provide an opportunity for scorers to identify their own biases. Provide time for discussion of how biases may influence scoring. Discuss methods for reducing the degree to which construct-irrelevant influences affect scores.