Education 325: Assessment for Classroom Teaching G. Galy, PhD Week 5.

Slides:



Advertisements
Similar presentations
Writing constructed response items
Advertisements

Alternate Choice Test Items
Test Taking Strategies
Test Taking Strategies
Daniel Peck January 28, SLOs versus Course Objectives Student Learning Outcomes for the classroom describe the knowledge, skills, abilities.
The Research Consumer Evaluates Measurement Reliability and Validity
Increasing your confidence that you really found what you think you found. Reliability and Validity.
TEST TAKING SKILLS. Read and understand all instructions before beginning the test. Look for words like ALL, ONLY, TWO OUT OF THREE Take practice tests.
1. 2 Dr. Shama Mashhood Dr. Shama Mashhood Medical Educationist Medical Educationist & Coordinator Coordinator Question Review Committee Question Review.
Designing the Test and Test Questions Jason Peake.
Selected Response Tests
Constructing Exam Questions Dan Thompson & Brandy Close OSU-CHS Educational Development-Clinical Education.
Education 3504 Week 3 reliability & validity observation techniques checklists and rubrics.
Grading in Health and Physical Education Evaluating Student Achievement.
Classroom Assessment FOUN 3100 Fall Assessment is an integral part of teaching.
Principles of High Quality Assessment
Creating Effective Classroom Tests by Christine Coombe and Nancy Hubley 1.
Validity and Reliability Neither Valid nor Reliable Reliable but not Valid Valid & Reliable Fairly Valid but not very Reliable Think in terms of ‘the purpose.
Stages of testing + Common test techniques
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 9 Subjective Test Items.
Classroom Assessment A Practical Guide for Educators by Craig A
Quality Grading Physical Education August 29, 2013 Evaluating Student Achievement-OAISD Workshop Colleen Lewis,Ph.D and Ingrid Johnson, Ph.D. Grand Valley.
Module 6 Test Construction &Evaluation. Lesson’s focus Stages in Test Construction Tasks in Test Test Evaluation.
Oscar Vergara Chihlee Institute of Technology July 28, 2014.
6-8 Learning Community Smarter Balance Test Strategies Understanding the challenge Test taking strategies that work Doing our best Being confident.
Designing and evaluating good multiple choice items Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information.
Tips for Top Tests FOSL fall September 10, 2007 Adapted from “Tools for Teaching” by Barbara Gross Davis.
Completion, Short-Answer, and True-False Items
Constructing Objective Test Items: Simple Forms
Classroom Assessments Checklists, Rating Scales, and Rubrics
CRM Prep Workshop TEST TAKING TECHNIQUES Parts 1-5.
Multiple Choice Question Design Karen Brooks & Barbara Tischler Hastie.
Exam Taking Kinds of Tests and Test Taking Strategies.
Prepare and Use Knowledge Assessments. IntroductionIntroduction Why do we give knowledge tests? What problems did you have with tests as a student? As.
Session 2 Traditional Assessments Session 2 Traditional Assessments.
Student assessment AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Chap. 2 Principles of Language Assessment
Test Taking Strategies. Prepare to avoid errors: Analyze your past results and errors Arrive early and prepared for tests Be familiar with exam question.
USEFULNESS IN ASSESSMENT Prepared by Vera Novikova and Tatyana Shkuratova.
Lectures ASSESSING LANGUAGE SKILLS Receptive Skills Productive Skills Criteria for selecting language sub skills Different Test Types & Test Requirements.
Validity and Reliability Neither Valid nor Reliable Reliable but not Valid Valid & Reliable Fairly Valid but not very Reliable Think in terms of ‘the purpose.
 Are more likely to use appropriate strategies when taking tests; and are more "test-wise" than their peers.  Have positive self-esteem  Have greater.
Common Formative Assessments for Science Monica Burgio Daigler, Erie 1 BOCES.
Writing Multiple Choice Questions. Types Norm-referenced –Students are ranked according to the ability being measured by the test with the average passing.
Assessment and Testing
Standards Based Grading Information from: A Repair Kit for Grading 15 Fixes for Broken Grades By: Ken O’Connor.
Guidelines to Test Preparation Guidelines to Test Preparation.
Develop Tolerance for Ambiguity Entrepreneurship I.
Strategies for answering multiple choice questions Don’t be chicken to answer!
Assessment Item Types: SA/C, TF, Matching. Assessment Item Types Objective Assessments Objective Assessments Performance Assessments Performance Assessments.
Do not on any account attempt to write on both sides of the paper at once. W.C.Sellar English Author, 20th Century.
EVALUATION SUFFECIENCY Types of Tests Items ( part I)
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 6 Construction of Knowledge Tests.
Evaluation and Assessment Evaluation is a broad term which involves the systematic way of gathering reliable and relevant information for the purpose.
VIDEO TEXT RISK APPLICATION SPEED INTERNET DATA INFORMATION CONSUME BUSINESS CAPITAL RESOURCE VIDEO MEDIA ECONOMIC DIVERSE YES NO TRAINING TOWER COMMERCIAL.
 Good for:  Knowledge level content  Evaluating student understanding of popular misconceptions  Concepts with two logical responses.
Assessment in Education ~ What teachers need to know.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Writing Selection Items
Language Assessment.
EDU 385 Session 8 Writing Selection items
RELIABILITY OF QUANTITATIVE & QUALITATIVE RESEARCH TOOLS
Validity and Reliability
Constructing Exam Questions
Summative Assessment Grade 6 April 2018 Develop Revise Pilot Analyze
Multiple Choice Item (MCI) Quick Reference Guide
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 8 Objective Test Items.
TEST CHARACTERISTICS VALIDITY TYPES OF VALIDITY 1- CONTENT VALIDITY
Multiple Choice Item (MCI) Quick Reference Guide
Constructing a Test We now know what makes a good question:
Presentation transcript:

Education 325: Assessment for Classroom Teaching G. Galy, PhD Week 5

Evaluation: A judgment, the determination of worth Common evaluation tools: Exams, Tests, Quizzes Let’s look at Evaluation: Teacher- Made Tests

Teacher-made Tests

How do we make our tests valid and reliable? Valid and Reliable

Validity and Reliability

“Measuring what we set out to test.” “So that we can make sound judgments.” “Validity is never 100% achievable.” Validity:

“Getting consistent results” Factors that can impact process validity include: Directions Vocabulary Time Difficulty Clues Scoring (unclear criteria) Arrangement Emotion/Feelings Cheating Process Validity

“Alignment” Factors that can impact content validity: Irregular ‘spot check’ Gaps (untested) Alignment (teach it = test it) Appropriate value and weighting Content Validity

“Future Performance” Minimal Competencies: Pass? Fail? (Driver’s License) Pass: Mastery? 80% Pattern of achievement and performance? How confidently can you predict success for this student in the next grade? Next course? Predictive Validity

“Ability to accurately measure without influence” Test anxiety Competitive stress Too much teaching to the test, rather than learning Consequential Validity

“attitudes, values or beliefs” Creativity Team Work Problem-Solving Safety-mindedness Higher- order thinking Does your test REALLY measure this? How do you measure such things? How do you accurately measure a mental process? Construct Validity

“produces consistent results…even if given to the same group, a second time” Common characteristics: Re-test reliability Inter rater reliability Reliability

Write Valid and Reliable Items Arrange Items in clear, sequential order Writing the Test

1.Bonus points can distort achievement.T F 1.Achievement is distorted by ____________. a) Bonus Points b) Avion Miles A B Alternate Response

Pro:Con: Recall level Not easy to make (No ambiguity) Quick to complete Unimportant details tend to be tested Promotes ‘guessing’ Not good with complex material Alternate Response

#1. Word statement clearly : Poor Example: “Achievement may be distorted by bonus marks.” Better Example: “Achievement is distorted by bonus marks.” Note: Broad generalizations are rarely wrong. CUE: ‘usually’, ‘generally’ ‘often’ ‘sometimes’ cue TRUE CUE: Absolutes ‘all’ ‘never’ ‘none’ ‘only’ ‘always’ cue FALSE Alternate Response: Guidelines

2. Avoid testing trivial or pointless information : Each item should test important learning outcomes. Ask: Is the answer obvious? If so, then it will undermine the content validity of the test. Alternate Response: Guidelines

3. Avoid the use of negative statements and double negatives. “Neither the bonding nor the adhesive surface should be handled with an ungloved hand. Bold, underline or italicize the word ‘not’. Double negatives (not, no, none, never, neither, nor) are confusing. Alternate Response: Guidelines

4. Keep statements short. Avoid marathon statements. “Large bubbles in local anaesthetic carpules, with or without plungers that extend beyond the end of the carpule, may be formed from dissolved gases in the solution. These bubbles are harmless. TF Alternate Response: Guidelines

5. Avoid double barreled statements. “Parks and protected areas should be maintained for the use and benefit of all British Columbians.” Alternate Response: Guidelines -

6. Avoid statements that contain lists. If you do, you won’t be able to determine if they knew the answer or took a guess. “ Stockton, Jordan and Robinson are in the NBA Hall of Fame.” Alternate Response: Guidelines

7. Avoid using trickery. Don’t insert insignificant detail to deceive the student. “The area of a rectangle 4 m x 3 m equals 12 square centimeters.”TF Alternate Response: Guidelines

8. Have an equal number of True and False items. Why? Students have a tendency to guess True more often than False. Skewing the number will unduly favour or penalize a number of students. Alternate Response: Guidelines

9. Keep True and False statements items similar in length. The longer statement is most often True. Largely because it contains additional detail and is more accurate. If possible, keep all items short and the same length. Alternate Response: Guidelines

1. Blank is located at the end of the sentence. 2. Only important words are omitted. 3. No more than two blanks per statement. 4. Each blank asks for a single idea. “Fill in the Blank” or Short Answer Guidelines

5. Blanks must be the same length. 6. Only an ‘informed’ person should be able to complete the answer. “Fill in the Blank” Guidelines