Narrowing the Gulf Annual Conference 2010

Slides:



Advertisements
Similar presentations
Writing constructed response items
Advertisements

Alternate Choice Test Items
Test Taking Strategies
College of Nursing January 2011 Best Practices for Writing Objective Test Items.
TEST TAKING SKILLS. Read and understand all instructions before beginning the test. Look for words like ALL, ONLY, TWO OUT OF THREE Take practice tests.
M ATCHING I TEMS Presenter Pema Khandu B.Ed.II(S)Sci ‘B’
Social Science Faculty Meeting January 2010 Mastering the Art of Test Writing Roundtable Discussion.
Classroom Assessment FOUN 3100 Fall Assessment is an integral part of teaching.
Test Writing: Moving Away from Publisher Material
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 9 Subjective Test Items.
Classroom Assessment A Practical Guide for Educators by Craig A
RELIABILITY BY DESIGN Prepared by Marina Gvozdeva, Elena Onoprienko, Yulia Polshina, Nadezhda Shablikova.
Module 6 Test Construction &Evaluation. Lesson’s focus Stages in Test Construction Tasks in Test Test Evaluation.
Narrowing the Gulf Annual Conference 2010 March 2010 Mastering the Art of Writing Objective Test Items.
Oscar Vergara Chihlee Institute of Technology July 28, 2014.
Designing and evaluating good multiple choice items Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information.
Tips for Top Tests FOSL fall September 10, 2007 Adapted from “Tools for Teaching” by Barbara Gross Davis.
Completion, Short-Answer, and True-False Items
CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CHAPTER 8 AMY L. BLACKWELL JUNE 19, 2007.
Multiple Choice Question Design Karen Brooks & Barbara Tischler Hastie.
Exam Taking Kinds of Tests and Test Taking Strategies.
Prepare and Use Knowledge Assessments. IntroductionIntroduction Why do we give knowledge tests? What problems did you have with tests as a student? As.
Session 2 Traditional Assessments Session 2 Traditional Assessments.
1 Writing Test Blueprints and Test Items “Software for Creating and Delivering Assessments With Powerful Reports”
Test Taking Strategies. Prepare to avoid errors: Analyze your past results and errors Arrive early and prepared for tests Be familiar with exam question.
ASSESSING STUDENT ACHIEVEMENT Using Multiple Measures Prepared by Dean Gilbert, Science Consultant Los Angeles County Office of Education.
2009 Professional Development Day October 2009 Mastering the Art of Test Writing.
Writing Multiple Choice Questions. Types Norm-referenced –Students are ranked according to the ability being measured by the test with the average passing.
Assessment and Testing
March 11, 2013 Chicago, IL American Board of Preventive Medicine American Board of Preventive Medicine Clinical Informatics Examination Committee Measurement.
Test Question Writing Instructor Development ANSF Nurse Training Program.
Language Testing How to make multiple choice test.
Do not on any account attempt to write on both sides of the paper at once. W.C.Sellar English Author, 20th Century.
EVALUATION SUFFECIENCY Types of Tests Items ( part I)
University of Baltimore Test Development Solutions (TDS) Thomas Fiske, M.S. - Test Development Team Lead Charles Glover, M.S. - Test Developer; Diann M.
Multiple-Choice Item Design February, 2014 Dr. Mike Atkinson, Teaching Support Centre, Assessment Series.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. SELECTED.
Assessment and the Institutional Environment Context Institutiona l Mission vision and values Intended learning and Educational Experiences Impact Educational.
Objective Examination: Multiple Choice Questions Dr. Madhulika Mistry.
 Good for:  Knowledge level content  Evaluating student understanding of popular misconceptions  Concepts with two logical responses.
Assessment in Education ~ What teachers need to know.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Writing Selection Items
How to Use These Modules 1.Complete these modules with your grade level and/or content team. 2.Print the note taking sheets. 3.Read the notes as you view.
Muhammad Riaz Anjum Nasir Mahmood Irum Bajwa Muhammad Riaz Anjum Nasir Mahmood Irum Bajwa.
Under Test Condition Techniques The Learning Centre Semester
GUST 1270 College and Career Planning
Writing Selection Items Multiple Choice
EDU 385 Session 8 Writing Selection items
Classroom test and Assessment
Exam Technique.
ENGLISH TEST 45 Minutes – 75 Questions
TEST taking strategies
Preparing for the Verbal Reasoning Measure
Greg Miller Iowa State University
Do Now: How do you learn best. Do you like to look at pictures
Constructing Exam Questions
Classification of Tests Chapter # 2
Summative Assessment Grade 6 April 2018 Develop Revise Pilot Analyze
Improving Test Taking Strategies
Multiple Choice Item (MCI) Quick Reference Guide
Testing Receptive Skills
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 8 Objective Test Items.
Test Taking Strategies
Multiple-Choice and Matching Exercises
General Multiple Choice Strategies
Multiple Choice Item (MCI) Quick Reference Guide
Tests are given for 4 primary reasons.
“QA” = quality assurance
Constructing a Test We now know what makes a good question:
Presentation transcript:

Narrowing the Gulf Annual Conference 2010 Mastering the Art of Writing Objective Test Items Narrowing the Gulf Annual Conference 2010 March 2010

Writing Objective Test Items Presenter Dr. James Coraggio, Director, Academic Effectiveness and Assessment Former Life… Director of Test Development , SMT Director of Measurement and Test Development, Pearson Taught EDF 4430 Measurement for Teachers, USF March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Purpose Are your students learning the course content or are they just good test takers? This presentation will explain how to create effective multiple choice test questions. The presentation will provide item-writing guidelines as well as best practices to prevent students from just guessing the correct answers. March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Agenda Propose of a Test Advantages of Objective Tests Types of Objective tests Writing Multiple Choice Items The Test-wise Student Test Instructions Test Validity March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Purpose of a Test ‘Clearly delineate between those that know the content and those that do not.’ Purpose of an assessment is to determine whether the student knows the content, not whether the student is a good test-taker. Likewise, confusing and tricky questions should be avoided to prevent incorrect responses from students who know (and understand) the material. March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Objective Tests Measure several types of learning (also levels) Wide content, short period of time Variations for flexibility Easy to administer, score, and analyze Scored more reliability and quickly What type of learning cannot be measured? March 19. 2010 Academic Effectiveness and Assessment

Types of Objective Tests Written-response Completion (fill-in-the-blank) Short answer Selected-response Alternative response (two options) Matching Keyed (like matching) Multiple choice March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Written-response Single questions/statements or clusters (stimuli) Advantages Measure several types of learning Minimizes guessing Points out student misconceptions Disadvantages Time to score Misspelling and writing clarity Incomplete answers More than one possible correct response (novel answers) Subjectivity in grading March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Completion A word that describes a person, place or thing is a ________. Remove only ‘key’ words Blanks at end of statement Avoid multiple correct answers Eliminate clues Paraphrase statements Use answer sheets to simplify scoring March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Short Answer Briefly describe the term proper noun. ____________________________ Terminology – Stimulus and Response Provide an appropriate blank (word (s) or sentence). Specify the units (inches, dollars) Ensure directions for clusters of items and appropriate for all items March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Selected-response Select from provided responses Advantages Measure several types of learning Measures ability to make fine distinctions Administered quickly Cover wide range of material Reliably scored Multiple scoring options (hand, computer, scanner) Disadvantages Allows guessing Distractors can be difficult to create Student misconceptions not revealed March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Alternative Response T F 1. A noun is a person place or thing. T F 2. An adverb describes a noun. Explain judgments to be made Ensure answers choices match Explain how to answer Only one idea to be judged Positive wording Avoid trickiness, clues, qualifiers March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Matching Item Column A Column B __Person, place, or thing. a. Adjective __Describes a person, place, or thing. b. Noun Terminology – premises and responses Clear instructions Homogenous premises Homogenous responses (brief and ordered) Avoid one-to-one March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Keyed Response Responses a. A noun b. A pronoun c. An adjective d. An adverb ___Person, place, or thing. ___Describes a person, place, or thing. Like matching items, more response options March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment MC Item Format What is the part of speech that is used to name a person, place, or thing? A) A noun* B) A pronoun C) An adjective D) An adverb March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment MC Item Terminology Stem: Sets the stage for the item; question or incomplete thought; should contain all the needed information to select the correct response. Options: Possible responses consisting of one and only one correct answer. Key: correct response Distractor: wrong response, plausible, but not correct, attractive to an under-prepared student March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Competency Items should test for the appropriate or adequate level of knowledge, skill, or ability (KSA) for the students. Assessing lower division students on graduate level material is an ‘unfair’ expectation. The competent student should do well on an assessment, items should not be written for only the top students in the class. March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Clarity Clear, precise item and instruction Correct grammar, punctuation, spelling Address one single issue Avoid extraneous material (teaching) One correct or clearly best answer Legible copies of exam March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Bias Tests should be free from bias… No stereotyping No gender bias No racial bias No cultural bias No religious bias No political bias March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Level of Difficulty Ideally, test difficulty should be aimed a middle level of difficulty. This can not always be achieved when the subject matter is based on specific expectations (i.e, workforce area). March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Level of Difficulty To make a M/C item more difficult, make the stem more specific or narrow and the options more similar. To make a M/C item less difficult, make the stem more general and the options more varied. March 19. 2010 Academic Effectiveness and Assessment

Trivial and Trick Questions Avoid trivia and tricks Avoid humorous or ludicrous responses Items should be straight forward, they should cleanly delineate those that know the material from those that do not Make sure every item has value and that it is contributing to the final score March 19. 2010 Academic Effectiveness and Assessment

Test Taking Guidelines http://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdf Test Taking Guidelines When you don’t know the answer As with all exams, attempt the questions that are easiest for you first. Come back and do the hard ones later. Unless you will lose marks for an incorrect response, never leave a question blank. Make a calculated guess if you are sure you don’t know the answer. Here are some tips to help you guess ‘intelligently’. Use a process of elimination Try to narrow your choice as much as possible: which of the options is most likely to be incorrect? Ask: are options in the right range? Is the measurement unit correct? Does it sound reasonable? March 19. 2010 Academic Effectiveness and Assessment

Test Taking Guidelines http://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdf Test Taking Guidelines Look for grammatical inconsistencies In extension-type questions a choice is nearly always wrong if the question and the answer do not combine to make a grammatically correct sentence. Also look for repetition of key words from the question in the responses. If words are repeated, the option is worth considering. e.g.: The apparent distance hypothesis explains… b) The distance between the two parallel lines appears… March 19. 2010 Academic Effectiveness and Assessment

Test Taking Guidelines http://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdf Test Taking Guidelines Be wary of options containing definitive words and generalizations Because they can’t tolerate exceptions, options containing words like ‘always’, ‘only’, ‘never’, ‘must’ tend to be incorrect more often. Similarly, options containing strong generalizations tend to be incorrect more often. Favor look-alike options If two of the alternatives are similar, give them your consideration. e.g.: A. tourism consultants B. tourists C. tourism promoters D. fairy penguins March 19. 2010 Academic Effectiveness and Assessment

Test Taking Guidelines http://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdf Test Taking Guidelines Favor numbers in the mid-range If you have no idea what the real answer is, avoid extremes. Favor more inclusive options If in doubt, select the option that encompasses others. e.g.: A. an adaptive system B. a closed system C. an open system D. a controlled and responsive system E. an open and adaptive system. Please note: None of these strategies is foolproof and they do not apply equally to the different types of multiple choice questions, but they are worth considering when you would otherwise leave a blank. March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Test-wise Students Are familiar with item formats Use informed and educated guessing Avoid common mistakes Have testing experience Use time effectively Apply various strategies to solve different problem types March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Test-wise Students Vary your keys: ‘Always pick option ‘C’ ’ Avoid ‘all of the above’ and ‘none of the above’ Avoid extraneous information: It may assist in answering another item Avoid item ‘bad pairs’ or ‘enemies’ Avoid clueing with the same word in the stem and the key March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Test-wise Students Make options similar in terms of length, grammar, and sentence structure. Different options stand out. Avoid ‘clues’. March 19. 2010 Academic Effectiveness and Assessment

Item Format Considerations Information in the stem Avoid negatively stated stem, qualifiers Highlight qualifiers if used Avoid irrelevant symbols (“&”) and jargon Standard set number of options (Prefer only four) Ideally, you should tie an item to reference March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Test Directions Highlight Directions State the skill measured. Describe any resource materials required. Describe how students are to respond. Describe any special conditions. Time limits, if any March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Ensure Test Validity Congruence between items and course objectives Congruence between item and student characteristics Clarity of items Accuracy of the measures Item formatting criteria Feasibility-time, resources March 19. 2010 Academic Effectiveness and Assessment

Academic Effectiveness and Assessment Questions March 19. 2010 Academic Effectiveness and Assessment

Narrowing the Gulf Annual Conference 2010 Mastering the Art of Writing Objective Test Items Narrowing the Gulf Annual Conference 2010 March 2010