Presentation is loading. Please wait.

Presentation is loading. Please wait.

College of Nursing January 2011 Best Practices for Writing Objective Test Items.

Similar presentations


Presentation on theme: "College of Nursing January 2011 Best Practices for Writing Objective Test Items."— Presentation transcript:

1 College of Nursing January 2011 Best Practices for Writing Objective Test Items

2 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Writing Objective Test Items Presenter  Dr. James Coraggio, Director, Academic Effectiveness and Assessment Contributor  Alisha Vitale, Collegewide Testing Coordinator 2

3 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Writing Objective Test Items Former Life…  Director of Test Development, SMT  Director of Measurement and Test Development, Pearson  Taught EDF 4430 Measurement for Teachers, USF January Academic Effectiveness and Assessment3

4 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Purpose  This presentation will address the importance of establishing a test purpose and developing test specifications.  This presentation will explain how to create effective multiple choice test questions.  The presentation will provide item-writing guidelines as well as best practices to prevent students from just guessing the correct answers. 4

5 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment5 Agenda  Purpose of a Test  Prior to Item Writing  Advantages of Objective Tests  Types of Objective tests  Writing Multiple Choice Items  The Test-wise Student  Test Instructions  Test Validity

6 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Purpose of a Test  To clearly delineate between those that know the content and those that do not.  To determine whether the student knows the content, not whether the student is a good test- taker.  Likewise, confusing and tricky questions should be avoided to prevent incorrect responses from students who know (and understand) the material. 6

7 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Prior to Writing Items  Establish the test purpose  Conduct the role delineation study/job analysis  Create the test specifications January Academic Effectiveness and Assessment7

8 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Establish the Test Purpose Initial Questions  How will the test scores be used?  Will the test be designed for minimum competency or content mastery?  Will the test be low-stakes, moderate-stakes, or high-stakes (consequences for examinees)?  Will the test address multiple levels of thinking ( higher order, lower order, or both )?  Will there be time constraints? January Academic Effectiveness and Assessment8

9 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Establish the Test Purpose  Responses to those initial questions have implications such as  the overall length of the test,  the average difficulty of the items,  the conditions under which the test will be administered, and  the type of score information to be provided.  Take the time to establish a singular purpose that is clear and focused so that goals and priorities will be effectively met. January Academic Effectiveness and Assessment9

10 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Conduct the Job Analysis  The primary purpose of a role delineation study or job analysis is to provide a strong linkage between competencies necessary for successful performance on the job and the content on the test.  This work has already been conducted by the National Council Licensure Examination for Registered Nurses [See Report of Findings from the 2008 RN Practice Analysis: Linking the NCLEX-RN® Examination to Practice, NCSBN, 2009] January Academic Effectiveness and Assessment10

11 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Create Test Specifications  Test specifications are essentially the ‘blue print’ used to create the test.  Test specifications operationalize the competencies that are being assessed.  NCLEX-RN® Examination has established test specifications. [See 2010 NCLEX- RN ® Detailed Test Plan, April 2010, Item Writer/Item Reviewer/Nurse Educator Version] January Academic Effectiveness and Assessment11

12 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Create Test Specifications January Academic Effectiveness and Assessment12

13 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Create Test Specifications January Academic Effectiveness and Assessment13

14 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Create Test Specifications January Academic Effectiveness and Assessment14

15 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Create Test Specifications Test specifications:  Support the validity of the examination  Provide standardized content across administrations  Allow for subscores that can provide diagnostic feedback to students and administrators  Inform the student (and the item writers) of the required content January Academic Effectiveness and Assessment15

16 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Item Development  After developing the test specifications, item development can begin.  The focus on the remaining presentation will be on creating ‘appropriate’ objective items. January Academic Effectiveness and Assessment16

17 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment17 Objective Tests  Measure several types of learning (also levels)  Wide content, short period of time  Variations for flexibility  Easy to administer, score, and analyze  Scored more reliability and quickly  What type of learning cannot be measured?

18 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment18 Types of Objective Tests  Written-response  Completion (fill-in-the-blank)  Short answer  Selected-response  Alternative response (two options)  Matching  Keyed (like matching)  Multiple choice

19 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment19 Written-response  Single questions/statements or clusters (stimuli)  Advantages  Measure several types of learning  Minimizes guessing  Points out student misconceptions  Disadvantages  Time to score  Misspelling and writing clarity  Incomplete answers  More than one possible correct response (novel answers)  Subjectivity in grading

20 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment20 Completion A word that describes a person, place or thing is a ________. 1.Remove only ‘key’ words 2.Blanks at end of statement 3.Avoid multiple correct answers 4.Eliminate clues 5.Paraphrase statements 6.Use answer sheets to simplify scoring

21 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment21 Short Answer Briefly describe the term proper noun. ____________________________  Terminology – Stimulus and Response 1.Provide an appropriate blank (word (s) or sentence). 2.Specify the units (inches, dollars) 3.Ensure directions for clusters of items and appropriate for all items

22 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment22 Selected-response Select from provided responses  Advantages  Measure several types of learning  Measures ability to make fine distinctions  Administered quickly  Cover wide range of material  Reliably scored  Multiple scoring options (hand, computer, scanner)  Disadvantages  Allows guessing  Distractors can be difficult to create  Student misconceptions not revealed

23 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment23 Alternative Response T F 1. A noun is a person place or thing. T F 2. An adverb describes a noun. 1.Explain judgments to be made 2.Ensure answers choices match 3.Explain how to answer 4.Only one idea to be judged 5.Positive wording 6.Avoid trickiness, clues, qualifiers

24 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment24 Matching Item Column A Column B __Person, place, or thing. a. Adjective __Describes a person, place, or thing. b. Noun Terminology – premises and responses 1.Clear instructions 2.Homogenous premises 3.Homogenous responses (brief and ordered) 4.Avoid one-to-one

25 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment25 Keyed Response Responses a. A noun b. A pronoun c. An adjective d. An adverb ___Person, place, or thing. ___Describes a person, place, or thing.  Like matching items, more response options

26 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment26 MC Item Format What is the part of speech that is used to name a person, place, or thing? A) A noun* B) A pronoun C) An adjective D) An adverb

27 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment MC Item Terminology  Stem: Sets the stage for the item; question or incomplete thought; should contain all the needed information to select the correct response.  Options: Possible responses consisting of one and only one correct answer  Key: correct response  Distractor: wrong response, plausible but not correct, attractive to an under-prepared student 27

28 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Competency  Items should test for the appropriate or adequate level of knowledge, skill, or ability (KSA) for the students.  Assessing lower division students on graduate level material is an ‘unfair’ expectation.  The competent student should do well on an assessment, items should not be written for only the top students in the class. 28

29 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Clarity  Clear, precise item and instructions  Correct grammar, punctuation, spelling  Address one single issue  Avoid extraneous material (teaching)  One correct or clearly best answer  Legible copies of exam 29

30 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Bias  Tests should be free from bias…  No stereotyping  No gender bias  No racial bias  No cultural bias  No religious bias  No political bias 30

31 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Level of Difficulty  Ideally, test difficulty should be aimed at a middle level of difficulty. This can not always be achieved when the subject matter is based on specific expectations (i.e., workforce area). 31

32 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Level of Difficulty  To make a M/C item more difficult, make the stem more specific or narrow and the options more similar.  To make a M/C item less difficult, make the stem more general and the options more varied. 32

33 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Trivial and Trick Questions  Avoid trivia and tricks.  Avoid humorous or ludicrous responses.  Items should be straight forward. They should cleanly delineate those that know the material from those that do not.  Make sure every item has value and that it is contributing to the final score. 33

34 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Test Taking Guidelines When you don’t know the answer  As with all exams, attempt the questions that are easiest for you first. Come back and do the hard ones later. Unless you will lose marks for an incorrect response, never leave a question blank. Make a calculated guess if you are sure you don’t know the answer. Here are some tips to help you guess ‘intelligently’. Use a process of elimination  Try to narrow your choice as much as possible: which of the options is most likely to be incorrect? Ask: are options in the right range? Is the measurement unit correct? Does it sound reasonable? 34

35 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Test Taking Guidelines Look for grammatical inconsistencies  In extension-type questions a choice is nearly always wrong if the question and the answer do not combine to make a grammatically correct sentence. Also look for repetition of key words from the question in the responses. If words are repeated, the option is worth considering. e.g.:  The apparent distance hypothesis explains…  b) The distance between the two parallel lines appears… 35

36 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Test Taking Guidelines Be wary of options containing definitive words and generalizations  Because they can’t tolerate exceptions, options containing words like ‘always’, ‘only’, ‘never’, ‘must’ tend to be incorrect more often. Similarly, options containing strong generalizations tend to be incorrect more often. Favor look-alike options  If two of the alternatives are similar, give them your consideration. e.g.: A. tourism consultants B. tourists C. tourism promoters D. fairy penguins 36

37 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Test Taking Guidelines Favor numbers in the mid-range  If you have no idea what the real answer is, avoid extremes. Favor more inclusive options  If in doubt, select the option that encompasses others. e.g.: A. an adaptive system B. a closed system C. an open system D. a controlled and responsive system E. an open and adaptive system. Please note: None of these strategies is foolproof and they do not apply equally to the different types of multiple choice questions, but they are worth considering when you would otherwise leave a blank. 37

38 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Test-wise Students  Are familiar with item formats  Use informed and educated guessing  Avoid common mistakes  Have testing experience  Use time effectively  Apply various strategies to solve different problem types 38

39 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Test-wise Students  Vary your keys: ‘Always pick option ‘C’. ’  Avoid ‘all of the above’ and ‘none of the above.’  Avoid extraneous information: It may assist in answering another item.  Avoid item ‘bad pairs’ or ‘enemies.’  Avoid clueing with the same word in the stem and the key. 39

40 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Test-wise Students  Make options similar in terms of length, grammar, and sentence structure. Different options stand out. Avoid ‘clues.’ 40

41 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Item Format Considerations  Information in the stem  Avoid negatively stated stem, qualifiers  Highlight qualifiers if used  Avoid irrelevant symbols (“&”) and jargon  Standard set number of options (Prefer only four)  Ideally, you should tie an item to reference (and rationale) 41

42 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment42 Test Directions Highlight Directions 1.State the skill measured. 2.Describe any resource materials required. 3.Describe how students are to respond. 4.Describe any special conditions. 5.State time limits, if any.

43 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment43 Ensure Test Validity  Congruence between items and course objectives  Congruence between item and student characteristics  Clarity of items  Accuracy of the measures  Item formatting criteria  Feasibility-time, resources

44 Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment44 Questions

45 College of Nursing January 2011 Best Practices for Writing Objective Test Items


Download ppt "College of Nursing January 2011 Best Practices for Writing Objective Test Items."

Similar presentations


Ads by Google