March 11, 2013 Chicago, IL American Board of Preventive Medicine American Board of Preventive Medicine Clinical Informatics Examination Committee Measurement.

Slides:



Advertisements
Similar presentations
Using Test Item Analysis to Improve Students’ Assessment
Advertisements

1. 2 Dr. Shama Mashhood Dr. Shama Mashhood Medical Educationist Medical Educationist & Coordinator Coordinator Question Review Committee Question Review.
2008 Pearson Education, publishing as Longman Publishers Chapter 12: Test Taking Bridging the Gap, 9/e Brenda Smith.
Essay Type Questions and Their Improvement
Test-Taking Tactics. 2 “Knowing is not enough; we must apply.” -- Goethe.
Standards for Question Writing Mary Grantner, MA Senior Manager, Physician Assessment.
Constructing Exam Questions Dan Thompson & Brandy Close OSU-CHS Educational Development-Clinical Education.
M ATCHING I TEMS Presenter Pema Khandu B.Ed.II(S)Sci ‘B’
MCQ’s 1: Construction of an MCQ MCQ’s 1: Construction of an MCQ.
Constructing Items & Tasks. Choosing the right assessment strategy The 1999 Standards for educational and psychological testing recommends the use of.
Social Science Faculty Meeting January 2010 Mastering the Art of Test Writing Roundtable Discussion.
Objective Examination Dr. Niraj Pandit MD Department of Community Medicine SBKS MIRC.
Guidelines for preparing Multiple Choice Questions (MCQ’s)
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 9 Subjective Test Items.
Classroom Assessment A Practical Guide for Educators by Craig A
Module 6 Test Construction &Evaluation. Lesson’s focus Stages in Test Construction Tasks in Test Test Evaluation.
Narrowing the Gulf Annual Conference 2010 March 2010 Mastering the Art of Writing Objective Test Items.
Test and Types of Tests.
Written Exam Assessing knows and knows how Departemen Obstetri dan Ginekologi FKUI-RSCM.
Ginny Price CETL TEST DEVELOPMENT. WRITING MULTIPLE CHOICE ITEMS Write only a few items at a time Immediately after preparing class lesson or after class.
Oscar Vergara Chihlee Institute of Technology July 28, 2014.
Building Effective Assessments. Agenda  Brief overview of Assess2Know content development  Assessment building pre-planning  Cognitive factors  Building.
Tips for Top Tests FOSL fall September 10, 2007 Adapted from “Tools for Teaching” by Barbara Gross Davis.
Groton Elementary Agenda: Discuss assessments, modifications, and accommodations Review common accommodations for assessments Study of Test.
CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CHAPTER 8 AMY L. BLACKWELL JUNE 19, 2007.
Objectives To know basic concepts and rationale of MCQ To know different types of MCQ To illustrate anatomy of each type To discuss guidelines construction.
Dr. Majed Wadi MBChB, MSc Med Edu. Objectives To discuss the concept of vetting process To describe the findings of literature review regarding this process.
Multiple Choice Question Design Karen Brooks & Barbara Tischler Hastie.
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
Session 2 Traditional Assessments Session 2 Traditional Assessments.
Objectives To know basic concepts and rationale of MCQ To know different types of MCQ To illustrate ‘anatomy’ of each type To discuss guidelines construction.
Teaching Today: An Introduction to Education 8th edition
1 Writing Test Blueprints and Test Items “Software for Creating and Delivering Assessments With Powerful Reports”
Assessment Item Writing Workshop Ken Robbins FDN-5560 Classroom Assessment Click HERE to return to the Documentation HERE.
2009 Professional Development Day October 2009 Mastering the Art of Test Writing.
Classroom Assessment (1) EDU 330: Educational Psychology Daniel Moos.
Jeanne Ormrod Eighth Edition © 2014, 2011, 2008, 2006, 2003 Pearson Education, Inc. All rights reserved. Educational Psychology Developing Learners.
Writing Multiple Choice Questions. Types Norm-referenced –Students are ranked according to the ability being measured by the test with the average passing.
Building Exams Dennis Duncan University of Georgia.
Materials produced under Phare 2006 financial support Phare TVET RO 2006/ EUROPEAN UNION Project financed under Phare MoERI/ NCDTVET-PIU.
Test Question Writing Instructor Development ANSF Nurse Training Program.
Language Testing How to make multiple choice test.
TEST SCORES INTERPRETATION - is a process of assigning meaning and usefulness to the scores obtained from classroom test. - This is necessary because.
Teachers New to Geography Cockatoo Island 23 March 2007 The School Certificate Kate Cameron Senior Assessment Officer HSIE Office of the Board of Studies.
Do not on any account attempt to write on both sides of the paper at once. W.C.Sellar English Author, 20th Century.
EVALUATION SUFFECIENCY Types of Tests Items ( part I)
University of Baltimore Test Development Solutions (TDS) Thomas Fiske, M.S. - Test Development Team Lead Charles Glover, M.S. - Test Developer; Diann M.
Reviewing, Revising and Writing Effective Social Studies Multiple-Choice Items 1 Writing and scoring test questions based on Ohio’s Academic Content Standards.
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 6 Construction of Knowledge Tests.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. SELECTED.
Assessment and the Institutional Environment Context Institutiona l Mission vision and values Intended learning and Educational Experiences Impact Educational.
Objective Examination: Multiple Choice Questions Dr. Madhulika Mistry.
 Good for:  Knowledge level content  Evaluating student understanding of popular misconceptions  Concepts with two logical responses.
Reviewing, Revising and Writing Mathematics Multiple- Choice Items 1 Writing and scoring test questions based on Ohio’s Academic Content Standards Reviewing,
Assessment in Education ~ What teachers need to know.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Writing Selection Items
Introduction to the Validation Phase
UMDNJ-New Jersey Medical School
Constructing Exam Questions
Summative Assessment Grade 6 April 2018 Develop Revise Pilot Analyze
به نام خدا.
Multiple Choice Item (MCI) Quick Reference Guide
Testing Receptive Skills
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 8 Objective Test Items.
Multiple Choice Questions
Multiple Choice Item (MCI) Quick Reference Guide
EDUC 2130 Quiz #10 W. Huitt.
Narrowing the Gulf Annual Conference 2010
Prescribing Safety Assessment Training Workshop
Presentation transcript:

March 11, 2013 Chicago, IL American Board of Preventive Medicine American Board of Preventive Medicine Clinical Informatics Examination Committee Measurement Research Associates, a division of Measurement Incorporated

Goals and Important Measurement Considerations

Considerations Validity Reliability Objectivity

Validity   Does the test measure what it is supposed to measure with regard to: Construct Content

Validity Exam content guidelines Distribution of items ensures the test is representative of the field. A sufficient number of items cover all relevant areas of practice appropriately

Reliability Confidence in test scores Measurement error – “uncontrolled noise”

Objectivity Criterion-Referenced Testing Ascertains an individual’s ability level relative to performance on the content domains as represented by the test items The measurement of a candidate is independent of other test-takers

Criterion Standard   Collective expectation of the level of knowledge and skills needed to practice safely and effectively in the field Determined by experts   A criterion-referenced standard allows any candidate that can achieve the standard to pass. All candidates can pass All candidates can fail

Developing and Reviewing Multiple-Choice Items

Clinical Informatics Item Bank   Goal: Compile and maintain a pool of exam items which are appropriate to measure the knowledge and skills necessary for safe and effective performance in the field of practice.

Taxonomy   Taxonomy refers to the level of cognitive skill required to answer the item correctly Recall Interpretive Problem Solving

Parts to an Item The stem The responses   Correct response   Distractors

The Stem   Format Ask a question Which of the following microscopic subtypes of ameloblastoma is most common? Give an incomplete statement The most common microscopic subtype of ameloblastoma is: Scenario with a question or an incomplete statement A 25 year-old man is brought to the emergency room. He was found lying unconscious on the sidewalk. After ascertaining that the airway is open, the next step in management should be:

Item Options O ptions are all the possible answers for a stem. – One correct (best) answer – Three distractors T he best answer is agreed upon by experts. T he distractors are logical misconceptions of the best answer.

Developing Items I tems should have one best answer. Avoid items based on opinion or for which there is not an accepted answer. I tems must focus on a single issue, fact, or problem.

Developing Items   Items should test important and pertinent material while avoiding trivial facts.   Attempt to write interpretation and problem solving items.   Items should be developed utilizing good grammar, punctuation, and spelling.

Stem Construction A void overly specific knowledge, excess information, and teaching in the stem. I nclude the central idea and most text in the stem T he stem should be stated positively; avoid negative phrasing.

Stem Construction U se terminology common to practice and avoid verbatim textbook phrasing. A void personal pronouns (i.e., you).

Response Construction Avoid “all of the above” and “none of the above.” Avoid absolutes such as “always” and “never.” Responses should be – organized in a logical order – independent and not overlapping – fairly consistent in length – homogeneous or parallel in content – plausible

Questions or Comments