Item Analysis What makes a question good??? Answer options?

Slides:



Advertisements
Similar presentations
Item Analysis.
Advertisements

Test Development.
FACULTY DEVELOPMENT PROFESSIONAL SERIES OFFICE OF MEDICAL EDUCATION TULANE UNIVERSITY SCHOOL OF MEDICINE Using Statistics to Evaluate Multiple Choice.
Using Test Item Analysis to Improve Students’ Assessment
Item Analysis: A Crash Course Lou Ann Cooper, PhD Master Educator Fellowship Program January 10, 2008.
Models for Measuring. What do the models have in common? They are all cases of a general model. How are people responding? What are your intentions in.
Constructing Exam Questions Dan Thompson & Brandy Close OSU-CHS Educational Development-Clinical Education.
Test Construction Processes 1- Determining the function and the form 2- Planning( Content: table of specification) 3- Preparing( Knowledge and experience)
MATH ASSESSMENT TEST OCMA May, HISTORY OF MAT Test originally developed in late 60’s.
Seminar Overview Welcomes, Introductions Background to e-asTTle
Item Analysis Ursula Waln, Director of Student Learning Assessment
Lesson Seven Item Analysis. Contents Item Analysis Item Analysis Item difficulty (item facility) Item difficulty (item facility) Item difficulty Item.
SETTING & MAINTAINING EXAM STANDARDS Raja C. Bandaranayake.
Item Analysis Prof. Trevor Gibbs. Item Analysis After you have set your assessment: How can you be sure that the test items are appropriate?—Not too easy.
Lesson Nine Item Analysis.
Multiple Choice Test Item Analysis Facilitator: Sophia Scott.
ANALYZING AND USING TEST ITEM DATA
Practice AP Lit Multiple Choice Test Reflection and Goal Setting.
1 Item Analysis - Outline 1. Types of test items A. Selected response items B. Constructed response items 2. Parts of test items 3. Guidelines for writing.
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
Migration from ExamSystem II to REMARK Examination Scanning Software.
Part #3 © 2014 Rollant Concepts, Inc.2 Assembling a Test #
Field Test Analysis Report: SAS Macro and Item/Distractor/DIF Analyses
Chapter 7 Item Analysis In constructing a new test (or shortening or lengthening an existing one), the final set of items is usually identified through.
Choose Your Own Adventure. Introduction Use this as a guide when working on a Smarter Balanced question.
Techniques to improve test items and instruction
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 ITEM TYPES IN A TEST Missing words and incomplete sentences Multiple choice.
Group 2: 1. Miss. Duong Sochivy 2. Miss. Im Samphy 3. Miss. Lay Sreyleap 4. Miss. Seng Puthy 1 ROYAL UNIVERSITY OF PHNOM PENH INSTITUTE OF FOREIGN LANGUAGES.
Lab 5: Item Analyses. Quick Notes Load the files for Lab 5 from course website –
Multiplicative Identity for Fractions When we multiply a number by 1, we get the same number: Section 3.51.
Basic Measurement and Statistics in Testing. Outline Central Tendency and Dispersion Standardized Scores Error and Standard Error of Measurement (Sm)
Grading and Analysis Report For Clinical Portfolio 1.
Thurstone Scaling.
1 Item Analysis - Outline 1. Types of test items A. Selected response items B. Constructed response items 2. Parts of test items 3. Guidelines for writing.
Educator’s view of the assessment tool. Contents Getting started Getting around – creating assessments – assigning assessments – marking assessments Interpreting.
Writing Multiple Choice Questions. Types Norm-referenced –Students are ranked according to the ability being measured by the test with the average passing.
 Quartiles and Percentiles. Quartiles  A quartile divides a sorted (least to greatest) data set into 4 equal parts, so that each part represents ¼ of.
Curriculum Services Team USING THE ITEM INFORMATION REPORTS.
Introduction to Item Analysis Objectives: To begin to understand how to identify items that should be improved or eliminated.
Tests and Measurements
American Mathematics Competition. ACM Scores of 1998: Cumulative Distribution by Gender (n= 107,894 USA 8th Graders)
Module 7 1. What do we know about selected- response items? Well constructed selected- response items can target: factual knowledge comprehension analysis.
Materials produced under Phare 2006 financial support Phare TVET RO 2006/ EUROPEAN UNION Project financed under Phare MoERI/ NCDTVET-PIU.
Multiple Choice Items EDUC 307. Multiple Choice Items  Definition: This format consists of a stem that poses a question or sets a problem and a set of.
Dan Thompson Oklahoma State University Center for Health Science Evaluating Assessments: Utilizing ExamSoft’s item-analysis to better understand student.
Psychometrics: Exam Analysis David Hope
Assessment and the Institutional Environment Context Institutiona l Mission vision and values Intended learning and Educational Experiences Impact Educational.
Objective Examination: Multiple Choice Questions Dr. Madhulika Mistry.
1 Main achievement outcomes continued.... Performance on mathematics and reading (minor domains) in PISA 2006, including performance by gender Performance.
Norm Referenced Your score can be compared with others 75 th Percentile Normed.
DESIGNING GOOD SURVEYS Laura P. Naumann Assistant Professor of Psychology Nevada State College.
Bivariate Association. Introduction This chapter is about measures of association This chapter is about measures of association These are designed to.
Exam Analysis Camp Teach & Learn May 2015 Stacy Lutter, D. Ed., RN Nursing Graduate Students: Mary Jane Iosue, RN Courtney Nissley, RN Jennifer Wierworka,
Part 1: Overview Mimi Fuhrman, American Institutes for Research Part 1: Overview Mimi Fuhrman, American Institutes for Research.
Writing Selection Items
Multiplication Timed Tests.
A Date with Data IATP – November 9, 2016.
Using Data to Drive Decision Making:
ARDHIAN SUSENO CHOIRUL RISA PRADANA P.
Classroom Analytics.
Greg Miller Iowa State University
Test Development Test conceptualization Test construction Test tryout
Constructing Exam Questions
Using statistics to evaluate your test Gerard Seinhorst
Mohamed Dirir, Norma Sinclair, and Erin Strauts
Analyzing test data using Excel Gerard Seinhorst
Distractor Efficiency
EXAM VIEW PRO TEST BUILDER EXAM AND CONTENT CREATION
Tests are given for 4 primary reasons.
Test Construction: The Elements
Presentation transcript:

Item Analysis What makes a question good??? Answer options?

Multiple Choice Questions Elements of a good multiple choice question????

Item Analysis How can we determine if it is a good question????? n Distractor power n Item difficulty n Item discrimination

Distractor power Strength of each distractor (should be equal) # answering incorrect # of distractors

E.G. n There are 4 possible answers n Option a is correct n if 9 people choose option b (9/3 = 3) n and 12 people choose option c (12/3 = 4) n and 15 people choose option d (15/3 = 5) n then option d is the strongest distractor

Item Difficulty = p (percent of people passing) p = number answering correctly total number answering n If 100 people take the test and 34 get question 1 correct, what is the difficulty of that question? p 1 = 34/100 =.34 n For 65 correct p would be 65/100 =.65 SO… the higher the p, the easier the question (score must be between 0 and 1)

Item Difficulty is... n A behavioral measure n Based on the current group n Allows internal comparison n Also allows external comparison

In construct based tests n Item difficulty is similar to item endorsement n Number agreeing with an item

Item endorsement also... n A behavioral measure n Based on the current group n Allows internal comparison n Also allows external comparison

Item discrimination = d Does the question discriminate between those who do well overall and those who do poorly? (or on a psychological test, those scoring high or low on the construct)

Distribution To determine level upper (U) = top 25 to 33 % of scores lower (L) = bottom 25 to 33 % of scores

Determining item discrimination d = U-L d = U-L n where d = item discrimination U = number in the upper group who get question correct L = number in the lower group who get question correct n = number in one group (U or L)

Examples of d n With 33 in U and L n if all in U and all in L get it correct / 33 = 0 n if all in U get it correct, and none in L / 33 = 1 n if none in U get it correct, and all in L / 33 = -1 n if 30 in U and 10 in L get it correct / 33 =.61

Value of d n Range for d = -1 to 1 n higher d indicates more discriminating (1 = all in upper group got it correct and none in lower group) n negative d indicates a really bad question (-1 indicates all in lower group got it correct and none in upper group)