Using Multiple Choice Tests for Assessment Purposes: Designing Multiple Choice Tests to Reflect and Foster Learning Outcomes Terri Flateby, Ph.D.

Slides:



Advertisements
Similar presentations
An Introduction to Test Construction
Advertisements

Assessing Student Performance
Performance Assessment
An Introduction to Computer- assisted Assessment Joanna Bull and Ian Hesketh CAA Centre Teaching and Learning Directorate.
FACULTY DEVELOPMENT PROFESSIONAL SERIES OFFICE OF MEDICAL EDUCATION TULANE UNIVERSITY SCHOOL OF MEDICINE Using Statistics to Evaluate Multiple Choice.
Pre and Post Assessments A quick and easy way to assess your Student Learning Outcomes.
What is Assess2Know ® ? Assess2Know is an assessment tool that enables districts to create high-quality reading and math benchmark assessments for grades.
1 Developing Tests for Departmental Assessment Deborah Moore, Assessment Specialist Institutional Research, Planning, & Effectiveness University of Kentucky.
Evaluation of the Iowa Algebra Aptitude Test Terri Martin Doug Glasshoff Mini-project 1 June 17, 2002.
Using Test Item Analysis to Improve Students’ Assessment
Benchmark Assessment Item Bank Test Chairpersons Orientation Meeting October 8, 2007 Miami-Dade County Public Schools Best Practices When Constructing.
Constructing Exam Questions Dan Thompson & Brandy Close OSU-CHS Educational Development-Clinical Education.
M ATCHING I TEMS Presenter Pema Khandu B.Ed.II(S)Sci ‘B’
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
 Explain objectives of “learning outcomes”  List “measureable” verbs when writing LO’s  Identify different methods used to evaluate learning  Explain.
© 2008 McGraw-Hill Higher Education. All rights reserved. CHAPTER 16 Classroom Assessment.
Multiple Choice Test Item Analysis Facilitator: Sophia Scott.
Challenge Question: Why is being organized on the first day of school important? Self-Test Questions: 1.How do I get my classroom ready? 2.How do I prepare.
Principles of High Quality Assessment
Grade 12 Subject Specific Ministry Training Sessions
Assessing and Evaluating Learning
What should be the basis of
Catherine Wehlburg, Ph.D. Office for Assessment & Quality Enhancement.
FLCC knows a lot about assessment – J will send examples
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Chap. 3 Designing Classroom Language Tests
Oscar Vergara Chihlee Institute of Technology July 28, 2014.
NEXT GENERATION BALANCED ASSESSMENT SYSTEMS ALIGNED TO THE CCSS Stanley Rabinowitz, Ph.D. WestEd CORE Summer Design Institute June 19,
Subject Matter Expert/Author: Assoc. Prof. Dr Rashid Johar (OUM) Faculty of Science and Foundation Studies Copyright © ODL Jan 2005 Open University Malaysia.
Building Effective Assessments. Agenda  Brief overview of Assess2Know content development  Assessment building pre-planning  Cognitive factors  Building.
Designing and evaluating good multiple choice items Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information.
Central concepts:  Assessment can measure habits of mind or habits of recall.  Tests have their limits.  It is important to know the purpose the test.
Part #3 © 2014 Rollant Concepts, Inc.2 Assembling a Test #
Assessment Literacy Kansas State Department of Education ASSESSMENT LITERACY PROJECT1 Selected-response Tests.
Test item analysis: When are statistics a good thing? Andrew Martin Purdue Pesticide Programs.
 Participants will teach Mathematics II or are responsible for the delivery of Mathematics II instruction  Participants attended Days 1, 2, and 3 of.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
EDU 385 Education Assessment in the Classroom
Human Learning Asma Marghalani.
Traditional vs. Alternative Assessment
 Standards In Practice™ It's a Process Essential Questions  What does rigor look like in your personal or school's instructional practices?  What.
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
SHOW US YOUR RUBRICS A FACULTY DEVELOPMENT WORKSHOP SERIES Material for this workshop comes from the Schreyer Institute for Innovation in Learning.
Common Formative Assessments for Science Monica Burgio Daigler, Erie 1 BOCES.
Classroom Assessment (1) EDU 330: Educational Psychology Daniel Moos.
Educator’s view of the assessment tool. Contents Getting started Getting around – creating assessments – assigning assessments – marking assessments Interpreting.
Lecture by: Chris Ross Chapter 7: Teacher-Designed Strategies.
Assessment Design. Four Professional Learning Modules 1.Unpacking the AC achievement standards 2.Validity and reliability of assessments 3. Confirming.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Planning for Assessment Taxonomies It is the classification of levels of intellectual behavior which is important to learning Formative Assessment Summative.
Gathering Evidence to Achieve Results.  ALL CSD students and educators are part of ONE proactive educational system.  Evidence-based instruction and.
Catholic College at Mandeville Assessment and Evaluation in Inclusive Settings Sessions 3 & /14/2015 Launcelot I. Brown Lisa Philip.
Traditional vs. Alternative Assessment Assessment is the process of finding out how well students have mastered the curriculum.
Teaching and Learning with Technology ETEC 562 Chapter 2 Robert Calvery Elizabeth McMurphy.
Dan Thompson Oklahoma State University Center for Health Science Evaluating Assessments: Utilizing ExamSoft’s item-analysis to better understand student.
EVALUATION SUFFECIENCY Types of Tests Items ( part I)
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
1 IT/Cybersecurity - ICRDCE Conference Day Aligning Program, Course, and Class Objectives / Outcomes.
Dr. Marciano B. Melchor University of Ha’il, KINGDOM OF SAUDI ARABIA May 2013.
Formative and Summative Assessment
Concept of Test Validity
Starting with the End in Sight…
Using Formative Assessment to Improve Student Achievement
Constructing Exam Questions
TOPIC 4 STAGES OF TEST CONSTRUCTION
Learning Assessment Learning Teaching Dr. Md. Mozahar Ali
As a result of participating in this module, participants will be able to:
EDUC 2130 Quiz #10 W. Huitt.
Presentation transcript:

Using Multiple Choice Tests for Assessment Purposes: Designing Multiple Choice Tests to Reflect and Foster Learning Outcomes Terri Flateby, Ph.D.

Overview of Assessment Process Select or develop measureable learning outcomes (course or program) Select or develop measures consistent with the outcomes Measure learning outcomes Analyze learning results Make adjustments in curriculum, instructional strategies, or activities to address weaknesses Re-evaluate learning outcomes

Purposes of Classroom Achievement Tests Measure Individual Student’s Learning Evaluate Class Performance Evaluate Test and Improve Learning Support Course and Program Outcomes

Why Use Multiple-Choice Tests to Measure Achievement of Learning Outcomes? Efficient –More content coverage in less time –Faster to evaluate –Methods to evaluate test items In some cases, can provide a proxy to Constructed Response measures

Above All Testing and Assessment should Promote Learning

To Promote Learning, Tests Must Be: Valid: Tests should be an Accurate Indicator of Content and Level of Learning (Content validity) Reliable: Tests Should Produce Consist Results

Validity Tests must measure what you want your students to know and be able to do with the content (reach the cognitive demands of the outcomes). Tests must be consistent with instruction and assignments, which should foster the cognitive demands.

Process of Ensuring Validity Table of Item Specifications also called Test Blue Print – useful for classroom tests and guiding assessment Review item performance after administering test

Test Blue Print Reflects the Important Content and Cognitive Demands Content/com ponents of outcomes KnowledgeComprehensionApplication and above Analysis

Bloom’s Taxonomy of Educational Objectives (use to develop tests and outcomes) Evaluation Synthesis Analysis Application Comprehension Knowledge

Develop Tests to Reflect Outcomes at Program or Course Levels Create summative test Develop sets of items to embed in courses indicating progress toward outcomes (formative) Develop course level tests that reflect program level objectives/outcomes

Institutional Outcome/Objective Students will demonstrate the critical thinking skills of analysis and evaluation in the general education curriculum and in the major. Course Outcome Students will analyze and interpret multiple choice tests and their results.

Constructing the Test Blue Print 1.List important course content or topics and link to outcomes. 2.Identify cognitive levels expected in outcomes. 3.Determine number of items for entire test and each cell based on: emphasis, time, and importance.

Base Test Blueprint on: Actual Instruction Classroom Activities Assignments Curriculum at the Program Level

Questions Validity How to use Test Blueprint

Reliability: Repeatable or Consistent Results If a test is administered one day and an equivalent test is administered another day the scores should remain similar from one day to another. This is typically based upon the correlation of the two sets of scores, yet this approach is unrealistic in the classroom setting.

Internal Consistency Approach: KR-20

Guidelines to Increase Reliability* Develop longer tests with well-constructed items. Make sure items are positive discriminators; students who perform well on tests generally answer individual questions correctly. Develop items of moderate difficulty; extremely easy or difficult questions do not add to reliability estimations. * Guide for Writing and Improving Achievement Tests

Multiple Choice Items Refer to handout

Guidelines for Developing Effective Items Resources In Guide for Improving Classroom Achievement Tests, T.L. Flateby Assessment of Student Achievement, 2008, N.E. Gronlund Allyn and Bacon Developing and Validating Multiple-Choice Test Items, 2004, Thomas Haladyna; Lawrence Erlbaum Associates Additional articles and booklets are available at

Questions How to ensure Reliability and Validity

Evaluate Test Results 1.Kr-20: An outcome of.70 or higher. 2.Item discriminators should be positive 3.Difficulty Index; P-Value. 4.Analysis of Distracters.

Item Analysis Refer to 8 Item handout

Use Results for Assessment Purposes Analyze performance on each item according to the outcome evaluated. Determine reasons for poor testing performance. –Faulty Item –Lack of Student Understanding Make adjustments to remedy these problems.

Questions Contact Terri Flateby at , or http//:teresaflateby.com