Presentation is loading. Please wait.

Presentation is loading. Please wait.

Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –

Similar presentations


Presentation on theme: "Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –"— Presentation transcript:

1 Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –

2 Copyright © Springer Publishing Company, LLC. All Rights Reserved. Assessment Validity  Current concept of validity focuses on the accuracy or appropriateness of inferences about test results and how those results are used  Content validation – Extent to which test items accurately represent a particular content domain  Construct validation – Extent to which it is possible to make inferences from test results to more general abilities and characteristics 2

3 Copyright © Springer Publishing Company, LLC. All Rights Reserved. Assessment Validity (cont)  Assessment-criterion relationship – Extent to which test results (the assessment) predict performance on another assessment (the criterion measure), either in the future (predictive) or at the same time (concurrent)  Consideration of consequences – Intended and unintended consequences 3

4 Copyright © Springer Publishing Company, LLC. All Rights Reserved. Assessment Reliability  Consistency, stability, or reproducibility of test scores – Boosts confidence in assessment results  Different methods of determining assessment reliability – Stability: consistency over time – Equivalence: consistency among different forms of the assessment – Internal consistency: consistency within the assessment itself (used commonly for teacher-made tests) – Interrater reliability: consistency of judgments among different raters 4

5 Copyright © Springer Publishing Company, LLC. All Rights Reserved. Test Planning  Purpose of test and students to be tested  Test length – Ideal length (i.e., number of test items or total possible points) depends on test’s purpose, ability level of students, available testing time, and desired reliability of test results  Difficulty and discrimination level – Ideal difficulty level correctly classifies students according to whether they met the criterion or not – Discrimination power is ability of each test item to distinguish between students who have greater knowledge of the content domain and those who have less knowledge  Item formats  Scoring of items 5

6 Copyright © Springer Publishing Company, LLC. All Rights Reserved. Test Blueprint  Also called a test plan or table or specifications  Guides teacher to write items at the appropriate level to test the desired content areas  Elements include: 1.List of major topics or learning outcomes that the test will assess 2.Level of complexity of assessment tasks 3.Emphasis that each topic or learning outcome will have Indicated by number or percentage of items or points 6

7 Copyright © Springer Publishing Company, LLC. All Rights Reserved. Writing Test Items  General rules that contribute to quality test items 1.Every item should measure something important 2.Every item should have a correct answer 3.Use simple, clear, concise, grammatically correct language 4.Avoid jargon, slang, and unnecessary abbreviations 5.Use positive wording 6.Items should not contain irrelevant cues to correct answers 7.No item should depend on another item for meaning 8.Eliminate unnecessary information 9.Request peer critique of test items 10.Prepare more test items than the test blueprint calls for 7

8 Copyright © Springer Publishing Company, LLC. All Rights Reserved. Specific Item Formats  True-False – Statement that student must judge as true or false – Variations: requiring students to explain why the statement is true or false, or to correct false statements  Matching exercises – Series of homogeneous items (premises) with same set of responses – Most appropriate for assessing students’ ability to classify and categorize information such as definitions of terms – Write unequal numbers of premises and responses to avoid giving clue to final match 8

9 Copyright © Springer Publishing Company, LLC. All Rights Reserved. Specific Item Formats (cont)  Multiple-choice – Used to assess many types of desired learning outcomes at various levels of cognitive taxonomy, especially application and analysis – Parts: stem, correct answer, and distractors Stem is a question or incomplete sentence that must be completed by one of the alternatives Alternatives or options include the correct answer and a number of distractors Distractors are incorrect alternatives that appear plausible to students who are unsure of correct answer – Guidelines for writing each part 9

10 Copyright © Springer Publishing Company, LLC. All Rights Reserved. Specific Item Formats (cont)  Multiple-response – Select one or more options as the correct or best answer  Short-answer – Require a word, phrase, or number as an answer – Two formats Question Completion (also known as fill-in-the-blank) 10

11 Copyright © Springer Publishing Company, LLC. All Rights Reserved. Specific Item Formats (cont)  Essay – More lengthy response than short answer – Scored by two methods Holistic scoring involves reading the entire response and judging its overall quality Analytic scoring involves identifying content to include and assigning number of possible points for each content area  Context-dependent item set – Also known as an interpretive exercise – Group of test items that relate to the same introductory material 11

12 Copyright © Springer Publishing Company, LLC. All Rights Reserved. Assembling the Test  Arrange items in a logical sequence – By order of expected difficulty, according to the sequence in which the content was learned, or combination of these  Write clear general directions for test  Use a cover page  Avoid crowding  Grouping items of same format together  Facilitate scoring  Number items continuously  Proofread  Prepare an answer key 12

13 Copyright © Springer Publishing Company, LLC. All Rights Reserved. Other Considerations  Reproducing the test  Preparing students to take tests – Students need information about the test to prepare for it – Test-taking skills – Test anxiety  Administering the test 13

14 Copyright © Springer Publishing Company, LLC. All Rights Reserved. Test and Item Analysis  Item difficulty (or P-value) is percentage of students who responded correctly to an item – Values range from 0 to 1.00  Discrimination index (D) indicates how effectively each test item measures what the entire test measures – D-values range from –1.00 to + 1.00 14


Download ppt "Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –"

Similar presentations


Ads by Google