Presentation is loading. Please wait.

Presentation is loading. Please wait.

Michigan Assessment Consortium Common Assessment Development Series Putting Together The Test Blueprint.

Similar presentations


Presentation on theme: "Michigan Assessment Consortium Common Assessment Development Series Putting Together The Test Blueprint."— Presentation transcript:

1 Michigan Assessment Consortium Common Assessment Development Series Putting Together The Test Blueprint

2 Developed and Narrated by Bruce R. Fay, PhD Assessment Consultant Wayne RESA

3 Support The Michigan Assessment Consortium professional development series in common assessment development is funded in part by the Michigan Association of Intermediate School Administrators in cooperation with MDE, MSU, Ingham & Ionia ISDs, Oakland Schools, and Wayne RESA. The Michigan Assessment Consortium professional development series in common assessment development is funded in part by the Michigan Association of Intermediate School Administrators in cooperation with MDE, MSU, Ingham & Ionia ISDs, Oakland Schools, and Wayne RESA.

4 What You Will Learn Test blueprints…what they are and why you need them Test blueprints…what they are and why you need them The components of a test blueprint The components of a test blueprint Criteria for a good test blueprint Criteria for a good test blueprint Test blueprint examples Test blueprint examples

5 If you don't know where you're going, any road will take you there. George Harrison (1943 - 2001) "Any Road", Brainwashed, 2002

6 Assessment with a Purpose Educational assessment is not something incidental to teaching and learning. It is an equal partner with curriculum and instruction. It is the critical 3 rd leg through which both students and teachers receive feedback about the effectiveness of the teaching and learning process in achieving desired learning outcomes. Assessment closes the loop. Educational assessment is not something incidental to teaching and learning. It is an equal partner with curriculum and instruction. It is the critical 3 rd leg through which both students and teachers receive feedback about the effectiveness of the teaching and learning process in achieving desired learning outcomes. Assessment closes the loop.

7 Closed–Loop (Feedback) Systems Home Heating System (Teaching & Learning) Desired Temperature (Learning Target) Actual Temperature (Test Results)

8 C – I – A Alignment Requires thoughtful alignment – ensuring that the items on a test fairly represent the… Requires thoughtful alignment – ensuring that the items on a test fairly represent the… Intended learning targets (intended curriculum) Intended learning targets (intended curriculum) Actual learning targets (taught curriculum) Actual learning targets (taught curriculum) Test what you teach, teach what you test

9 Target–Level Alignment Relative importance of those targets Relative importance of those targets Level of cognitive complexity associated with those targets Level of cognitive complexity associated with those targets

10 Useful feedback requires tests that are… Reliable (consistent; actually measure something) Reliable (consistent; actually measure something) Fair (Free from bias or distortions) Fair (Free from bias or distortions) Valid (contextually meaningful or interpretable; can reasonably support the decisions we make based on them) Valid (contextually meaningful or interpretable; can reasonably support the decisions we make based on them)

11 Test Blueprints The Big Idea A simple but essential tool, used to: A simple but essential tool, used to: Design tests that can meet the preceding requirements Design tests that can meet the preceding requirements Define the acceptable evidence to infer mastery of the targets Define the acceptable evidence to infer mastery of the targets Build in evidence for validity Build in evidence for validity

12 The Test Blueprint or Table of Test Specifications Explicitly map test items to: Explicitly map test items to: Learning Targets Learning Targets Levels of Complexity Levels of Complexity Relative Importance Relative Importance Provides common definition of the test Provides common definition of the test

13 Learning Targets & Standards Frameworks Standards as structured hierarchical frameworks. Michigans is: Standards as structured hierarchical frameworks. Michigans is: Strands Strands Standards Standards Domains Domains Content Expectations Content Expectations Detailed curriculum is usually left to local districts or classroom teachers. Detailed curriculum is usually left to local districts or classroom teachers.

14 A Simple Taxonomy of Cognitive Complexity Norm Webbs Depth Of Knowledge (1997) (highest to lowest) Extended Thinking Extended Thinking Strategic Thinking Strategic Thinking Skill / concept use / application Skill / concept use / application Recall Recall

15 Putting it all together… A Basic Test Blueprint Table (matrix) format (spreadsheet) Table (matrix) format (spreadsheet) Rows = learning targets (one for each) Rows = learning targets (one for each) Columns = Depth of Knowledge Columns = Depth of Knowledge Cells = number of items and points possible Cells = number of items and points possible

16 Summary Information Number of items and points possible: Number of items and points possible: Row Margins = for that target Row Margins = for that target Column Margins = for that level of complexity Column Margins = for that level of complexity Lower Right Corner = for the test Lower Right Corner = for the test

17 Example 1 – Basic Blueprint for a test with 5 learning targets Learning Targets Recall # (pts) Use # (pts) Strategic # (pts) Extended #(pts) Target Totals # (pts) Target 13 (3)2 (2)5 (5) Target 21 (1)2 (2)2 (4)5 (7) Target 32 (2)1 (3)3 (5) Target 43 (3)1 (2)4 (5) Target 52 (4)1 (4)3 (8) Level Totals # (pts) 6 (6)10 (14)4 (10)20 (30)

18 Is this reasonable? Rule of Thumb Criteria… At least 3 items per target for reliability At least 3 items per target for reliability Appropriate: Appropriate: Distribution of items over targets Distribution of items over targets Levels of complexity for targets/instruction Levels of complexity for targets/instruction Distribution of items over levels of complexity Distribution of items over levels of complexity (all items are NOT at the lowest or highest level)

19 Professional Judgment Like all things in education, the development of assessments and the use of the results are dependent on professional judgment, which can be improved through… Like all things in education, the development of assessments and the use of the results are dependent on professional judgment, which can be improved through… Experience Experience Collaboration Collaboration Reflection on methods Results Reflection on methods Results

20 Limitations… Shows total points for each target/level combination, but not how those points apply to each item Shows total points for each target/level combination, but not how those points apply to each item Doesnt show item types Doesnt show item types Doesnt indicate if partial credit scoring can/will be used (but may be implied) Doesnt indicate if partial credit scoring can/will be used (but may be implied) But…it was easy to construct, is still a useful blueprint, and is much better than not making one! But…it was easy to construct, is still a useful blueprint, and is much better than not making one!

21 Add details on item type and format to ensure… Appropriate match to learning targets and associated levels of complexity Appropriate match to learning targets and associated levels of complexity Balanced use within tests and across tests over time Balanced use within tests and across tests over time Specification of test resources, i.e. – Specification of test resources, i.e. – Calculators, dictionaries, measuring tools… Calculators, dictionaries, measuring tools… Track on same or separate spreadsheet Track on same or separate spreadsheet

22 Common item types include… Selected-response Selected-response Multiple-choice Multiple-choice Matching Matching Constructed-response Constructed-response Brief (fill-in-the-blank, short answer, sort a list) Brief (fill-in-the-blank, short answer, sort a list) Extended (outline, essay, etc.) Extended (outline, essay, etc.) Performance Performance Project Project Portfolio Portfolio

23 Complexity vs. Utility Your test blueprint could get complicated if you try to account for too much in one spreadsheet. Your test blueprint could get complicated if you try to account for too much in one spreadsheet. Make sure your test blueprint covers the basics, is not a burden to create, and is useful to you Make sure your test blueprint covers the basics, is not a burden to create, and is useful to you The following example is slightly more complicateded, but still workable The following example is slightly more complicateded, but still workable

24 Target Code Item # Item type SR Item type CR-B Item type CR-E DOK Recall Pts DOK Use Pts DOK Strategic Pts Trgt Tots (pts) 1.1.1.11x1 2x1 3x24 1.2.3.44x3 5x58 etc Col Tots522117512 Example 2 – Blueprint with Explicit Items and Item Types

25 Beyond the Test Blueprint Answer key (selected-response items) Answer key (selected-response items) Links to scoring guides & rubrics Links to scoring guides & rubrics Specs for external test resources Specs for external test resources Item numbering for alternate test forms Item numbering for alternate test forms

26 Conclusions Destination & Road Map Destination & Road Map Alignment/balance of items/types for… Alignment/balance of items/types for… learning targets (curriculum /content) learning targets (curriculum /content) size (complexity) of targets size (complexity) of targets cognitive level of targets cognitive level of targets relative importance of targets relative importance of targets Spec or document other aspects of the test Spec or document other aspects of the test


Download ppt "Michigan Assessment Consortium Common Assessment Development Series Putting Together The Test Blueprint."

Similar presentations


Ads by Google