Presentation is loading. Please wait.

Presentation is loading. Please wait.

Practical Issues in Computerized Testing: A State Perspective Patricia Reiss, Ph.D Hawaii Department of Education.

Similar presentations


Presentation on theme: "Practical Issues in Computerized Testing: A State Perspective Patricia Reiss, Ph.D Hawaii Department of Education."— Presentation transcript:

1 Practical Issues in Computerized Testing: A State Perspective Patricia Reiss, Ph.D Hawaii Department of Education

2 Agenda  Contextual Information on CAT Implementation in Hawaii  Shadow Tests  Scoring of Incomplete Adaptive Tests  Speededness

3 Contextual Information - Hawaii State Assessment: Online CAT since 2010 - Smarter Balanced Assessment: CAT and PT - Computer Adaptive Algorithm - Blueprint alignment: content (claims, targets, CCSS), cognitive complexity, and item types - Maximize achievement level classification for total test and reporting categories - Minimize overall measurement error - Stopping rule – test length

4 Flow Select Eligible Items (not seen before) Assign Starting Scores Calculate Value for Each Item Recalculate score Administer item Select highest valued Item (or randomly from highest) Recalculate score Field Test? Go Back? Recalculate score Capture response change Next item already Administered? Recalculate score Select field test item Update Constraints Used with permission from American Institutes for Research

5 Practical Considerations To allow for adaptations items in CAT are selected sequentially. But each of these practical problems implies constraints on item selection that can only be realized if all items are selected simultaneously. Wim van der Linden

6 Practical Considerations  Computing resources  Number and cost of servers  Network Bandwidth  Reduce number of reassembled shadow tests by requiring  fixed intervals between reassemblies  a certain minimal change in the interim ability estimate

7

8 AIR Summative Simulation Report for Smarter Balanced March 2015

9

10

11

12 Scoring of Incomplete Adaptive Tests For the Smarter Balanced online assessments,  A student is required to provide a response to an item before s/he can go on to the next item  However, the student is able to go back and change his or her answers

13 Scores for Incomplete Tests  For incomplete Smarter Balanced Assessments for which students answer at least 10 CAT items, the Consortium will derive a score based on the students’ actual responses to items and will consider as incorrect any item to which students do not provide a response. The difficulty of a student’s omitted items from the combination of online computer adaptive and performance tasks will be estimated based on the average difficulty of the items in the item pool.  For incomplete tests for which students answer fewer than 10 CAT items, the Consortium will assign the lowest possible scale score for the grade and content area.  Sub-scores are not reported for incomplete tests.

14 Differential Speededness “Test items vary considerably in the amount of time they require.” Examinees vary in how much time they require to answer test items.

15 Actual Testing Times

16 Implementation Issues Related to “Time”  Number of Computers  Bandwidth Prioritization and Management  Highest priority given to assessments  Gaming, audio and video sites were blocked  Start times of tests staggered School level feedback  Impact on duty free time for teachers  Bathroom breaks for teachers  Move students who require more time or keep all students in the same room  Conflict with lunch schedule  Loss of instructional time, extra-curricular activities, assemblies, field trips, etc.

17


Download ppt "Practical Issues in Computerized Testing: A State Perspective Patricia Reiss, Ph.D Hawaii Department of Education."

Similar presentations


Ads by Google