Presentation is loading. Please wait.

Presentation is loading. Please wait.

Copyright © 2012 by Educational Testing Service. Computer Based Achievement Assessment of Young Students (Grades 1, 2 and 3) Christine M. Mills ETS, Princeton,

Similar presentations


Presentation on theme: "Copyright © 2012 by Educational Testing Service. Computer Based Achievement Assessment of Young Students (Grades 1, 2 and 3) Christine M. Mills ETS, Princeton,"— Presentation transcript:

1 Copyright © 2012 by Educational Testing Service. Computer Based Achievement Assessment of Young Students (Grades 1, 2 and 3) Christine M. Mills ETS, Princeton, NJ Paper presented at the National Conference on Student Assessment June 19- 22, 2013, National Harbor, Maryland Unpublished Work Copyright © 2012 by Educational Testing Service. All Rights Reserved. These materials are an unpublished, proprietary work of ETS. Any limited distribution shall not constitute publication. This work may not be reproduced or distributed to third parties without ETS's prior written consent. Submit all requests through www.ets.org/legal/index.html.www.ets.org/legal/index.html Educational Testing Service, ETS, the ETS logo, and Listening. Learning. Leading. are registered trademarks of Educational Testing Service (ETS). 1

2 Copyright © 2012 by Educational Testing Service. What problem did we try to solve? The client wanted a battery of diagnostic assessments measuring student achievement in English language arts and mathematics to a computer delivery system. These studies were completed within the scope of the transition to inform the design decisions for the tests assessing students in Grades 1, 2, and 3. 2

3 Copyright © 2012 by Educational Testing Service. Three Small Studies General computer usability study – informing design and test presentation decisions Modality comparability study – average scores comparable in CBT and PBT Audio delivery usability study – informing decisions regarding audio delivered tests (those typically read aloud by teachers) 3

4 Copyright © 2012 by Educational Testing Service. Computer Usability Study Inform design and presentation decisions and provide evidence that young students could manage completing the CBT assessment – 15 students (4 - Grade 1, 8 - Grade 2 and 3 - Grade 3) participated in individual interviews – Researchers followed a scripted protocol to walk students through the exercises. – Students recorded their responses using the mouse or keyboard and were interviewed about their experience. 4

5 Copyright © 2012 by Educational Testing Service. Computer Usability Study Results: – Most students thought both CBT and PBT were easy to use prefer taking the test on a computer – General hardware requirements should replicate what students use daily – Practice makes perfect – Different font was required for CBT – Different assumptions about practice items 5

6 Copyright © 2012 by Educational Testing Service. Modality Comparability Study Are average scores for schools comparable in CBT and PBT? – Schools were recruited on a voluntary basis and received $20 per student for participation – Provided guidance and asked to randomly assign students within a classroom to take a subset of the assessment battery on PBT group or CBT group – Performed t-test for the comparison of CBT and PBT group means 1 6

7 Copyright © 2012 by Educational Testing Service. Modality Comparability Study Results: – Across the three grades mean scores tended to be slightly higher for the CBT group. – Observed more variability in the scores for the PBT group with the exception of Grade 1 where we observed the opposite result – Only statistically significant differences in mean scores were observed for Grade 1 Auditory Comprehension and Grade 3 Mathematics – in both cases the CBT group had a slightly higher mean score. – Effect sizes ranged between 0.00 to 0.41 with largest values for Grade 1 Auditory Comprehension (.41) and Mathematics Grade 3 (0.25) – all others were between.0 and 0.22. 7

8 Copyright © 2012 by Educational Testing Service. Audio Delivery Usability Study Inform design decisions for audio delivery – Students were presented with item instructions, stimulus and responses (where applicable) and asked to adjust the audio settings and/or determine how to have the audio stimulus repeated. – One kindergartner and four Grade 1 students responded to the directions, sample items and 12 live items assessing Word Analysis and 5 mathematics items. – Six Grade 2 students were asked to answer sample questions and 6 live items assessing writing mechanics. 8

9 Copyright © 2012 by Educational Testing Service. Audio Delivery Usability Study – The students followed the audio delivered instructions with ease. – They were able to answer the sample questions and in most cases understood how to follow the directions to answer the questions. – When students had trouble answering questions it did not appear to be due to difficulty in understanding the speech of the narrators who had recorded the stimulus presented. – Students were able to adjust their headsets with ease. Only one student had trouble understanding the tutorial’s description of how to adjust the volume; however had no trouble during the test. – Ten minutes was allotted for each student to complete the tutorial to learn how to adjust the volume of the narrator, his headset, and to learn how to replay the audio presentation of an item and this seemed to be plenty of time to take each student through the tutorial. – More time was needed after practice questions for students to respond to the item. However, most students used “Replay” spontaneously or with veiled coaching, such as “Pretend you wanted to listen to the lady again” if they wanted to hear the item text again. 9

10 Copyright © 2012 by Educational Testing Service. Conclusions – With deliberate, well thought out development decisions young students can take computer based assessments – With the appropriate item and stimulus presentation the scores form the PBT and CBT versions are comparable for this low stakes test battery. – Continued investigation is warranted as more schools transition to the CBT – Future work should consider the emerging literature on teacher preparation for implementing use of technology in the classrooms and how this affects what students are able to do. 10

11 Copyright © 2012 by Educational Testing Service. Limitations – Study included only traditional selected response item types – Sample sizes were small so caution should be used and more study is warranted as data become available to substantiate findings and evaluate subpopulations – There was not a comparison of the audio usability on CBT to PBT and performing this information 11


Download ppt "Copyright © 2012 by Educational Testing Service. Computer Based Achievement Assessment of Young Students (Grades 1, 2 and 3) Christine M. Mills ETS, Princeton,"

Similar presentations


Ads by Google