Presentation is loading. Please wait.

Presentation is loading. Please wait.

South Carolina Alternate Assessment (SC-Alt) Advisory Committee September 28, 2011 1.

Similar presentations


Presentation on theme: "South Carolina Alternate Assessment (SC-Alt) Advisory Committee September 28, 2011 1."— Presentation transcript:

1 South Carolina Alternate Assessment (SC-Alt) Advisory Committee September 28, 2011 1

2 South Carolina Alternate Assessment (SC-Alt) Overview 2

3 Advisory Committee Role and Purpose 3

4 American Institutes for Research (AIR) SC-Alt Contractor 4

5 A.I.R. Staff DeeAnn Wagner DeeAnn Wagner Project Director Jennifer Chou Jennifer Chou Project Manager Lynnett Wright Lynnett Wright Alternate Assessment Specialist Emma Hannon Emma Hannon Research Assistant 5

6 2011 Administration 6

7 Discussion of New Procedures Print Manipulatives Print Manipulatives Packaging of Manipulatives and Test Booklets Packaging of Manipulatives and Test Booklets Answer Folders and Security Affidavits Answer Folders and Security Affidavits 7

8 Biology Operational Administration 2011 8

9 Scoring Fidelity 9

10 Videotaping of SC-Alt Administrations Implemented to monitor test administration effectiveness and scoring consistency Annual implementation allows monitoring consistency over testing years as adjustments are made in training, as new tasks/items are used to replace previous tasks, and new content areas are added (i.e., high school biology). 10

11 Videotaping Sampling Procedures for 2011 One student per sampled teacher was videotaped: ELA only for elementary and middle school forms ELA and biology for high school forms 11

12 Videotaping Sampling Procedures 12 All districts were sampled. Sampling implemented by teacher and student. Teachers sampled according to proportions of students in their district. Approximately 1/3 of teachers and 10% of students were sampled.

13 Review and Analysis of Videotaped Administrations 13 All recordings were reviewed by trained AIR raters for: Fidelity of administration Accuracy of scoring Teacher score is used for reporting purposes 10% sample reviewed by AIR alternate assessment specialist

14 Videotaping Results 14 Previous results have indicated consistently high rates of scoring agreement at all three form levels (elementary, middle, and high school). For 2011, the average item agreement statistics for the ELA videotaped samples were: Elementary form - 96.0, Middle School form – 94.9, and High School form – 95.9. The item agreement statistic for High School biology was 94.3. These results are consistent with the scoring consistency results for previous years and confirm a high level of scoring consistency for the new High School biology assessment.

15 Second Rater Pilot Study 2011 Administration 15

16 Second Rater Pilot A second rater procedure may also be used to obtain data on scorer fidelity. A pilot of the second rater procedure was conducted for the 2011 administration. Participation in the pilot was voluntary. We are seeking feedback and suggestions from you today as we review the outcomes of the pilot study. 16

17 Second Rater Pilot Procedures The DTC-Alt volunteered district participation. The second rater pilot was limited to elementary ELA administrations. The DTC-Alt was allowed to select a teacher (and the specified student) identified for videotaping for implementing the pilot second rater session. The second rater procedure was in lieu of videotaping for that teacher. The second rater could also serve as the test administration monitor. 17

18 Second Rater Pilot Procedures The participating district was only required to implement the procedure with one teacher. The second rater scored the student’s responses on a separate answer document marked Second Rater and submitted to AIR separately. Second rater pilot participants (teachers, second raters, and DTCs-Alt) were asked to complete a brief questionnaires. 18

19 Second Rater Qualifications 19 Must meet the test administrator criteria: certified teacher administrator (e.g., school administrator, district level special education consultant, or other administrator) related services personnel Must participate in test administration training.

20 Second Rater Pilot Study Outcomes 20

21 Second Rater Scoring Consistency Results The second rater observer sores were compared to the teacher scores to calculate scoring agreement in the same manner as was used for videotape data. Since both the second rater and videotaping procedures were used for samples of Elementary ELA administrations, the results of use of the two methods could be compared. Analyzable data were obtained for 48 second rater administrations and 70 videotaped administrations. 21

22 Second Rater Scoring Consistency Results The second rater average item agreement statistics were 96.9% for the second rater data and 96.0% for the videotape data. The second rater average item agreement statistics were 96.9% for the second rater data and 96.0% for the videotape data. The comparable results for these two procedures supports the effectiveness of the second rater procedure. The comparable results for these two procedures supports the effectiveness of the second rater procedure. 22

23 Second Rater Pilot Questionnaire Results The questionnaires completed by the pilot participants were used to obtain information on the experience of the teachers and observers, the staff positions of the observers, and the recommendations and preferences in regard to the second rater procedure from the three groups of respondents (teachers, observers, and DTCs-Alt). Survey responses were obtained for 41 teachers, 45 observers, and 8 DTCs-Alt. Since the number of districts participating in the pilot was 25, the participation rate for DTCs-Alt was low. 23

24 Second Rater Pilot Questionnaire Results The survey respondents reported a very high level of preference for using the second rater procedure over use of the videotaping procedure. 93% of the teachers and 87% of the observers responded that they preferred the second rater procedure. 5% and 9% of each group respectively indicated no preference, with only 2% and 4% indicating a preference for videotaping. These results did not differ by teacher or observer experience or observer staff position. 24

25 Second Rater Pilot Questionnaire Results 75% of the DTCs-Alt reported a preference for the second rater procedure over the videotaping procedure (6 of the 8 respondents). 25% (2 DTCs-Alt) indicated a preference for videotaping. 25

26 Questionnaire Results: Problems Encountered Questionnaire Results: Problems Encountered The students using eye-gaze response were difficult to rate (observe). A few districts reported some planning issues, e.g., determining who should be the second rater. Being a pilot, all materials were sent to the DTC-Alt. 26

27 Questionnaire Results: Reported Benefits Questionnaire Results: Reported Benefits The second rater was able to observe some administration problems related to teacher preparation. Teachers reported the procedure to be less stressful than videotaping. Teachers reported the procedure was less distracting to students than videotaping. 27

28 Questionnaire Results: Suggestions to Improve the Process Questionnaire Results: Suggestions to Improve the Process Provide a test booklet to the second rater. Identify second rater teachers prior to TA training. Include documentation of mode of response on the second rater answer folder. Other: second rater materials, packaging, and return procedures 28

29 2011 Student Participation and Performance 29

30 30 Alternate Assessment Participation 2006 – 2011* Group200620072008200920102011 Increase 10 to 11 NumberPercent Mod_MD973992987975913868-45-4.9% Autism2774064525196147008614.0% Sev_MD265273274325321301-20-6.2% Mild_MD19454664067372384612317.0% Orth6582597768 00 OHI557810290134 00 Other71991371001692588952.7% All Students 1900247626512759294231752337.9% *PACT-Alt/HSAP-Alt 2006; SC-Alt 2007- 2011

31 31 Changes in Rates of Participation The overall number of students increased by 233, which was a 7.9% increase. Last year’s increase was 6.6% Autism students increased by14.0%, compared to an 18.3% increase last year. Mild MD students increased by 17.0%, compared to a 7.4% increase last year. The percentages of increase from 2007 to 2011 have been 72% for Autism and 55% for Mild MD, compared to an increase since 2007 of 7% for all other students.

32 32 SC-Alt 2011 Participation by Primary Disability GroupNumberPercent Moderate MD86827.3% Autism70022.1% Severe MD3019.5% Mild MD84626.7% OHI1344.2% Developmental Delay732.3% Orthopedic682.1% Other1855.8% All Students3175

33 33 ELA Performance for All Students 2007 - 2011

34 34 Math Performance for All Students 2007 - 2011

35 35 Science Performance for All Students 2007 - 2011

36 36 Social Studies Performance for All Students in 2008 - 2011

37 37 Increases in SC-Alt Participation: A Continuing Concern The SC-Alt was designed for students with the “most significant cognitive disabilities.” It was not intended for higher-level autism or even higher – level moderate MD students. Only a very small number of students classified as “mild MD” would be expected to be included. There is evidence that some districts and schools are gaming the accountability system by identifying SC-Alt students inappropriately.

38 38 2011 SC-Alt Students with Previous PASS Scores A search for previous PASS scores was implemented for the 2009 and 2010 PASS data files. 2010 PASS scores were identified for 203 students. 2009 PASS scores were identified for an additional 142 students. These numbers approximate the student increases from 2009 to 2010 and from 2010-2011 administrations.

39 39 PASS Test Grade Prior to SC-Alt Placement Last PASS Test Grade Number of Students 391 446 589 642 720 843 All Grades Total331

40 Score Reports Supporting Student Learning Parent Suggestions 40

41 2012 Administration 41

42 Discussion and Feedback District Level Training District Level Training Scoring Worksheets Scoring Worksheets Other Other 42

43 Office of Exceptional Children Update 43

44 Common Core State Standards Assessment Consortia 44

45 Peer Review 45

46 AA-MAS Update on Projects 46

47 http://www.ed.sc.gov Suzanne Swaffield Suzanne Swaffield Douglas Alexander Douglas Alexander 47 sswaffie@ed.sc.gov 803-734-8274 dgalexan@ed.sc.gov 803-734-3923


Download ppt "South Carolina Alternate Assessment (SC-Alt) Advisory Committee September 28, 2011 1."

Similar presentations


Ads by Google