Presentation is loading. Please wait.

Presentation is loading. Please wait.

2013-2014 Test Irregularity Summary Report. Agenda Overview, Purpose, and Background Test Irregularity Summary Report.

Similar presentations


Presentation on theme: "2013-2014 Test Irregularity Summary Report. Agenda Overview, Purpose, and Background Test Irregularity Summary Report."— Presentation transcript:

1 2013-2014 Test Irregularity Summary Report

2 Agenda Overview, Purpose, and Background Test Irregularity Summary Report

3 Overview and Purpose This presentation is designed to provide district staff with an overview and understanding of the Test Irregularity Summary Report Purpose Through this overview district staff will be able to understand: (a)The components and data that comprise the summary report (b)How the summary report can be used to plan and improve

4 Background Test Security The Department has annually provided districts with reports of testing irregularities and voids during testing and scoring windows. These reports have traditionally been specific to each assessment. In prior years, the Department has also worked to provide schools and districts greater detail as needed for planning and improvement purposes. This year, in order to greater support LEAs in ensuring test security and integrity the Department will be providing annual summary reports in addition to in window void reports, The Test Irregularity Summary should be used to inform test security training. Test Irregularity Summary Reports can be found in FTP folders.

5 Bulletin 118 Requires LEAs: Develop and adopt a district test security policies that provide for: The security of the test materials including storage Training of test coordinators and administrators Investigation of irregularities Procedures for monitoring of test sites

6 Agenda Overview, Purpose, and Background Test Irregularity Summary Report

7 Test Irregularity Summary Report: Purpose The Test Irregularity Summary Report was designed to provide district leaders with the data and information necessary to improve test security and integrity. The components in the report include: Voids resulted from plagiarism, reported incidents of test security violations, and administrative errors Voids and flags revealed from erasure analysis Retests that were administered due to administrative error EOC test sessions reopening

8 Using Summary Reports Leaders can use Summary Reports to: Identify areas of security concern (e.g., plagiarism, erasure) Identify areas of administrative concern (e.g., administrative errors, technology) Such information is helpful in: Evaluating current policies and procedures related to test security Providing appropriate support and training in test administration procedures Addressing technology issues as appropriate

9 Test Irregularity Summary Report: Sections The Summary Report is divided into the following sections: 1.Test Scores Voided for Students and Schools 2.Erasure Voids/Flags for Schools 3.Administrative Error Retests 4.EOC Session Reopened Reports

10 Section 1: Test Scores Voided Student test scores are voided when violations of test administration policy occur. Student test scores may be voided due to: (a)Plagiarism, (b)reported test security violations, or (c)administrative errors that occur during the testing process. The following tables are examples of summary data that provide the number of schools that had void test scores, the number of tests voided, and the number of voids by reason and by testing program.

11 Section 1: District Summary ComponentDescription Number of SchoolsNumber of schools with voids Number of TestsNumber of tests voided Plagiarism VoidsNumber of tests voided due to plagiarism identified in the scoring process Reported Prohibited Behavior Number of voids self-reported by the district due to test irregularities Administrative Error VoidsNumber of voids resulting from administrative errors (e.g. scheduling errors, accommodation misuse). Number Voided by Testing Program Number of voids broken down by testing program

12 Section 1: School Level Detail ComponentDescription Number of TestsNumber of tests voided at each school Reason for VoidReason for void (Admin Error, District, Plagiarism) Test ProgramTest program in which the void occurred GradeGrade level in which the void occurred SubjectSubject area in which the void occurred

13 Section 2: Erasure Analysis Erasure analysis is a data forensic technique designed to detect possible tampering with student answer sheets through an examination of excessive wrong-to-right erasures on student answer documents. Statistical analyses were conducted to determine where the number of wrong-to-right erasures made was improbable (i.e. <=1 in 10,000 chance). Students who had an improbable number of wrong to right erasures but did not meet the void criteria were flagged. By district and school, the tables below show the number of students who were identified as having wrong-to-right erasures that are significantly higher than the state average, and whose results were voided or flagged for school accountability.

14 Section 2: District Summary ComponentDescription Number of SchoolsTotal number of schools with erasure analysis voids and flags Number of Tests Voided/Flagged Total number of tests voided or flagged based on erasure analysis LEAP VoidedTotal number of LEAP tests voided for excessive erasures LEAP FlaggedTotal number of iLEAP tests flagged for excessive erasures, but not voided iLEAP VoidedTotal number of LEAP tests voided for excessive erasures iLEAP FlaggedTotal number of iLEAP tests flagged for excessive erasures, but not voided

15 Section 2: School Level Detail Column Under Subject TestColumn Value Number of Tests VoidedTotal number of tests voided for excessive erasures Number of Tests FlaggedTotal number of tests flagged for excessive erasure, but not voided Test AdministrationTest administration in which the voids and/or flags occurred GradeGrade level at which the voids and/or flags occurred SubjectSubject area in which the voids and/or flags occurred

16 Section 3: Administrative Error Retest When tests are administered incorrectly, student's results are voided. However, with the high stakes testing programs i.e., LEAP, GEE, LAA2, and EOC, students are afforded the opportunity to take an administrative error retest. Common administrative errors include wrong test administered, errors in accommodation administration, and scheduling errors. The following tables outline the number of schools that gave the administrative error retests, the total number of students who took administrative error retests by the testing program.

17 Section 3: District Summary Column Under Subject TestColumn Value Number of SchoolsTotal number of schools with administrative error retests Number of TestsTotal number of tests with administrative error retests EOCNumber of EOC tests resulting in administrative error retests LEAPNumber of LEAP tests resulting in administrative error retests

18 Section 3: School Level Detail Column Under Subject TestColumn Value Number of TestsTotal number of tests with administrative error retests Test ProgramTest program with administrative error retests Test AdministrationTest administration in which the retests occurred GradeGrade level at which the retests occurred SubjectSubject area in which the retests occurred

19 Section 4: EOC Sessions Reopened End-of-Course exams (EOCs) are administered online for high school subjects. During each administration, test administrators may occasionally need to reopen a test session in order to allow for student completion of test session. Reasons for reopening a session include: technology issues administration of accommodations allowing additional time for completion of the assessment. When an EOC test session has to be reopened, the reasons for reopening the session must be indicated in the EOC test system. Reported in the following tables are the total number and percent of students whose test sessions were reopened during the 2014 May administration, as well as the reasons reported for sessions reopened.

20 Section 4: Number and Percent of Sessions Reopened Column Under Subject TestColumn Value Number of Sessions OpenedNumber of test sessions opened listed by state, district, and school Number of Sessions ReopenedNumber of test sessions reopened listed by state, district, and school Percent of Sessions ReopenedPercent of test sessions reopened listed by state, district, and school

21 Section 4: Self-Reported Reasons for Sessions Reopened Column Under Subject TestColumn Value Sessions ReopenedNumber of test sessions reopened listed by state, district, and school Lost Internet ConnectionLost internet connection reported as reason for reopening of session Computer CrashedComputer crash reported as reason for reopening of session Lost PowerPower loss reported as reason for reopening of session More TimeAdditional time need by student reported as reason for reopening of session AccommodationAccommodations reported as reason for reopening of session EmergencyEmergency situation reported as reason for reopening of session IllnessStudent illness reported as reason for reopening of session Makeup TestMakeup test reported as reason for reopening of session Other ReasonsOther reasons reported for reopening of session Multiple ReasonsMultiple reasons reported for reopening of session

22 Next Steps for Districts Use Summary Reports to: Identify areas of security concern (e.g., plagiarism, erasure) Identify areas of administrative concern (e.g., administrative errors, technology) Such information is helpful in: Evaluating current policies and procedures related to test security Providing appropriate support and training in test administration procedures Addressing technology issues as appropriate


Download ppt "2013-2014 Test Irregularity Summary Report. Agenda Overview, Purpose, and Background Test Irregularity Summary Report."

Similar presentations


Ads by Google