Caveon Test Security Audit for Cesar Chavez Academy – Oral Report December 5, 2009 Commissioned by Colorado Department of Education.

Slides:



Advertisements
Similar presentations
Self-Study Tool for Alaska Schools Winter Conference January 14, 2010 Jon Paden, EED Deborah Davis, Education Northwest/Alaska Comprehensive Center.
Advertisements

(Individuals with Disabilities Education Improvement Act) and
1 Test Coordinator Training Spring 2014 Test Security.
Wisconsin Knowledge & Concepts Examination (WKCE) Test Security Training for Proctors Wisconsin Department of Public Instruction Office of Educational.
Test Security for Test Administrators
Testing Accommodations for Students with Disabilities.
1 Test Coordinator Training Fall 2013 Roles of STC, TAs, and Proctors.
Test Security Laws and Best Practices
Edward S. Shapiro Director, Center for Promoting Research to Practice Lehigh University, Bethlehem, PA Planning for the Implementation of RTI: Lessons.
Test Monitor Training Administering Minnesota Assessments.
The Caveon Security Screen designed specifically for School District Assessment Programs.
Test Proctor Training. Test Proctor Requirements All Oklahoma State Testing Program (OSTP) test administration sessions shall be monitored by an adult.
Test Monitor Training Administering Minnesota Assessments “Leading for educational excellence and equity. Every day for every one.”
Wisconsin Knowledge & Concepts Examination (WKCE) Test Security Training for Proctors Wisconsin Department of Public Instruction Office of Educational.
Wisconsin Knowledge & Concepts Examination (WKCE) Test Security Training for Proctors Wisconsin Department of Public Instruction Office of Educational.
TRAIN-THE-TRAINER FORMAT Minnesota Assessments Test Security Training for Districts and Schools.
Test Security Company Perspective Presented by: John Fremer, President, Caveon Consulting Services NCSA – June 2014.
TESTING PROGRAM: SCHOOL: GRADE(S): DATES:. PROGRAM OVERVIEW Insert overview from the program guide.
– 14 CASSP CALIFORNIA ASSESSMENT OF STUDENT PERFORMANCE AND PROGRESS TEST EXAMINER TRAINING Whittier City School District.
Training for Test Examiners CMT Training for Test Examiners New for 2012 Test Security  New statistical analyses will be used with the 2012.
2014 Act Aspire test Training Bush Hills Academy Library Monday, March 10, 2014.
Administering & Proctoring the NYS Assessments 2015 Liane Benedict Center for Instruction, Technology & Innovation.
Test Irregularity Summary Report. Agenda Overview, Purpose, and Background Test Irregularity Summary Report.
Wisconsin Knowledge & Concepts Examination (WKCE) Test Security Wisconsin Department of Public Instruction Office of Educational Accountability 06/26/2013.
SLO Post-Assessment Administration Elementary May-June 2013.
Clinical Teaching/Student Teaching
Quarterly Update Blue Ribbon Commission’s Recommendations of the 2009 Criterion Referenced Competency Test (CRCT) Erasure Investigation Dr. Beverly L.
State Assessment Accommodations The Role of School Assessment Coordinator Documentation & Decisions Office of Planning & Assessment Boulder Valley School.
Office of Inspector General (OIG) 2015 Florida Annual Assessment and FSAA Meeting.
Test Security June 26, >
{date}, 2012 Trainer – {trainer name} Humble ISD STAAR, STAAR-M, STAAR-L 2012 CAMPUS TEST ADMINISTRATOR TRAINING STAAR, STAAR-M, STAAR-L.
What Was Learned from a Second Year of Implementation IES Research Conference Washington, DC June 8, 2009 William Corrin, Senior Research Associate MDRC.
Test Security and Special Populations. Test Security.
Stanford Achievement Test – Tenth Edition Grade 3 Alternate Assessment for Promotion
Primary Reading & Math Assessments Grades 1 and 2 Assessments
Wyoming Department of Education Proficiency Assessments for Wyoming Students (PAWS) and Student Assessment of Writing Skills (SAWS) Test Security WDE.
1 Spring 2012 Training Materials FCAT/FCAT 2.0 Reading/Mathematics/Science.
Florida Test Security Measures Presented at CCSSO National Conference of Student Assessment National Harbor, Maryland June 2013.
1 Standard Test Administration Testing Ethics Training PowerPoint Spring 2007 Utah State Office of Education.
WKCE Proctor Guidelines. Who can Proctor the WKCE? A qualified proctor for the WKCE is an employed district staff member (including administrators, teachers,
1 Standard Test Administration Testing Ethics Training PowerPoint Spring 2011 Utah State Office of Education.
Proctor Training Thank you for serving as a proctor in the Charlotte-Mecklenburg Schools testing program!
1 Spring 2014 EOC Test Security April 3, Assessment Administration 2.
2014 CRCT Proctor In-Service Mt. Zion Elementary.
Colorado Student Assessment Program Colorado Department of Education Unit of Student Assessment CSAP Administration Training 2008.
Administration Code for Kentucky’s Educational Assessment Program Spring 2012.
Office of Test Security Rina Davis office fax.
EXPLORE Training October 3, Serves as a baseline measure of academic progress toward college and career readiness when used with PLAN and the.
Administering & Proctoring NYS Assessments 2016 Liane Benedict ~ Instructional Support Services.
Testing Accommodations. Allowable Test Administration Procedures and Materials  Available to any student who regularly benefits from the use of these.
Cindy Blair, Director of Testing January 2016
Paper-Based Test Security Training for Schools. Agenda Welcome Communication and Support Policy and Key Terms Scheduling Monitoring Preventing Plagiarism.
SECURITY TRAINING 2016 January 2016 This training does not replace your responsibility to read and adhere to all test administration manuals!
Paper-Based Test Security Training for Districts.
OHIO’S STATE TESTS TESTING SECURITY Ohio Law (Ohio Administrative Code (H) and (J))
Test Administrator Training Spring 2016 Online Tests February 2016.
Test Administrator Training Spring 2016 Paper Tests.
Test Security Ryan Kuykendall April 19, 2011 February 1-3, 2011 © MDE - Test Security 1.
TESTING SECURITY Ohio Law (Ohio Administrative Code )
Overview of Caveon Data Forensics
Wisconsin Department of Public Instruction
Teacher Evaluation System
Overview of New State Data Forensics Analysis March 2011
Proctoring In Our Schools St Johns County School District
Proctoring In Our Schools St Johns County School District
Cindy Blair, Director of Testing January 2016
Proctoring In Our Schools for District Determined Assessments
Proctoring In Our Schools for District Determined Assessments
Miami-Dade County Public Schools Test Security Policies and Procedures
Presentation transcript:

Caveon Test Security Audit for Cesar Chavez Academy – Oral Report December 5, 2009 Commissioned by Colorado Department of Education

Overview Onsite test security audit interviews were conducted at Cesar Chavez Academy October 19 and 20, 2009 as well as by phone in the following two weeks. –Review the CSAP test security policies –Detect anomalous and inconsistent behaviors. CSAP test data were analyzed using Caveon Data Forensics –Describe and document anomalies, if any. –Relate anomalies to potential testing irregularities.

Interview Findings Cesar Chavez staff were cooperative and helpful in the process Training, policy and procedures for CSAP testing were explored CSAP test administration training, as well as policy and procedures were discussed We found no evidence of: –Answer sheet tampering –Improper coaching –Teaching to the test –Unauthorized extension of testing sessions –Cheating* on tests by students or staff Nothing conclusive was found from community members wishing to comment. * Cheating – when someone has been given an unfair advantage. Since Caveon was unable to observe the process to determine who gets extra-time accommodations, the results were inconclusive.

Interview Findings Extra-Time Accommodation Proctors provided extra-time accommodations for students deemed eligible for extra-time Extra-time accommodation eligibility was determined by committee Interview answers did not reveal how the committee allotted extra-time accommodations to students

Interviews All staff involved with CSAP testing in 2009 were interviewed including: –All Teachers (but one) –Directors –School Psychologist –School Assessment Director –Community members who identified themselves as wanting to comment –SD 60 Assessment Director

Data Forensics Findings At Cesar Chavez (CCA), we found no evidence of: –Answer sheet tampering (through erasures) –Test coaching (through similar test analysis) –Unusual gains from prior years –Unexpectedly high scores At CCA, we did find evidence of unusual allotment of extra-time accommodations At Dolores Huerta, we found no evidence of any form of testing irregularity If improper assistance was provided to the students while taking the test, it was done on an individual basis.

Data Forensics Findings Extra-Time* Accommodations at CCA Normal rates of extra time were found in Extreme rates of extra time accommodations for all grades in 2007 and Extreme rates of extra time accommodations during 2009, especially for grades 7 and 3. Process for granting extra time accommodation was inconsistent from 2008 to 2009 Process for granting extra time accommodation was inconsistent in 2008 from process in 2009 used by other schools * as determined by CSAP Accommodations Flag

Data Forensics Method Compute statistics by test for entire state –Similarity –Erasures –Gains –Accommodations: Extra Time and Oral Help Tabulate flags by schools to detect concentrations In-depth analysis of data from Cesar Chavez

Interpretation of Data Extreme data are expressed as index values Probability = 10 -index An index value of 5 is usually “extreme” and represents one chance in 100,000. An index value of 10 represents one chance in 10,000,000,000.

School Tabulation (2009) 1,713 schools 1,563,204 test instances –Subjects: Math, Reading, Writing, Science –All grades and subjects are pooled CCA Index values –Mean score, 67.6% –Overall index, –M4 Similarity index, 0.5 –Erasures index, 0.3 –Gains index, 0.2 –Extra Time index, –Oral Help index, 1.8 Conclusion: CCA is extreme for an excessive number of tests with extra time accommodations in (A similar pattern was found in 2007 and 2008 but not 2006.)

School-Grade Tabulation (2009) 5,608 school-grade combinations (pool subjects) Probability=10 -index ; Extreme=5.75; Marginal=4.73 GradeObservedExpectedIndex 318%4% %9% %7% %7% %5% %5%12.0

School-Grade-Subject Tabulation (2009) 18,731 school-grade-subject combinations Probability=10 -index ; Extreme=6.27; Marginal=5.25 Grade Subj. ObservedExpectedIndex 3/M24%2%19.0 3/R9%7%0.8 3/W20%4%10.6 4/M24%4%14.3 4/R24%11%4.4 4/W24%11%4.4

School-Grade-Subject Tabulation (2009) (Continued; Extreme=6.27; Marginal=5.25 ) Grade Subj. ObservedExpectedIndex 5/M11%5%2.1 5/R15%11%1.0 5/Sci11%3%4.9 5/W15%11%1.0 6/M21%4%11.8 6/R17%8%3.0 6/W17%8%3.0

School-Grade-Subject Tabulation (2009) (Continued; Extreme=6.27; Marginal=5.25 ) Grade Subj. ObservedExpectedIndex 7/M28%4%22.9 7/R57%6%61.6 7/W57%6%61.4 8/M25%4%13.4 8/R8%6%0.8 8/Sci10%2%4.3 8/W8%6%0.8

Year-by-Year Tabulations Grade-Subject Rate Comparisons If Index > 12, rate is highlighted with red. If Index > 8, rate is highlighted with gold. If Index > 4, rate is highlighted with tan and 2008 were the most extreme years Residual effects were present in 2009

Extra Time 2006 Red (>12); Gold (>8); Tan (>4) math (ALL)5.6%6.6%6.8%5.0%4.7%4.5% 2006-math (CCA)4.1%12.1%14.8%1.6%9.9%13.2% 2006-read (ALL)12.2%14.2%13.2%9.3%7.0%5.8% 2006-read (CCA)28.6%18.2%22.2%3.2%9.8%15.1% 2006-writ (ALL)6.8%14.2%13.2%9.3%7.0%5.8% 2006-writ (CCA)2.0%18.2%22.2%3.2%9.8%15.1% 2006-sci (ALL) 4.9% 3.1% 2006-sci (CCA) 16.7% 15.1% “All” pertains to all schools in the state. Extra time is % of students who received an extra time accommodation.

Extra Time 2007 Red (>12); Gold (>8); Tan (>4) math (ALL)5.3%6.4%7.1%6.0%4.8%4.7% 2007-math (CCA)77.5%50.0%60.8%36.7%40.3%36.5% 2007-read (ALL)11.2%13.0% 10.9%7.0%6.3% 2007-read (CCA)68.1%50.0%60.8%36.7%40.3%34.4% 2007-writ (ALL)6.5%13.0% 10.9%7.0%6.3% 2007-writ (CCA)77.5%50.0%60.8%36.7%40.3%34.4% 2007-sci (ALL) 4.9% 3.2% 2007-sci (CCA) 60.8% 35.4%

Extra Time 2008 Red (>12); Gold (>8); Tan (>4) math (ALL)2.8%4.6%4.8%4.2%3.7%3.4% 2008-math (CCA)37.4%84.3%86.9%46.2%12.0%37.1% 2008-read (ALL)7.6%10.5%9.0%8.2%6.2%4.9% 2008-read (CCA)25.9%95.0%97.0%64.3%24.8%41.7% 2008-writ (ALL)4.2%10.5%9.0%8.2%6.2%4.9% 2008-writ (CCA)36.4%95.0%97.0%64.3%24.8%41.7% 2008-sci (ALL) 2.9% 2.2% 2008-sci (CCA) 31.3% 0.0%

Extra Time 2009 Red (>12); Gold (>8); Tan (>4) math (ALL)2.5%4.2%4.9%4.4%3.8%4.3% 2009-math (CCA)23.9%24.4%10.6%21.3%28.0%25.5% 2009-read (ALL)6.6%10.8%10.6%8.4%5.9%5.7% 2009-read (CCA)9.2%23.6%14.6%16.9%57.3%8.5% 2009-writ (ALL)3.5%10.8%10.6%8.4%5.9%5.7% 2009-writ (CCA)19.7%23.6%14.6%16.9%57.3%8.5% 2009-sci (ALL) 2.9% 2.4% 2009-sci (CCA) 11.4% 10.4%

Overall Finding Accommodation rate - decreased in 2009 but was still extreme in some grades Accommodation rate – was not at the same rate for all grades –Were some grade cohorts* in greater need of accommodation? –Why were students flagged in 2008 but not in 2009? What changed? *Grade cohort – refers to similarities within the same group; In this case grade levels -- 3 rd graders, 4 th, 5 th, etc.

Looking at Accommodation Rates by Cohort and Transition Group Method: Count flags for students taking the test in both years, 2008 and 2009 (a student that gets an accommodation is flagged). Extra-time flags should be consistent between years –Those not flagged should continue to not be flagged –Those flagged should continue to be flagged Three transition groups –Students moving into CCA in 2009 (Moved in) –Students moving from CCA in 2009 (Moved out) –Students within CCA for both 2008 and 2009 (Stayed)

Accommodation – “Stayed” observed (expected) Math same diff Chi-sq 2008 (0)182 (117)40 (104) (1)65 (129)180 (115) Read same diff Chi-sq 2008 (0)144 (90)41 (94) (1)86 (139)199 (145) Writ same diff Chi-sq 2008 (0)135 (82)38 (90) (1)89 (141)208 (155) The Chi-Square statistic tests whether the proportion of students who received an extra-time accommodation in 2008 was consistent with the proportion for The critical value at the.05 level is In all three subjects from 2008 to 2009 for the students who remained at CCA, an unexpectedly large number of students were not given an extra-time accommodation in 2009, even though one was given in (0) means no extra-time accommodation in (1) means extra-time accommodation given in 2008

Accommodation – “Moved Out” observed (expected) Math same diff Chi-sq 2008 (0)31 (6)0 (24) (1)1 (25)117 (92) Read same diff Chi-sq 2008 (0)55 (30)5 (30) (1)16 (41)66 (41) Writ same diff Chi-sq 2008 (0)55 (29)4 (29) (1)16 (41)66 (40) The Chi-Square statistic tests whether the proportion of students who received an extra-time accommodation in 2008 is consistent with the proportion for The critical value at the.05 level is In all three subjects from 2008 to 2009 for the students who left CCA and moved to another school, an unexpectedly large number of students were not given an extra-time accommodation in 2009, even though one was given in (0) means no extra-time accommodation in (1) means extra-time accommodation given in 2008

Accommodation – “Moved In” observed (expected) Math same diff Fisher’s 2008 (0)113 (111)28 (29) (1)5 (6)3 (1) Read same diff Fisher’s 2008 (0)110 (107)25 (27) (1)5 (7)5 (2) Writ same diff Fisher’s 2008 (0)111 (108)25 (27) (1)5 (7)4 (1) Fisher’s Exact Test is used to see whether the proportion of students who received an extra-time accommodation in 2008 is consistent with the proportion for 2009, when expected cell values are lower than 5. In all three subjects from 2008 to 2009 for the students who moved into CCA from another school, the students who were given an extra-time accommodation in 2009 were also given one in (0) means no extra-time accommodation in (1) means extra-time accommodation given in 2008

Summary of Transitions The method by which extra time accommodation was determined at CCA in 2008 was not consistent within CCA in 2009 and for students who left CCA in 2009 Students that moved and had extra-time changes (flags were different from 2008 to 2009): –348 total changes –327 changes were for students with extra-time assignment granted at CCA (94%) but no extra-time assignment at the other school

Observations about Transition Groups Many students were given extra time at CCA in 2008 but the same students were not given extra time in 2009, whether at CCA or another school. The data suggest the method of extra time assignment at CCA was different between 2008 and 2009.

Data Forensics Summary Large numbers of students were given Extra-Time accommodation at CCA, especially in 2007 and The data were extreme for 2007, 2008, and Evidence suggests that extra time assignment was not granted consistently at CCA from 2008 to There were no additional data forensics indications of testing irregularities in 2009 on the part of educators (other than inconsistent assignment of extra time from grade to grade in 2009). If improper assistance was provided to the students while taking the test, it was done on an individual basis, in 2009.

Possible Next Steps Have special services professionals review cases at CCA involving placement rules, policies, and procedures Involve SD 60 personnel and CCA staff in these meetings Determine training needed for consistency throughout the special services program Invalidate suspect test scores from CCA and re-administer the CSAP to affected students Compare special accommodation rates for consistency between SD 60 policy and CCA procedures. Tests shipped from SD 60 to CCA should be signed for by the director and immediately locked in a room instead of the front desk signing for them Review special student placement procedures with District 60 Improve special student placement procedures by conducting onsite reviews