Presentation is loading. Please wait.

Presentation is loading. Please wait.

Why Data Integrity? The Importance of Submitting Quality Data THE NETWORK SUMMER SUMMIT JUNE 30, 2015 Copyright © Texas Education Agency 2015. All rights.

Similar presentations


Presentation on theme: "Why Data Integrity? The Importance of Submitting Quality Data THE NETWORK SUMMER SUMMIT JUNE 30, 2015 Copyright © Texas Education Agency 2015. All rights."— Presentation transcript:

1 Why Data Integrity? The Importance of Submitting Quality Data THE NETWORK SUMMER SUMMIT JUNE 30, 2015 Copyright © Texas Education Agency 2015. All rights reserved. 1

2 Welcome and Introductions Presenters: ◦Heather Christie, Program Monitoring and Interventions Division ◦Rachel Harrington, Performance-Based Monitoring Division Copyright © Texas Education Agency 2015. All rights reserved. 2

3 Provide a brief history of the evolution of data and data integrity as critical priorities. Review identification and interventions components of the data validation monitoring system. Copyright © Texas Education Agency 2015. All rights reserved. 3 Purpose of Today’s Presentation

4 IT’S ALL ABOUT DATA! Why Is the Data So Important? Its benefit should be to provide metrics for leaders to make decisions. Sure, it may have other important functionality, but that’s the most important one. Data may come from many sources, but can be categorized into three areas. Depending on the data integrity of the sources, there may be different levels of cleaning that need to be performed to achieve “grade A” data. In order to answer these questions, quality data must be captured. Having incorrect or incomplete data will only produce inaccurate analysis and reporting. Inaccuracies in reporting can cause leaders to make poor decisions or lead the entity in the wrong direction. Even a small misstep can take a lot of time and manpower to correct. Common Data Sources External – Records are sent from a vendor or partner. Make sure that records contain all the fields needed, the data is clean, complete, correctly ordered and there are keys to join this data to other related datasets. It may be difficult to get corrections made to data at this source. Internal – Records are sent from another department or division within the organization. Make sure that records contain all the fields needed, the data is clean, correctly ordered and there are keys to join this data to other related datasets. It should be easier to work with these source owners if corrections are needed. One common solution to data problems is to push them off until later in the project because deliverables and timelines have to be met. “We’ll come back and fix that later,” or “It’s not that big of a problem. It will be fixed in phase 2,” are often costly oversights on management’s part. When that data issue is not fixed or the project is rushed into production early, these seemingly little issues can cause major problems. THE DAILY NEWS THE WORLD’S FAVORITE NEWSPAPER - Since 1879 4 October 10, 2003

5 When Did Data Become So Important? Statute from 78 th Texas Legislature (2003) limited and redirected agency monitoring to focus on two critical components:  Program monitoring (i.e., the Performance-Based Monitoring Analysis System [PBMAS] program monitoring of BE/ESL, CTE, NCLB, and special education programs through a data evaluation and reporting system) and  Data Validation Monitoring This legislation recognized that the reliability of data evaluation and reporting systems depends on the validity and accuracy of the data that are used in those systems. Copyright © Texas Education Agency 2015. All rights reserved. 5

6 Statutory Excerpt Related to Data Integrity TEC §7.028. Limitation on Compliance Monitoring. (a) Except as provided by Section 29.001(5), 29.010(a), 39.056, or 39.057, the agency may monitor compliance with requirements applicable to a process or program provided by a school district, campus, program, or school granted charters under Chapter 12, including the process described by Subchapter F, Chapter 11, or a program described by Subchapter B, C, D, E, F, H, or I, Chapter 29, Subchapter A, Chapter 37, or Section 38.003, and the use of funds provided for such a program under Subchapter C, Chapter 42, only as necessary to ensure:... (3) data integrity for purposes of: (A) the Public Education Information Management System (PEIMS); and (B) accountability under Chapter 39.... Copyright © Texas Education Agency 2015. All rights reserved. 6 Note: This is only one example of statute related to data integrity. Other examples can be found in a variety of state and federal laws, rules, and regulations.

7 Copyright © Texas Education Agency 2015. All rights reserved. 7 Notification to Districts Regarding Focus on Data Integrity The Commissioner’s Office issued a To the Administrator Addressed Letter in October 2003: “House Bill 3459 (78 th Texas Legislature, Regular Session, 2003) mandates important changes in the monitoring that the Texas Education Agency is authorized to conduct. The new legislation eliminates the DEC and IOR processes as formerly conducted by the agency and establishes clear parameters on the types and extent of allowable monitoring. In response to these changes, the agency intends to develop and implement a data-driven and performance- based monitoring system that 1) reduces—to the extent possible—

8 Copyright © Texas Education Agency 2015. All rights reserved. 8 Notification to Districts Regarding Focus on Data Integrity (continued) the burden of monitoring on school districts and charter schools by accurately identifying for further review only those with clear indicators of non-compliance or poor program quality and 2) encourages continuous improvement in alignment with the state’s accountability system. This new data-driven system will enable the agency to monitor district and charter school performance on an ongoing, rather than cyclical basis.”

9 Copyright © Texas Education Agency 2015. All rights reserved. 9 Notification to Districts Regarding Focus on Data Integrity (continued) “Clearly, data collection and analysis will be an integral part of whatever monitoring the agency undertakes—whether that monitoring is conducted remotely from the agency or on-site at the district or charter school. Districts and charter schools are, therefore, strongly encouraged to verify that their internal data collection procedures and systems are effectively designed to assure data quality and integrity.”

10 Copyright © Texas Education Agency 2015. All rights reserved. 10 Why Is Data Integrity So Important? Districts’ data are used for many critical purposes: ◦Funding ◦Monitoring ◦Evaluation ◦Compliance ◦Auditing ◦Research ◦Transparency and the Public’s Right to Know

11 Copyright © Texas Education Agency 2015. All rights reserved. 11 General Observations Since 2003 Districts that staffed their data collection and reporting functions/responsibilities with highly competent personnel and expertise have been the most successful at designing internal procedures and systems to assure data quality, accuracy, and integrity. Likewise, districts that have not made the ongoing development of data expertise and data capacity a priority within their districts continue to struggle with issues of data quality, accuracy, and integrity.

12 Copyright © Texas Education Agency 2015. All rights reserved. 12 General Observations (continued) Districts that developed, and have continued to sustain, strong communication and coordination among their data department, relevant program areas, and administrative leadership have been the most successful at designing internal procedures and systems to assure data quality, accuracy, and integrity. Likewise, districts that do not have these strong communication and coordination links continue to struggle with issues of data quality, accuracy, and integrity.

13 Copyright © Texas Education Agency 2015. All rights reserved. 13 General Observations (continued) Districts that approach data integrity monitoring as an ongoing, holistic endeavor have been the most successful at assuring data quality, accuracy, and integrity and implementing effective data- driven decisions. Likewise, districts that approach data integrity monitoring as an episodic, isolated activity completed by one individual continue to struggle with issues of data quality, accuracy, and integrity and are unable to implement effective data-driven decisions.

14 Copyright © Texas Education Agency 2015. All rights reserved. 14 General Observations (continued) Districts that regularly review, analyze, and evaluate their data have been the most successful at designing and/or modifying internal procedures and systems to assure data quality, accuracy, and integrity. Likewise, districts that only review their data if asked by TEA to do so continue to struggle with issues of data quality, accuracy, and integrity and lack the necessary internal systems, procedures, and processes.

15 Copyright © Texas Education Agency 2015. All rights reserved. 15 General Observations (continued) Districts that use their data to make policy, programmatic, and educational decisions have been able to effect real and lasting changes to promote program effectiveness and student achievement. Likewise, districts that rarely or never make data-driven decisions continue to struggle with issues of data quality, accuracy, and integrity as well as program effectiveness and student achievement.

16 Copyright © Texas Education Agency 2015. All rights reserved. 16 Data Sources There are two relevant data sources: Public Education Information Management System (PEIMS) Data ◦Submitted by districts to TEA four times a year ◦The Fall and Summer submissions are the two submissions most commonly used for monitoring and accountability purposes. Student Assessment Data ◦Submitted by districts to the test contractor after each test administration ◦The Fall, Spring, and Summer administrations are all used for monitoring and accountability purposes.

17 Copyright © Texas Education Agency 2015. All rights reserved. 17 Data Monitoring Data monitoring is implemented through TEA’s Data Validation Monitoring System. There are two components to the Data Validation Monitoring System: 1)Identification (Performance-Based Monitoring Division) 2)Interventions (Program Monitoring and Interventions Division) There are three data types that are monitored: 1) leaver data; 2) discipline data; and 3) student assessment data.

18 Copyright © Texas Education Agency 2015. All rights reserved. 18 Leaver Data

19 Student withdraws District enters leaver code into PEIMS Performance Reporting pulls data for state accountability ratings PBM pulls data for PBMAS indicators Performance indicators in all programs Performance data in all 4 Indexes Public report and potential interventions DVM validates districts’ leaver data and reporting process SIS pulls data for federal accountability identification Focus Schools Priority Schools Reward Schools Public report and potential interventions Leaver codes requires specific documentation (Appendix D) DVM indicators # 1, 3, 4, 5 THROUGHOUT THE YEAR

20 Copyright © Texas Education Agency 2015. All rights reserved. 20 Leaver Data: The Identification Component “Leaver” data are submitted for students in Grades 7-12 through the fall PEIMS submission. Every district’s leaver data are evaluated every year by the PBM Division. An overall analysis is conducted through Indicator #1 Leaver Data Analysis. ◦This indicator evaluates the change in districts’ dropout rates in relation to several components of interrelated data, including dropouts, graduates, other leavers, and underreported students.

21 Copyright © Texas Education Agency 2015. All rights reserved. 21 Leaver Data: The Identification Component (continued) Indicator #2 (Underreported Students) identifies districts exceeding the state standard for the count and/or percent of underreported students. ◦An underreported student is one for whom none of the following statuses apply: graduate, previous graduate, returned on time, returned late migrant student, mover, other leaver, GED recipient, or dropout. ◦A district is identified under this indicator if exceeds one or both of the following standards: ◦Count of underreported students: 100 ◦Percent of underreported students: 1.7%

22 School start window (first day- Sept) District tries to find “no shows” District enters leaver code into PEIMS Performance Reporting pulls data for state accountability ratings PBM pulls data for PBMAS indicators Performance indicators in all programs Performance data in all 4 Indexes Public report and potential interventions DVM validates districts’ leaver data and reporting process SIS pulls data for federal accountability identification Focus Schools Priority Schools Reward Schools Public report and potential interventions Leaver codes requires specific documentation (Appendix D) DVM indicators # 1, 3, 4, 5 “No shows” that are not reported = “underreported students” DVM indicator # 2

23 Copyright © Texas Education Agency 2015. All rights reserved. 23 Leaver Data: The Identification Component (continued) Indicator #3 (Use of Leaver Reason Codes by Districts with No Dropouts) identifies districts with no dropouts and a potentially anomalous use of certain leaver reason codes. ◦The following leaver reason codes are evaluated collectively: ◦16 (Student returned to family’s home country) ◦24 (Student entered college and is working toward an Associate’s or Bachelor’s Degree) ◦60 (Student is home schooled) ◦81 (Student enrolled in a private school in Texas) ◦82 (Student enrolled in a public or private school outside Texas)

24 Copyright © Texas Education Agency 2015. All rights reserved. 24 Leaver Data: The Identification Component (continued) Indicator #4 (Use of One or More Leaver Reason Codes) identifies districts with a potentially anomalous use of one or more leaver reason codes. ◦The following leaver reason codes are evaluated individually: ◦03 (Student died) ◦16 (Student returned to family’s home country) ◦24 (Student entered college and is working toward an Associate’s or Bachelor’s Degree) ◦60 (Student is home schooled) ◦66 (Student was removed by CPS and district hasn’t been informed of student’s current status or enrollment) ◦78 (Student was expelled under TEC §37.007 and cannot return to school)

25 Copyright © Texas Education Agency 2015. All rights reserved. 25 Leaver Data: The Identification Component (continued) Indicator #4 (Use of One or More Leaver Reason Codes) [continued] ◦81 (Student enrolled in a private school in Texas) ◦82 (Student enrolled in a public or private school outside Texas) ◦83 (Administrative withdrawal) ◦85 (Student graduated outside Texas before entering a Texas public school, entered a Texas public school, and left again) ◦86 (Student completed the GED outside Texas) ◦87 (Texas Tech University ISD High School Diploma Program or the University of Texas at Austin High School Diploma Program) ◦90 (Student graduated from another state under provisions of the Interstate Compact on Educational Opportunity for Military Children)

26 26

27 Copyright © Texas Education Agency 2015. All rights reserved. 27 Leaver Data: The Identification Component (continued) Indicator #5 (Use of Certain Leaver Reason Dropout Codes) identifies districts with a potentially anomalous use of one or more leaver reason dropout codes. ◦The following leaver reason dropout codes are evaluated individually: ◦88 (Student was ordered by a court to attend a GED program and has not earned a GED certificate) ◦89 (Student is incarcerated in a state jail or federal penitentiary as an adult or as a person certified to stand trial as an adult)

28 28

29 Copyright © Texas Education Agency 2015. All rights reserved. 29 Leaver Data: The Identification Component (continued) Indicator #6 (Missing PET Submission: August 17, 2015 – September 18, 2015) identifies districts that did not complete at least one PET submission during this time period. Indicator #7 (Missing PET Submission: 2014 – 2015 Reporting Year) identifies districts that did not complete at least one PET submission during this time period.

30 Copyright © Texas Education Agency 2015. All rights reserved. 30 Leaver Data: The Identification Component (continued) Indicator #8 (Continuing Students’ Dropout Rate: Class of 2013, as of Fall 2014) identifies districts with a continuing students’ dropout rate of 35% or higher. Detailed information on all Leaver Data Validation indicators can be found in each year’s Leaver Records Data Validation Manual, available at: http://tea.texas.gov/pbm/DVManuals.aspxhttp://tea.texas.gov/pbm/DVManuals.aspx

31 Copyright © Texas Education Agency 2015. All rights reserved. 31 Leaver Data: The Identification Component (continued) Some leaver records data validation indicators may identify one or more districts that are collecting and reporting accurate data. Confirming the accuracy of data is a critical part of the process necessary to validate and safeguard the integrity of our overall monitoring, accountability, and financial systems.

32 Copyright © Texas Education Agency 2015. All rights reserved. Leaver Data: Intervention Component 32

33 Review Data – aggregate and student level Policies/procedures/process Verify Data is correct Process/procedures being followed (are established) Identify Inconsistencies and discrepancies Contributing factors to anomaly Correct Revise procedures to correct, including monitoring system Training and communication for appropriate staff 33 Copyright © Texas Education Agency 2015. All rights reserved.

34 34 Stage Criteria 3  Triggered Indicator #2 this year and at least 2 and up to 5 of the last 8 years OR  Triggered Indicator #6 this year and at least 3 and up to 4 of the last 8 years 2  Triggered Indicator #2 this year and up to one year of the last 8 years OR  Triggered two or more indicators OR  Triggered Indicator #1 OR  Triggered Indicator #4, leaver code 82 1  Triggered Indicator #3 OR  Triggered Indicator #4, any leaver code except 82 OR  Triggered Indicator #5 OR  Triggered Indicator #6 this year OR  Triggered Indicator #8

35 Stage 3 SLDR and/or DA CAP if appropriate Submit to TEA Feedback on submission Additional follow-up and support Stage 2 SLDR and/or DA CAP if appropriate Submit to TEA Feedback on submission Stage 1 SLDR and/or DA CAP if appropriate Retain locally Random submissions SLDR – Student Level Data Review; DA – Data Analysis; CAP – Corrective Action Plan

36 2014-2015 DVM – Leaver Indicators IndicatorDescriptionIntervention Document Student List 1Leaver data analysisDA 2Underreported studentsSLDR and DA 3Use of leaver codes when no dropouts reported SLDR and DA 4Use of one or more leaver codesSLDR and DA 5Use of leaver codesSLDR and DA 6Missing PET school start windowDA 7Missing PET all yearDA 8Continuing students drop-out rateDA CAP completed for any indicator for which a discrepancy is identified

37 DVM- Leaver Resources  DVM Leaver Manual (Performance-Based Monitoring)  ESC Contact  Appendix in DVM Student Assessment Manual  Appendix D of PEIMS Data Standards (PEIMS)  Intervention Resources (Program Monitoring and Interventions)  Indicator workbook (SLDR and DA)  Corrective Action Plan template  Intervention Guidance document 37

38 Copyright © Texas Education Agency 2015. All rights reserved. 38 Discipline Data

39 Copyright © Texas Education Agency 2015. All rights reserved. 39 Discipline Data: The Identification Component Discipline data are submitted through the PEIMS summer submission. Discipline data report what the student’s conduct was (Disciplinary Action Reason Codes) and the district’s subsequent response (Disciplinary Action Codes). Every district’s discipline data are evaluated every year by the PBM Division. Indicator #1 Length of Out-of-School Suspension identifies districts with one or more students reported with an OSS for more than three school days (regular districts) or more than 10 school days (charters).

40 Copyright © Texas Education Agency 2015. All rights reserved. 40 Discipline Data: The Identification Component (continued) Indicator #5 Unauthorized DAEP Placement-Students under Age 6 identifies districts that reported a DAEP placement of one or more students under age 6 for a disciplinary reason other than expelling a student to a DAEP for bringing a firearm to school. Indicator #6 High Number of Discretionary DAEP Placements identifies districts with a high number of discretionary DAEP placements for all students.

41 Copyright © Texas Education Agency 2015. All rights reserved. 41 Discipline Data: The Identification Component (continued) Indicator #7 African American (Not Hispanic/Latino) Discretionary DAEP Placements identifies districts with a higher rate of African American discretionary DAEP placements compared to the rate of discretionary DAEP placements for all students. Indicator #8 Hispanic Discretionary DAEP Placements identifies charters with a higher rate of Hispanic discretionary DAEP placements compared to the rate of discretionary DAEP placements for all students.

42 Copyright © Texas Education Agency 2015. All rights reserved. 42 Discipline Data: The Identification Component (continued) Charters are not included in the following: ◦Indicator #2 (Length of Out-of-School Suspension) ◦Indicator #3 (Unauthorized Expulsion-Students Age 10 and Older) ◦Indicator #4 (Unauthorized Expulsion-Students under Age 10), and ◦Indicator #9 (No Mandatory Expellable Incidents Reported for Multiple Years)

43 Copyright © Texas Education Agency 2015. All rights reserved. 43 Discipline Data: The Identification Component (continued) Detailed information on all Discipline Data Validation indicators can be found in each year’s Discipline Data Validation Manual, available at: http://tea.texas.gov/pbm/DVManuals.aspxhttp://tea.texas.gov/pbm/DVManuals.aspx Some discipline data validation indicators may identify one or more districts that are collecting and reporting accurate data. Confirming the accuracy of data is a critical part of the process necessary to validate and safeguard the integrity of our overall monitoring, accountability, and financial systems.

44 Copyright © Texas Education Agency 2015. All rights reserved. Discipline Data: Intervention Component 44

45 StageCriteria 3  Triggered three indicators OR  Triggered Indicator #1 this year and 5 previous years OR  Triggered Indicator #3 this year and 5 or 6 previous years OR  Triggered Indicator #6 this year and 6 previous years OR  Triggered Indicator #7 this year and 4 or 5 previous years 2  Triggered two indicators OR  Triggered Indicator #1 this year and 4 previous years OR  Triggered Indicator #3 this year and 3 or 4 previous years OR  Triggered Indicator #6 this year and 3 previous years OR  Triggered Indicator #7 this year and 2 or 3 previous years 1  Triggered Indicator #1 or #3 this year and up to 2 previous years OR  Triggered Indicator #7 this year and up to 1 previous year OR  Triggered Indicator #4, #6, #8, or #9 this year 45

46 Discipline Indicator DescriptionIntervention Document Student Level Reports 1Length of OSSSLDR and DADVM Report 2Length of ISS (Report Only)No Documents 3Unauthorized Expulsion Age 10 and Older SLDR and DADVM Report 4Unauthorized Expulsion Under Age 10 SLDR and DADVM Report 5Unauthorized DAEP Under 6SLDR and DADVM Report 6High Discretionary DAEPSLDR and DAPEIMS Edit + 7African American DAEP PlacementSLDR and DAPEIMS Edit + 8Hispanic DAEP PlacementSLDR and DAPEIMS Edit + 9No Mandatory Expellable OffensesDA CAP completed for any indicator for which noncompliance is identified 2014-2015 DVM-Discipline Indicators

47 Copyright © Texas Education Agency 2015. All rights reserved. 47 Student Assessment Data

48 Copyright © Texas Education Agency 2015. All rights reserved. 48 Student Assessment Data: The Identification Component Student assessment data are submitted to the test contractor after each test administration. Every district’s student assessment data are evaluated every year by the PBM Division. One set of indicators evaluates answer documents coded “Absent” and another set evaluates answer documents coded “Other.”

49 Students test District sends answer documents and data is processed and sent to TEA for use in accountability systems Performance Reporting pulls data for state accountability ratings PBM pulls data for PBMAS indicators Performance indicators in all programs Performance data in all 4 Indexes Public report and potential interventions DVM validates districts’ student assessment processes, specifically student inclusion SIS pulls data for federal accountability identification Focus Schools Priority Schools Reward Schools Public report and potential interventions

50 Students test District sends answer documents and data is processed and sent to TEA for use in accountability systems DVM validates districts’ student assessment processes, specifically student inclusion

51 Students test District sends answer documents and data is processed and sent to TEA for use in accountability systems DVM validates districts’ student assessment processes, specifically student inclusion AbsentOtherNot Found These instances result in the student not being included in the performance indicators used in accountability systems A score code used when student is not at school and doesn’t test, including during make-up window A score code used when student results are not to be used. Use guided by DCCM Status of answer document when student is reported as taking the course, but no EOC document for student is found

52 Students test District sends answer documents and data is processed and sent to TEA for use in accountability systems Performance Reporting pulls data for state accountability ratings PBM pulls data for PBMAS indicators Performance indicators in all programs Performance data in all 4 Indexes Public report and potential interventions DVM validates districts’ student assessment processes, specifically student inclusion SIS pulls data for federal accountability identification Focus Schools Priority Schools Reward Schools Public report and potential interventions Not Found Absent Other

53 DVM validates districts’ student assessment processes, specifically student inclusion AbsentOther Not Found A score code used when student is not at school and doesn’t test, including during make-up window A score code used when student results are not to be used. Use guided by DCCM Status of answer document when student is reported as taking the course, but no EOC document for student is found STAAR 3-8 - #1-#5 TELPAS - #11 EOC - #13 STAAR 3-8 - #6-#10 TELPAS - #12 EOC - #13

54 Student Assessment Data: The Identification Component (continued) STAAR 3-8 ABSENT RATE INDICATORS #1(I-XI) - #5(I-XI) Subjects: Mathematics, Reading, Science, Social Studies, and Writing Student Groups: All students, African American, American Indian, Asian, Hispanic, Pacific Islander, White, Students with Two or More Races, Economically Disadvantaged, English Language Learners, Students Served in Special Education STAAR 3-8 OTHER RATE INDICATORS #6(I-XI) - #10(I-XI) Subjects: Mathematics, Reading, Science, Social Studies, and Writing Student Groups: All students, African American, American Indian, Asian, Hispanic, Pacific Islander, White, Students with Two or More Races, Economically Disadvantaged, English Language Learners, Students Served in Special Education 54

55 Student Assessment Data: The Identification Component (continued) TELPAS ABSENT RATE INDICATOR #11 Subject: TELPAS Reading Test Students: Grades 2-12 TELPAS OTHER RATE INDICATOR #12 Subject: TELPAS Reading Test Students: Grades 2-12 55

56 Copyright © Texas Education Agency 2015. All rights reserved. 56 Student Assessment Data: The Identification Component (continued) Indicator #13 (i-v) STAAR EOC Test Participation Rate evaluates discrepancies between course completion data and STAAR EOC test participation (i.e., Absent, Other, or Answer Document Not Found). ◦Calculated for the following EOC assessments: ◦Algebra I ◦English I ◦English II ◦Biology ◦U.S. History

57 Which Students Have to Meet the STAAR Graduation Requirements? Students who were first enrolled in grade 9 or below in the 2011– 2012 school year, the first year of the STAAR program, have to meet the STAAR graduation requirements to earn a high school diploma from a Texas public or charter school. Students who repeated grade 9 or were enrolled in grade 10 or above in the 2011–2012 school year have to meet the Texas Assessment of Knowledge and Skills (TAKS) graduation requirements. 57

58 Which Students Have to Meet the STAAR Graduation Requirements? (continued) Districts that determined a student who was reported in PEIMS in grade 9 during the 2011–2012 school year is a TAKS graduate because of prior grade 9 attendance outside the Texas public school system must be able to provide documentation, upon request from TEA, to substantiate the district’s designation of the student as a TAKS graduate. 58

59 Which Students Have to Meet the STAAR Graduation Requirements? (continued) The information that is available to TEA to determine whether a student is a STAAR graduate or a TAKS graduate is the graduation rate cohort. Cohort assignment for graduation rate purposes is determined by the grade level submitted by the district in PEIMS. Grades attended outside Texas public schools are not collected in PEIMS and therefore cannot be considered when determining cohort assignment for graduation rate purposes. 59

60 Which Students Have to Meet the STAAR Graduation Requirements? (continued) Detailed information about graduation rate cohort assignment may be found in the technical documentation associated with graduation rate processing available at: http://tea.texas.gov/acctres/dropcomp_index.html#documentation.http://tea.texas.gov/acctres/dropcomp_index.html#documentation 60

61 61

62 62

63 63

64 64

65 Copyright © Texas Education Agency 2015. All rights reserved. 65 Student Assessment Data: The Identification Component (continued) Detailed information on all Student Assessment Data Validation indicators can be found in each year’s Student Assessment Data Validation Manual, available at: http://tea.texas.gov/pbm/DVManuals.aspx http://tea.texas.gov/pbm/DVManuals.aspx Some student assessment data validation indicators may identify one or more districts that are collecting and reporting accurate data. Confirming the accuracy of data is a critical part of the process necessary to validate and safeguard the integrity of our overall monitoring, accountability, and financial systems.

66 Copyright © Texas Education Agency 2015. All rights reserved. Student Assessment Data: Intervention Component 66

67 67 Stages of Intervention Stage 1  Triggered one to four indicators Stage 2  Triggered five to nine indicators OR  Triggered Indicator #13(i) Algebra I - Not Found OR  Triggered Indicator #13(iii): English II - Absent OR  Triggered Indicator #13(iv): Biology - Not Found Stage 3  Triggered ten or more indicators OR  Triggered at least three of the five #13 Indicators (i- v) - Absent OR  Triggered at least three of the five #13 Indicators (i- v) - Not Found

68 DVM-Student Assessment SA Indicator DescriptionWorkbook Content Student Level Report 1STAAR Absent Rate – Math (i-xi)SLDR and DADVM Report 2STAAR Absent Rate – Reading (i-xi)SLDR and DADVM Report 3STAAR Absent Rate – Science (i-xi)SLDR and DADVM Report 4STAAR Absent Rate – Social St (i-xi)SLDR and DADVM Report 5STAAR Absent Rate – Writing (i-xi)SLDR and DADVM Report 6STAAR Other Rate – Math (i-xi)SLDR and DADVM Report 7STAAR Other Rate – Reading (i-xi)SLDR and DADVM Report 8STAAR Other Rate – Science (i-xi)SLDR and DADVM Report 9STAAR Other Rate – Social St (i-xi)SLDR and DADVM Report 10STAAR Other Rate – Writing (i-xi)SLDR and DADVM Report 11TELPAS Absent RateSLDR and DADVM Report 12TELPAS Other RateSLDR and DADVM Report 13EOC Participation Rate (i-v)SLDR and DADVM Report 14CTE Coding DiscrepancyDADVM Report i-v are indicators based on EOC subject areas. Each subject area has a calculations for Absent Other Not Found There are 15 EOC indicators calculated within #13 5 subject areas x 3 calculations (A, O, NF) 15 indicators Each of the 15 can be “triggered”

69 Indicator 13 – Not Found 24. When should students take a STAAR EOC assessment? Students should take a STAAR EOC assessment during the spring, summer, or fall administration, as close as possible to the completion of the corresponding course. Most students will have received instruction in an entire course or a significant portion of the course by the testing date or by the end of the school year, so they would participate in the spring administration. However, if by the end of the school year students have received instruction in only part of the course (the first half or the second half), then they would take the STAAR EOC assessment in whichever subsequent administration is closest to the time they are completing the course. 69

70 Indicator 13 – Not Found 24. When should students take a STAAR EOC assessment? (continued) For students who are taking courses outside of the typical semester sequence, districts should carefully evaluate the timing of the course instruction as it relates to the STAAR EOC assessment schedule to ensure that students are provided the best opportunity to demonstrate their understanding of the course content. For example, because the spring administration of STAAR English I and English II typically occurs a month earlier than the administration of the other EOC assessments, districts should evaluate the extent to which students taking English I or English II in an accelerated block of instruction during the spring are able to complete their testing requirements. Students who do not participate in the spring STAAR administration may not be able or willing to return to school in July to take the assessments and will not have another opportunity to test until December of the following school year, months after they have completed the course. 70

71 DVM-Student Assessment Resources  DVM Student Assessment Manual (Performance-Based Monitoring)  ESC Contact  Appendix in DVM Student Assessment Manual  District/Campus Coordinator Manual (Student Assessment)  Student Assessment FAQ (Student Assessment)  Intervention Resources (Program Monitoring and Interventions)  Indicator workbook (SLDR and DA)  Corrective Action Plan template  Intervention Guidance document 71

72 Contact Information FOR SLIDES 4-46 Division of Performance-Based Monitoring Phone: (512) 936-6426 Email: pbm@tea.texas.gov FOR SLIDES XX-XX Division of Program Monitoring and Interventions Phone: (512) 463-5226 Email: PMIdivision@tea.texas.gov 72


Download ppt "Why Data Integrity? The Importance of Submitting Quality Data THE NETWORK SUMMER SUMMIT JUNE 30, 2015 Copyright © Texas Education Agency 2015. All rights."

Similar presentations


Ads by Google