Texas Education Agency February 15,2012 NCES Winter Forum and 25 th Annual 2012 MIS Conference Brian Rawson & David Butler slide1 Fictitious district,

Slides:



Advertisements
Similar presentations
PBM The Staging Process. Check the dates August In late August, districts receive their PBMAS summary report.
Advertisements

Edupoint Educational Systems
Partnering with K-12 for over 140 years Kevin Hinds, Aspen Sales Consultant Jamar Boyd, Aspen Technical Consultant.
State of Indiana Business One Stop (BOS) Program Roadmap Updated June 6, 2013 RFI ATTACHMENT D.
Individual Learning Plan for Kentucky. Background: Data Integration State-wide unique student identifiers are used to link information from a number of.
The ADE-Post- Secondary SLDS Project Presented to: CTE Administrators Meeting February 7, 2013 Mark T. Masterson Chief Information Officer.
Data for Action: How Oklahoma Is Efficiently Expanding Data Tools 26th Annual Management Information Systems [MIS] Conference Concurrent Session VIII Thursday,
Texas Student Data System NCES Conference 1 ©2010 Michael & Susan Dell Foundation. All Rights Reserved. National Center for Educational Statistics Summer.
Holyoke Public Schools Professional Development By, Judy Taylor
LCFF & LCAP PTO Presentation April, 2014 TEAM Charter School.
Whiteboard Zoom Out OKED TLE Pilot Facilitator Training.
DPI Updates School Improvement Division Meeting March 19, 2013.
Nina Taylor Research and Analysis Division Texas Education Agency.
1 GENERAL OVERVIEW. “…if this work is approached systematically and strategically, it has the potential to dramatically change how teachers think about.
DRAFT Building Our Future 2017 Fulton County Schools Strategic Plan Name of Meeting Date.
The Initiative for School Empowerment and Excellence (i4see) “ Empowering teachers, administrators, policy makers, and parents to increase student achievement.
Standards Aligned System April 21, 2011 – In-Service.
ILLUMINATING COLLEGE AND CAREER READINESS State Report Cards for Districts and Schools.
VII: Rolling Out Success: Educators’ Use Is Informing the Texas Student Data System 2012 MIS Conference February 16, 2012 Stock photos – approved for non-promotional.
Strategic Planning Board Update February 27, 2012 Draft - For Discussion Purposes Only.
Kit Goodner Assistant Deputy Commissioner for Data Systems Division of Accountability, Research & Measurement June 2011.
CEDS Standard Update.
Catalyst Preview: Advisor Center and Degree Audit A Presentation by Alejo Delgado and Chad Wells.
Executive Sponsor Planning Toolkit Last Updated: September 23, 2011.
Texas Skyward User Group Conference TSDS studentGPS Dashboards Debbie Largent/Sandra Kratz.
August 2014 Presenting Yourself on the UC Application for Undergraduate Admission – Transfers.
Communication System Coherent Instructional Program Academic Behavior Support System Strategic FocusBuilding Capacity.
Texas Student Data System The Future of Data Collection and Sharing in Texas 11/8/2011.
TSDS Pilot – Lubbock ISD Train the Trainer September 26, 2011.
College Board EXCELerator Schools Site Visit Preparation.
TSDS Pilot – Lubbock ISD Dashboard Training Module September 26, 2011.
2012 SLDS P-20W Best Practice Conference 1 T EXAS S TUDENT D ATA S YSTEM (TSDS) Monday, October 29, 2012 Sharon Gaston, Texas Education Agency.
HECSE Quality Indicators for Leadership Preparation.
Our assessment objectives
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
User Management: Understanding Roles and Permissions for Schoolnet Schoolnet II Training – Summer 2014.
1 The New York State Education Department New York State’s Student Data Collection and Reporting System.
BOCES Data Collection & Reporting October 10, 2013 Lisa Pullaro Mid-Hudson Regional Information Center.
College Preparatory Course Certification Pilot May 5th,
Georgia’s Changing Assessment Landscape Melissa Fincher, Ph.D. Associate Superintendent for Assessment and Accountability Georgia Department for Education.
Texas Education Agency November 2011 Brian Rawson, Director-Statewide Data Initiatives, Texas Education Agency Sharon Gaston, TSDS Project Technical Director,
P-20 Statewide Longitudinal Data System (SLDS) Update Center for Educational Performance and Information (CEPI)
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Anne Arundel County Public Schools Intro to Online Data Systems For AP’s Oct 27 th and 28th, 2010 Mike Ballard Cathy Gillette.
Kansas Educator Evaluation Bill Bagshaw Asst. Director Kansas State Department of Education February 25, 2015.
August 2015 Presenting Yourself on the UC Application for Undergraduate Admission – Freshmen.
Local Control Funding Formula (LCFF) & Local Control Accountability Plan (LCAP) School Board Meeting, March 20,
Overview of Student Learning Objectives (SLOs) for
Input Opportunity Commissioner Morath, TEA NET3 PROVIDED BY R3 LEADERSHIP SERVICES MARCH 2, 2016.
Overview Plan Input Outcome and Objective Measures Summary of Changes Board Feedback Finalization Next Steps.
Crafting a Quality Grant Proposal March, 2016 ACCELERATED COLLEGE CREDIT GRANT.
INSTRUCTIONAL LEADERSHIP TEAM CAMPUS IMPROVEMENT PLANNING MARCH 3, 2016.
TSDS INTRODUCTORY SERIES: PEIMS. Contents 1. Why is our data collection system important? 2. Limitations of current PEIMS data collection system? 3. What.
Texas Student Data System TEA Texas Education Agency September 2011 SSIS Total Cost of Ownership TETN Training Roger Waak – Texas Education Agency.
Albert Gallatin School District Mr. Carl Bezjak - Superintendent.
Texas Student Data System State-sponsored Regional Forum Report.
Kansas Education Longitudinal Data System Update to Kansas Commission on Graduation and Dropout Prevention and Recovery December 2010 Kathy Gosa Director,
Welcome to Home Base You can also re-emphasize that all of these benefits can be found using one login all in one place, saving teachers more time to focus.
Instructional Improvement System Why – How – What -- When
Texas Student Data System The Future of Data Collection and Sharing in Texas 11/8/2011.
Texas Student Data System
Webinar: ESSA Improvement Planning Requirements
Jean Scott & Logan Searcy July 22, MEGA
ECDS Early Childhood Data System Texas Student Data System August 2018
A New Era In Data Management.
Parent Resources Public Education Department Project Update
Interim Assessment Training NEISD Testing Services
Annual Title I Meeting and Benefits of Parent and Family Engagement
Presentation transcript:

Texas Education Agency February 15,2012 NCES Winter Forum and 25 th Annual 2012 MIS Conference Brian Rawson & David Butler slide1 Fictitious district, school, staff and student names. Stock photos. Release for web use of these photos on file.

Districts spend significant time providing data to TEA for PEIMS$ Data that are shared back with the district are not timely and are not in a particularly useful format Cost to districts is extremely high, estimated to be $323M annually statewide Data rarely makes its way to the educators best positioned to improve student achievement 2

State- sponsored SIS Education Data Warehouse (EDW) PEIMSTPEIR  Opt-in, voluntary SIS  TEA has selected two options on the model for offering state-sponsored SIS  TSDS will integrate with other SIS’s – no requirement or mandate to switch  Powers student, campus, district data dashboards  System supported by the state but the data only available to educators  Will become conduit to submit PEIMS data  Loading of non-PEIMS data is strictly optional and at the districts’ discretion  XML data standard will make it easier to submit and certify data  Realign statewide data collection standards and protocol for districts  Expanded to link pre-K, college readiness, and workforce data  Load college readiness test score collections (SAT, ACT, AP Test data) © 2011 Michael & Susan Dell Foundation 3 3 Fictitious district, school, staff and student names. Stock photos. Release for web use of these photos on file.

SSIS District SIS EDW PEIMSTPEIR PEIMS data Voluntary data TEA will connect K-12 data with pre-K, college readiness, workforce data Certify/Validate 4slide 4 Fictitious district, school, staff and student names. Stock photos. Release for web use of these photos on file.

 A data system for teachers, designed by teachers.  Delivers relevant, timely and actionable student data back to educators to continually improve performance  A comprehensive, easy to use resource for student data – brings together student information from multiple sources  Reduces reporting and collection burden to districts  Requires no additional data input  User friendly and intuitive and accessible from any location  Available free of charge to all Texas districts Enable 100% of educators to have access to timely, relevant, actionable data to drive classroom and student success

6 Fictitious district, school, staff and student names. Stock photos. Release for web use of these photos on file.

7 Fictitious district, school, staff and student names. Stock photos. Release for web use of these photos on file.

slide8 SSIS Stakeholder Engagement Process Plenary Q&A SIS Features Feedback SPOT Analysis Final Q&A The goal was to gain an understanding of district / campus SIS needs and obtain feedback on key features required. Questions were targeted to identify the needs of district and campus administrators, educators, PEIMS coordinators, and technical staff. Gauged initial reaction to TSDS vision from stakeholders in breakout groups, including: overall impressions, areas that are confusing/not clear, aspects that are most and/or least appealing Document basic demographic information, management systems currently in use, and comments on common SIS features Document basic demographic information, management systems currently in use, and comments on common SIS features We conducted an exercise whereby the participants identify the quality of their SIS features by placing green, yellow, and red stickers We conducted an exercise whereby the participants identify the quality of their SIS features by placing green, yellow, and red stickers SPOT (Strengths, Problems, Opportunities, and Threats) analysis of current system versus changes to the new solution SPOT (Strengths, Problems, Opportunities, and Threats) analysis of current system versus changes to the new solution A final summary of group findings captured the key take- aways of each participant Any additional questions/concerns were captured as well

slide9 Fictitious district, school, staff and student names. Stock photos. Release for web use of these photos on file. Stakeholder feedback allowed the ability to identify the strong and weak features of their current SIS systems. StrongWeak Attendance Disciplinary EYP Tracking Free and Reduced Lunch Gifted and Talented Grade Reporting Health Records and Reporting Historical Records Special Ed Services Student Scheduling Master Schedule Building Address Verification Emergency Notification Student Demographics Classroom Management Parental Portal Extra-Curricular Graduation Plan Guidance and Counseling Test Scores Lesson Planner Cafeteria Automation Curriculum Maps Textbook Tracking Fees Attachments Data Mining and Summary e-Signatures Student Management Mobile Portal

 After an extensive RFI/RFO selection process:  Two (2) SSIS vendors were chosen  SSIS contracts are umbrella agreements that allow the LEA to purchase without the need for an RFP or RFO  Each LEA will have a separate contract with the vendor  Contract Management teams will oversee the Service Level Agreements to include: ▪ SSIS availability ▪ Performance ▪ Business continuity  Not-to-exceed pricing allows LEA to negotiate with vendor slide10

11 Fictitious district, school, staff and student names. Stock photos. Release for web use of these photos on file.

EDW Based Upon Texas Education Data Standard 12 TEDS XML-based standard Common Data Model across all LEAs 18 interchanges in total For both PEIMS and dashboards standard stable, released to public in March 2012 Texas Extensions PEIMS specific elements that can’t be fulfilled through Ed-Fi 7 interchanges Ed-Fi National CEDS compliant standard 11 interchanges Some elements dashboard specific, other used for PEIMS and dashboards

13 Fictitious district, school, staff and student names. Stock photos. Release for web use of these photos on file.

Nationwide Best Practices Audit Academic Research Focus Groups with 2,600 Educators Initial dashboards based on national education research and review of best practices across the country Received and incorporated feedback on dashboard from 2,600 educators in Texas Enhancements to dashboards based on stakeholder feedback

16 El Paso Amarillo Richardson Edinburg Houston San Antonio Mount Pleasant Midland  ~2,000 people attended the regional forums – 204 Classroom Teachers – 268 Campus Administrators and Principals – 693 LEA Administrators and Superintendents – 637 PEIMS/IT Coordinators – 160 Webinar participants; ongoing feedback collected via TSDS website  9 Regional Forums; 12 Feedback Sessions  85 Breakout Sessions by 4 stakeholders groups (Teachers, Principals, Superintendents/ Administrators, and PEIMS/IT Coordinators) Stakeholder engagement meetings were conducted over a two-month period (Mar-Apr 2010), and consisted of 12 three-hour sessions at 9 different regional sites Lubbock Stakeholder Engagement Process Overview

slide17 Stakeholder Engagement Process Plenary Presentation Plenary Q&A “Clean Slate” Metrics Needs Review Existing Snapshots Final Q&A The engagement process enabled stakeholders to understand the TSDS vision, review progress to date, express feedback, and provide input into the design of reports & tools Full Group Breakout Sessions Gauged initial reaction to TSDS vision from stakeholders in breakout groups, including: overall impressions, areas that are confusing/not clear, aspects that are most and/or least appealing TEA – Presented an overview of the TSDS vision MSDF – Provided an overview of performance management, including action video clips Prior to viewing any TSDS snapshots, stakeholders were asked to provide the “Top 10” critical questions/pieces of data they would want to include in a dashboard, including timing/frequency of use and importance Prior to viewing any TSDS snapshots, stakeholders were asked to provide the “Top 10” critical questions/pieces of data they would want to include in a dashboard, including timing/frequency of use and importance Following a discussion of the “clean slate” metrics, participants were asked to provide feedback on both the student and campus snapshots that have been created based on best practice ideals Following a discussion of the “clean slate” metrics, participants were asked to provide feedback on both the student and campus snapshots that have been created based on best practice ideals A final summary of group findings captured the key take- aways of each participant Any additional questions/concerns were captured as well

slide18 Teachers’ Focus:Campus Leaders’ Focus:District Leaders’ Focus: – Complete historical information (What is the status of my incoming class?) – Attendance and Academic Progress during year (How are my students doing now?) – Updated student demographics/contact information (What is going on with my student? Who do I contact regarding issues?) – Peer comparisons (How is my class/my student comparing to like classes/students?) – Student/Teacher performance trends (What is our progress? Where is help needed?) – Ability to analyze/ correlate data (What does the data tell us?) – Data security, quality, and timeliness (Is the data relevant, accurate, safe?) – Implementation realities (Who will enter this data? Who sets targets?) – Secure access to/oversight of data (Who will see this info?) – Implementation realities (Who will pay for this? Who will input data?) – Existing vendor overlap / integration (How will vendors react?) – Report generation, customization capabilities (How can the data be used?) – Tie to/fit with Campus Improvement Plans, Strategic Plans (How does this fit with existing processes?) Stakeholder Feedback – By Role Open-ended stakeholder feedback, lines of inquiry, and areas of concern were largely tied to the responsibilities each group faces in their current roles Use Information to Improve Student Outcomes Use System to Improve Reporting and Reduce Costs

District/LEA Campus Classroom Student Drill down to the individual Student level from any aggregate view Data originates at the Student Level and rolls up to aggregate views

Positive Interactive Involvement with Students More Teaching TimeInformed Decision MakingUser Friendly Data System “After the information has been gathered, hopefully I would have a user-friendly service that would allow me to look at the information, serve my students and meet their needs more effectively and efficiently.”

21 Metrics by “Useful” Rating – All Stakeholders In general, the Academic Progress and Engagement metrics were considered more useful than the Academic Challenge and College/Career Readiness metrics

22 Stakeholder Feedback – Clean Slate “Top 10” Participants provided unsolicited feedback regarding the “Top 10” critical questions or pieces of data ideally seen on Student, Teacher, and Campus dashboards Students – Top 3 Special Programs/Interventions Attendance TAKS Results 17% 13% Teachers – Top 3 Student performance by* Teacher attendance Certifications 32% 15% 12% Campus – Top 3 Expenditures Dropouts Graduation Rate 21% 18% Feedback generated during the Clean Slate exercise mapped very closely with metrics identified through best practices research, allowing for state-specific nuances (e.g., ELL, career-ready, etc.) *”Student performance by" varied, but generally includes students’ Grades, Assessment performance, Longitudinal testing data, and Teacher Fail Rate

Attendance & Discipline Daily attendance Class period attendance Tardy rates Discipline Assessments & Grades State assessments Language assessment Early reading District benchmarks Course / subject area grades Credit accumulation and 4x4 Academic Potential Advanced course potential College entrance exams (SAT, ACT, PSAT) College readiness indicators Student Information Program participation Enrollment dates Contact information Key transcript data: Current and historical courses and grades TAKS history  Highlights trends over time and flags negative trends  Easily drill down for more detail (e.g., finer grain, historical data, student exception lists)  Highlights where students are not meeting performance goals

Demo Video.wmv 24 Fictitious district, school, staff and student names. Stock photos. Release for web use of these photos on file.

Red text = not meeting goal Trend arrows compare to prior period Customize your view Summary view for class/teacher Same format for campus, student, and teacher list drill down (exception lists) Columns are customizable and can be sorted All lists can be exported to csv file Clicking any student name takes you to student dashboard 25 Fictitious district, school, staff and student names. Stock photos. Release for web use of these photos on file.

Indicates if meeting campus goal Trend direction, flag if negative Tabs describe content and are clickable Additional detail Same format for student, campus and district dashboards Metric name, status, value trend Date of last data capture ‘More’ detail option Navigation tools at top 26 Fictitious district, school, staff and student names. Stock photos. Release for web use of these photos on file.

Available for most metrics on campus dashboard Drill down by grade (campus) Generate exception lists of students not meeting goal (campus) Historical trend detail Grade Level Detail Student Exception List 27 Fictitious district, school, staff and student names.

Available for most metrics on student dashboards Historical trend detail Metric details (student: attendance, TAKS, credits) Historical Detail Metric Details 28 Fictitious district, school, staff and student names. Stock photos. Release for web use of these photos on file.

Quickly get to information you need Available on all dashboard pages Search for student, teacher, or campus name Drop down list appears with available options Results limited to user access role

30 Fictitious district, school, staff and student names. Stock photos. Release for web use of these photos on file.

 Why measure Algebra I vs. other courses?  Research indicates that students who enroll and complete Algebra I by the 9 th grade (a key prerequisite for higher math) will graduate in higher numbers and are more likely to be college ready.  Algebra I Metrics-High School:  Taking or have Taken: % of students who have taken Algebra I by the 9 th grade. In the case of students who have never enrolled, teachers and counselors can quickly identify students who need to be enrolled and, based on each student’s academic history, provide the necessary support to register and prepare them for Algebra I.  Passing or have Passed: % who are passing/have passed by the 9 th grade. For students who are currently enrolled in Algebra I, teachers can quickly identify students who need additional academic support to successfully complete the course. In the case of former Algebra I students, teachers of advanced math courses can view Algebra I performance to evaluate the level of support students may require as they engage in more rigorous coursework.  Algebra I Metrics - Middle School:  This measure, defined as enrolling and successfully completing Algebra I by the 8 th grade, helps campuses identify the level of students who show potential for more advanced coursework and their ability to master higher level math skills to be successful at the post-secondary level.

Common measure used by districts to track attendance and for state reporting Use to pinpoint specific students negatively impacting ADA. Value often lower than ADA. Catch students with attendance problems that aren’t reflected in daily attendance measures above 32 Fictitious district, school, staff and student names. Stock photos. Release for web use of these photos on file.

 All Discipline Incidents: percentage of students with repeat occurrences (5 or more) of discipline incidents representing minor infractions (school code of conduct incidents) or one or more discipline incidents representing the most serious incidents, excluding school code of conduct incidents.  School Code of Conduct Incidents: percentage of students who have one or more minor infractions (school code of conduct incidents) in a given grading period. Identifies students with chronic discipline issues Identifies students with early signs of discipline problems 33 Fictitious district, school, staff and student names. Stock photos. Release for web use of these photos on file.

34 Fictitious district, school, staff and student names. Stock photos. Release for web use of these photos on file.

EDW Projected Timeline Initial EDW (Dashboards) prototype Initial EDW (Dashboards) prototype Draft data standard defined Draft data standard defined FY10 Draft data dashboards defined Draft data dashboards defined SLDS 2009 ARRA Grant Awarded SLDS 2009 ARRA Grant Awarded Dashboards finalized based on feedback Dashboards finalized based on feedback Stakeholder engagement Stakeholder engagement Continued Deployment to all Districts FY14 FY13FY12FY11 RFI for DW Solution Issued RFI for DW Solution Issued  Begin Limited Prod Releases 1-2 Select Vendors  Limited Prod Releases 3-5 Dashboard Enhancements  Limited Prod Release 6 Integration with SSIS Integration with other SISs Interface with other Source Systems PEIMS Data Standards Published  Begin Production Deployment Hosted EDW Dashboard Enhancements PEIMS Collection through EDW Draft Data Standards for EDW Draft Data Standards for EDW Evaluate RFI Responses Evaluate RFI Responses Re-engineer Data Collection Business Processes Re-engineer Data Collection Business Processes Requirements (Solution Components) Requirements (Solution Components) Issue 3 RFOs Issue 3 RFOs

The importance of prototyping and piloting activities was reinforced as significant learning and discovery continued during subsequent phases of the project. Sufficient time needs to be allocated for iterative change and refinement. The high level of stakeholder feedback is worth the additional efforts associated with the refinements. Lessons learned

 Contact: about any additional TSDS questions and/or 38 Fictitious district, school, staff and student names. Stock photos. Release for web use of these photos on file.