Reporting & Evaluation Workshop Lauren Amos, Liann Seiter, and Dory Seidel.

Slides:



Advertisements
Similar presentations
Read the Quote Below: Why?
Advertisements

1 Title I, Part D Data: SY 2012−13 Data Preview, Data Quality, and Upcoming CSPR Clarifications Dory Seidel and Jenna Tweedie, NDTAC.
1 Title I, Part D Data Reporting and Evaluation: What You Need To Know Dory Seidel and Jenna Tweedie, NDTAC Karen Neilson, California Department of Education.
1 Workshop Part I: Federal Monitoring Basics Victoria Rankin, Greta Colombi, and Alexandra Woods NDTAC.
1 ND Community Call Salmon Community 21 October 2014.
OJJDP Performance Measurement Training 1 Incorporating Performance Measurement in the Formula Grant RFP and Application Format Presenter: Pat Cervera,
1 Gold ND Community Call October 7, Agenda “That Time of Year”: Data Team Updates A Closer Look: Subgrantee Monitoring Review of Recent TA Requests.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Title I, Part D—Prevention and Intervention Programs for Children.
Strategies for Developing Efficient and Effective Annual Count Processes Stephanie Lampron, DeAngela Milligan, and Marcia Calloway.
Workshop Part II: Subgrantee Monitoring Basics Victoria Rankin, Greta Colombi, and Alexandra Woods NDTAC.
Designing and Implementing An Effective Schoolwide Program
1 Planning and Funding Basics Lauren Amos and Liann Seiter, NDTAC Chandra Martin, Arkansas Department of Education.
1 Monitoring Review: What Every New Coordinator Should Know Victoria Rankin and Greta Colombi, NDTAC.
Taking the Fast Lane to High-Quality Data Sarah Bardack and Stephanie Lampron.
ND Community Call Data Dashboards: Part 1 September 20, 2012.
ND Community Call Salmon Community October 23, 2013.
MONITORING INDISTAR® STATE-DETERMINED IMPROVEMENT PLANNING TOOL.
1 Topical Call Series: Improving Data Quality and Use Improving Data Use Wednesday, November 19, 2014.
ND Topical Call Innovative Uses of Title I, Part D, Funds: Cost-Benefit Analysis to Drive Decisionmaking (Call 1) February 5, 2014.
Toolkit Series from the Office of Migrant Education Webinar: Program Evaluation Toolkit August 9, 2012.
Meeting the Educational Needs of Diverse Learners DeAngela Milligan and Sarah Bardack.
Communication System Coherent Instructional Program Academic Behavior Support System Strategic FocusBuilding Capacity.
OFFICE OF FIELD SERVICES SPRING PLANNING WORKSHOP 2012.
1 Gold ND Community Call February 3, Agenda “That Time of Year”: CSPR Data Submission Peer-to-Peer Discussion: Questions from Community Members.
1 ND Topical Call Series: NDTAC Resources to Meet Technical Assistance Needs (Call 3) 22 September 2015 – Katie Deal.
1 ND Communities Meeting. 2 Agenda Activity: Planning training and technical assistance Compelling technical assistance requests NDTAC has received Activity:
1 Topical Call Series: Improving Data Quality and Use CSPR Data Collection Tuesday, September 15, 2015.
The Instructional Decision-Making Process 1 hour presentation.
Data Quality & the Consolidated District Performance Report (CDPR) Jesse Parsons & Russ Sweet, Education Specialists Oregon Department of Education August.
ND Topical Call Subgrantee Monitoring Tools: Meeting the Civil Rights Obligations to Students (Call 1) January 14, 2015.
Melvin L. Herring, III Program Director, Title I, Part D Florida Department of Education.
Making Sound Use of Funds Decisions for Title I, Part D Nicholas Read and Simon Gonsoulin, NDTAC Jeff Breshears, California Department of Education.
The Power of Monitoring: Building Strengths While Ensuring Compliance Greta Colombi and Simon Gonsoulin, NDTAC.
Making the Most of Your Data: Strategies for Evaluating Your Program Greta Colombi, NDTAC; and John McLaughlin, ED.
Using Data for Program Quality Improvement Stephanie Lampron, Deputy Director.
Implementing School Plans in ePlan
MIS DATA CONFERENCE 2012 JULY 23, 2012 Mississippi Department of Education Office of Federal Programs.
The Annual Count: Understanding the Process and Its Implications.
Annual Counts: Understanding the Process and Its Implications.
TITLE I, PART D STATE PLANS John McLaughlin Federal Coordinator for the Title I, Part D Program NDTAC Conference May
Overview of the Counting Process DeAngela Milligan.
Annual Count for Local Agency Programs (Subpart 2) Greta Colombi.
Edit the text with your own short phrases. To change the sample image, select the picture and delete it. Now click the Pictures icon in the placeholder.
1.  Mapping Terms  Security Documentation  Predictor Table  Data Discussion Worksheet 2.
Presented by: Jan Stanley, State Title I Director Office of Assessment and Accountability June 10, 2008 Monitoring For Results.
Early Childhood Transition Part C Indicator C-8 & Part B Indicator B-12 Analysis and Summary Report of All States’ Annual Performance Reports.
1 ND Topical Call Series: NDTAC Resources to Meet Technical Assistance Needs (Call 2) 26 August 2015 – Katie Deal.
Consolidated State Performance Report & Survey to Generate Title I Neglected and Delinquent Funds for Subpart 2 LEAs and TACF Neglected,
Consolidated State Performance Report & Survey to Generate Title I Neglected and Delinquent Funds for Subpart 1 State Agencies Neglected,
The Michigan Department of Education Program Evaluation Tool (PET) Lessons Learned & Support Documents.
Hudson Area Schools - Lincoln Elementary School Improvement Professional Development Friday, February 26, 2010.
1 New Coordinator Orientation Lauren Amos, Katie Deal, and Liann Seiter.
Proposed Changes to the Title I, Part D, Federal Data Collection As of June 28, 2012.
McKinney-Vento Education for Homeless Children and Youth Program (EHCY) Improving the Quality of LEA Level Data February 28, 2013 Prepared for: Office.
SACS/CASI District Accreditation  January 2007  April 2007  May 2007  January – April 2008  Board Approval for Pursuit of District Accreditation.
1 ND Community Call Teal Community 27 October 2015.
1 Introductions Choose a photo from the table that appeals to you or represents you in some way. Write the answers to the following questions on a 3x5.
Moving Title IA School Plans into Indistar ESEA Odyssey Summer 2015 Presented by Melinda Bessner Oregon Department of Education.
Using Pre- and Posttesting To Improve Programming and Student Achievement Anju Sidana.
Promoting Data Collection and Evaluation in Title I, Part D, Programs Stephanie Lampron, NDTAC; John McLaughlin, Program Officer; and Bobbi Stettner-Eaton,
1 ND Community Call Gold Community 22 October 2015.
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
ND Community Call Salmon Community April 16, 2014.
Promoting the Vision & Mission of the School Governing Board Online Training Module.
Instructional Leadership Supporting Common Assessments.
1 Effectively Addressing Administrative Challenges of Implementing Title I, Part D Katie Deal, Rob Mayo, Liann Seiter, and Jake Sokolsky.
1 Welcome! Choose a photo from the table that appeals to you or represents you in some way. Write the answers to the following questions on a 3×5 notecard.
Measuring Project Performance: Tips and Tools to Showcase Your Results
2018 OSEP Project Directors’ Conference
Using Data to Monitor Title I, Part D
Presentation transcript:

Reporting & Evaluation Workshop Lauren Amos, Liann Seiter, and Dory Seidel

2 Workshop Objectives Learn how to use a data dashboard to support your Title I, Part D, reporting and evaluation responsibilities.

3 Agenda: Part I 1.Presentation: Data Quality Overview and Common Issues 2.Dashboard Demonstration: Exploring National CSPR Data With the NDTAC Dashboard 3.Activity: Use the Dashboard To Conduct a CSPR Data Quality Review of Your State Data 4.Small-Group Discussion

4 Agenda: Part II 1.Dashboard Demonstration: GPRA Measures Used To Inform Federal Decisionmaking 2.Activity: Use the Dashboard To Drive Decisionmaking With Your State Data 3.Whole-Group Discussion

5 Data Quality Overview and Common Issues Presentation

6 Why Is Data Quality Important? You need to trust your data because it informs: Data-driven decision-making Technical assistance (TA) needs Subgrantee monitoring Application reviews Program evaluation

7 What Makes “High Quality Data”? Accuracy Consistency Unbiased Understandable Transparency

8 Individual Programs: Where Data Quality Begins If data quality is not a priority at the local level, the problems become harder to identify as the data are rolled up—problems can become hidden. If data issues are recognized late in the process, it is more difficult (and less cost-effective) to identify where the issues are and rectify them in time. Individual Programs SA or LEA SEA ED

9 Role of the Part D Coordinator Ultimately, coordinators cannot “make” the data be of high quality, but they can implement systems that make it likely: Understand the collection process. Provide TA in advance. Develop relationships. Develop multilevel verification processes. Track problems over time. Use the data.

10 Method # 1: Use the Data!!! The fastest way to motivate for data quality is to use the data that programs provide. The best way to increase data quality is to promote usage at the local level.

11 Should you use data that has data quality problems? YES!! You can use these data to… Become familiar with the data and readily identify problems. Know when the data are ready to be used or how they can be used. Incentivize and motivate others.

12 Method #2: Incentivize and Motivate 1.Know who is involved in the process and their roles. 2.Identify what is important to you and your data coordinators. 3.Select motivational strategies that align with your priorities (and ideally encourage teamwork). Reward Provide Control BelongCompareLearnPunish Provide bonus/ incentives for good data quality (individual or team level) Set goals, but allow freedom for how to get there Communicate vision and goals at all levels Publish rankings, and make data visible (to individuals or to everyone) Provide training and tools on data quality and data usage Withhold funding

13 Method #3: Prioritize Consider targeting only: Top problem areas among all subgrantees Most crucial data for the State Struggling programs

14 Method #4: Know the Data Quality Pitfalls Recognize and respond proactively to the things that can hinder progress: Changes to indicators Changes to submission processes Staff turnover Funding availability

15 Develop materials upfront. Look to existing resources and make them your own. Where to look: NDTAC ED Your ND community The Web Method #5: Renew, Reuse, Recycle

16 NDTAC: Tools for Proactive TA Consolidated State Performance Report (CSPR) Guide – Sample CSPR tables – In-depth instructions – Data quality checklists Data Collection List CSPR Tools and Other Resources EDFacts File Specifications

17 Tools for Reviewing Data and Motivating Providers EDFacts summary reports – Review the quality of data entered via EDFacts ED Data Express – Compare data at the SA and LEA level to the nation or similar states Data Dashboard – Conduct data quality reviews or performance evaluation – Make data driven decisions

18 Common CSPR Data Quality Issues Academic and vocational outcomes are over the age-eligible ranges. The sum of students by race, age, or gender does not equal the unduplicated count of students. The number of long-term students testing below grade level is more than the number of long-term students for reading and math. The number of students demonstrating results (sum of rows 3–7) does not equal the number of students with complete pre- and posttests (row 2) for reading and math. Average length of stay is 0 or more than 365 days.

19 Exploring National CSPR Data with the NDTAC Dashboard Dashboard Demonstration

20 Activity Use the Dashboard To Conduct a CSPR Data Quality Review of Your State Data

21 Activity: CSPR Data Quality Review

22 Strategies for Improving Data Quality Technical assistance: newsletters, webinars, phone calls, one-pagers Monitoring: on-site verification (e.g., document reviews, demo data collection process/system) Interim data collection and QC reviews Data collection templates Annual CSPR data show Other ideas?

23 Whole-Group Discussion: Data Quality Is your current data collection tool sound? Do your subgrantees know how to use your data collection tool? − How do you know? − What TA do you provide to support their use of the tool? Do your facilities have a data collection system, process, or tool that allows them to track student-level data? How do your current subgrantee monitoring activities support CSPR data quality?

24 Dashboard Demonstration GPRA Measures Used To Inform Federal Decisionmaking (or the Federal Perspective on State Performance)

25 Activity: GPRA Measures

26 Guiding Questions 1.GRPA Indicators » In what areas is your state underperforming relative to the nation? » What contextual factors may explain why? » What questions do you have about your state’s performance in this area? » Is this cause for concern? » If so, what are your next steps?

27 Activity Use the Dashboard To Drive Decisionmaking With Your State Data

28 Activity: Decisionmaking

29 Guiding Questions 2.Program Performance » Are certain demographic groups overrepresented in a particular program and if so, are those programs taking measures to serve the unique needs of those populations? » Are all students being served equally well or are certain program types outperforming (or underperforming) their peers? » Which program type(s) appear to be effective? Why might it/they appear to outperform the other program types (e.g., higher student performance upon entry, more highly qualified teachers, smaller student-teacher ratio, low teacher attrition)? » What questions do you have about your state’s performance in this area? » Any causes for concern? » If so, what are your next steps?

30 Whole-Group Discussion 1.Did these activities inspire any surprises for? Any major take- aways? 2.Are your subgrantees meeting the needs of your State’s youth who are neglected and delinquent? 3.Are youth needs being consistently met across program types? 4.What did the dashboard activity suggest about how well TIPD funds are being used in your State? 5.Are the funds you’ve allocated to your subgrantees reflected in program outcomes?

31 Whole-Group Discussion (cont.) 1."It's Complicated": What's your Facebook status with your EDFacts Coordinator? What might you do to forge better ties with your State data person? 2.What are your next steps? What might you do differently moving forward to improve data quality and data-driven decisionmaking in your State? 3.How can NDTAC support you in these next steps?