Presentation is loading. Please wait.

Presentation is loading. Please wait.

Rural Dropout Prevention Project

Similar presentations


Presentation on theme: "Rural Dropout Prevention Project"— Presentation transcript:

1 Rural Dropout Prevention Project
Tools for measuring the use of Early Warning data October 6, 2015 March 2015

2 About the Rural Dropout Prevention Project
Goal: Provide technical assistance as determined through collaboration with the state education agency (SEA) to states, districts, and schools, particularly those in rural communities, to support dropout prevention efforts.

3 Participating States Fifteen states selected to participate (MT, WY, ND, SD, NE, IA, AR, OK, NH, VT, WV, NC, ME, AK, MS)

4 Projects Priority Areas
Reengaging out-of-school youth Assigning adult advocates to at- risk students Implementing early-warning indicator systems Implementing effective credit recovery and acceleration opportunities Instituting “summer bridge” transition programs Improving services and instruction for English language learners (ELL) and new immigrants Improving services and instruction for migrant, American Indian, Alaska Native, minority, and at-risk students and youth Supporting in-service leadership and professional development programs Addressing other topics suggested by the SEAs that relate to or enhance the other priority areas

5 Gathering Input to Drive TA Planning
Asked for input about: Unique challenges faced by rural communities What dropout prevention looks like in rural areas and what approaches are successful Types of support that would be helpful

6 Examples of What We Heard: Approaches to Dropout Prevention
Early Warning Tool available from the state using predictive analytics to identify at risk students. Support whole child needs Partnering with communities and businesses Utilizing flexibility provided through innovation zone grants Focusing on school climate and student engagement Conducting site visits to colleges starting in middle school Providing alternative pathways through career and technical education (CTE) opportunities and connecting CTE and core curriculum

7 Examples of What We Heard: Challenges
Using data to inform practice (not collecting data for data’s sake) Grant writing Changing community contexts

8 Focus Based on What We Heard
To increase the use of data to inform dropout prevention activities in West Virginia

9 Early Warning Implementation Pathway
Synthesize Research Validate Indicators West Virginia's Early Warning Tool Customize and Develop Tools and Supports Launch and Implement West Virginia’s Early Warning Tool uses predictive analytics to identify students at risk. Justin will be providing a short update about this next. Assess and Improve Processes

10 Collecting Data is Great but Using it is Critical
Synthesize Research Validate Indicators Customize and Develop Tools and Supports Launch and Implement West Virginia’s Early Warning Tool uses predictive analytics to identify students at risk. Justin will be providing a short update about this next. Assess and Improve Processes

11 Early Warning Intervention Monitoring System Implementation Cycle
STEP 1 Establish roles and responsibilities STEP 2 Use the EW tool STEP 3 Review the EW data STEP 4 Interpret the EW data STEP 5 Assign and provide interventions STEP 6 Monitor students STEP 7 Evaluate and refine the process Developed by the National High School Center in Collaboration with For more information on this process visit:

12 Tools to Measure Early Warning Implementation: Rubric and Interview
Rubric and worksheet

13 Sections of the Rubric and Worksheet
System Features to Support Readiness and Implementation Data Team and Structure EWS Tool Capabilities Review of EWS Data Interventions and Supports Progress Monitoring Continuous Improvement Aligned with the implementation cycle. Each of these sections include subsections. System Features to Support Readiness and Implementation. Resources and organizational structures necessary to support, identify, and intervene to support students at risk of dropping out of high school Data Team and Structure. Team structure and process in place to review EWS data and other data in order to assign and monitor interventions EWS Tool Capabilities. Components and capabilities of the tool (e.g., indicators and thresholds, reports) and maintenance of the tool to support identification of students at risk of dropping out Review of EWS Data. Data review process to identify at-risk students, groups of students, or schoolwide patterns, and to explore underlying root causes of risk and develop a hypothesis for intervention and support Interventions and Supports. Interventions and supports of varying focus (e.g., attendance, course performance, behavior) and intensity—aligned to address identified root causes of risk—provided to support dropout prevention efforts Progress Monitoring. Ongoing and frequent monitoring of student progress across levels of interventions to monitor responsiveness to interventions and supports, and to make adjustments and adaptations as needed Continuous Improvement. System for collecting and analyzing data to measure fidelity and effectiveness of EWS implementation

14 Implementation Rubric
Includes Rating scale 1–5 points with anchors for 1 point = little or no implementation 3 points = partial or inconsistent implementation 5 points = complete and consistent implementation

15 Implementation Interview
Goal is to create dialogue Script and note-taking template for gathering information. Aligned with rubric and includes sample questions The Implementation Interview is intended to create dialogue about implementation Do not need to use all the prompts available and can streamline based on context of school. If you are a reviewer may consider if any of the language needs to change based on school context.

16 Goal of the Tools The goal is not to be evaluative but to inform identification of implementation strengths, detect barriers that teams have encountered, and identify areas in need of additional focus or professional development.

17 Use of Tools: Administration
Self-assessment: School based team works through questions and rate themselves based on the descriptions in the rubric May be beneficial for team members to individually rate themselves, compare results and discuss any differences External review: Independent reviewers hold discussion with school-based team using Implementation Interview and then independently score using the rubric Recommended that two facilitators are used to allow for a check of interrater reliability.

18 Use of Tools: Scoring Scored against specific language/definitions listed in the rubric When in doubt, round scores down Used scores of “2” and “4” when implementation did not exactly match definitions of a “1,” “3,” or “5.” Scores averaged across implementation area Components not in place scored a “1.” Considered evidence when determining scores

19 Use of Tools: Time & Frequency
Estimated to take about 2hrs to conduct interview with school sites, but will depend on the depth of conversations with sites. If time limited, could target specific sections of the rubric and interview. Frequency: Collect overtime allows you to see change in implementation May consider collecting annually

20 Status of Tools Currently being customized to fit language for West Virginia Look for additional information about access coming soon!

21 Questions

22 Video: Sneak Peak

23 Thank you! Amy Peterson


Download ppt "Rural Dropout Prevention Project"

Similar presentations


Ads by Google