Presentation is loading. Please wait.

Presentation is loading. Please wait.

Doing more with less: Evaluation with the Rapid Cycle Evaluation Coach

Similar presentations


Presentation on theme: "Doing more with less: Evaluation with the Rapid Cycle Evaluation Coach"— Presentation transcript:

1 Doing more with less: Evaluation with the Rapid Cycle Evaluation Coach
Erin Dillon, Mathematica Policy Research Matthew Lenard, Wake County Public Schools

2 Session Overview What is Rapid Cycle Evaluation?
Rapid Cycle Evaluation Coach demo Activity to practice using the RCE Coach Discussion on the RCE Coach and ‘Openness’ Q & A

3 What does evaluation look like in your agency?

4 What is rapid cycle evaluation?
Focused on measuring the impact of changes to existing program operations and services Uses experimental or quasi-experimental methods to identify a causal relationship Relies predominantly on administrative data to measure impacts Results can be observed quickly (within one year)

5 A word on the word “cycle”
An on-ramp to a continuous improvement cycle Short-term focus: one-time opportunistic evaluations Long-term vision: Iterative, continuous improvement tool Linked to data systems Agencies running multiple experiments per year

6 Rapid-cycle v. Program evaluation
Rapid-cycle Evaluation Program Evaluation Incorporated into regular practice and decision making Done outside of regular practice Narrow, targeted research question Broader and/or multiple research questions Uses existing data Collects new data Findings in one year or less Multiple years until findings Low-cost, usually in-house analysis More costly analysis done by outside experts Results lead to change and a new round of testing Results are the end point

7 Who is doing RCE?

8 RCE Coach background Free, openly licensed, online tool created by the U.S. Dept. of Education Office of Education Technology Created to help districts make better decisions about education technology products, but can be used for other education interventions Tailored to non-technical audience Available at

9 RCE Coach demonstration

10 Bayesian v. frequentist interpretations
Bayesian Interpretation Frequentist Interpretation Assesses the probability that an intervention has the desired impact Assesses whether results are statistically significant Uncertainty can be framed in probabilistic terms: “There is a 77% chance that the new education technology improves student achievement, and a 23% chance that it decreases achievement.” Uncertainty is typically framed in terms of the confidence interval: “The 95% confidence interval around the impact of the new education technology includes zero, so we cannot reject the hypothesis that there was no difference.” User determines if results are practically significant Results are usually presented as binary: statistically significant or not significant

11 Bayesian results in the RCE Coach
There is a 67% chance the technology has a positive impact There is a 33% chance the technology has a negative impact Impact estimate and credible interval

12 Activity: Using the RCE Coach
Work in groups or individually Use scenarios provided or your own data Scenarios are for a matched comparison design. Let Erin or Matt know if you want to try the random assignment tools Download data for scenarios here: (Coach workshop resources) Activity Steps: Create a log-in: select “Create Account” Work through the RCE Coach set-up steps Analyze the data using the RCE Coach analysis tool Create a findings brief and be prepared to share your results

13 Activity debrief What did you learn in your analysis and findings brief? Do you think rapid cycle evaluation can be useful in your agency? What obstacles will there be to using a tool like the RCE Coach in your agency?

14 RCE Coach lessons learned and looking ahead
The Coach needs a champion to ensure RCEs are a priority The Coach can serve as a local capacity building tool Practices associated with collecting, reporting and interpreting usage data are still emergent Ed Tech Developers are important partners in the RCE process Looking ahead Build out non-academic achievement, teacher professional development and staff productivity measures Pilot with additional districts/schools Add case studies based on pilot districts Highlight and expand the “shared evaluation” page

15 RCE Coach and ‘Openness’
‘Share Evaluation’ page allows users to share their results publicly Goal is to create a national bank of RCE results that describe education technology tools in different contexts Hope to add the ability to create meta-analyses that aggregate results across districts Challenges in sharing results: Districts are reluctant to share negative or null results Ed tech developers see risk in potentially unfavorable results Need more openness to sharing “failure”

16 Interested in piloting the RCE Coach?
Identifying districts and schools to be pilot partners Ready to pilot an ed tech in summer or fall 2017 Ideally, schools or districts are: Able to implement forward-looking evaluations, and/or Interested in looking at student non-academic outcomes, teacher professional development, or staff productivity Receive customized training and support from MPR or SRI Go to and submit the form to express your interest

17 For more information Erin Dillon Matthew Lenard
Matthew Lenard


Download ppt "Doing more with less: Evaluation with the Rapid Cycle Evaluation Coach"

Similar presentations


Ads by Google