Presentation is loading. Please wait.

Presentation is loading. Please wait.

Strategies for Effective Program Evaluations U.S. Department of Education The contents of this presentation were produced by the Coalition for Evidence-Based.

Similar presentations


Presentation on theme: "Strategies for Effective Program Evaluations U.S. Department of Education The contents of this presentation were produced by the Coalition for Evidence-Based."— Presentation transcript:

1 Strategies for Effective Program Evaluations U.S. Department of Education The contents of this presentation were produced by the Coalition for Evidence-Based Policy, in partnership with the National Opinion Research Center (NORC) at the University of Chicago, under a contract with the Institute of Education Sciences.

2 Step 1: Find a researcher with expertise in conducting rigorous impact evaluations to include on the study team.  Contact authors of previous, well- designed impact evaluations.  Ask the evaluator to identify a plan that describes in non-technical language, how they would conduct the study.  Check references

3 Step 2: Decide what research question(s) the study seeks to answer.  Evaluate a specific, well-defined MSP approach.  Measure the effect of the program on teachers’ content knowledge.  Measure the impact on student knowledge.  If resources permit, ask questions about the long-term effect of the MSP projects.

4 Step 3: Decide on the study design.  Decide on overall design: preferably a randomized controlled trial, or if not possible, a well-matched comparison-group study.  Decide the sample size to measure teacher content knowledge – at least 90 teachers needed.  Decide the sample size to measure student achievement – at least 60 teachers needed.

5 Step 3 continued:  Decide how to recruit and allocate teachers.  The simplest way to allocate teachers in a RCT is to apply random assignment to the entire sample. However, if there are big differences across the group (high-achieving, low-achieving) then grouping in blocks and then random assignment.  In a comparison group study the teachers must be very closely matched: First, teachers must be closely matched by achievement levels; and second, by students’ achievement.

6 Step 3 continued:  Decide how to measure MSP project outcomes for student achievement. There are 3 conditions: --must obtain scores for individual students; --must obtain scores before they enter the teachers class, and at the end of the study; and --must be able to convert the scores so that they will enable comparisons across grade levels.  Decide how to measure teacher content knowledge.

7 Step 3 Example: Program group Control group

8 Step 4: Gain the cooperation of teachers and school officials.  Identify one or more senior-level advocates for the study.  Gain teachers’ cooperation by explaining the benefits of the research design.  Satisfy privacy (FERPA) concerns in your design.  Special consideration for matched comparison group of teachers from another school district.

9 Step 5: Allocate teachers to the program and control (or matched comparison) groups.  The evaluator should conduct random assignment of teachers rather than project director.  Ask participating teachers not to share their materials.  Ensure that students are assigned to classes using the normal procedures.

10 Step 6: Collect the data needed to measure the MSP project’s effectiveness.  Make process as short and streamlined as possible.  Provide a unique personal identifier for each participant.  At the time of assignment, obtain pre-program performance data.  Make every effort to collect data from 80 percent of the original sample of teachers.

11 Step 7: Analyze and report the study’s results.  Obtain regression-adjusted estimates of the MSP project’s effect on student achievement.  Use the students’ pre-program test score as a covariate.  Use data from all of the original participants.

12 Improved Intervention Improved Evaluation

13 Help Desk Website: http://www.whatworkshelpdesk.ed.gov Phone: 1-866-WWC-9799 (8am-8pm ET Mon-Fri) Email: info@whatworkshelpdesk.ed.gov Mission is to provide federal, state, and local education officials, researchers, program providers, and educators with practical, easy-to-use tools to (i) advance rigorous evaluations of educational interventions (i.e., programs, products, practices, and policies), and (ii) identify and implement evidence-based interventions.


Download ppt "Strategies for Effective Program Evaluations U.S. Department of Education The contents of this presentation were produced by the Coalition for Evidence-Based."

Similar presentations


Ads by Google