Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar 2015 Update Note: These slides are intended as guidance.

Slides:



Advertisements
Similar presentations
A Roadmap to Successful Implementation Management Plans.
Advertisements

Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Synthesizing the evidence on the relationship between education, health and social capital Dan Sherman, PhD American Institutes for Research 25 February,
How Do We Know if a Charter School is Really Succeeding? – Various Approaches to Investigating School Effectiveness October 2012 Missouri Charter Public.
TITLE OF PROJECT PROPOSAL NUMBER Principal Investigator PI’s Organization ESTCP Selection Meeting DATE.
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
FY 2015 SSS Using Evidence to Support Proposed Strategies and Activities Competitive Preference Priorities 1(b) and 2(b)  Addressing priority(ies) in.
Experimental Research Designs
Alaska Native Education Program (ANEP) Technical Assistance Meeting September 2014 Sylvia E. Lyles Valerie Randall Almita Reed.
Computing Leadership Summit STEM Education Steve Robinson U.S. Department of Education White House Domestic Policy Council February 22, 2010.
High-Quality Supplemental Educational Services And After-School Partnerships Demonstration Program (CFDA Number: ) CLOSING DATE: August 12, 2008.
Funding Opportunities at the Institute of Education Sciences: Information for the Grants Administrator Elizabeth R. Albro, Ph.D. Acting Commissioner National.
IES Grant Writing Workshop for Efficacy and Replication Projects
Arts in Education National Grant Program (AENP) Pre-Application Webinar U.S. Department of Education Office of Innovation and Improvement Improvement Programs.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar May 2014 Note: These slides are intended as guidance.
Developing Quality Grant Proposals U.S. Department of Education Center for Faith-Based and Community Initiatives
Overview Slides April 17, 2012 Q&A Webinar i3 Scale-up and Validation Applications Note: These slides are intended as guidance only. Please refer to the.
Overview of MSP Evaluation Rubric Gary Silverstein, Westat MSP Regional Conference San Francisco, February 13-15, 2008.
Selecting a Research Design. Research Design Refers to the outline, plan, or strategy specifying the procedure to be used in answering research questions.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Summary Document March 2010 Investing in Innovation (i3) Fund Note: These slides are intended as guidance only. Please refer to the official notice of.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
DRAFT – Not for Circulation Investing in Innovation (i3) 2012 Development Competition Summary Document February 2012 Note: These slides are intended as.
Overview Slides March 13, 2012 Q&A Webinar i3 Development Pre-Application Note: These slides are intended as guidance only. Please refer to the official.
Professional Development for Arts Educators Program (PDAE) Pre-Application Webinar U.S. Department of Education Office of Innovation and Improvement Improvement.
Presenters: Martin J. Blank, Martin J. Blank, President, Institute for Educational Leadership; Director, Coalition for Community Schools S. Kwesi Rollins.
Arts in Education Model Development and Dissemination Grant Program (AEMDD) Pre-Application Webinar U.S. Department of Education Office of Innovation and.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Mathematics and Science Education U.S. Department of Education.
KATEWINTEREVALUATION.com Education Research 101 A Beginner’s Guide for S STEM Principal Investigators.
Assisting GPRA Report for MSP Xiaodong Zhang, Westat MSP Regional Conference Miami, January 7-9, 2008.
What Works Clearinghouse Susan Sanchez Institute of Education Sciences.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Preliminary Results – Not for Citation Strengthening Institutions Program Webinar on Competitive Priority on Evidence April 11, 2012 Note: These slides.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar April 25, 2012 Note: These slides are intended as.
1 Copyright © 2011 by Saunders, an imprint of Elsevier Inc. Chapter 8 Clarifying Quantitative Research Designs.
Q&A Webinar i3 Development Pre-Application Overview Slides April 2014 Note: These slides are intended as guidance only. Please refer to the official documents.
Classifying Designs of MSP Evaluations Lessons Learned and Recommendations Barbara E. Lovitts June 11, 2008.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
NOTES FROM INFORMATIONAL BRIEFINGS FOR POTENTIAL REGIONAL CENTER AND CONTENT CENTER APPLICANTS JUNE 19,20 & 22, 2012 Comprehensive Centers Program.
WWC Standards for Regression Discontinuity Study Designs June 2010 Presentation to the IES Research Conference John Deke ● Jill Constantine.
November 15, Regional Educational Laboratory - Southwest The Effects of Teacher Professional Development on Student Achievement: Finding from a Systematic.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
Best Practices in Demonstrating Evidence Diana Epstein, Ph.D, CNCS Office of Research and Evaluation Carla Ganiel, AmeriCorps State and National.
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
One-Way Analysis of Covariance (ANCOVA)
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Evaluation Designs Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Securing External Federal Funding Janice F. Almasi, Ph.D. Carol Lee Robertson Endowed Professor of Literacy University of Kentucky
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Developing an evaluation of professional development Webinar #2: Going deeper into planning the design 1.
Characteristics of Studies that might Meet the What Works Clearinghouse Standards: Tips on What to Look For 1.
OFFICE OF ENGLISH LANGUAGE ACQUISITION NATIONAL PROFESSIONAL DEVELOPMENT PROGRAM (NPD) NPD Grant Competition Webinar 2: GPRA & Selection Criteria January.
OFFICE OF ENGLISH LANGUAGE ACQUISITION NATIONAL PROFESSIONAL DEVELOPMENT PROGRAM (NPD) NPD Grant Competition Webinar 3: Evidence and Evaluation January.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Best Practices in CMSD SLO Development A professional learning module for SLO developers and reviewers Copyright © 2015 American Institutes for Research.
Welcome to Workforce 3 One U.S. Department of Labor Employment and Training Administration Webinar Date: April 30, 2014 Presented by: U.S. Departments.
Pre-Application Workshop Hispanic-Serving Institutions Division March 2016.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
June 25, Regional Educational Laboratory - Southwest Review of Evidence on the Effects of Teacher Professional Development on Student Achievement:
Rigor and Transparency in Research
Stages of Research and Development
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Investing in Innovation (i3) Fund
2016 AmeriCorps Texas All-Grantee Meeting February 25-26, 2016
Evidence Based Curriculum & Instruction
Presentation transcript:

Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar 2015 Update Note: These slides are intended as guidance only. Please refer to the official documents published in the Federal Register.

Preliminary Results – Not for Citation Agenda  Evidence Standards—eligibility requirement Requirements Review process What Works Clearinghouse (WWC) Evidence Standards  Independent Evaluation—program requirement Guidance on qualities of high-quality evaluation plans  Questions & answers Live—submission via the Webinar chat function Post-webinar: to 2

Preliminary Results – Not for Citation Evidence Standards— Eligibility Requirement 3

Preliminary Results – Not for Citation i3 Evidence Eligibility Requirements  All applications must meet the applicable evidence requirement to receive a grant award.  Applications that do not meet the evidence requirement will not be eligible for an award, regardless of scores on the selection criteria.  If an application does not meet the evidence standard of the grant type under which it was submitted, it will not be considered for a different type of i3 grant.  Applicants must either ensure that all evidence is available to the Department from publicly available sources and provide links or other guidance indicating where it is available; or, in the application, include copies of evidence in Appendix D. 4

i3 Scale-up Grant Evidence Standard: “Strong Evidence of Effectiveness” *See What Works Clearinghouse (WWC) Procedures and Standards Handbook, which can currently be found at the following link: ies.ed.gov/ncee/wwc/DocumentSum.aspx?sid=19 Option 1 Option 2 Characteristics of the Cited Prior Evidence 5

Preliminary Results – Not for Citation Scale-up Grant Evidence Eligibility Requirements  To be eligible for an award, an application for a Scale- up grant must be supported by strong evidence of effectiveness.  An applicant should identify up to four study citations to be reviewed against WWC Evidence Standards for the purpose of meeting the i3 evidence standard requirement.  An applicant should clearly identify these citations in Appendix D, under the “Other Attachments Form,” of its application. The Department will not review a study citation that an applicant fails to clearly identify for the evidence review. 6

i3 Validation Grant Evidence Standard: “Moderate Evidence of Effectiveness” Note: Greyed-out cells indicate criteria on which the standards are silent. *See What Works Clearinghouse (WWC) Procedures and Standards Handbook, which can currently be found at the following link: ies.ed.gov/ncee/wwc/DocumentSum.aspx?sid=19 Option 1 Option 2 Characteristics of the Cited Prior Evidence 7

Preliminary Results – Not for Citation Validation Grant Evidence Eligibility Requirements  To be eligible for an award, an application for a Validation grant must be supported by moderate evidence of effectiveness.  An applicant should identify up to two study citations to be reviewed against WWC Evidence Standards for the purpose of meeting the i3 evidence standard requirement.  An applicant should clearly identify these citations in Appendix D, under the “Other Attachments Form,” of its application. The Department will not review a study citation that an applicant fails to clearly identify for the evidence review. 8

i3 Development Grant Evidence Standards Number of Studies Not Applicable; Logic Model OnlyAt least 1 Statistical Significance Statistically significant or substantively important (0.25 standard deviation or larger) positive association or impact WWC Standards* Not Applicable; Correlational study with statistical controls for selection bias Meets with reservations Meets without reservations Note: Greyed-out/shaded cells indicate criteria on which the standards are silent. *See What Works Clearinghouse (WWC) Procedures and Standards Handbook, which can currently be found at the following link: ies.ed.gov/ncee/wwc/DocumentSum.aspx?sid=19 Option 1 Option 2 Option 3 Strong TheoryEvidence of Promise 9

Preliminary Results – Not for Citation Evidence Standards— Eligibility Review Process 10

Preliminary Results – Not for Citation Responsibility for the Evidence Eligibility Reviews  OII assess whether Development applicants meet the “strong theory” standard.  The Institute of Education Sciences (IES) conducts the other evidence reviews. IES sends evidence citations to What Works Clearinghouse (WWC) reviewers.  IES provides the WWC reviews, as well as information about the intervention, sample, and setting for each study reviewed to OII.  OII makes a determination about the relevance of the intervention, sample, and setting of the study compared to what the applicant has proposed. 11

Preliminary Results – Not for Citation Evidence Citation Reminders  ED will limit reviews to evidence explicitly cited in Appendix D of the application as supporting the eligibility requirement.  Applicants must ensure evidence is available to ED from publicly available sources and provide links or other guidance indicating where it is available. Or an applicant may include copies of evidence in Appendix D.  Include only citations that are relevant to the intervention being proposed.  For evidence that will go through WWC review, include only citations that are primary analyses of the effect of the intervention being proposed. 12

Preliminary Results – Not for Citation What Works Clearinghouse Evidence Standards

Preliminary Results – Not for Citation How Does the WWC Assess Research Evidence?  Type of design: does the study design allow us to draw causal conclusions?  Strength of data: does the study focus on relevant outcomes and measure them appropriately?  Adequacy of statistical procedures: are the data analyzed properly? 14

Preliminary Results – Not for Citation WWC Standards Apply to Causal Designs Eligible Designs Randomized controlled trials (RCTs) Quasi-experimental designs (QEDs) Potentially Eligible Designs Regression discontinuity (RDD) Single case (SCD) Ineligible Designs Anecdotes and testimonials Case studies Descriptive Correlational 15

Preliminary Results – Not for Citation The WWC reviews studies using a relevant topic-specific protocol where possible. Review Protocols guide all WWC reviews s.aspx?f=All%20Publication%20and%20Product %20Types,5;#pubsearch s.aspx?f=All%20Publication%20and%20Product %20Types,5;#pubsearch If a relevant topic-specific protocol does not exist, the WWC will use the Single Study Review protocol to guide the review: sid=234 Process for Assessing Research Evidence 16

Preliminary Results – Not for Citation Key Elements of the WWC RCT/QED Standards Randomization Attrition Equivalence Meets Evidence Standards Meets Evidence Standards with Reservations Does Not Meet Evidence Standards Low No Yes High Yes No 17

Preliminary Results – Not for Citation Caution 1: RCTs with high sample attrition must demonstrate baseline equivalence Randomization? Attrition? Baseline Equivalence? Meets Evidence Standards Meets Evidence Standards with Reservations Does Not Meet Evidence Standards Low No Yes High Yes No 18

Preliminary Results – Not for Citation Standards Account for Overall and Differential Attrition Rates Overall AttritionTreatment-Control Differential Attrition 15%10.7% 20%10.0% 30%8.2% Sample maximum attrition thresholds for meeting WWC evidence standards 19

Preliminary Results – Not for Citation Caution 2: All QEDs must demonstrate baseline equivalence between the treatment and control groups Randomization? Attrition? Baseline Equivalence? Meets Evidence Standards Meets Evidence Standards with Reservations Does Not Meet Evidence Standards Low No Yes No 20

Preliminary Results – Not for Citation Baseline Equivalence Standard Equivalence must be demonstrated on the analytic sample Baseline Difference: Treatment v. Control No Adjustment Needed Adjustment Needed Does not Meet Standards <.05 sd√ sd√ >.25 sd√ 21

Preliminary Results – Not for Citation Other Criteria  Studies must measure impacts on relevant outcomes.  Outcome measures must be reliable.  Outcomes must not be over-aligned with the intervention.  There is not a confound between the treatment condition and one or more other factors that also could be the cause of differences in outcomes between the treatment and comparison groups. 22

Preliminary Results – Not for Citation Independent Evaluation-- Program Requirement and Guidance 23

Evaluation Requirements MUST All i3 Grantees MUST 1.Conduct an independent project evaluation of the impact of the i3- supported practice (as implemented at the proposed level of scale) on a relevant outcome. 2.Cooperate with technical assistance provided by the Department or its contractors. 3.Provide an updated evaluation plan to the Department within 100 days of grant award. 4.Share the results of any evaluation broadly. 5.Share the data for Validation and Scale-up evaluations. 1.Conduct an independent project evaluation of the impact of the i3- supported practice (as implemented at the proposed level of scale) on a relevant outcome. 2.Cooperate with technical assistance provided by the Department or its contractors. 3.Provide an updated evaluation plan to the Department within 100 days of grant award. 4.Share the results of any evaluation broadly. 5.Share the data for Validation and Scale-up evaluations. MUST * Note: The quality of an applicant’s project evaluation is also a selection criterion. 24

Selection Criterion: Quality of Project Evaluation “The extent to which the methods of evaluation will, if well implemented, produce evidence about the project’s effectiveness that would meet the WWC Evidence Standards without reservations.” Scale-up and Validation “The extent to which the methods of evaluation will, if well implemented, produce evidence about the project’s effectiveness that would meet the WWC Evidence Standards with reservations.” Development 25 Evidence of Effectiveness

Helpful Evaluation Resources Designing Quasi-Experiments: Meeting What Works Clearinghouse Standards Without Random Assignment Designing Strong Studies 26

Selection Criterion: Quality of Project Evaluation “The clarity and importance of the key questions to be addressed by the project evaluation, and the appropriateness of the methods for how each question will be addressed.” Scale-up, Validation, and Development “The extent to which the proposed project plan includes sufficient resources to carry out the project evaluation effectively.” Scale-up, Validation, and Development Clarity of Questions and Appropriateness of Methods 27 Sufficient Resources

Selection Criterion: Quality of Project Evaluation (Cont’d) “The extent to which the evaluation will study the project at the proposed level of scale, including, where appropriate, generating information about the potential differential effectiveness of the project in diverse settings and for diverse student population groups.” Scale-up and Validation “The extent to which the evaluation plan includes a clear and credible analysis plan, including a proposed sample size and minimum detectable effect size that aligns with the expected project impact, and an analytic approach for addressing the research questions.” Scale-up and Validation “The extent to which the evaluation plan clearly articulates the key components and outcomes of the project, as well as a measurable threshold for acceptable implementation.” Scale-up and Validation Clearly Articulates Key Components and Outcomes Clear and Credible Analysis Plan 28 Studies Project at Proposed Level of Scale

Preliminary Results – Not for Citation Guidance on Qualities of High-quality Evaluation Plans 29

Preliminary Results – Not for Citation Evaluation Goals  All i3 grantees are required to estimate the impact of the i3-supported practice (as implemented at the proposed level of scale) on a relevant outcome.  Aligned with i3 performance measures.  Increase strength of evidence on the impact or promise of i3-supported interventions.  i3 performance measures set the expectation for all i3 independent evaluations to provide high-quality implementation data and performance feedback.  Other expectations vary by grant type. 30

Preliminary Results – Not for Citation Evaluation Goals: Scale-up & Validation Projects  Expectation that evaluations will meet WWC evidence standards without reservations (e.g., well-implemented randomized controlled trial).  Provide information on key elements of the intervention for replication or testing in other settings.  Reflect changes to the intervention or delivery model.  Reflect the nature of sites served, implementation settings, and participants.  Document costs/resource use. 31

Preliminary Results – Not for Citation Evaluation Goals: Development Projects  Expectation that evaluations will meet WWC evidence standards with reservations (e.g., randomized controlled trial, or quasi-experimental design study).  Provide information on key elements of the intervention for further development, replication, or testing in other settings. 32

Preliminary Results – Not for Citation Evaluation Plan Tips (1 of 6)  Provide a fully developed plan in the application: Maximizes the ability to adequately address key questions. Minimizes the time spent at the beginning grant specifying details and negotiating revisions with ED. Key components of a high-quality evaluation plan include:  Logic model (What is the intervention? Who is it intended to serve? What outcomes is it expected to produce? How?);  Research questions (What do you want to learn? Who will the results pertain to?);  Proposed methods (What is the sampling frame? How will treatment and comparison groups be formed? What is the minimum detectable effect size? What data collection measures and methods will be used? How will the data be analyzed and reported?); and  Coherent links among the above components. 33

Preliminary Results – Not for Citation Evaluation Plan Tips (2 of 6)  Provide a well-developed logic model to guide the evaluation plan, including: Clear description of the i3-supported intervention/scaling mechanism; Key elements of the intervention in sufficient detail to guide replication; Target population for the innovation; Expected pathways through which the innovation is expected to affect intermediate and ultimate outcomes of interest; and Expected mechanisms of change—e.g., shows how factors such as content, organization, duration, training, and other procedures are expected to lead to the intended outcomes. 34

Preliminary Results – Not for Citation Evaluation Plan Tips (3 of 6)  Specify and provide clarity on key research questions in the evaluation plan. Creates shared expectations with partners and ED about what will be learned.  Specify and provide clarity on aspects of the logic model the evaluation will examine.  Propose a design (methods, sample, data collection, analysis) appropriate to address the key questions. Be explicit about the population to which the proposed sample generalizes. 35

Preliminary Results – Not for Citation Evaluation Plan Tips (4 of 6)  Specify proposed methods: For questions about program effectiveness (i.e., causal inference questions):  Rely on experimental methods, when possible. NOTE: Challenging to meet WWC evidence standards with quasi- experimental methods (as noted above).  Describe how treatment and control groups will be formed and monitored. 36

Preliminary Results – Not for Citation Evaluation Plan Tips (5 of 6)  Summarize data collection and analysis procedures: Describe data sources, samples, data collection procedures and timeline, and analysis methods for each research question; and Ensure critical components of the logic model are represented in the data collection plan.  NOTE: ensure measurement of implementation fidelity.  NOTE: comparative studies should document the experiences of comparison participants. 37

Preliminary Results – Not for Citation Evaluation Plan Tips (6 of 6)  Ensure good match among logic model, research questions, proposed methods, and i3 goals: Sample design should yield a sample representative of the relevant population group for the i3 project & support strong impact analyses (relevant to the intervention as implemented under i3); Analysis plan should be appropriate for addressing the study questions and well-matched to the sample design; and Plan for reporting the findings should be consistent with the evaluation goals and design. 38

Preliminary Results – Not for Citation Other Important Resources Note: These slides are intended as guidance only. Please refer to the official Notices in the Federal Register. Investing in Innovation Fund Website: (  Notices of Final Priorities, Requirements, and Selection Criteria (published in the Federal Register on 3/27/2013)  Application Packages for each competition (includes the respective Notice Inviting Applications)  Frequently Asked Questions What Works Clearinghouse Website: (  Reference Resources, Procedures and Standards Handbook All questions about i3 may be sent to 39