Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.

Slides:



Advertisements
Similar presentations
Chapter 22 Evaluating a Research Report Gay, Mills, and Airasian
Advertisements

Protocol Development.
Ies.ed.gov Connecting Research, Policy and Practice Accelerating the Academic Achievement of Students with Learning Disabilities Research Initiative Kristen.
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
Roger D. Goddard, Ph.D. March 21, Purposes Overview of Major Research Grants Programs Administered by IES; Particular Focus on the Education Research.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
8. Evidence-based management Step 3: Critical appraisal of studies
Why do we read research articles?
+ Evidence Based Practice University of Utah Presented by Will Backner December 2009 Training School Psychologists to be Experts in Evidence Based Practices.
The “Secrets” to Securing IES Funding: Some Lessons Learned as an IES Standing Panel Member Geoffrey D. Borman Professor, Educational Leadership and Policy.
Critiquing Research Articles For important and highly relevant articles: 1. Introduce the study, say how it exemplifies the point you are discussing 2.
Midterm Review Evaluation & Research Concepts Proposals & Research Design Measurement Sampling Survey methods.
IES Grant Writing Workshop for Efficacy and Replication Projects
Topics - Reading a Research Article Brief Overview: Purpose and Process of Empirical Research Standard Format of Research Articles Evaluating/Critiquing.
Grant Writing Workshop for Development & Innovation Projects
WRITING PROPOSALS WITH STRONG METHODOLOGY AND IMPLEMENTATION Kusum Singh, Virginia Tech Gavin W. Fulmer, National Science Foundation 1.
Critical Appraisal of an Article by Dr. I. Selvaraj B. SC. ,M. B. B. S
Studying treatment of suicidal ideation & attempts: Designs, Statistical Analysis, and Methodological Considerations Jill M. Harkavy-Friedman, Ph.D.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Associate Commissioner Teaching and Learning Division National Center.
Grant Writing Workshop for Efficacy and Replication Projects and Effectiveness Projects Hi, I’m Joan McLaughlin. Caroline Ebanks (from the National Center.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Teaching and Learning Division National Center for Education Research.
Writing a Research Proposal
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar May 2014 Note: These slides are intended as guidance.
Reporting & Ethical Standards EPSY 5245 Michael C. Rodriguez.
Overview of MSP Evaluation Rubric Gary Silverstein, Westat MSP Regional Conference San Francisco, February 13-15, 2008.
Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition, or past practice. The importance.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 12 Undertaking Research for Specific Purposes.
Study design P.Olliaro Nov04. Study designs: observational vs. experimental studies What happened?  Case-control study What’s happening?  Cross-sectional.
CHAPTER III IMPLEMENTATIONANDPROCEDURES.  4-5 pages  Describes in detail how the study was conducted.  For a quantitative project, explain how you.
Incorporating an Evaluation Plan into Program Design: Using Qualitative Data Connie Baird Thomas, PhD Linda H. Southward, PhD Colleen McKee, MS Social.
Evaluating a Research Report
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Today: Our process Assignment 3 Q&A Concept of Control Reading: Framework for Hybrid Experiments Sampling If time, get a start on True Experiments: Single-Factor.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
KATEWINTEREVALUATION.com Education Research 101 A Beginner’s Guide for S STEM Principal Investigators.
Experimental Research Methods in Language Learning Chapter 16 Experimental Research Proposals.
Session I: Unit 2 Types of Reviews September 26, 2007 NCDDR training course for NIDRR grantees: Developing Evidence-Based Products Using the Systematic.
1 f02kitchenham5 Preliminary Guidelines for Empirical Research in Software Engineering Barbara A. Kitchenham etal IEEE TSE Aug 02.
Writing about Methods in Dissertations and Doctoral Studies
Quantitative and Qualitative Approaches
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar April 25, 2012 Note: These slides are intended as.
SURVEY RESEARCH.  Purposes and general principles Survey research as a general approach for collecting descriptive data Surveys as data collection methods.
Scientifically-Based Research What is scientifically-based research? How do evaluate it?
Classifying Designs of MSP Evaluations Lessons Learned and Recommendations Barbara E. Lovitts June 11, 2008.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Securing External Federal Funding Janice F. Almasi, Ph.D. Carol Lee Robertson Endowed Professor of Literacy University of Kentucky
Evidence-Based Practice Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition,
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Chapter Eight: Quantitative Methods
Designing New Programs Design & Chronological Perspectives (Presentation of Berk & Rossi’s Thinking About Program Evaluation, Sage Press, 1990)
Developing an evaluation of professional development Webinar #2: Going deeper into planning the design 1.
Characteristics of Studies that might Meet the What Works Clearinghouse Standards: Tips on What to Look For 1.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
Quantitative research Meeting 7. Research method is the most concrete and specific part of the proposal.
How do you know your product “works”? And what does it mean for a product to “work”?
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar 2015 Update Note: These slides are intended as guidance.
Critiquing Quantitative Research.  A critical appraisal is careful evaluation of all aspects of a research study in order to assess the merits, limitations,
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Writing a sound proposal
Goal 2/ Goal 3 In 2016, no Goal 2s accepted; 2017?
The Research Design Continuum
Research Designs, Threats to Validity and the Hierarchy of Evidence and Appraisal of Limitations (HEAL) Grading System.
AXIS critical Appraisal of cross sectional Studies
Critical Reading of Clinical Study Results
BU Career Development Grant Writing Course- Session 3, Approach
Misc Internal Validity Scenarios External Validity Construct Validity
Presentation transcript:

Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010

Objectives Development Fully develop intervention Gather data on the feasibility of implementing the intervention Provide pilot data on the promise of the intervention for generating desired outcomes Efficacy Estimate the strength or potency of the impact of the intervention Inform the degree to which the intervention can be feasibly or practically implemented Assess implementation fidelity

Grant application components Significance Research Plan Personnel Resources

Significance Development Context for the Intervention Detailed Description of the Intervention to be developed Theory of change Theoretical and empirical support Practical Importance Rationale for why this grant should be funded Efficacy Detailed description of developed intervention Theory of Change Justification for evaluating the intervention - Practical Importance - Rationale for grant funding

Research Plan Development 1.Sample and setting 2.Iterative development process 3.Feasibility of implementation 4.Pilot study 5.Measures Efficacy 1.Research questions 2.Sample and setting 3.Detailed research design 4.Power Analysis 5.Measures 6.Fidelity of Implementation 7.Description of Comparison Group 8.Mediating and Moderation Variables 9.Data Analysis

Efficacy: Research Questions Clear and concise Stem from Significance section Relate to design and methodology Map onto analysis section (e.g., if subgroups are of interest)

Efficacy: Sample Issues Show that sample size is large enough to rule out that a finding of no impact is due to lack of power Provide support from education settings for what you are planning to do (e.g., RCTs) Provide sufficient assurance that you can recruit the necessary number of participants

Efficacy: Rigorous Research Design RCT favored - Unit of randomization and justification - Procedures for assignment Strong quasi-experiment - why RCT not possible - How it reduces or models selection bias - RDD, IV, strong matching - Discuss threats to internal validity that remain and how this will effect conclusions drawn

Efficacy: Power Analysis Provide and justify all values used in calculating power Provide method used to calculate power –Address clustering/nesting of data Reviewers should be able to reproduce your results

Efficacy: Measures Assess feasibility Assess impact - include measures of broad educational interest (aligned and general) - sensitive to change in performance caused by intervention - assess proximal and distal outcomes - include mediators and moderators Assess fidelity

Efficacy: Fidelity of Implementation Description of the intervention should include how it will be implemented Documenting and measuring implementation –Discuss how capture core components –Reliability and validity of measures; coding of qualitative measures –How fidelity study will be implemented Mediators and moderators of fidelity Use of fidelity data in impact analysis

Efficacy: Describing the Comparison Group Determine if comparison group received similar treatment –Overall approach and/or key components A fidelity study for the comparison group –Measures, reliability & validity –Often “business as usual” (define) or alternative treatment Explain impacts Address contamination if this is a possibility

Efficacy: Mediating and Moderating Variables Described in your theory of change Discuss how to measure –Describe method and process to collect and code Measure in both treatment and comparison group

Efficacy: Detailed Data Analysis Thorough description of data analysis –Main analysis, –Sub-analyses including mediators and moderators (fidelity levels, subgroups) –Show models –Address common statistical issues: clustering of units, missing data, attrition bias –Qualitative data analyses – often not described Show that analysis fits with design and addresses research questions How conclusions drawn from results

Personnel Development Developer plays primary role Methodological and statistical expertise represented Expertise needed in working with schools (or other study settings) and with population Content domain experts Efficacy Developer role is prominent Methodological and statistical expertise prominent Expertise needed in working with schools (or other study settings) and with population Content domain experts

Resources In both Development and Efficacy grant applications, there need to be resources sufficient to support research activities, and provide access to settings in which research will be conducted