Securing External Federal Funding Janice F. Almasi, Ph.D. Carol Lee Robertson Endowed Professor of Literacy University of Kentucky

Slides:



Advertisements
Similar presentations
Chapter 22 Evaluating a Research Report Gay, Mills, and Airasian
Advertisements

 Is extremely important  Need to use specific methods to identify and define target behavior  Also need to identify relevant factors that may inform.
Ies.ed.gov Connecting Research, Policy and Practice Accelerating the Academic Achievement of Students with Learning Disabilities Research Initiative Kristen.
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
Roger D. Goddard, Ph.D. March 21, Purposes Overview of Major Research Grants Programs Administered by IES; Particular Focus on the Education Research.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
8. Evidence-based management Step 3: Critical appraisal of studies
+ Evidence Based Practice University of Utah Presented by Will Backner December 2009 Training School Psychologists to be Experts in Evidence Based Practices.
The “Secrets” to Securing IES Funding: Some Lessons Learned as an IES Standing Panel Member Geoffrey D. Borman Professor, Educational Leadership and Policy.
Funding Opportunities at the Institute of Education Sciences: Information for the Grants Administrator Elizabeth R. Albro, Ph.D. Acting Commissioner National.
Evidenced Based Practice; Systematic Reviews; Critiquing Research
IES Grant Writing Workshop for Efficacy and Replication Projects
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Grant Writing Workshop for Development & Innovation Projects
WRITING PROPOSALS WITH STRONG METHODOLOGY AND IMPLEMENTATION Kusum Singh, Virginia Tech Gavin W. Fulmer, National Science Foundation 1.
How to Improve your Grant Proposal Assessment, revisions, etc. Thomas S. Buchanan.
Studying treatment of suicidal ideation & attempts: Designs, Statistical Analysis, and Methodological Considerations Jill M. Harkavy-Friedman, Ph.D.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Associate Commissioner Teaching and Learning Division National Center.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Teaching and Learning Division National Center for Education Research.
Writing a Research Proposal
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar May 2014 Note: These slides are intended as guidance.
Grant Writing Workshop for Historically Black Colleges and Universities Allen Ruby, Ph.D. Katina R. Stapleton, Ph.D. Policy and Systems Division National.
Reporting & Ethical Standards EPSY 5245 Michael C. Rodriguez.
Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition, or past practice. The importance.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Literature Review and Parts of Proposal
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Associate Commissioner Teaching and Learning Division National Center.
Grant Writing Workshop for Research on Adult Education Elizabeth R. Albro National Center for Education Research.
Northcentral University The Graduate School February 2014
Writing research proposal/synopsis
Research Policies and Mechanisms: Key Points from the National Mathematics Advisory Panel Joan Ferrini-Mundy Director, Division of Research on Learning.
Writing the Introduction to the Study Dissertation Editors Writing Center.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Evaluating a Research Report
Understanding Meaning and Importance of Competency Based Assessment
KATEWINTEREVALUATION.com Education Research 101 A Beginner’s Guide for S STEM Principal Investigators.
Experimental Research Methods in Language Learning Chapter 16 Experimental Research Proposals.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar April 25, 2012 Note: These slides are intended as.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
1 The Theoretical Framework. A theoretical framework is similar to the frame of the house. Just as the foundation supports a house, a theoretical framework.
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
One-Way Analysis of Covariance (ANCOVA)
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
1 f02laitenberger7 An Internally Replicated Quasi- Experimental Comparison of Checklist and Perspective-Based Reading of Code Documents Laitenberger, etal.
1 Phase 2 Grant Renewals - March A- Overview A.1- Performance-based Funding Y1Y2Y3Y4Y5 Proposal Initial Grant Agreement(s)Extension of Grant.
Evidence-Based Practice Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition,
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
How is a grant reviewed? Prepared by Professor Bob Bortolussi, Dalhousie University
Designing New Programs Design & Chronological Perspectives (Presentation of Berk & Rossi’s Thinking About Program Evaluation, Sage Press, 1990)
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
Developing an evaluation of professional development Webinar #2: Going deeper into planning the design 1.
Characteristics of Studies that might Meet the What Works Clearinghouse Standards: Tips on What to Look For 1.
Michigan Assessment Consortium Common Assessment Development Series Module 16 – Validity.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
How do you know your product “works”? And what does it mean for a product to “work”?
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar 2015 Update Note: These slides are intended as guidance.
Critiquing Quantitative Research.  A critical appraisal is careful evaluation of all aspects of a research study in order to assess the merits, limitations,
Dr. Kathleen Haynie Haynie Research and Evaluation November 12, 2010.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Dr. Aidah Abu Elsoud Alkaissi An-Najah National University Employ evidence-based practice: key elements.
Rigor and Transparency in Research
Stages of Research and Development
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Writing a sound proposal
Look Beneath the Surface Regional Anti-Trafficking Program
The Grant Process at the Institute of education sciences
BU Career Development Grant Writing Course- Session 3, Approach
Presentation transcript:

Securing External Federal Funding Janice F. Almasi, Ph.D. Carol Lee Robertson Endowed Professor of Literacy University of Kentucky

Institute of Education Sciences 2 Grant Competitions per year Janice F. Almasi, Ph.D., University of Kentucky

Current Funding Opportunities 14 Long-term Programs of Research

Be Informed Subscribe to Newsflash at ies.ed.gov/newsflash

IES Research Goals  Goal 1: Identification  Identifying programs and practices associated with better educational outcomes (secondary data analysis)  Goal 2: Development Projects  Developing educational interventions  Goal 3: Efficacy and Replication Projects  Determine if fully-developed interventions are effective  Goal 4: Scale-Up  Goal 5: Measurement Projects Janice F. Almasi, Ph.D., University of Kentucky

Prior to Peer Review Meeting Triage identifies top 25 applications Reviewers read, rate about 8 applications Reviewers check for COIs Each application assigned to at least 2 reviewers Janice F. Almasi, Ph.D., University of Kentucky

Criteria  Significance  Research Plan  Personnel  Resources Janice F. Almasi, Ph.D., University of Kentucky

Review Criterion Ratings (More Weaknesses than Strengths) (Balance of Strengths and Weaknesses) (More Strengths than Weaknesses) Janice F. Almasi, Ph.D., University of Kentucky 1 PoorExcellent

What Reviewers Look For Where Applications Tend to be Weak Janice F. Almasi, Ph.D., University of Kentucky

Significance Goal 1 Theoretical and empirical rationale for study and practical importance of the intervention (e.g., program, practice) that will be examined Goals 2 and 3 Describe (a) the intervention (e.g., features, components) and the logic model for the intervention, (b) theoretical and empirical support for intervention, and (c) practical importance of the intervention Janice F. Almasi, Ph.D., University of Kentucky

Significance in Goals 2 and 3  Context for Proposed Interventions  Provide context for the proposed intervention by including data on, or reviewing research describing, the attributes of typical existing practices.  Identify shortcomings of current practice and how they contribute to the rationale for the proposed intervention.  Provide context for understanding how much of a change the proposed intervention is intended to achieve. Janice F. Almasi, Ph.D., University of Kentucky

Significance in Goals 2 and 3  Intervention, Theory of Change, Empirical/Theoretical Rationale  Clearly describe the intervention  Clearly describe the theory of change for the intervention  How do the features or components of the intervention relate to each other temporally (or operationally), pedagogically, and theoretically (e.g., why A leads to B)?  Provide a strong theoretical and empirical justification for the design and sequencing of the features or components of the intervention. Enables evaluation of:  Relation between the intervention and its theoretical and empirical foundation (e.g., is the proposed intervention a reasonable operationalization of the theory?)  Relation between the intervention and the outcome measures (e.g., do the proposed measures tap the constructs that the intervention is intended to address?) Janice F. Almasi, Ph.D., University of Kentucky Include a Logic Model

Significance in Goals 2 and 3  Practical Importance of Intervention  When the proposed intervention is fully developed will it have the potential to improve student outcomes in educationally meaningful increments, if it were implemented over the course of a semester or school year?  Would the proposed intervention be both affordable for and easily implemented by schools (e.g., not involve major adjustments to normal school schedules)? Janice F. Almasi, Ph.D., University of Kentucky

Research Plan:Goal 2 Janice F. Almasi, Ph.D., University of Kentucky Sample Samples and settings used to assess feasibility of intervention and for pilot data assessing promise of intervention Iterative Development Process Revision Implementation Observation Revision How do you define “operating as intended?” How do you define “operating as intended?” What data will be gathered to determine how intervention is operating? What data will be gathered to determine how intervention is operating? How will the data gathered be used to revise the intervention? How will the data gathered be used to revise the intervention? What criteria will be used to determine if the intervention operates as intended? What criteria will be used to determine if the intervention operates as intended?

Research Plan:Goal 2 Janice F. Almasi, Ph.D., University of Kentucky Feasibility of Implementation Goal is a fully developed intervention Data that addresses feasibility of implementing in small sample of authentic education settings Promise of intervention in terms of outcomes Pilot Study Pilot data on outcome measures progressing in right direction Pilot data demonstrates implementation of intervention is associated with behaviors consistent with theory of change No more than 30% of funds Data should not be a test of efficacy

Research Plan:Goal 2 Janice F. Almasi, Ph.D., University of Kentucky Measures Clearly describe procedures for gathering data to refine and revise the intervention and provide insight into feasibility and usability of proposed intervention What needs to be observed? How will observations be gathered? Clearly describe measures that will be used (and reliability and validity if appropriate)

Reearch Plan:Goal 3 Janice F. Almasi, Ph.D., University of Kentucky Research Questions Pose clear, concise hypotheses or research questions Sample Define sample to be selected Define sampling procedures (including justification for inclusion and exclusion) Strategies to be used to reduce attrition

Reearch Plan:Goal 3 Janice F. Almasi, Ph.D., University of Kentucky Research Design Provide detail! How will threats to internal/external validity be addressed? Studies using random assignment are preferred where feasible What is unit of randomization and what procedures will be used to make assignments to conditions? Power What power is needed to detect a reasonably expected and minimally important effect? How was effect size calculated? If clusters are randomly assigned to treatment conditions be sure to include intraclass correlation and anticipated effect size in power analysis

Research Plan:Goal 3 Janice F. Almasi, Ph.D., University of Kentucky Measures Justify appropriateness of measures Are measures of practical interest to educators and not overly aligned with intervention? Include reliability and validity information Fidelity of Implementation How will implementation be documented and measured? How will factors associated with fidelity be identified and assessed? How will fidelity data be incorporated into analyses of impact? How do conditions in the school setting affect fidelity of implentation?

Research Plan:Goal 3 Janice F. Almasi, Ph.D., University of Kentucky Comparison Group How does comparison group compare to intervention on critical features of intervention? Using a “business-as-usual” comparison is acceptable but explain why using it is acceptable How will contamination be avoided? Mediating and Moderating Variables Observational, survey, or qualitative methods are encouraged to help identify factors that may explain the effect or lack of effect of intervention Data Analysis Quantitative: Specify statistical procedures and include formulas where appropriate Qualitative: Specific methods used to index, summarize, and interpret data should be identified Relation between hypotheses, measures, and independent and dependent variables should be clear

Personnel  What role will each individual have in the project?  What qualifications, training, and experience do key personnel possess?  How will qualifications be used on the research?  Are key personnel dedicating sufficient time to competently implement proposed research? Janice F. Almasi, Ph.D., University of Kentucky

Resources  Are the resources adequate to support the proposed activities in terms of:  Facilities  Equipment  Supplies  Institutional Support for Managing/Directing Grants and Supporting Scholarship  Have partners shown support for implementation and support of the project? Janice F. Almasi, Ph.D., University of Kentucky

Scientist Reviewer Critiques  A brief description of the overall application.  Identify each application’s key strengths and weaknesses in each of the evaluation areas and prepare critical, evaluative comments.  Integrated summary of the overall assessment of the application, including the main strengths and weaknesses of the application. Janice F. Almasi, Ph.D., University of Kentucky

Overall Score Janice F. Almasi, Ph.D., University of Kentucky

Peer ReviewMeeting Process Janice F. Almasi, Ph.D., University of Kentucky 3. Notetaker Summarizes the discussion orally and in writing 2. Full Panel Discusses the application, asks questions, and offers additional critique Discusses the budget 1. Assigned Scientist Reviewers Share Overall Scores Reviewer 1 Summarizes the application and its strengths and weaknesses in each Reviewer 2 elaborates on areas of agreement or disagreement

Peer ReviewMeeting Process Janice F. Almasi, Ph.D., University of Kentucky 6. Assigned Scientist Reviewers May edit/revise their original written critiques based on panel discussion 5. Full Panel Privately assigns criteria scores and overall scores 4. Assigned Scientist Reviewers Adjust initial recommended criteria scores and overall scores