Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.

Slides:



Advertisements
Similar presentations
Research Strategies: Joining Deaf Educators Together Deaf Education Virtual Topical Seminars Donna M. Mertens Gallaudet University October 19, 2004.
Advertisements

Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Introduction to Impact Assessment
Dr Linda Allin Division of Sport Sciences The value of real life evaluation research for student learning and employability in Sports Development.
Postgraduate Course 7. Evidence-based management: Research designs.
SETTINGS AS COMPLEX ADAPTIVE SYSTEMS AN INTRODUCTION TO COMPLEXITY SCIENCE FOR HEALTH PROMOTION PROFESSIONALS Nastaran Keshavarz Mohammadi Don Nutbeam,
Introduction to the unit and mixed methods approaches to research Kerry Hood.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Donald T. Simeon Caribbean Health Research Council
Supporting continuous improvement in the replication process Getting to Grips with Replication Seminar 3: Monitoring, evaluation & continuous improvement.
Michelle O’Reilly. Quantitative research is outcomes driven Qualitative research is process driven Please offer up your definitions.
In Europe, When you ask the VET stakeholders : What does Quality Assurance mean for VET system? You can get the following answer: Quality is not an absolute.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Session 2: Implementation and process evaluation Implementation Neil Humphrey (Manchester) Ann Lendrum (Manchester) Process evaluation Louise Tracey (IEE.
Reviewing and Critiquing Research
PPA 502 – Program Evaluation
Need to clarify what is possible.... Variables  settable  measurable  inferrable  observable  estimable  aspirational  hypothetical Relationships.
Thinking: A Key Process for effective learning “The best thing we can do, from the point of view of the brain and learning, is to teach our learners how.
Questions from a patient or carer perspective
CAPP Evaluation: Implementing Evidence Based Programs in NYS Jane Powers ACT for Youth Center of Excellence 2011 A presentation for Comprehensive Adolescent.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Implementation and process evaluation in intervention development and research Neil Humphrey and Ann Lendrum Manchester Institute of Education
Supported Employment Demonstration Sites 2010/2011.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
How to Develop the Right Research Questions for Program Evaluation
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Dawne Gurbutt, Discipline Lead, Health Related Studies 11 th July 2013 Enhancing the student learning experience through Patient & Public Involvement Practice,
Evaluation – the why’s, the what’s and the how’s 2014 Dr Basma Ellahi (PI) Cat Crum (PhD Student)
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Program Fidelity Influencing Training Program Functioning and Effectiveness Cheryl J. Woods, CSW.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Building Evidence in Education: Conference for EEF Evaluators 11 th July: Theory 12 th July: Practice
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
Experimental Research Methods in Language Learning Chapter 1 Introduction and Overview.
Organizational Behavior. Organizational Behavior-What is it?  OB Involves the study of process-how people in social systems function with each other.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
URBDP 591 I Lecture 3: Research Process Objectives What are the major steps in the research process? What is an operational definition of variables? What.
LEVEL 3 I can identify differences and similarities or changes in different scientific ideas. I can suggest solutions to problems and build models to.
Laying the Foundation for Scaling Up During Development.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
IMPLEMENTATION QUALITY RESEARCH OF PREVENTION PROGRAMS IN CROATIA MIRANDA NOVAK University of Zagreb, Faculty of Education and Rehabilitation Sciences.
Third Sector Evaluation: Challenges and Opportunities Presentation to the Public Legal Education in Canada National Conference on “Making an Impact” 26.
Approach to GEF IW SCS Impact Evaluation Aaron Zazueta Reference Group Meeting Bangkok, Thailand September 27, 2010.
Challenges and healthy ageing: the role of resilience across the life course 1 st Meeting of ResNet 19 th May, 2009 Bangor University.
The P Process Strategic Design
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Research for Nurses: Methods and Interpretation Chapter 1 What is research? What is nursing research? What are the goals of Nursing research?
Improving skills and care standards in the support workforce: a realist synthesis of workforce development interventions Jo Rycroft-Malone, Christopher.
EVIDENCE BASED POLICY RECOMMENDATIONS – TAKE AWAY LESSONS ON HOW TO PROGRESS EFFECTIVE ALCOHOL EDUCATION BETSY THOM Drug and Alcohol Research Centre MIDDLESEX.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
A Multi-Level Framework to Understand Factors Influencing Program Implementation in Schools Celene E. Domitrovich, Ph.D. Penn State Prevention Research.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Critical Realism and Realist Synthesis Sam Porter School of Nursing and Midwifery March 2016.
LEARN. CARE. COMMUNITY. PNWU.edu Figure 1: Concept Map for IPE Fidelity 1.Determine the rubric score that represents high, medium, and low fidelity. 2.Identify.
4 th Biennial SAMEA Conference Meaningful Evaluation: Improving Use and Results Evaluation of Teacher-Directed ICT Initiatives Presenter Mokete Mokone.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Sandra F. Naoom, MSPH National Implementation Research Network Frank Porter Graham Child Development Institute University of North Carolina- Chapel Hill.
A bright IDEA? Intervention Delivery and Evaluation Analysis in implementation and process evaluation Professor Neil Humphrey and Dr. Ann Lendrum Manchester.
DATA COLLECTION METHODS IN NURSING RESEARCH
Nursing Process Applied to Community Health Nursing
Health Education THeories
Introduction to Program Evaluation
Conducting Efficacy Trials
YSGOL GWYDDORAU GOFAL IECHYD / SCHOOL OF HEALTHCARE SCIENCES
Regulated Health Professions Network Evaluation Framework
Presentation transcript:

Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute of Education

Starting activity What is your understanding of the terms ‘process’ and ‘implementation’? Try to come up with working definitions for each in your group It may be helpful to think about how these terms could be applied and understood in relation to an intervention you are evaluating

Introduction Understanding ‘what works’ is important but it is equally important to know, “why various programs do or do not work, for whom and under what conditions they work, what is needed to scale up proven programs, and what policy supports are needed to scale them up without losing their effectiveness” (Slavin, 2012,p.xv)

What do we mean by implementation and process evaluation? Put very simply… –Assessment of outcomes in trials answers the question does it work? –Implementation and process evaluation (IPE) helps us to understand how and why IPE within trials explore, “the implementation, receipt and setting of an intervention and help in the interpretation of outcomes” (Oakley et al, 2006, p.413) –Studying how the intervention is implemented (including how and why this varies) –Ascertaining the views of key participants (e.g. pupils, teachers) of critical issues (e.g. social validity) –Distinguishing between different intervention components –Investigating contextual factors that may influence the achievement of expected outcomes

What do we mean by implementation and process evaluation? IPE can help to clarify whether assumptions in the intervention design about the causal links from intervention to impact work out in practice By paying attention to the social processes involved in making the intervention happen, IPE can help us to –Know what happened in an intervention –Establish the internal validity of the intervention and strengthen conclusions about its role in changing outcomes –Understand the intervention better – how different elements fit together, how users interact et cetera –Provide ongoing feedback that can enhance subsequent delivery –Advance knowledge on how best to replicate intervention effects in real world settings (Domitrovich and Greenberg, 2000)

A worked example Secondary SEAL national evaluation (Humphrey, Lendrum & Wigelsworth, 2010) –Outcomes strand Pre-test-post-test control group design (41 schools, c.8,000 pupils) Primary outcomes were social and emotional skills, behaviour, mental health –IPE strand Longitudinal cases studies in 9 SEAL schools Interviews and focus groups, observations, document analysis Analysis of data from outcomes strand indicated that SEAL had no measurable impact Analysis of data from IPE strand helped us to understand why this was the case –Issues with the programme theory (or lack thereof) –Implementation failure –Lack of understanding, resistance among staff THEORY IMPLEMENTATIONEVALUATION

Why is IPE important? Interventions are rarely (if ever!) delivered exactly as planned Variability in implementation has been consistently shown to predict variability in outcomes Interventions do not happen in a vacuum – so understanding context and the social processes within them is crucial

Putting the I in IPE Aspects of implementation –Fidelity/adherence –Dosage –Quality –Participant responsiveness –Programme differentiation –Programme reach –Adaptation –Monitoring of comparison conditions Factors affecting implementation –Preplanning and foundations –Implementation support system –Implementation environment –Implementer factors –Programme characteristics See Durlak and DuPre (2008), Greenberg et al (2005), Forman et al (2009)

Researching IPE activity Think about a trial you have recently completed Did you have an IPE built into your trial? If YES… –What information did you collect and why? –What did the data generated tell you about the intervention, the context in which it was being implemented, and the interaction between the two? –One useful piece of information you were able to feedback to the intervention designers/implementers/participants/funders about the process of implementation If NO… –What were the main findings of the trial? –What conclusions did you draw as a result? –What could IPE have added?

Developing our approach to IPE General approach - quantitative, qualitative or both? –62% quant, 21% qual, 17% both in health promotion research (Oakley, 2005) Where to target resources? –Intervention, context, and the interaction between the two Which aspects to assess when examining implementation? –In assessment of implementation the focus is predominantly on fidelity and dosage, but this can lead to a Type III error What you see and what people tell you –Implementer self-report or independent observation?

Benefits of IPE For implementers explains and illuminates findings from the impact evaluation, strengthening the conclusions impact evaluators draw For intervention designers clarifies whether assumptions embedded in the programme about “what works” are warranted indicates how the programme might need to adapt to unforeseen contextual factors that influence implementation enables designers to engage with programme participants’ perspectives on the programme, giving participants a voice For EEF, helps to clarify the relative effectiveness of different “theories of change” the necessary and sufficient conditions under which different programme logics might work best

Sources of further information and support Some reading –Lendrum, A. & Humphrey, N. (2012). The importance of studying the implementation of school-based interventions. Oxford Review of Education, 38, –Durlak, J.A. & DuPre, E.P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, –Kelly, B. & Perkins, D. (Eds.) (2012). Handbook of implementation science for psychology in education. Cambridge: CUP –Oakley, A. et al (2006). Process evaluation in randomised controlled trials of complex interventions. British Medical Journal, 332, Organisations –Global Implementation Initiative: –UK Implementation Network: Journals –Implementation Science: –Prevention Science: