Presentation is loading. Please wait.

Presentation is loading. Please wait.

A bright IDEA? Intervention Delivery and Evaluation Analysis in implementation and process evaluation Professor Neil Humphrey and Dr. Ann Lendrum Manchester.

Similar presentations


Presentation on theme: "A bright IDEA? Intervention Delivery and Evaluation Analysis in implementation and process evaluation Professor Neil Humphrey and Dr. Ann Lendrum Manchester."— Presentation transcript:

1 A bright IDEA? Intervention Delivery and Evaluation Analysis in implementation and process evaluation Professor Neil Humphrey and Dr. Ann Lendrum Manchester Institute of Education neil.humphrey@manchester.ac.uk @neilhumphreyUoM

2 Workshop overview Implementation and process evaluation A bright IDEA? A TIDieR approach to describing interventions How do we get there from here? Assessing usual practice Documenting implementation On treatment analysis An integrated approach

3 Shameless self-promotion slide

4 Introductory activity Think about an EEF trial in which you have been involved* –What was the intervention? –What was the trial design? –What were the main impact analysis findings? –What were the explanations for these findings? *If you haven’t been involved in an EEF trial that has reported yet, think about a non-EEF trial.

5 Implementation and process evaluation (IPE) If RCTs tell us ‘what works’, IPE helps us to understand how and why they work (or, in the case of null impact, why an intervention appears not to have worked) IPE refers to, “the generation and analysis of data to examine how an intervention is put into practice, how it operates to achieve its intended outcomes, and the factors that influence these processes” (Humphrey et al, 2016, p.3) “There is no such thing as a typical [implementation and] process evaluation” (Evans, Scourfield & Murphy, 2015, p.1) Some core principles for EEF IPE –Dimensions of and factors affecting implementation –Mixed methods, integrated approach –TIDieR framework –Logic model and/or theory of change –IDEA workshop –Documenting implementation

6 Implementation and process evaluation (IPE) Pilot –Establish social validity (acceptability, feasibility, utility) In EEF terms: is it feasible? Is there evidence of promise? Is it trial-ready? –Generate evidence to assess intervention theory All trials –Documenting implementation –Assessing usual practice –Researching adaptations –Differential intervention benefits Efficacy trials –On treatment analysis –Logic model validation Effectiveness trials –Critical component analysis –Contextual influences

7 A bright IDEA? Intervention Delivery and Evaluation Analysis (IDEA) workshop The basic premise: close collaboration between the delivery partner and evaluation team will provide a strong foundation for a high quality IPE Aim of the IDEA workshop is to enable an in-depth exploration of the intervention and develop the IPE, including: –Co-construct and agree the TIDieR framework content for the intervention –Interrogate the intervention logic model (or theory of change) –Examine intervention delivery and training materials –Explore existing literature and evidence about the intervention –Discuss the intervention delivery history (within or beyond the EEF context) Coming up with a catchy acronymEverything else UoM IPE literature synthesis and evaluation guidance timeline

8 A bright IDEA? Identify the most salient dimensions of and factors affecting implementation and consider when and how they may be assessed Identify the most salient contextual factors that are likely to influence implementation and outcomes (particularly in effectiveness trials) so that these can be assessed in the IPE Start scoping data collection tools to provide evidence to support/challenge (in pilots) or empirically validate (in efficacy trials) the intervention logic model (or theory of change) (e.g. change mechanism indicators) Clarify which (if any) subgroups are likely to experience differential intervention benefits (and why) and should therefore be included in the trial protocol

9 A bright IDEA? Generate a definition of ‘on treatment’ status, including data needed to determine this (in efficacy trials) Build a detailed IPE data generation and analysis protocol (within the constraints of the budget agreed by the EEF grants committee) for publication on the EEF website Consider the availability and specify naturally occurring data to support the above Agree any procedures for data sharing between the evaluation and delivery teams If you have any ideas about other issues that might usefully be covered in an IDEA workshop please write them down on the post-it notes provided

10 A TIDieR approach to describing interventions “The quality of description of interventions in publications… is remarkably poor” (Hoffman et al, 2014, p.1) Without a complete description of an intervention: –The person/s responsible for delivery cannot reliably implement it –The recipient/s do not know exactly what they ‘signing up for’ –Researchers cannot properly replicate or build upon existing findings –Researchers cannot adequately evaluate the implementation of the intervention –It is difficult, if not impossible, to understand how and why it works Template for Intervention Description and Replication’ (TIDieR) (Hoffman et al, 2014) offers a useful tool that can improve the quality of how interventions are described and subsequently understood –TIDieR PHP being developed currently

11 A TIDieR approach to describing interventions Think about an intervention with which you are very familiar - can you provide a full description of it? TIDieR (adapted version) items: 1. Brief name, 2. Why? (theory/rationale), 3. Who (recipients), 4. What (materials), 5. What (procedures), 6. Who (provider), 7. How (format), 8. Where (location), 9. When and how much (dosage), 10. Tailoring (e.g. adaptation) ‘Just a Minute’ format – responses to the 10 items in the adapted TIDieR framework Questions for reflection –Why are these 10 items important? –Are there any fundamental ways of describing an intervention that TIDieR misses? What are these?

12 How do we get there from here? “Seasoned travellers would not set out on a cross country motor trip without having a destination in mind, at least some idea of how to get there, and, preferably, a detailed map to provide direction and guide progress along the way” (Stinchcomb, 2001, p.48) Two (related) approaches –Logic model –Theory of change

13 How do we get there from here?

14 Logic model: worked example Achievement for All (pilot version) logic model (Barlow et al, 2015)

15 How do we get there from here? Try to create a basic logic model for the intervention you described in the previous activity Questions for reflection –Which component(s) of the logic model was/were the most difficult to complete? Why? –Is logic modeling better suited to certain kinds of interventions than others? If so, what kinds of interventions and why?

16 Assessing usual practice Being able to define and document what constitutes ‘usual practice’ is a critical consideration for a number of reasons: –Establishing the counterfactual is a necessary step in establishing causal effects –Usual practice may change in response to randomisation to the control arm of a trial (e.g. compensatory rivalry, aka the ‘John Henry’ effect) –It is important to establish the level of programme differentiation in the intervention group (e.g. how distinctive is the intervention? What has the intervention displaced?) A fairly typical example of how usual practice is reported:

17 Assessing usual practice How you might develop a usual practice survey relating to your chosen intervention? –What is the appropriate level at which to survey? (e.g. school or classroom/teacher)? –What level of ‘granularity’ is required? (e.g. implementation status of named interventions in the same or related areas, and/or behaviours, strategies and approaches, curriculum content, and use of resources similar to aspects of the intervention being evaluated?) –What sources of information (e.g. existing research, evaluation team knowledge, delivery partner knowledge) are likely to be most useful in developing a usual practice survey that is fit for purpose?

18 Documenting implementation “Accurate interpretation of outcomes depends on knowing what aspects of the intervention were delivered and how well they were conducted” (Durlak & DuPre, 2008, p.328) 8 generally agreed upon dimensions of implementation 1.Fidelity/adherence – the extent to which implementers (e.g. teachers) adhere to the intended treatment model 2.Dosage – how much of the intended intervention has been delivered and/or received 3.Quality – how well different components of an intervention are delivered 4.Reach – the rate and scope of participation 5.Responsiveness – the degree to which participants engage with the intervention 6.Programme differentiation – the extent to which intervention activities can be distinguished from other, existing practice 7.Monitoring of control/comparison groups (in a trial context) – determination of the ‘counter- factual’ (e.g. that which is taking place in the absence of the intervention) Both 6 and 7 require a clear understanding of ‘usual practice’ 8.Adaptation – the nature and extent of changes made to the intervention Focusing solely on a single implementation dimension (e.g. fidelity) limits the utility of IPE and can lead to a Type III error (the inaccurate attribution of cause)

19 Documenting implementation “Even if the concept of implementation is not new, the idea of developing ways of measuring it certainly is” (Ogden & Fixsen, 2014, p.8) How might you go about documenting implementation of your chosen intervention? –How might you go about assessing each of the 8 dimensions of implementation? –If you are not in a position to collect data pertaining to all dimensions, are there some that you would want to prioritise? Which and why? –What types (quantitative, qualitative), sources (e.g. researcher, implementer, delivery partner) and methods of data generation (e.g. observation, survey, interview) could be utilised? –What are the advantages and limitations of these types, sources and methods? Descriptive, comparative, relational uses of implementation data

20 Documenting implementation: worked example (Good Behaviour Game) Development of a structured observation schedule to capture procedural fidelity, quality, reach and responsiveness Sources of information and inspiration –American Institutes for Research GBG coach fidelity checklist –Literature on assessment of implementation –Other studies that included assessment of GBG implementation –Our previous experience of assessing implementation of school-based interventions (e.g. PATHS; Humphrey et al, 2015 –Video footage of GBG being implemented (recorded as part of UK GBG pilot) High inter-rater reliability established using Cohen’s Kappa (for nominal items – 0.95), and intra-class correlation co-efficient (for ordinal items – 0.96) 10% of game observations moderated by AL to guard against drift over time

21 Documenting implementation: worked example (Good Behaviour Game)

22 ‘On treatment’ analysis One approach to establishing the relationship between implementation variability and intervention outcomes Sometimes referred to as ‘pre protocol’ or ‘adherence to protocol’ Use of implementation threshold to distinguish between participants known to have received the intervention as planned and those who have not Think about your chosen intervention and approach to documenting implementation What data is needed in order to determine whether implementation in a given class or school could be classified as ‘on treatment’? What is the appropriate threshold for on-treatment status?

23 An integrated approach “At the heart of understanding how to develop [and evaluate] interventions is the realization that no one research method in isolation will suffice” (Borglin, 2015, p.29) The RCT and IPE strands of an evaluation should not be seen as disparate The quantitative and qualitative methods of data generation and analysis should not be seen as disparate Some mixed-methods design considerations for IPE –Level of priority afforded to the different strands (e.g. equal, unbalanced) –The amount of interaction between them (e.g. independent, interactive) –The protocol for mixing them (e.g. during development, data collection, analysis, or interpretation) Some design options –Convergent parallel –Explanatory sequential –Exploratory sequential –Embedded and/or multi-phase

24 An integrated approach: worked example (AfA effectiveness trial) Baseline quantitative dataSelection of case study sitesQualitative implementation data Development of quantitative implementation survey Analysis of relationship between implementation and outcomes Illustrative case study data An integrated approach begins at the design stage Development of research questions Working with the delivery team (e.g. in IDEA workshop) Working with EEF: what is ‘fixed’ and what is ‘flexible’? What, who, when, how, why? QuantitativeQualitative RCTIPE

25 Shameless self-promotion slide (redux)

26 That’s all folks!


Download ppt "A bright IDEA? Intervention Delivery and Evaluation Analysis in implementation and process evaluation Professor Neil Humphrey and Dr. Ann Lendrum Manchester."

Similar presentations


Ads by Google