Presentation is loading. Please wait.

Presentation is loading. Please wait.

Barbara Resnick, PhD, CRNP, FAAN, FAANP

Similar presentations


Presentation on theme: "Barbara Resnick, PhD, CRNP, FAAN, FAANP"— Presentation transcript:

1 Barbara Resnick, PhD, CRNP, FAAN, FAANP
Understanding RE-AIM Barbara Resnick, PhD, CRNP, FAAN, FAANP

2 RE-AIM The Reach, Effectiveness, Adoption, Implementation, and Maintenance framework. 14 years old. The first RE-AIM publication was in the American Journal of Public Health in 1999. The model grew out of need for improved reporting on key issues related to implementation/real world research. Driven by work and teams within the BCC.

3 Definition of RE-AIM Components
Dimension Indicators of Achievement Sources of Evidence Reach refers to the proportion of representativeness of the population targeted by the innovation. Recruitment of settings and participation of champions throughout the study; numbers of residents potentially reached. Site recruitment rates and class participation; total number of residents potentially impacted by function focused care. Efficacy refers to the extent to which the intervention improved outcomes of participants. Evidence of change in the environment, policy and service plans to optimize function and physical activity. Decreased falls and hospitalizations of residents. Measurement of the environment, policy and service plans; measurement of resident falls and hospital transfers in the month prior to and in the last month of the 12 month study period. Adoption refers to the proportion of organizations or settings that adopt the innovation. Setting willingness to provide a champion and work with the function focused care nurse on implementation activities; evidence of integration of function focused care. Setting identification of a champion and adherence of the champion to meetings and participation in function focused care activities; Evidence of changes in environment, policies and service plans.

4 Definition of RE-AIM Components
Dimension Indicators of Achievement Sources of Evidence Implementation refers to evidence that the intervention was implemented as intended. Treatment fidelity based on delivery; and receipt. Delivery was based on evidence that all champions received the initial face- to-face training; evidence that the champions were provided with the resources to teach and raise awareness of function focused care among their staff, residents and families; completion of the environment and policy assessments and appropriate changes discussed; that champions received the weekly tidbits. Receipt was based on evidence that the champion used the Nasco gift certificate. Maintenance refers to the long term adherence to the intervention and transition of the intervention into routine care. Evidence of adherence to function focused care activities at the end of the 12 month study period. Evidence of changes in the environment and policies within settings that better reflect function focused care.

5 Challenges/Recommendations for Reach
Not reporting characteristics of participants compared with nonparticipants. Not reporting on recruitment methods and implicit “screening/selection.” Not reporting on a “valid denominator.” Report on any criteria you can, even if it is just 1 characteristic.

6 Challenges/Recommendations for Efficacy
Not reporting measurement of short-term or differential rates by participant characteristic or treatment group. Not reporting on broader effects (e.g., quality of life or unintended consequences). Measure of primary outcome relative to public health goal. Reporting of short-term loss to follow-up can easily be added to CONSORT figures. Use national guidelines (Healthy People 2020).

7 Adoption Not reporting percent of settings approached that participated based on a valid denominator. Not reporting recruitment of setting details and exclusion criteria (e.g., only picking optimal sites). Use of a valid denominator at the setting level can be challenging. Use any information you can. At minimum, report the sampling frame from which your settings were selected and percentage of participation.

8 Implementation Not reporting adaptations made to interventions during study. Not reporting on costs and resources required. Not reporting differences in implementation or outcomes by different staff. Report any changes that made the intervention easier to delivery or to fit into real world settings. Remember, this is not the same as fidelity.

9 Maintenance Not reporting results of long-term broader outcomes, such as quality of life or unintended outcomes. Reporting broader outcomes provides a context in which to evaluate the long-term primary outcome results.

10 Results Reach: 300 settings invited: 99 sites (33%) volunteered and 38 attended the initial face-to-face (28% of sites). Potentially impacted 3,676 older adults.

11 Efficacy : Descriptive Outcomes at Baseline and Follow up
Variable Baseline Follow-up F(p)* Falls 12.00 (16.21) 9.33(16.40) 4.1(.05)* Hospitalizations 2.60(2.61) 2.27(4.76) .11(.74) Emergency Room Visits 1.69(1.25) 1.92(2.43) .09(.76) Policy 4.15(3.70) 10.79(4.07) 78.22(.00)* Environment: Positive Subscale .12 (.33) .03(.17) 22.34(.001)* Environment: Negative Subscale .12(.33) 3.19(.08)

12 RE-AIM Results Adoption : 21 settings (21%) did not participate.
Implementation : all components implemented in 79 sites. Did not consider costs. Maintenance: through 12 months positive qualitative findings; enduring environment and policy changes.

13 Have to have champion and site buy in
Challenges/Opportunities Identified With Regard to Dissemination and Implementation Work Have to be flexible and meet the needs of each setting (ex. We revised materials for them; wrote policies) Utilize measures that are practical and real world (ex. falls and hospitalizations versus actigraphy) Have to have champion and site buy in


Download ppt "Barbara Resnick, PhD, CRNP, FAAN, FAANP"

Similar presentations


Ads by Google