Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evidence-based Practice v. Practice-based Evidence

Similar presentations


Presentation on theme: "Evidence-based Practice v. Practice-based Evidence"— Presentation transcript:

1 Evidence-based Practice v. Practice-based Evidence
With Angela Beggan @AngelaBeggan #LSA2014

2 Introduction Physical Activity can be both an outcome and a mechanism within community health improvement. Interventions are the main mode of delivery for such mechanisms which are largely concerned with lifestyle and behavioural change supported by empirically derived evidence. Design and implementation of complex, real world interventions are a pragmatic mix of research and practice based evidence.

3 Mismatch? “Change is conceptualized as a linear, deterministic process… Consistent with this framework, the associated statistical models have almost exclusively assumed a linear relationship between psychosocial predictors and behavior (change); i.e., greater increases in knowledge, attitudes and intentions will lead to greater change in behavior (Resnicow and Vaugh, 2006, p.2).” “A recognition of emergence means that we cannot understand things simply in terms of their components, the essence of the reductionist approach that underpins positivist science. Instead, we have to think about parts and wholes and we must recognize that causality does not run in any one direction (Byrne, 2013, p. 218).”

4 Interventions are natural Experiments
“When the subject of new intervention development is one with such epidemiologic urgency as obesity and with such a paucity of evidence-based practices, practitioners and communities cannot sit idly while science develops refıned interventions. Action is a political, economic, and public health necessity, and such actions must be taken in the absence of absolute confıdence in their effıcacy, much less their effectiveness in the particular communities, settings, and populations (Sallis and Green, 2012, S411).”

5 Translation How can research and practice communicate better?
RE-AIM Framework (Glasgow, Vogt, and Boyles, 1999)

6 Evaluation Application
Dimension Outcome(s) Measures Reach 1,7 Representativeness of participants to intended target groups. Number of participants accessing AC activities in each area. Rate of penetration into available target groups in each area. Effectiveness 1,2,6 Self-report survey measures on increases in physical activity and increases in awareness of health benefits Health-related quality of life – SF-12v2 7-day recall of physical activity Adoption 3,4 Representativeness of partners engaging the services Uptake of partner organisations in each area. Implementation 1,3,5,8,9 Interviews with partners Interviews with instructors Reporting on processes Document review Maintenance 5,8 Rate of attrition Reporting on volunteer development Community intervention targeting hard to reach groups as defined by stakeholders. Multiple outcome objectives across several theoretical and practice-based domains. Post implementation only – retrospective.

7 Selected Findings Dimension Practice implications
Research Implications Reach Specific barriers were not targeted Why did it work anyway in some groups and locations? Effectiveness Specific behavioural mechanisms were not employed How can evidence be more readily communicated and applied in practice – expediency? Adoption Partners have good working links – good uptake. Potential for wide dissemination to practice through partners Implementation Varying levels of buy-in affected outcomes in some areas Competing agendas affect programme fidelity Maintenance Failure to engage instructors affected sustainability Why didn’t participants engage better with the activities and training opportunities?

8 Implications Theory-based design and evaluation: Complexity Theories
Realist Evaluation Theories of Change Complexity Theories Participatory Methods Practice Academics Participants Need to think about interventions differently. Instead of empirically derived linear causal relationships – collaborative insightful development with pragmatic measures of effectiveness.

9 Questions/Comments? Co-Production
“As such action rolls out, the opportunities to evaluate its development, application and effects become the stuff of practice-based evidence that will contribute to and make more robust the long-awaited evidence-based practice (Sallis and Green, 2012, S411).” Questions/Comments?

10 References Byrne, D (2013). Evaluating complex social interventions in a complex world. Evaluation 19(3): Glasgow, R; Vogt, T; Boyles, S (1999). Evaluating the public health impact of health promotion interventions: the RE-AIM framework. American Journal of Public Health Sep 89(9): Rescinow, K and Vaugh, R (2006). A chaotic view of behavior change: a quantum leap for health promotion. International Journal of Behavioural Nutrition and Physical Activity 3(1):25. Sallis, J and Green, L (2012). Active living by design and its evaluation: contributions to science. American Journal of Preventive Medicine 43(5): S410-S412.


Download ppt "Evidence-based Practice v. Practice-based Evidence"

Similar presentations


Ads by Google