Designing Survey Instrument to Evaluate Implementation of Complex Health Interventions: Lessons Learned Eunice Chong Adrienne Alayli-Goebbels Lori Webel-Edgar.

Slides:



Advertisements
Similar presentations
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Advertisements

HR Manager – HR Business Partners Role Description
Philip M. Ullrich, Ph.D. Spinal Cord Injury QUERI IRC Philip M. Ullrich, Ph.D. Spinal Cord Injury QUERI IRC Philip M. Ullrich, Ph.D. Spinal Cord Injury.
Co-Teaching as Best Practice in Student Teaching Conclusion 1.
CULTURAL COMPETENCY Technical Assistance Pre-Application Workshop.
Teaching/Learning Strategies to Support Evidence-Based Practice Asoc. prof. Vida Staniuliene Klaipeda State College Dean of Faculty of Health Sciences.
Abt Associates Inc. In collaboration with: I Aga Khan Foundation I BearingPoint I Bitrán y Asociados I BRAC University I Broad Branch Associates I Forum.
A Quick Look at Implementation Science Mary Louise Peters Technical Assistance Specialist National Early Childhood Technical Assistance Center
Clustering of Risk Factors among Postpartum Families in Ontario Canadian Public Health Association May 28,2014 | Toronto, ON Anne Philipneri, MPH, PhD(c)
Session 4: Frameworks used in Clinical Settings, Part 2 Janet Myers, PhD, MPH Session 2 ● September 27, 2012.
Evidence for ‘excellence in care’
Healthy Child Development Suggestions for Submitting a Strong Proposal.
Family Resource Center Association January 2015 Quarterly Meeting.
The Impact on Practice (ImP) Project: A framework to maximise the impact of continuing professional education on practice Liz Clark, Jan Draper and Shelagh.
Life Course Perspective Seminar Series LCPSS Evaluation Leadership Project URLEND 2011 Brooke Sevy Caroline Hagedorn, PNP Eduardo Ortiz, PhD Sarah Winter,
450 PRESENTATION NURSING TURNOVER.
INTERPROFESSIONAL EDUCATION: A GUIDED ACTIVITY FOR MEDICAL AND NURSING STUDENTS ON CLINICAL PLACEMENT Professor Amanda Henderson Nursing Director, Princess.
Evaluation: A Necessity for the Profession, an Essential Support for Practitioners Linking Intervention with Outcome Bryan Hiebert University of Victoria.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Building Capacity for Better Care Behavioural Support Systems Across Canada Dr. J Kenneth LeClair Sarah Clark.
UNLEASH the POWER of the System Integration. Integration and Service System Planning: The Literacy Sector’s Path Literacy Service Planning in The Early.
Welcome to Workshop 3 The Foundation of Nursing Studies (FoNS) in Partnership with the Burdett Trust for Nursing Patients First: Supporting Nurse-led.
KT-EQUAL/ CARDI Workshop: ‘Lost in Translation’ 23 June 2011 Communicating research results to policy makers: A practitioner’s perspective.
Canadian Cancer Society Manitoba Division: Knowledge Exchange Network (KEN) & CancerCare Manitoba Manitoba Integrated Chronic Disease Primary Prevention.
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
EQARF Applying EQARF Framework and Guidelines to the Development and Testing of Eduplan.
My Own Health Report: Case Study for Pragmatic Research Marcia Ory Texas A&M Health Science Center Presentation at: CPRRN Annual Grantee Meeting October.
© 2009 On the CUSP: STOP BSI Identifying Barriers to Evidence-based Guideline Compliance.
Accountability in Health Promotion: Sharing Lessons Learned Management and Program Services Directorate Population and Public Health Branch Health Canada.
CHILDREN, YOUTH AND WOMEN’S HEALTH SERVICE New Executive Leadership Team 15 December 2004 Ms Heather Gray Chief Executive.
Introduction to MAST Kristian Kidholm Odense University Hospital, Denmark.
Research Utilization in Nursing Chapter 21
Leukemia & Lymphoma Society (LLS) Information Resource Center (IRC) Planning for a Service Program Evaluation - Case Study Public Administration 522 Presented.
Organizational Conditions for Effective School Mental Health
Commissioning Self Analysis and Planning Exercise activity sheets.
Dr. David Mowat June 22, 2005 Federal, Provincial & Local Roles Surveillance of Risk Factors and Determinants of Chronic Diseases.
NIPEC Organisational Guide to Practice & Quality Improvement Tanya McCance, Director of Nursing Research & Practice Development (UCHT) & Reader (UU) Brendan.
Managing Organizational Change A Framework to Implement and Sustain Initiatives in a Public Agency Lisa Molinar M.A.
Health Promotion as a Quality issue
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
NATIONAL MENTAL HEALTH SERVICES COLLABORATIVE Report of Independent Evaluation Presentation – 7 th February 2012 NATIONAL MENTAL HEALTH SERVICES COLLABORATIVE.
IMPLEMENTATION QUALITY RESEARCH OF PREVENTION PROGRAMS IN CROATIA MIRANDA NOVAK University of Zagreb, Faculty of Education and Rehabilitation Sciences.
School Improvement Partnership Programme: Summary of interim findings March 2014.
1 SHARED LEADERSHIP: Parents as Partners Presented by the Partnership for Family Success Training & TA Center January 14, 2009.
11 The CPCRN, DCPC, NCI, and the Community Guide: Areas for Collaboration and Supportive Work Shawna L. Mercer, MSc, PhD Director The Guide to Community.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
بسم الله الرحمن الرحیم.
Guidelines Recommandations. Role Ideal mediator for bridging between research findings and actual clinical practice Ideal tool for professionals, managers,
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
Using Multiple Data Sources to Understand Variable Interventions Bruce E. Landon, M.D., M.B.A. Harvard Medical School AcademyHealth Annual Research Meeting.
INTRODUCING THE PSBA-GTO ACT FOR YOUTH CENTER OF EXCELLENCE IN CONSULTATION WITH HEALTHY TEEN NETWORK Planning for Evidence-Based Programming.
Scottish Improvement Science Collaborating Centre Strengthening the evidence base for improvement science: lessons learned Dr Nicola Gray, Senior Lecturer,
Resource Review for Teaching Resource Review for Teaching Victoria M. Rizzo, LCSW-R, PhD Jessica Seidman, LMSW Columbia University School of Social Work.
A Policy-oriented Board of Trustees A Review of Selected Research.
KEVIN SMITH & KIM HORTON JULY 2015 Educational research and teaching Wales.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Acute Health Care Perspectives on Homelessness Research Making Data Meaningful April 23, 2015 Ginetta Salvalaggio, MSc, MD, CCFP Assistant Professor, University.
Evaluating the Impact of a System- Level Intervention Using a Developmental Evaluation Approach S. Elizabeth McGee, MA Evangeline Danseco, PhD Kyle Ferguson.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Why Has it got to be Multi Professional ? The extent to which different healthcare professionals work well together can affect the quality of the health.
Creating Positive Culture through Leadership (Recovery Orientation) Jennifer Black.
Health Management Information Systems Unit 3 Electronic Health Records Component 6/Unit31 Health IT Workforce Curriculum Version 1.0/Fall 2010.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring & Evaluation Capacity Strengthening Workshop WORKSHOP INTRODUCTION AND OVERVIEW.
MUHC Innovation Model.
Workforce Planning Framework
By: Andi Indahwaty Sidin A Critical Review of The Role of Clinical Governance in Health Care and its Potential Application in Indonesia.
Presentation transcript:

Designing Survey Instrument to Evaluate Implementation of Complex Health Interventions: Lessons Learned Eunice Chong Adrienne Alayli-Goebbels Lori Webel-Edgar Jennifer Robertson Helen Cerigo Anne Philipneri Carly Heung Quynh Huynh Heather Manson The Canadian Public Health Association Conference, May 28 th 2014

PublicHealthOntario.ca Why do we need to study implementation? Implementation is a complex process involving multiple steps at multiple levels by multiple people Only a small fraction of evidence-based interventions are actually implemented in the real world Implementation research helps: Understand interaction with the local context Identify factors that hinder or facilitate implementation 2

PublicHealthOntario.ca Challenges in evaluating implementation There is an increasing body of evidence on factors that affect implementation, but: Different theories/frameworks, different terminology/definitions used Knowledge about relationships between factors is limited Available tools tend to focus on a limited number of factors and have weak reliability and validity 3 How do we decide on what to measure?

PublicHealthOntario.ca Objective of this presentation To share lessons learned from identifying and measuring factors contributing to implementation of the enhanced Ontario Healthy Babies Healthy Children (HBHC) Program. 4

PublicHealthOntario.ca The HBHC Program Aims to help children in Ontario have a healthy start in life and targets at children from the prenatal period to transition to school Funded by Ministry of Children and Youth Services (MCYS) Early 2013 MCYS introduced mandatory enhancements to the HBHC program to be implemented by 36 public health units Three key elements of the enhancements: Funding of 36 screening liaison nurse positions Introduction of a new validated screening tool Introduction of training to use evidence-informed interventions during home- visiting February to April 2013: staggered implementation process 5

PublicHealthOntario.ca Aims of the process implementation evaluation To what extent has the program been implemented as planned? Fidelity with protocol and program goals Reach of target population for each program component Which factors hinder/facilitate the implementation? Multiple program components Multiple HBHC staff and community partners Multiple organizations Different local settings and population characteristics 6 →Administrative database analysis → Surveys & focus groups with HBHC staff

PublicHealthOntario.ca Evaluation framework: multi-level factors contributing to implementation outcomes 7 System Organization (Public Health Unit) Provider Program participants Innovation Contributing Factors Reach Fidelity Local adaptations Impact on program change goals Implementation Outcomes Facilitation Feedback Adapted from: Chaudoir et al., Measuring factors affecting implementation of health innovations: a systematic review of structural, organizational, provider, patient, and innovation level measures. Implementation Science. 8:22

PublicHealthOntario.ca How did the framework help in developing the survey? Identified several multi-level factors that have shown to contribute to implementation outcomes Encouraged a structured stepwise method to identify relevant constructs to measure for each multi-level factor 1.Review of the implementation science literature to identify measurable constructs at each level 2.Search for existing instruments that measure the identified constructs 3.Expert consultation to identify additional instruments and verify the relevance of the identified constructs 8

PublicHealthOntario.ca Stakeholder engagement To determine which of the identified constructs are most important to measure for the HBHC program implementation Brainstorm groups with HBHC program managers and directors of 36 health units (May 2013) 9 Better understanding of constructs that are most important in the language of the field AND additional factors relevant to consider Question: Based on your knowledge and experience, what aspects of do you think are relevant/important to consider in the evaluation? Multi-voting exercise: to prioritize identified aspects

PublicHealthOntario.ca Constructs examples selected for each multi-level factor Innovation level constructs (Characteristics of the enhanced HBHC program) Adaptability & flexibility Consistency (additional) Complexity Evidence Feasibility Provider level constructs (Characteristics of health unit staff implementing the program) Job stress Job satisfaction Knowledge Professional confidence Preparedness for implementation (additional) 10 Organizational level constructs (Characteristics of the public health units in which the program is implemented) Organizational culture Communication Leadership System level constructs (The external environment and broader socio-cultural context) Community characteristics (additional) Funding and staff resources Policies and mandates (additional) Involvement of external partners (additional)

PublicHealthOntario.ca Development of specific survey items Survey items (open/closed-ended) were developed for all selected constructs in collaboration with program specialist Questions were derived from previously used/validated instruments, where possible Questions often adapted to the specific context of HBHC and the role-perspective (5 versions) For additional constructs we developed new questions using ‘field language’ Pilot-testing with 2 staff from each role 11

PublicHealthOntario.ca Example survey items 12 Strongly Disagree Disagree Neither Agree nor Disagree Agree Strongly Agree Prefer not to answer □□□□□□ Innovation level: “The enhanced HBHC program leaves enough room for me to make my own practice decisions” (Adaptability/Flexibility, adapted from Barriers and Facilitators Assessment Instrument ) Provider level: “I feel stressed by my new responsibilities and tasks” (Job stress) Organizational level: “Colleagues in my health unit are willing to innovate and/or experiment to improve clinical practice ” (Organizational culture, adapted from ORCA) System level: “My health unit has sufficient support to facilitate the implementation of the enhanced HBHC program in terms of budget” (Funding resources, adapted from ORCA) Facilitation level: “ Change champions in my health unit support me to implement the program changes relevant to my role” (Change champions) Feedback level: “Management and/or senior leadership in my health unit welcomes feedback from staff regarding the implementation of the new HBHC protocol” (Feedback) Statements with the following response categories:

PublicHealthOntario.ca Conclusions and lessons learned The evaluation framework was useful to inform systematic survey development for evaluating the HBHC implementation Integrating factors from other theories/frameworks into the evaluation framework encouraged a more comprehensive understanding of the implementation process The multi-level nature of the framework allowed for multiple perspectives and the use of multiple data sources Stakeholder engagement was essential to ask the right questions The survey contributed to the limited instruments available to measure factors that hinder and facilitate implementation of the multiple levels 13

PublicHealthOntario.ca Limitations and challenges Our survey instrument used validated tools, but has not been validated The multi-level approach resulted in a long survey (feasibility depends on level of commitment of the sample) Due to a lack of consistency in definitions and terminology for constructs, comparisons with other studies are difficult Richness of data can be challenging in terms of making sense of the results 14

PublicHealthOntario.ca 15