Dissemination and Use of Results from OVC Program Evaluations Florence Nyangara, PhD MEASURE Evaluation/Futures Group Dissemination Meeting, September.

Slides:



Advertisements
Similar presentations
Violence Against Women and Girls A Compendium of Monitoring and Evaluation Indicators.
Advertisements

DATA DEMAND AND USE: S HARING I NFORMATION AND P ROVIDING F EEDBACK Session 5.
Nigeria Case Study HIVAIDS Specific Slides. ANALYZING AND INTERPRETING DATA.
Rwanda Case Study Additional Slides on Stakeholder Involvement.
Identification of critical success factors for implementing NLLS, through collaboration and exchange of expertise IDENTIFY LLP-2008-RO-KA1-KA1NLLS.
GENERATING DEMAND FOR DATA Module 1. Session Objectives  Understand the importance of improving data-informed decision making  Understand the role of.
Monitoring and Evaluation of National Tuberculosis Programs Regional Workshop Kyiv, Ukraine May 23-26, 2006.
Begin with the End in Mind
INTEGRATING BENEFICIARY FEEDBACK INTO EVALUATION- A STRUCTURED APPROACH Presentation to UKES Conference May 2015 Theme: Theory and practice of inclusion.
LINKING DATA TO ACTION Session 6. Session Objectives By the end of this session, you will be able to:  Identify priority decisions and programmatic questions.
DETERMINANTS OF DATA USE Session 2. Session Objectives  Explain the data-use conceptual framework  Highlight the determinants of data use  List potential.
Targeted Evaluation of Five Programs Supporting Orphans and Vulnerable Children: Background and Methods Florence Nyangara, PhD MEASURE Evaluation/Futures.
Communicating and Applying Research Results Session 3.
Context of Decision Making
UNDERSTANDING DATA AND INFORMATION FLOW Session 4.
Increasing district level evidence-based decision making in Côte d’Ivoire Tara Nutley MEASURE Evaluation / Futures Group Mini University, Washington DC.
Strengthening Information Systems for Community Based HIV Programs Heidi Reynolds and Florence Nyangara Global Health Mini University 9 October 2009.
Building Capacity on Program Evaluation in Latin America: The Experience of the Partnership between Mexico’s National Institute of Public Health (INSP)
Technical Approach to and Experiences from Strengthening National Monitoring and Evaluation System for Most Vulnerable Children Program in Tanzania Prisca.
Business as Unusual: Changing the Approach to Monitoring OVC Programs Karen G. Fleischman Foreit, PhD Futures Group/MEASURE Evaluation.
Measuring and Evaluating Reproductive Health Programs and Initiatives Bridgit Adamou Global Maternal Health Conference 2010 August 30 th, 2010.
Strengthening Health Information Systems: Creating an Information Culture Manila, June 14, 2011 Theo Lippeveld, MD, MPH,
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
Case management versus M&E in the context of OVC programs: What have we learned? Jenifer Chapman, PhD Futures Group/MEASURE Evaluation.
Bridging the Research-to-Practice Gap Session 1. Session Objectives  Understand the importance of improving data- informed decision making  Understand.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Dissemination in Canada CICAD Guidelines for School-based Prevention of Substance Abuse VII Meeting of the Expert Group on Demand Reduction September 13,
Using Data for Decision Making. Learning Objectives By the end of the session, participants will be able to: 1. Raise awareness of the importance of using.
Introduction to Monitoring and Evaluation. Learning Objectives By the end of the session, participants will be able to: Define program components Define.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Introduction to Group Work. Learning Objectives The goal of the group project is to provide workshop participants with an opportunity to further develop.
Day 4: Field Practicum This presentation has been supported by the U.S President’s Emergency Plan for AIDS Relief (PEPFAR) through the U.S. Agency for.
Information Use Part II Informing Decisions Strengthening Programs through Improved Use of Data and Information.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
PLANNING FOR QUALITATIVE DATA COLLECTION Day 2 - Session 4.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
MEASURE EVALUATION Session: 7 Developing Action Plans Based on Results Data Quality Assurance Workshop.
Data Use for Gender-Aware Health Programming Welcome and Introductions.
Data Demand & Use: Information Use Map Webinar Series #2 Tuesday, January 24, 2012 Presenters: Eric Geers and Tara Nutley.
Monitoring & Evaluation Capacity Strengthening Workshop WORKSHOP INTRODUCTION AND OVERVIEW.
Integration of Community Based Services It seems like a good idea, but how to make it work? Molly Cannon Palladium/MEASURE Evaluation September 28, 2015.
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 9:
Data Quality Assurance Workshop
RHIS Design and Reform ROUTINE HEALTH INFORMATION SYSTEMS MODULE 10:
Data Quality Assurance
Session: 5 Using the RDQA tool for System Assessment
Session: 8 Disseminating Results
Right-sized Evaluation
Fundamentals of Monitoring and Evaluation
Training of Trainers on the OVC Household Vulnerability Prioritization Tool.
MEASURE Evaluation Using a Primary Health Care Lens Gabriela Escudero
Presenting an Information Needs Framework for PEPFAR OVC Programs
Introduction to Comprehensive Evaluation
Overview of the RHIS Rapid Assessment Tool
Session: 4 Using the RDQA tool for Data Verification
Complementing Routine Data with Qualitative Data for Decision Making: Understanding the "Why" Behind Program Data Day 1 - Session 1 Note to Facilitator:
Use of Community Health Data for Shared Accountability
Assessment Training Session 9: Assessment Analysis
Introduction to Health Informatics:
Information Systems for Health:
Session: 6 Understanding & Using the RDQA Tool Output
Introduction MODULE 7: RHIS Governance and Management of Resources
Process Improvement, System Design, and Usability Evaluation
Siân Curtis, PhD OVC Evaluation Dissemination Meeting,
Data and Interoperability:
Use of Information for Decision Making
Introduction to Health Informatics
Session: 9 On-going Monitoring & Follow Up
Process Improvement, System Design, and Usability Evaluation
Presentation transcript:

Dissemination and Use of Results from OVC Program Evaluations Florence Nyangara, PhD MEASURE Evaluation/Futures Group Dissemination Meeting, September 3 rd, 2009 Washington, DC

Primary Objective of OVC Program Evaluations ■Provide evidence to guide program decisions such as; ■Scaling-up of best practices (models, strategies), and ■Modify & improve interventions - to make them effective Therefore, needed to collect quality and relevant data, analyze, and use results to guide OVC programs  Monitoring and Evaluation to Assess and Use Results (MEASURE) Evaluation project

Overall Data Use Strategy Employed a comprehensive data use strategy;  Diverse stakeholders were involved throughout the study  Ensured only relevant and useful data was collected by continuous consultations between researchers & practitioners  Data is packaged to meet the needs of target audiences  Results are used to improve programs

Comprehensive Data Use Strategy Stakeholder engagement: To ensure - support, ownership, relevance and sustainability.  Diverse stakeholders were involved (e.g. beneficiaries, program)  Capture different perspectives and information needs  Stakeholders were involved throughout the study  Get buy-in and promote ownership (consultation meetings)  Continuous communication between researchers and various stakeholders for updates (feedback).

Comprehensive Data Use Strategy Ensured that only relevant data was collected- by holding consultation meetings with donor, program implementers, community, and beneficiaries. This helped to;  Identify key OVC program issues and information needs for service, program, and policy decisions  Identify program models for evaluation  Inform questionnaire development

Examples Example 1 - Dissemination of Case Studies:  The 1 st feedback – to share information on program descriptions, implementation challenges, & opportunities  Involved - program staff and in-country stakeholders  Discussed and identified issues to consider for outcome evaluations Example 2 - Dissemination of preliminary outcome evaluation results:  Consultations with each program & key in-country stakeholders to validate preliminary findings  Presentations at international conferences and other forums helped interpret findings

Packaging Data for and Reaching Various Audiences Packaging information in various formats for diverse audiences Publications:  Six case study reports for each program evaluated 1.  Five briefing papers specific to each of the program evaluated  Two summary papers - four CORE-funded programs  Overarching paper on key findings  Cost-effectiveness analysis paper  One summary paper of key findings from the three studies in Tanzania  Program-specific summaries of key findings (1-page) – TSA Dissemination meetings  Program sites  In-country – national level  International level 1 Jali Watoto – was a mini-case study

Use of Results Workshops: Tanzania Example  Facilitated two workshops with OVC stakeholders in TZ  TSA program staff (field-staff and managers)  National OVC stakeholders (Implementing partners, government, donors, bilateral agencies, etc)

Objectives of Results Use Workshops  Present and discuss the key findings  Develop actionable recommendations based on the results  Develop a data use action plan to implement each of the recommendations  Develop and agree on a mechanism to monitor the data use action plan  Follow-up plan

Use of Results: Program Staff  Findings were presented to the Salvation Army – Mama Mkubwa program staff (field-supervisors, program managers, M&E staff) from all regions  Discussions of how TSA findings could be used to inform program improvement and the well-being of OVC.  Program-specific recommendations  Developed a data use action plan  Developed and agreed on how to monitor the plan

Use of Results: National Stakeholders  Results were presented to National OVC stakeholders – service providers, policy- makers, donors in TZ  Discussions of how findings from three program evaluations could be used for decision-making.  National OVC program actionable recommendations  Developed a data use action plan  Developed and agreed on how to monitor the plan.

Example of a Recommendation Researchers’ proposed recommendations were challenged & participants came-up with their own, e.g.  Researchers: Need to review & restructure Kids’ Clubs and home-visit activities to make them more effective.  TSA staff: volunteer motivation – through incentives.  National stakeholders: more government involvement to develop guidelines that will allow volunteer growth, recognition, and ensure sustainability.

Information Use Bulletin Contains:  Reactions to the findings (surprises)  Data use actionable recommendations plan (see National Tanzania)  Responsibilities are assigned ( joint plans) which shows who will do what - implicated people do something  Follow-up plans to assess if the actionable recommendations are implemented according to plan  This formalizes the follow-up plans which is often forgotten after disseminations  Qualitatively (January/February, 2010) Process Effect: Increased demand for more data for decision-making - more programs want to conduct simple evaluations of these nature to find out if their key program components (IGA) are making a difference.

MEASURE Evaluation is funded by the U.S. Agency for International Development through Cooperative Agreement GHA-A and is implemented by the Carolina Population Center at the University of North Carolina at Chapel Hill, in partnership with Futures Group International, ICF Macro, John Snow, Inc., Management Sciences for Health, and Tulane University. The views expressed in this presentation do not necessarily reflect the views of USAID or the United States government. Visit us online at