MER Essential Survey Indicators Jenifer Chapman, PhD & Lisa Parker, PhD February 2, 2015.

Slides:



Advertisements
Similar presentations
Violence Against Women and Girls A Compendium of Monitoring and Evaluation Indicators.
Advertisements

Nigeria Case Study HIVAIDS Specific Slides. ANALYZING AND INTERPRETING DATA.
Rwanda Case Study Additional Slides on Stakeholder Involvement.
GENERATING DEMAND FOR DATA Module 1. Session Objectives  Understand the importance of improving data-informed decision making  Understand the role of.
Understanding the role of child marriage on reproductive health outcomes: evidence from a multi- country study in South Asia Deepali Godha, David Hotchkiss,
Begin with the End in Mind
Contraceptive discontinuation in urban Honduras Janine Barden-O’Fallon, PhD Ilene Speizer, PhD University of North Carolina at Chapel Hill, USA 29 September.
What characteristics differentiate method switchers from discontinuers? Janine Barden-O’Fallon, PhD Ilene Speizer, PhD University of North Carolina at.
LINKING DATA TO ACTION Session 6. Session Objectives By the end of this session, you will be able to:  Identify priority decisions and programmatic questions.
Dissemination and Use of Results from OVC Program Evaluations Florence Nyangara, PhD MEASURE Evaluation/Futures Group Dissemination Meeting, September.
DETERMINANTS OF DATA USE Session 2. Session Objectives  Explain the data-use conceptual framework  Highlight the determinants of data use  List potential.
M&E Framework for Programmes for Most-at-Risk Populations
Conditional Cash Transfers for Improving Utilization of Health Services Health Systems Innovation Workshop Abuja, January 25 th -29 th, 2010.
Holistic early childhood development indicators
Evaluating impact of OVC programs: standardizing our methods Jenifer Chapman, PhD Futures Group/MEASURE Evaluation.
Monitoring and Evaluating National Responses for Children Orphaned and made Vulnerable by HIV/AIDS Mary Mahy UNICEF Meeting on Results of Pilot Surveys.
PEPFAR OVC Survey Toolkit Janet Shriberg, EdD Senior Evaluation Advisor Office of HIV/AIDS USAID **please note that slides were created with work done.
The Early Learning Challenge Fund: Metrics and Data Danielle Ewen February 22, 2010.
Technical Approach to and Experiences from Strengthening National Monitoring and Evaluation System for Most Vulnerable Children Program in Tanzania Prisca.
Business as Unusual: Changing the Approach to Monitoring OVC Programs Karen G. Fleischman Foreit, PhD Futures Group/MEASURE Evaluation.
Case management versus M&E in the context of OVC programs: What have we learned? Jenifer Chapman, PhD Futures Group/MEASURE Evaluation.
UNICEF Child Friendly City Framework: a rights based local planning approach Dave Pugh CEO St Luke’s Anglicare.
Evaluation of the Indiana ECCS Initiative. State Context Previous Early Childhood System Initiatives –Step Ahead –Building Bright Beginnings SPRANS Grant.
Bridging the Research-to-Practice Gap Session 1. Session Objectives  Understand the importance of improving data- informed decision making  Understand.
PLACE Method Priorities for Local AIDS Control Efforts 1 1 MEASURE Evaluation, A Manual for implementing the PLACE Method.
World Education (WEI)/Bantwana Initiative : Reducing Children’s Vulnerability with an Integrated Livelihood, Protection, and Psychosocial Support (PSS)
Linking Data with Action Part 1: Seven Steps of Using Information for Decision Making.
Linking Data with Action Part 2: Understanding Data Discrepancies.
Day 6: Supervisors’ Training This presentation has been supported by the U.S President’s Emergency Plan for AIDS Relief (PEPFAR) through the U.S. Agency.
Regional Forum: Use of Gender Data in Sub-national Decision-making Kigali, Rwanda August 2012 Key Gender Terms and Concepts.
Introduction to Monitoring and Evaluation. Learning Objectives By the end of the session, participants will be able to: Define program components Define.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Multiple Indicator Cluster Survey in Kazakhstan (fourth round) Astana The Agency of Statistics of the Republic of Kazakhstan.
Unmet Need Exercise  Review the trends in CPR. What do you conclude about the performance of the FP program in each country?  Review trends in unmet.
Day 4: Field Practicum This presentation has been supported by the U.S President’s Emergency Plan for AIDS Relief (PEPFAR) through the U.S. Agency for.
An OVC Survey Toolkit Jenifer Chapman, PhD February 2, 2015.
Day 2: Research Ethics and Interviewing This presentation has been supported by the U.S President’s Emergency Plan for AIDS Relief (PEPFAR) through the.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
PLANNING FOR QUALITATIVE DATA COLLECTION Day 2 - Session 4.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
MEASURE EVALUATION Session: 7 Developing Action Plans Based on Results Data Quality Assurance Workshop.
Day 1: Well-being and Interviewing This presentation has been supported by the U.S President’s Emergency Plan for AIDS Relief (PEPFAR) through the U.S.
Monitoring & Evaluation Capacity Strengthening Workshop WORKSHOP INTRODUCTION AND OVERVIEW.
Integration of Community Based Services It seems like a good idea, but how to make it work? Molly Cannon Palladium/MEASURE Evaluation September 28, 2015.
Data Quality Assurance Workshop
Introduction MODULE 2: Indicators and Data Collection and Reporting
Session: 5 Using the RDQA tool for System Assessment
Introduction MODULE 6: RHIS Data Demand and Use
Right-sized Evaluation
Training of Trainers on the OVC Household Vulnerability Prioritization Tool.
MEASURE Evaluation Using a Primary Health Care Lens Gabriela Escudero
Presenting an Information Needs Framework for PEPFAR OVC Programs
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 4:
Overview of the RHIS Rapid Assessment Tool
Assessment Training Session 9: Assessment Analysis
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 3:
Introduction to Health Informatics:
Introduction to the PRISM Framework
Information Systems for Health:
Information Systems for Health:
Introduction to Health Informatics:
Introduction MODULE 7: RHIS Governance and Management of Resources
Process Improvement, System Design, and Usability Evaluation
Siân Curtis, PhD OVC Evaluation Dissemination Meeting,
Data and Interoperability:
Introduction to Health Informatics
Session: 9 On-going Monitoring & Follow Up
Process Improvement, System Design, and Usability Evaluation
EHRs and Privacy Protection in LMICs
Willis Odek, PhD Chief of Party/Senior Technical Advisor,
Presentation transcript:

MER Essential Survey Indicators Jenifer Chapman, PhD & Lisa Parker, PhD February 2, 2015

Learning outcomes  You will become familiar with the indicators and think about what they mean for your programs  You will know how to get the data  You will better understand how to choose programs for outcomes monitoring from your country HKID portfolio

MER Essential Survey Indicators  Move to outcomes, linked to program goals  Mandatory  Every two years

9 Essential Indicators  Representing holistic measures of child and family wellbeing  Linked to broader HIV response goals  Linked to broader child protection response goals  Vetted by broad stakeholder community

Indicator criteria  Amenable to change due to PEPFAR OVC programs in a 2 year period  Easy to measure by trained data collectors  Relevant over time and place  Indicators including questions that could be validated were prioritized  Many pilot tested

HIV status Percent of children whose primary caregiver knows the child’s HIV status Rationale: If HIV status is unknown, child will not access care & treatment (proxy for testing) Source: PEPFAR OVC TWG

Nutrition Percent of children <5 years of age who are undernourished Rationale: linked to infant morality and long- term child health and development Source: World Health Organization

Health Percent of children too sick to participate in daily activities Rationale: PEPFAR supports critical linkages to health services to improve functional well- being Source: MEASURE Evaluation

Legal protection Percent of children who have a birth certificate Rationale: Required to access essential services Source: DHS

Education: School attendance Percent of children regularly attending school Rationale: Important for child development; children in school are less likely to acquire HIV Source: UNESCO

Education: Progression in school Percent of children who progressed in school during the last year Rationale: More education is linked to better HIV awareness, higher contraceptive use, improved child well-being (among others) Source: MEASURE Evaluation

Early childhood development Percent of children <5 years of age who recently engaged in stimulating activities with any household member over 15 years of age Rationale: Early childhood stimulation is linked to long term child health and development Source: MICS

Perception of violence Percent of caregivers who agree that harsh physical punishment is an appropriate means of discipline or control in the home or school Rationale: Perception of violence is linked to use of violence; children experiencing violence show greater HIV risk behaviors Source: MEASURE Evaluation

Household economic resilience Percent of households able to access money to pay for unexpected household expenses Rationale: Resilience to economic shocks is linked to poverty, which impacts child and family well-being Source: MEASURE Evaluation

Disaggregation  By sex  By age group (where relevant):  0-4 years  5-9 years  years  years

Let’s discuss (30 mins) What questions do these indicators raise for you with respect to program implementation?  Hint: how does this affect targeting beneficiaries?

Feedback What did you discuss?

Getting the data  Figuring out which programs  Figuring out which approach: 1.Outcomes monitoring 2.Evaluation  Figuring out who will collect the data

Which programs?  Appropriate proportion of budget  Agency representation  Appropriate program scope and timeline  Strategic effort  Paying heed to other data collection efforts *Countries with total HKID funding of <1M USD/year are exempt from requirement

So we might have data from multiple programs in one country?  Yes. They cannot be aggregated.

What approach? Outcomes monitoring vs. Evaluation

Considerations  The information you need  Why you need it  When you need it Note that we have developed data collection tools for both approaches

EvaluationOutcomes monitoring Outcomes may be attributed to program* Attribution cannot be established Data valid at population level Data may be valid at local level* Larger number of indicatorsVery limited number of indicators 3-5 years usuallyEvery 2 years Complex samplingSimpler sampling Complex analysisSimpler analysis* Higher costLower cost

Methods for outcomes monitoring Cluster sample surveys vs. Lot quality assurance sampling (LQAS)

MethodAdvantagesDisadvantages Cluster sampling Sample large enough for sub-group analyses Simpler sampling design No weighting* Statistician needed for sample size calculation More expensive (larger) Lot quality assurance sampling Provides information valid at supervision area (SA) level May be cheaper depending on number of SAs Sampling frame needed for each SA Sample size will need to be increased for some indicators Values need to be weighted

Data collection tool

Which partner?  Surveys must be undertaken by appropriate institution, that is NOT providing services under program  Proven institutional capes:  Survey design and sampling (all methods)  Ethical and safe data collection  Data management and analysis

When? APR FY15 And again in two years.

Implications for your work (30 min)  What programs might require outcome monitoring in your country?  What are your next steps?

Feedback

Where can I find out more? Go to our website: our-work/ovc Jenifer Chapman: or Lisa Parker:

The research presented here has been supported by the President’s Emergency Plan for AIDS Relief (PEPFAR) through the United States Agency for International Development (USAID) under the terms of MEASURE Evaluation cooperative agreement AID-OAA-L Views expressed are not necessarily those of PEPFAR, USAID or the United States government. MEASURE Evaluation is implemented by the Carolina Population Center at the University of North Carolina at Chapel Hill in partnership with Futures Group, ICF International, John Snow, Inc., Management Sciences for Health, and Tulane University.