Session 5: Selecting and Operationalizing Indicators.

Slides:



Advertisements
Similar presentations
Introduction to the Results Framework. What is a Results Framework? Graphic and narrative representation of a strategy for achieving a specific objective.
Advertisements

Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Donald T. Simeon Caribbean Health Research Council
What You Will Learn From These Sessions
Module 1: Key concepts in data demand & use
Developing a Logic Model
1 Avian Influenza Rapid Response Team Training. 2 What is a Rapid Response Team? A team of professionals that investigates suspected cases of avian influenza.
Comprehensive M&E Systems
Action Writing Action Statements Writing action statements is the first step in the second (action) stage of the public health nutrition (PHN) intervention.
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
Sustaining Local Public Health and Built Environment Programs Fit Nation NYC November 3, 2011 Annaliese Calhoun.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Community Planning Training 4- Community Planning Training 4-1.
Basic Training on Project Proposal
1 Interpretation and use. 2 The walls inside are plastered with laboriously made graphs…
The National HIV and AIDS Strategy is the overarching framework for everyone at all levels to guide and drive the expanded response to HIV, AIDS and STI’s.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
This project is funded by the EUAnd implemented by a consortium led by MWH Logical Framework and Indicators.
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Measuring & Assessing Democratic Governance Pro-poor & gender-sensitive indicators Lorraine Corner.
Program Evaluation and Logic Models
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Unit 10. Monitoring and evaluation
The Targeting Outcomes of Programs (TOP) framework.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
MOD 6050 PROJECT MANAGEMENT AND FUND RAISING TOPIC – PROPOSAL WRITING AND FUNDRAISING (WK 6 &8) LECTURER: DR. G. O. K’AOL.
Indicators Regional Workshop on the
“Working to ensure children, birth to 5, are prepared for success in school and life.” Wake County SmartStart Logic Model Training May 2013.
Why Do State and Federal Programs Require a Needs Assessment?
21/4/2008 Evaluation of control measures 1. 21/4/2008 Evaluation of control measures 2 Family and Community Medicine Department.
Program Evaluation Dr. Ruth Buzi Mrs. Nettie Johnson Baylor College of Medicine Teen Health Clinic.
UK Aid Direct Introduction to Logframes (only required at proposal stage)
WORLD HEALTH ORGANIZATION Draft Report WHO/HQ Geneva – Dr. Sasha Goubarev WHO/SEARO & WHO/Nepal Presented by Karen Gladbach Contributions by Arie Rotem.
Tier 2/ Tier 3 Planning for Sustainability Rachel Saladis WI PBIS Network/Wi RtI Center Katrina Krych Sun Prairie Area School District.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Indicators in Malaria Program Phases By Bayo S Fatunmbi [Technical Officer, Monitoring & Evaluation] ERAR-GMS, WHO Cambodia & Dr. Michael Lynch Epidemiologist.
Indicators in Malaria Program Phases By Bayo S Fatunmbi [Technical Officer, Monitoring & Evaluation] ERAR-GMS, WHO Cambodia.
Developing a Project Proposal
Measuring Workforce Development Outcomes. Definition of Workforce Development 1. Needs include those of WfD providers, labor, and employers. 2. The definition.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
Using Logic Models to Create Effective Programs
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
Developing a Project Proposal ACTRAV-Turin. Contents Concept of “Logical Framework Approach” SPROUT – model project proposal Individual activity Presentation.
Session 6: Data Flow, Data Management, and Data Quality.
Session 2: Developing a Comprehensive M&E Work Plan.
1 Introduction Overview This annotated PowerPoint is designed to help communicate about your instructional priorities. Note: The facts and data here are.
Introduction to Monitoring and Evaluation. Learning Objectives By the end of the session, participants will be able to: Define program components Define.
Development of Gender Sensitive M&E: Tools and Strategies.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
Definition of indicators Facilitators’ Workshop on District Health Performance Improvement Lilongwe, 25 th – 27 th March 2015.
Developing Program Indicators Measuring Results MEASURE Evaluation.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
ADRCs Do What? Using Logic Models to Document and Improve ADRC Outcomes Glenn M. Landers.
Session 4: Introduction to Indicators
Module 2 Basic Concepts.
Session 1 – Study Objectives
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Strategic Prevention Framework - Evaluation
Planning a Learning Unit
SUCCESSFUL MEASURING JAMES HARDIN, DIRECTOR, HEALTH IMPACT
Session 3: Principles of Monitoring and Evaluation
Resources Activity Measures Outcomes
Monitoring and Evaluation
monitoring & evaluation THD Unit, Stop TB department WHO Geneva
Intensive Intervention – Tier 3
Indicator: % of population in an area of BMU implementing DOTS
Presentation transcript:

Session 5: Selecting and Operationalizing Indicators

Session Overview  Selecting indicators  Common difficulties with identifying indicators  Operationalizing indicators  Tools to use to operationalize indicators

Session Learning Objectives By the end of the session, the participant will be able to:  understand the process for selecting indicators;  be aware of the common difficulties with identifying indicators; and  understand how to operationalize indicators.

Selecting Indicators

Steps in Selecting Indicators Step 1: Review the program framework Identify what needs to be measured. Good indicators start with good program framework. Start with the overall goals and objectives for the program.

Step 2: Develop a list of possible indicators Identify indicators for each level of results. You can use:  indicators from past years of the program  experience of similar programs  global/regional/national indicators  indicator guides Steps in Selecting Indicators

Step 3: Assess each indicator Each indicator should be:  measurable  practical  reliable  relevant  useful for decision-making  precise  sensitive  capable of being disaggregated Steps in Selecting Indicators

Step 4: Select the “b est ” indicators  Narrow your list to the final indicators.  Aim for an optimum set that meets management needs at a reasonable cost.  Limit the number of indicators for each objective/result to 2-3.  Remember your target audiences, both external and internal. Steps in Selecting Indicators

Common Difficulties with Selecting Indicators

Common Pitfalls in Indicator Selection  Indicators not linked to program activities.  Using outputs as outcomes.  Poorly defined indicators.  Data needed for indicator is unavailable.  Too many indicators!!!

Pitfalls with Selecting Indicators Indicator not linked to program activities IR: Expanded access to diarrhea treatment services Activities: Train providers in treating acute diarrhea Inappropriate Indicator: Percent of facilities with adequate conditions to provide care Better indicators: Number of clinicians trained, # of facilities with a trained provider, percent of clinicians with 100% on post-test The program is not aiming to affect facility conditions, only provider skills.

Pitfalls with Selecting Indicators Using outputs to measure outcomes Problem: routine monitoring data (outputs) are available from households but outcomes are needed for reporting. You collect: Number of farms which practice recommended biosecurity measures You need: Percent of farms that practice recommended biosecurity measures Routine monitoring data should not be reported as outcome data. Compile your routine data as well as information about your target area to turn outputs into outcome measures.

Pitfalls with Selecting Indicators Indicator poorly defined Activity: A radio campaign to provide information about AI prevention methods Inappropriate indicator: Percent population with AI knowledge Better indicator: Percent population that can cite top 3 ways to protect their families from AI “Knowledge of AI” is vague. What knowledge is critical for the prevention of transmission of AI?

Pitfalls with Selecting Indicators Data needed for indicator not available Data issue: Information on stock-outs may not be collected daily Inappropriate indicator: Percent of days per quarter that service delivery points have stock-out of drugs Better indicator: Percent of service delivery points that had a stock out of drugs at some time during the last quarter If relying on routine data, indicator definition will depend on how data are collected.

Pitfalls with Selecting Indicators Too many indicators!!! Rules of thumb: one or two indicators per key activity or result (ideally, from different data sources) not more than 8-10 indicators per programmatic area at least one indicator for every core activity (e.g. infection control, laboratory improvement, outbreak response) use a mix of different data collection strategies and sources There is no set formula for how many indicators to use. Consider your budget, stakeholders, and program plans. Then consider: Is it reasonable? Feasible?

Adding Indicators to the Program Framework

Adding Indicators to Program Framework Input Activity/ Process OutputOutcomesImpact Quantifiable resources going in to your activities – the things you budget for. 1. What you do to accomplish your objectives? Immediate results from your activity – people trained, services provided Longer-term change in knowledge, attitude, behavior, etc. related to program goal Long-term, population level change. Can relate to a program or organization’s vision/mission statement Indicators (example) How do you measure this?

Logical Framework: Training Activity 18 INPUTPROCESSOUTPUT OUTCOME IMPACT Time and skills to develop new biosecurity training curriculum for market owners Conduct training events Market owners trained in biosecurity methods Improvemen t in market conditions Reduction in HPAI/H5N 1 virus circulation

Adding Indicators Input Activity/ Process OutputOutcomesImpact Time and funding to develop new biosecurity training curriculum for market owners Conduct training events Market owners training in biosecurity methods Improvement in market conditions Reduction in HPAI/H5N1 virus circulation in targeted areas Indicators (example) Amount of funding for curriculum development Percent of staff for curriculum development Number of trainings conducted Percent of market owners trained in biosecurity methods Percent of markets in which proper biosecurity methods have been implemented Percent of poultry in targeted markets found to be infected with H5N1

Small Group Activity Return to small groups and return to the logical frameworks from Session 3. As in the examples, fill in the indicators for your logical framework:  You must have indicators for every column.  REMEMBER: In general, output indicators are counts while outcome indicators measure longer term changes in percentages or proportions in knowledge, attitudes, or practices. After group work, share logic models with the larger group. You have about 30 minutes for this activity.

Small Group Projects Present your findings to the group.

Operationalizing Indicators

What does operationalizing indicators mean? Identify exactly how a given concept, result, or behavior will be measured. Challenges include:  subjective judgment  local conditions  unclear yardsticks

Importance of Operationalizing Indicators Operationalizing indicators helps ensure that:  everyone is using the same definitions;  units of measurement are the same;  everyone understands the indicators;  only the most significant program elements or achievements are being tracked; and  the chosen indicators will assist in appropriate decision- making and/or action-planning.

Exercise: Operationalizing Indicators In groups of 3-4 people, review and discuss how these indicators were operationalized :  Percent of provinces and districts with at least 5 or more surveillance officers trained on disease surveillance.  Number of poultry disease reports to central level per year.  Percent of provinces that have an avian influenza and pandemic preparedness plan.  Number (percent) of markets with active surveillance at least twice per month.  In case of outbreak, percent of outbreaks responded by rapid response teams within 24 hours.  Percent of provinces with an effective (staff-trained and PPE provided) rapid response team.  Percent of provinces with effectiveness of trained rapid response team tested through simulation exercise or outbreak.  Percent of AI specimens tested within 24 hours.  Percent of targeted influenza specimens tested.

Indicator Reference Sheet The sheet provides detailed documentation for each indicator:  basic information  description  plans for data acquisition  data quality issues  plans for data analysis, reporting, and review

Example of an Indicator Reference Sheet