Assessment Implementation

Slides:



Advertisements
Similar presentations
Rwanda Case Study Additional Slides on Stakeholder Involvement.
Advertisements

DETERMINANTS OF DATA USE Session 2. Session Objectives  Explain the data-use conceptual framework  Highlight the determinants of data use  List potential.
UNDERSTANDING DATA AND INFORMATION FLOW Session 4.
A syndrome of Irregular Enthusiasm: Increasing the Utilisation of Evaluation findings in the UPHOLD project BY Apollo Nkwake Visit
LOT QUALITY ASSURANCE SAMPLING (LQAS). What is LQAS A sampling method that:  Is simple, in-expensive, and probabilistic.  Combines two standard statistical.
Strengthening Health Information Systems: Creating an Information Culture Manila, June 14, 2011 Theo Lippeveld, MD, MPH,
Case management versus M&E in the context of OVC programs: What have we learned? Jenifer Chapman, PhD Futures Group/MEASURE Evaluation.
Linking Data with Action Part 1: Seven Steps of Using Information for Decision Making.
Day 6: Supervisors’ Training This presentation has been supported by the U.S President’s Emergency Plan for AIDS Relief (PEPFAR) through the U.S. Agency.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Introduction to Group Work. Learning Objectives The goal of the group project is to provide workshop participants with an opportunity to further develop.
Day 4: Field Practicum This presentation has been supported by the U.S President’s Emergency Plan for AIDS Relief (PEPFAR) through the U.S. Agency for.
Module 2: Random Sampling Background, Key Concepts and Issues Outcome Monitoring and Evaluation Using LQAS.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
MEASURE EVALUATION Session: 7 Developing Action Plans Based on Results Data Quality Assurance Workshop.
Data Use for Gender-Aware Health Programming Welcome and Introductions.
Monitoring & Evaluation Capacity Strengthening Workshop WORKSHOP INTRODUCTION AND OVERVIEW.
Integration of Community Based Services It seems like a good idea, but how to make it work? Molly Cannon Palladium/MEASURE Evaluation September 28, 2015.
Management of RHIS Resources
Module 2 Household Vulnerability Prioritization Tool Database.
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 9:
Community Health Information System in Action in SSNPR/Ethiopia
RHIS Data Integration and Interoperability
Data Quality Assurance Workshop
Introduction MODULE 2: Indicators and Data Collection and Reporting
Introduction to Frameworks for Assessing RHIS
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 5:
RHIS Design and Reform ROUTINE HEALTH INFORMATION SYSTEMS MODULE 10:
Introduction to Data Quality
Data Quality Assurance
Session: 5 Using the RDQA tool for System Assessment
Using Data to Inform Community-Level Management
Session: 8 Disseminating Results
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 8:
Community Health Information System in Action in SNNPR/Ethiopia
Introduction MODULE 6: RHIS Data Demand and Use
Right-sized Evaluation
Fundamentals of Monitoring and Evaluation
The PLACE Mapping Tool Becky Wilkes, MS, GISP Marc Peterson, MA, GISP
Training of Trainers on the OVC Household Vulnerability Prioritization Tool.
ROUTINE HEALTH INFORMATION SYSTEMS
Measuring Success Toolkit
MEASURE Evaluation Using a Primary Health Care Lens Gabriela Escudero
Introduction to Comprehensive Evaluation
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 4:
Overview of the RHIS Rapid Assessment Tool
Session: 4 Using the RDQA tool for Data Verification
Community Health Information System in Action in SNNPR/Ethiopia
Assessment Training Session 9: Assessment Analysis
Introduction to the Health Information System
Training Content and Orientation
Overview of the PRISM Tools
Introduction ROUTINE HEALTH INFORMATION SYSTEMS MODULE 3:
Introduction RHIS Design and Reform ROUTINE HEALTH INFORMATION SYSTEMS
Introduction to the Routine Health Information System
Introduction to Health Informatics:
Introduction to the PRISM Framework
Information Systems for Health:
Process Improvement, System Design, and Usability Evaluation
Information Systems for Health:
Introduction to Health Informatics:
Session: 6 Understanding & Using the RDQA Tool Output
Introduction MODULE 7: RHIS Governance and Management of Resources
Process Improvement, System Design, and Usability Evaluation
Data and Interoperability:
Use of Information for Decision Making
Measuring Data Quality
Introduction to Health Informatics
Session: 9 On-going Monitoring & Follow Up
Process Improvement, System Design, and Usability Evaluation
Presentation transcript:

Assessment Implementation Performance of Routine Information System Management (PRISM) Assessment Training Session 8: Assessment Implementation MEASURE Evaluation Date:

Session objectives Describe the PRISM assessment implementation steps Understand the criteria for the adaptation of the PRISM Tools to the local context Understand the sampling method Become familiar with electronic PRISM data entry

PRISM implementation Pre-assessment steps Establish country leadership, buy-in, and stakeholder engagement/coordination Principle outputs: HIS Advisory Group Assessment management and monitoring Working Group Set priorities and plan for the assessment Principle products: HIS priorities in country context: health program focus (nation-wide vs selected provinces/states); desired unit of analysis (comprehensive vs focused) Customized PRISM Tools Assessment Plan: sampling unit, sampling frame, sample size; key informants; assessment schedule; composition of assessment team; assessment team training schedule

Steps to conducting an assessment PRISM implementation Steps to conducting an assessment Manage and monitor data collection Assemble a core team of data collectors and supervisors Train the data collectors Select sites to conduct the assessment; identify key informants to interview Inform the assessment sites and key informants (preferably through official letters/communication media) Arrange logistics for data collectors Use the tools in order (Overview, Performance Diagnostic, eRHIS Performance, MAT, Office/Facility Checklist, OBAT) Supervise data collection; monitor quality of data collected and/or data entered using an electronic data entry tool Principle product: Assessment data (raw data)

Post data-collection steps PRISM implementation Post data-collection steps Analyze and assess current RHIS performance Apply PRISM data analysis plan Prepare tables, charts, assessment score matrix, etc. Review, verify, and adjust (if anomalies/inconsistencies found) Principle product: PRISM Assessment Report – current RHIS performance and its major determinants; identifying major areas of attention Translate evidence into policy, strategy, and interventions RHIS performance improvement plan, policies Implement the plan and monitor progress

Sampling method Probability samples are better than nonprobability samples. Probability samples are randomly selected and produce unbiased estimates. Two commonly used sampling techniques are: Lot Quality Assurance Sampling (LQAS) Sample size of 19 for each supervisory area Five supervisory areas: total sample size of 95 Probability Proportion to Size (PPS) Minimum sample size of 100

What is LQAS? LQAS is a sampling method with flexibility, rapidity, and inexpensiveness – relative to other probability survey methods – to assess performance using a smaller sample size. A sampling method that can: Be used locally, at the level of a “supervision area,” to identify priority areas or indicators that are not reaching average coverage or an established benchmark Provide an accurate measure of coverage or health system quality at a more aggregate level (e.g., program catchment area or district) Produce data for local management decision making and for sharing information across supervision areas Be used for quality assurance using a “minimal sample”, “maximal security” principle: Most frequently used size = <20 per supervision area Larger sizes are seldom needed

What a sample of 19 can tell us A sample of 19 provides an acceptable level of error for making management decisions At least 92% of the time, it is good for: Identifying whether a coverage benchmark has been reached or whether a supervision area is below the average coverage of a program Setting priorities in a supervision area Setting priorities among supervision areas with large differences in coverage Deciding which are the higher performing supervision areas to learn from Deciding which are the lower performing supervision areas Distinguishing knowledge/practices that have high coverage from those with low coverage Samples larger than 19 have practically the same statistical precision as samples of 19. They do not result in better information and they cost more.

What a sample of 19 cannot tell us A sample of 19 is not good for: Calculating exact coverage in a supervision area (but can be used to calculate coverage for an entire program) Setting priorities among supervision areas with little difference in coverage

Burundi: 17 health provinces Mali: 8 health provinces Sampling examples Burundi Mali Burundi: 17 health provinces Country divided into 5 zones Random selection of 1 province in each zone (5 provinces) Selection of 2 districts per province: the provincial capital and 1 random district (10 districts) Random selection of another 5 districts across the 5 selected provinces (now 10+5=15 districts) Random selection of 9 health facilities per district and selection of the district hospital (9+1=10 facilities/district) Total of 150 health units selected Mali: 8 health provinces Selection of 6 provinces Selection of 2 districts per province: the provincial capital and 1 random district (12 districts) Random selection of another 6 districts across the 6 selected provinces (now 12+6=18 districts) Random selection of 9 health facilities per district and selection of the district reference center (9+1=10 facilities/district) Total 180 health units selected

PRISM Tools implementation

PRISM electronic data entry Options available for electronic data entry: SurveyCTO - a survey platform that allows for mobile data collection using phones, tablets, or computers All of the PRISM Tools have been adapted into electronic data collection forms in SurveyCTO Locally build data entry modules using Microsoft Excel, Google Sheets, or another platform Facilitator’s Note: If the country has decided to use SurveyCTO, refer to the guidance document for guidelines of how to upload form definitions onto SurveyCTO, customize the data entry forms, and download data files from SurveyCTO for further analysis. Demonstrate (on a big screen) the steps in using SurveyCTO for data collection. Encourage participants to ask clarifying questions during the demonstration.

How to access the PRISM Series This slide deck is one of nine in the PRISM Series Training Kit, which also includes a Participant’s Manual and a Facilitator’s Manual. Also in the PRISM Series is a Toolkit (the centerpiece of the series) and a User’s Kit. The PRISM Series is available in its entirety on MEASURE Evaluation’s website, here: https://www.measureevaluation.org/prism

MEASURE Evaluation is funded by the United States Agency for International Development (USAID) under the terms of Cooperative Agreement AID-OAA-L-14-00004. It is implemented by the Carolina Population Center, University of North Carolina at Chapel Hill, in partnership with ICF International; John Snow, Inc.; Management Sciences for Health; Palladium; and Tulane University. The views expressed in this presentation do not necessarily reflect the views of USAID or the United States government. www.measureevaluation.org