Download presentation
Presentation is loading. Please wait.
Published byBelinda Ball Modified over 7 years ago
1
Jordan Monitoring and Evaluation Support Project (MESP)
USAID Mission and Implementing Partner Monitoring and Evaluation Systems Jordan Monitoring and Evaluation Support Project (MESP) June 18, 2014
2
Agenda: Monitoring and Evaluation Support Project (MESP) Cultivating Inclusive and Supportive Learning Environments (CISLE) 18th June, 2014 Time Topic Presenter 10:00 – 10:10 Overview of meeting objectives and roles USAID CISLE AOR 10:10 – 10:20 CISLE project introduction CISLE CoP or other 10:20 – 11:00 Best Practices in Activity Monitoring and Evaluation Plan (AMEP) content and revision MESP M&E Specialist in Education 11:00 – 11:45 Data Quality Assessment (DQA) discussion CISLE M&E 11:45 – 12:00 Q&A
3
MESP provides technical and advisory services to:
What Does MESP Do? MESP provides technical and advisory services to: USAID/Jordan Office of Program Management (OPM) USAID/Jordan Development Objective (DO) teams Implementing Partners (IPs) Services in the areas of: Project and activity performance monitoring Evaluation Research Organizational learning Knowledge management In a nutshell, MESP can make your life easier across all areas of Mission M&E work We are not an audit or enforcement arm of OPM and we are not USAID – we can’t take over any of your official duties such as approval or clearance. However, we can make implementation of many of your duties related to M&E much easier.
4
Specific Areas of MESP Support
Three overall technical areas: Mission performance monitoring systems Evaluation (including baselines and targets) Strategic communications of Mission results Support in the forms of: Structured training/workshops One-on-one or group technical assistance Technical review of documents Tool development Direct implementation (evaluations, baselines, etc.) Mission PM Systems include IP M&E systems
5
Expected Results of this Meeting:
CISLE will be able to develop a more complete M&E plan in line with USAID best practices CISLE will be able to work with USAID to identify and report on required USAID indicators relevant to the activity CISLE will be better prepared to support upcoming USAID Data Quality Assessment (DQA)
6
Why do IP M&E Systems and Results Matter?
Better Jordan Better CDCS IMPL Better USAID DO/MGMT Better IP M&E Other options for text: Implementation of Mission PMP Strengthened Provision of useful data by partners improved Improved USAID staff ability to work with partners to improve the quality of their M&E plans.
7
What is the CDCS slide: What is in the CDCS? Discuss the RF and show it Monitored annually in the PPR
8
Most Performance Indicator Data comes from IPs/Activities
The Link between the CDDS Result Framework and the Activity USAID/W PPR level Most Performance Indicator Data comes from IPs/Activities Mission Mission level DO 1 DO 2 DO level So how do you know what required indicators correspond to your project? – USAID has to tell you. Next slide Project level Project 1 Project 2 IP 1 IP 2 CISLE Activity level Field 1 Field 2 Field level
9
Three levels of USAID M&E Plans
Completed by… When… Describes plans to measure… Insert row above referencing CDCS Mission Performance Management Plan (PMP) Mission staff Within 6 months of CDCS approval Performance across all of the Mission’s DOs against CDCS results Activity M&E Plan (AMEP) Implementing Partners Within 90 days of award Performance of the Activity There are expected results and indicators in all three document types.
10
What should your Activity M&E Plan (AMEP) tell USAID?
Describes how the Activity-level results contribute to the achievement of the Mission CDCS/PMP Describes how and why and when your activities will produce desired change = the “story” of the activity Describes how your M&E systems will produce, analyze, assess, and utilize data Describes how data will guide project decision-making When you look at an AMEP, especially a results framework and performance indicator table, the project’s “story” should be clear
11
What should be in your Activity M&E Plan?
12
Recommended Elements of an AMEP:
Development Hypothesis or Theory of Change Results Hierarchy/Framework/Diagram Indicators esp. USAID-required Performance Indicator Reference Sheets (PIRS) Baselines and Targets Partner’s process for data management, reporting and quality assurance Partner’s M&E staffing and oversight Gender aspects and other Vulnerable Groups Evaluation Plans and Questions M&E Calendar
13
Development Hypothesis – Narrative
Describes theory of change, logic, and causal relationships between results in a framework Explains the relationships between each layer of results, often using if-then statements that reference the evidence that supports the causal linkages Based on development theory, practice, sound evidence, literature, and experience Country- and Context-specific Basically, this says what the project is trying to do and why it is structured/designed in this way.
14
Results Framework/Hierarchy
Illustrates the Development Hypothesis Bottom – activities (may be left off) Middle – sub-IRs/sub-purposes Top – IRs/Activity Purpose Can Illustrates linkage to Mission RF Use Mission Results language Use Mission DO numbering system Use parallel boxes with Mission results next to activity results Can show indicators at each level Written in “results language” Purpose Sub-P Sub-P Task1 Task2 Basically, this says what the project is trying to do and why it is structured/designed in this way.
15
Country Development Cooperation Strategy (CDCS) DO3 Result Framework
Improved Prosperity, Accountability, and Equality for a Stable, Democratic Jordan DO 3: Essential Service to the Public Improved IR 3.2 Quality of education services improved Sub-IR 3.3 Gender responsive and relevant education services for at-risk youth increased Sub IR Institutional capacity strengthened to improve student learning outcomes This slide shows an overview of performance against the Mission’s results framework (this one uses Health as an example). MSI will help each DO team populate this slide assuming you provide OPM with the necessary data. The vertical strip at the left hand side of each box indicates the overall ‘health’ of this result. This should be based on both the indicator data reported for the result as well as the manager’s understanding of the context. It may be that the indicators depict an underperformance toward the result, but that recent trends point toward things being back on-track for the result. In this case the manager may rate the result as Performing as Expected. The overall health rating of higher level results should incorporate the information depicted in its contributing (intermediate/sub-intermediate) results- both their overall health as well as their indicator data, culminating in an overall rating for the DO. The small boxes in the bottom right-hand corner of each box depict the status of each of the result’s individual indicators against their targets for this period. Detailed visualizations for each indicator are included in the Word document that accompanies this presentation With this visualization, you can rapidly form a high-level picture of how things are going across a DO. Sub IR Learning environment improved
16
DO3 Result Framework IR 3.2 Quality of education services improved
DO 3: Essential Service to the Public Improved 3b. % change in beneficiary satisfaction in quality education delivered at USG supported schools or equivalent non school-based setting IR 3.2 Quality of education services improved 3.2.a Number of teachers/educators trained with USG support. 3.2.b Number of learners benefiting from USG assistance. 3.2.c Proportion of students who, by the end of two grades of primary schooling, demonstrate that they can read and understand the meaning of grade level text. Sub IR Institutional capacity strengthened to improve student learning outcomes 3.2.1.a Number of learners receiving reading interventions at the primary level b Number of educators trained in early grade literacy 3.2.1.c Percent of teachers using diagnostic and assessment tools to gauge student learning in reading and math 3.2.1.d Number of education pedagogy courses developed and approved 3.2.1.e Number of students enrolled in ‘new’ education pedagogy offerings at University Sub IR Learning environment improved 3.2.2.a Number of schools achieving “Developing” on a Rubrics Based Benchmarking tool 3.2.2.b Number of students benefiting from USG infrastructure improvements 3.2.2.c Percentage of target population that views GBV as less acceptable after participating in or being exposed to USG programming 3.2.2.d Number of classrooms built or repaired with USG assistance (F ) 3.2.2.e New design concept for school construction is effectively utilized Sub-IR 3.3 Gender responsive and relevant education services for at-risk youth increased 3.2.3.a Number of at-risk youth with increased access to education services 3.2.3.b Number of USG-assisted organizations serving at-risk youth 3.2.3.c Percent change of at-risk youth engaging in extra-curricular activities This slide shows an overview of performance against the Mission’s results framework (this one uses Health as an example). MSI will help each DO team populate this slide assuming you provide OPM with the necessary data. The vertical strip at the left hand side of each box indicates the overall ‘health’ of this result. This should be based on both the indicator data reported for the result as well as the manager’s understanding of the context. It may be that the indicators depict an underperformance toward the result, but that recent trends point toward things being back on-track for the result. In this case the manager may rate the result as Performing as Expected. The overall health rating of higher level results should incorporate the information depicted in its contributing (intermediate/sub-intermediate) results- both their overall health as well as their indicator data, culminating in an overall rating for the DO. The small boxes in the bottom right-hand corner of each box depict the status of each of the result’s individual indicators against their targets for this period. Detailed visualizations for each indicator are included in the Word document that accompanies this presentation With this visualization, you can rapidly form a high-level picture of how things are going across a DO.
17
Program Monitoring Plan (PMP) Cultivating Inclusive and Supportive Learning Environments (CISLE) 2013_2016 (23 IND) GOAL: Ensure that all children – local residents and Syrian refugees – are afforded an equal opportunity to acquire a purposeful and meaningful education in a safe and supportive learning environment.(2 IND) Objective 1: Enhance the capacity of teachers to integrate and increase the participation of displaced refugee student’s in Jordan’s public schools (4 IND) Objective 2:Increase local community awareness, responsibility, advocacy and participation in the targeted schools (5 IND) Objective 3: Promote supportive and inclusive learning environments in Model Community Schools (MCS) (6 IND) Objective 4: Strengthen Community-school Support Connections through Life-long Learning and Extra-Curricular Programs (6 IND)
18
Where do Mission indicators come from?
F- or standard Custom Both in the PPR and all should be in the M-PMP as well It would be good to have an example of Activity XYZ with some results and indicators listed. We can make these up/borrow from other training materials or other Missions. Using the example, we can point out the specific indicators that would be included because they are in the PMEP and/or PMP. Then we can point out the indicators that aren’t in those plans but that are useful for management. We can also have examples of indicators that would not be in the AMEP because USAID does not need them even if the IP needs them. Not required (but may be useful): Additional indicators that will (only) help the implementing partner manage its own performance
19
Required Mission Indicators
AMEP must include: Required Mission indicators Gender (Washington and DO 4/cross-cutting) Any additional indicators that will help USAID manage or monitor the activity USAID may give you the minimum number but the more you can use the better It would be good to have an example of Activity XYZ with some results and indicators listed. We can make these up/borrow from other training materials or other Missions. Using the example, we can point out the specific indicators that would be included because they are in the PMEP and/or PMP. Then we can point out the indicators that aren’t in those plans but that are useful for management. We can also have examples of indicators that would not be in the AMEP because USAID does not need them even if the IP needs them. Not required (but may be useful): Additional indicators that will (only) help the implementing partner manage its own performance
20
What Are Your Required Indicators?
Discuss any USAID guidance provided to date and/or current guidance We don’t know. USAID has to tell us. Assume IR 1.1 and IR 1.2
21
Education Mission PMP and Activity alignment
No specific Mission PMP or PPR indicators in the CISLE PMP M-PMP indicators CISLE indicators 3b. % change in satisfaction with quality of education. -IND 1.5 Percentage change of displaced and refugee students reporting increased satisfaction with the environment in schools - GOAL level 3.2a Number of teachers/educators trained with USG support - 3.2.b Number of learners benefiting from USG assistance This slide shows an overview of performance against the Mission’s results framework (this one uses Health as an example). MSI will help each DO team populate this slide assuming you provide OPM with the necessary data. The vertical strip at the left hand side of each box indicates the overall ‘health’ of this result. This should be based on both the indicator data reported for the result as well as the manager’s understanding of the context. It may be that the indicators depict an underperformance toward the result, but that recent trends point toward things being back on-track for the result. In this case the manager may rate the result as Performing as Expected. The overall health rating of higher level results should incorporate the information depicted in its contributing (intermediate/sub-intermediate) results- both their overall health as well as their indicator data, culminating in an overall rating for the DO. The small boxes in the bottom right-hand corner of each box depict the status of each of the result’s individual indicators against their targets for this period. Detailed visualizations for each indicator are included in the Word document that accompanies this presentation With this visualization, you can rapidly form a high-level picture of how things are going across a DO. PPR indicators CISLE indicators Number of teachers in USG supported programs trained on how to support learners’ psychosocial well-being -IND 1.2 Number of teachers trained in psychosocial and pedagogical needs of displaced refugee children -IND 3.2 Number of trained teachers in psychosocial and interactive pedagogy
22
Potential DO3 indicators:
DO 3: Essential Service to the Public Improved 3b. % change in beneficiary satisfaction in quality education delivered at USG supported schools or equivalent non school-based setting IND 1.5 IR 3.2 Quality of education services improved 3.2.a Number of teachers/educators trained with USG support. 3.2.b Number of learners benefiting from USG assistance. 3.2.c Proportion of students who, by the end of two grades of primary schooling, demonstrate that they can read and understand the meaning of grade level text. Sub IR Institutional capacity strengthened to improve student learning outcomes 3.2.1.a Number of learners receiving reading interventions at the primary level b Number of educators trained in early grade literacy 3.2.1.c Percent of teachers using diagnostic and assessment tools to gauge student learning in reading and math 3.2.1.d Number of education pedagogy courses developed and approved 3.2.1.e Number of students enrolled in ‘new’ education pedagogy offerings at University Sub IR Learning environment improved 3.2.2.a Number of schools achieving “Developing” on a Rubrics Based Benchmarking tool 3.2.2.b Number of students benefiting from USG infrastructure improvements 3.2.2.c Percentage of target population that views GBV as less acceptable after participating in or being exposed to USG programming 3.2.2.d Number of classrooms built or repaired with USG assistance (F ) 3.2.2.e New design concept for school construction is effectively utilized Sub-IR 3.3 Gender responsive and relevant education services for at-risk youth increased 3.2.3.a Number of at-risk youth with increased access to education services 3.2.3.b Number of USG-assisted organizations serving at-risk youth 3.2.3.c Percent change of at-risk youth engaging in extra-curricular activities This slide shows an overview of performance against the Mission’s results framework (this one uses Health as an example). MSI will help each DO team populate this slide assuming you provide OPM with the necessary data. The vertical strip at the left hand side of each box indicates the overall ‘health’ of this result. This should be based on both the indicator data reported for the result as well as the manager’s understanding of the context. It may be that the indicators depict an underperformance toward the result, but that recent trends point toward things being back on-track for the result. In this case the manager may rate the result as Performing as Expected. The overall health rating of higher level results should incorporate the information depicted in its contributing (intermediate/sub-intermediate) results- both their overall health as well as their indicator data, culminating in an overall rating for the DO. The small boxes in the bottom right-hand corner of each box depict the status of each of the result’s individual indicators against their targets for this period. Detailed visualizations for each indicator are included in the Word document that accompanies this presentation With this visualization, you can rapidly form a high-level picture of how things are going across a DO.
23
Education Mission PMP and Activity alignment
Other indicator issues with current CISLE AMEP/PMP: Missing baselines and disaggregation (gender, nationality, location) Unclear timeline for targets and actuals (year and months of Quarter) This slide shows an overview of performance against the Mission’s results framework (this one uses Health as an example). MSI will help each DO team populate this slide assuming you provide OPM with the necessary data. The vertical strip at the left hand side of each box indicates the overall ‘health’ of this result. This should be based on both the indicator data reported for the result as well as the manager’s understanding of the context. It may be that the indicators depict an underperformance toward the result, but that recent trends point toward things being back on-track for the result. In this case the manager may rate the result as Performing as Expected. The overall health rating of higher level results should incorporate the information depicted in its contributing (intermediate/sub-intermediate) results- both their overall health as well as their indicator data, culminating in an overall rating for the DO. The small boxes in the bottom right-hand corner of each box depict the status of each of the result’s individual indicators against their targets for this period. Detailed visualizations for each indicator are included in the Word document that accompanies this presentation With this visualization, you can rapidly form a high-level picture of how things are going across a DO.
24
Indicators must have Baselines
Value of a performance indicator when beginning implementation; should be established before activity implementation begins If indicator data will be disaggregated, then baselines and targets should be set for each disaggregation By definition, project-produced outputs will have baseline of zero Note that Targets may be set in the RFP/proposal/contract and may flow down from the M PMP/PPR. Therefore, the margins for setting these may be more narrow. Often PADs and SOWs and contracts will have LOP targets but rarely will have annual ones. Almost without exception, the IPs accept the targets in the solicitations and agree to produce these. The trick, as noted, is to get good targets that are relevant to the higher level Do and Mission programs and targets.
25
Indicators must have Targets
Amount of expected change in a performance indicator to be achieved within an explicit timeframe with a given level of resources USAID/Jordan sets these for Mission-required indicators Should specify quantity, quality, and time; based on analysis of past trends, experience of similar activities, expert opinion and the existence of objective quality standards Within LOP targets, annual targets should also be related to project stage, e.g., start-up vs. close-out phases Note that Targets may be set in the RFP/proposal/contract and may flow down from the M PMP/PPR. Therefore, the margins for setting these may be more narrow. Often PADs and SOWs and contracts will have LOP targets but rarely will have annual ones. Almost without exception, the IPs accept the targets in the solicitations and agree to produce these. The trick, as noted, is to get good targets that are relevant to the higher level Do and Mission programs and targets.
26
Partner’s Process for M&E and Reporting
There should be AMEP narrative that describes who is responsible for: data collection and management (M&E manager, technical specialists, others) in what format (database, spreadsheets, GIS) data will be managed, and who is responsible for producing which reports. DQA procedures Aspects of quality control at all stages should be described.
27
Partner’s Process for Learning
USAID/Jordan is putting increased emphasis on the use of information for management and for learning for the future. AMEPs should describe how the IP will use M&E data/processes for these purposes: Periodic reviews of monitoring data Adjustments/triangulation of other reported data Improvement/revisions to AMEP systems based on DQA Adjustments/revisions based on evaluations
28
Aspects of Gender and Other Vulnerable Populations
Sex-disaggregation in data collection, analysis, and reporting for people Other relevant people disaggregation Specific gender (or other)-related outcomes you are expecting to achieve: # of new female business owners instead of # of new businesses owners disaggregated by sex % of new banking accounts opened by women that did not require male involvement This is sort more than gender-aware, it is pro-actively gender active.
29
Aspects of Gender and Other Vulnerable Populations
Even where activities are not obviously directed at different groups, IPs should examine if there could be disparate effects on different sexes or different groups as a result: Would improving availability of water change the lives of young girls more than young men? Would a change in interest rates or fiscal policies affect the lives of women more than men? This is sort more than gender-aware, it is pro-actively gender active.
30
Evaluation Plans and Questions
To the extent known, the AMEP should outline the plan for how the activity will be evaluated though ultimately the mission will decide on what kind of evaluations, when, and who will do them. Is the project appropriate for an impact evaluation or a performance evaluation? (assess “evaluability”) Some projects are, by nature, appropriate for impact evaluation and USAID Evaluation Policy has guidelines for which kinds of projects should have an impact evaluation If an impact evaluation is possible, does the AMEP set the baselines and collect data in such a way so as to support an eventual impact evaluation?
31
M&E Calendar This will lay out when the partner will complete the various M&E tasks described earlier. Should include all the expected events in which performance information is to be reported, reviewed, and discussed and in which important decisions about program strategy and activities are to be made such as semi-annual or annual performance reviews. It would be useful if it shows how it fits with the Mission’s M&E calendar.
32
Performance Indicator Reference Sheets
Performance Indicator Reference Sheets (PIRS) contain all of the details about the indicator - who, what, why, when, where, how, how much, and baseline and other aspects. Completing PIRS is essential and most M&E staff find that it the best way to truly understand if and then how an indicator will work. It is usually a “reality test” and critical for DQA. I think that it is worth showing both slides and talking about this as a blank so that people can concentrate on the things to be put in there without reading actual text. When and how should these be created/updated? Note that shown is just one example of an “official” PIRS. We have several.
33
Performance Indicator Reference Sheets
Should be completed for all indicators in the AMEP Will be longer than one page IPs complete PIRS Clarify relationship with Mission PIRS There is a revised template tailored for IPs USAID definition (where applicable) Precise definition as described by IP Some aspects may be “TBD” early in the activity For “F” and other required M-PMP/PPR indicators, partner should detail what specific activities fit into the higher level definition provided Give some examples of why definitions are so important – DRG indicator uses “addressed” – what does this mean? Lead into next slide.
34
Performance Data Tables (PDT)
Should include: Most recent indicator data available for the time period (Mission, Activity, “F”, etc.) All required disaggregations Quarterly data and cumulative results Numerators and denominators for percentages Format options: Word or Excel table included in body of Quarterly Reports or as an Annex Well, the AMEP Plan would have the blank table with available info but the activity reports (quarterly, etc) would have the actual available data. In the Template we will include an example but there is no set format … yet.
35
When Should AMEPs Be Updated or Modified?
When unknown or missing information becomes available Targets Baselines Data sources/methods When the IP work plan changes When priorities for highlighting results shift When the Mission’s RF and Indicators change When current AMEP indicators aren’t working All changes to the AMEP should be in an AMEP change-log
36
Potential Next Steps for MESP-CISLE Partnership
Any questions? Where does CISLE go from here with their AMEP?
37
Ready for the USAID Data Quality Assessment (DQA)?
38
USAID’s June/July 2014 DQA of CISLE Indicators
Purpose of a DQA is to determine: “ extent to which the data…can be trusted to influence management decisions.” (ADS ) Mandated for all Mission indicator data reported to Washington every 3 years = PPR indicators not every AMEP indicator Helps USAID understand any limitations in the data that may affect how they can use it for activity and portfolio management standardized across IPs data meet USAID definitions (M-PMP and/or “F”) data don’t have any serious errors identifies needed steps to address issues (where applicable)
39
USAID’s June/July 2014 DQA of CISLE Indicators
Key documents for DQA: DQA Checklist PIRS (Mission PMP, “F” handbooks, IP) IP M&E tools, processes, tracking sheets, and actual reported data Key personnel in DQA IP M&E staff USAID COR USAID DO team M&E PoC OPM staff (receive, sign-off on, and monitor)
40
USAID’s June/July 2014 DQA of CISLE Indicators
USAID’s DQA process: Identify indicators Discuss utility of indicator for IP management Discuss data collection, analysis, reporting, and storage procedures of IP and compare with USAID requirements where applicable Review relevant documents required to assess and verify described procedures Complete a detailed checklist Review findings with partner Agree on recommendations Monitor implementation of recommendations
41
Data Quality Assessment (DQA)
F-Indicator (Standard Foreign Assistance Indicators) Number of teachers in USG supported programs trained on how to support learners’ psychosocial well-being “The term ‘teachers’ refers to both formal and non-formal teaching personnel and can include skilled trainers that provide support to formal and non-formal learning environments. “Training in psychosocial well-being” refers to building teachers’ capacity to provide a healing and supportive environment for learners affected by crisis and conflict”
42
Data Quality Assessment (DQA)
CISLE Indicators that are close matches IND 1.2- Number of teachers trained in psychosocial and pedagogical needs of displaced refugee children: Number of teachers trained in psychosocial and pedagogical need of displaced refugee children. IND 3.2- Number of trained teachers in psychosocial and interactive pedagogy: This indicator will measure the number of teachers trained in psychosocial and interactive pedagogy. These trainings will be held in a formal setting and will equip teachers to understand the psychosocial and pedagogical factors necessary in working w refugee children What about the word “support” in F-definition? What about targets? 3.2=400 trained, 1.2=3600, PPR=2400
43
Next Steps for CISLE DQA
Clarify definition so it is clear which indicator and how it fits Complete PIRS Prepare all relevant documents M&E tools Data entry, tracking, storage forms Description of relevant M&E processes USAID: Send notification Conduct DQA Share findings with IP Send Checklist to OPM
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.