Download presentation
Presentation is loading. Please wait.
1
Designing for Results: Activity Results Framework
Orientation March 2016 Designing for Results: Activity Results Framework
2
What are Activity Results Frameworks? What are they for?
3
Source: Julie Smith
4
Activity Implementation/Management:
Purpose of session Improve understanding of what good Activity Results Frameworks look like. i.e. support: Activity Design: agreed logic, intended results, approach for monitoring and evaluation Activity Implementation/Management: learning, improvement, decision-making, accountability and communications ‘inflight’ adjustments to ensure relevance and effectiveness in complex and dynamic contexts.
5
3 Components of Activity Results Frameworks
1. Results Diagram 2. Results Measurement Table 3. Monitoring and Evaluation Work Plan
6
1. RESULTS DIAGRAM Purpose of a results diagram:
underlying logic or theory of change, informs design of outputs how planned outputs expected to contribute desired outcomes Based on good problem analysis (causes and consequences) determines performance information needed from monitoring and evaluation
7
Results chain INPUT OUTPUT OUTCOME
Resources required: Money, People, Equipment, Something being produced or delivered: services, capital goods and products Can be purchased The change or improvement that occurs as a result: e.g. improved knowledge, practices, livelihoods Several levels: short, medium and long- term. Cannot be purchased
8
Results Chain: example (opening up the ‘black box’)
Multiple influences Increased income and resilience of smallholder farmer households in South and South-West Tanna Improved productivity of smallholder farmers through improved production, processing and business management. Smallholder farmers adopt improved coffee and other crop production and processing techniques. Long-term (Ultimate) outcome (changes in state, social, economic, environmental addressing the problem/need) Medium-term (intermediate) outcomes (e.g. changes in behaviour) Short-term (immediate) outcomes (e.g. changes in skills, awareness, knowledge, attitudes…) Outputs (Services, products delivered by the Activity) Smallholder farmers received training in improved coffee growing practices and processing techniques External environmental, political, social, economic factors… Farmers, other implementing partners? Opening up the black box: to examine why the intervention is working, or not working: Train farmers -- increased production and income (is it due to the training provided in improved farming practices, or what other possible explanations unrelated to the Activity?) -- did not increase production and income – why? Activity failed (training failed) or what other external factors? Attribution/contribution: control/attribution strongest at Inputs and Outputs level. Reduces at higher levels. When is short-term, medium-term and long-term? Dependent on each Activity and its design – and “pitch” of results. General guidance: Outputs: services/products delivered during implementation Short-term outcomes – improvements progressed / achieved during implementation Medium-term outcomes – improvements progressed / achieved during/ post Activity completion Long-term outcomes – improvements achieved post Activity completion. Smallholder farmers have enhanced knowledge and understanding of improved coffee growing and processing techniques. Training provider
9
EXAMPLE 1 Goal: Improved livelihoods through increased protection of the environment Communities trained in Landcare, gardening, livestock and plantations Raised garden beds built Communities have increased understanding of land/ crop/ livestock management Communities have increased understanding of land/ coastal protection All households have raised vegetable beds with newly planted crops Increased food security for community members and crops Improved animal management and increased production of eggs, poultry meat etc. Livestock is protected from predation (e.g. poultry) and contained to prevent damage to crops (e.g. pigs) Livestock management techniques implemented Fruit and stability trees planted Fruit trees provide sources of food Long-term outcome(s) Medium- term outcomes Short-term outcomes Outputs Increased protection of immediate environment including coastlines (where applicable), and within villages Tree planting initiatives are established successfully Tree and plant re- growth on previously unutilised or damaged soil Some are less tidy… 9
10
EXAMPLE 2 Other ways to depict results diagram- key headline for each level of result, acknowledging white space, not just about what’s in the boxes, 10
11
Process for developing Results Diagrams
SOME TIPS Workshop the diagram – in collaboration with key stakeholders Usually takes back and forth – more than a single meeting Test logic: Moving up the chain, ask WHY? Moving down the chain, ask HOW? For e.g: If we successfully provide teacher training, then the effect will be increased knowledge of students, because teacher knowledge of learning techniques is important to student learning. Focus on logic, not measurability at this stage Aim for shared understanding of the logic with key stakeholders before developing indicators, targets etc.
12
Group Exercise 1 (20 minutes)
Review a Results Diagram for an Activity (redraw if needed): Do we have adequate understanding of the problem, its causes, and consequences? Is the intervention logic clear? Are there gaps in the logic? (leaps of faith?) Consider assumptions and risks: Within scope - address in design? Outside scope - monitor and mitigate?
13
Complexity and predictability of outcomes?
Simple? Complicated? Complex? Complexity and predictability of outcomes? Evolving/changing risks? Opportunities? Add in here that it’s not new – was used in the 70’s – shift back to project modalities has seen a resurgence in interest combined with IT and private sector developing new models with applicability to the complex context of development.
14
What does it mean for how we plan, monitor, evaluate and learn?
How do we know adaptive management is appropriate? What does it mean for how we plan, monitor, evaluate and learn? Confident on solution Adaptive design and management: monitoring, evaluation and learning supporting fast feedback and response Traditional planning, project management, monitoring and evaluation Stable context Unstable context Adaptive design and management: monitoring, evaluation and learning supporting experimentation and iteration Move to another quadrant! Positive deviance? Not confident on solution
15
A B C ? D ? Programme Logic Influences Z Y X U V ? ? E F G? H? ? ? ?
Long term outcomes Z Y Contextual: political, social, economic, environmental Medium term outcomes X U V ? ? Short term outcomes E F G? H? Activity ? ? ? A B C ? D ? Outputs Our ‘best guess’ of how outcomes will be achieved: honesty and humility? Test, refine with in-country stakeholders: during concept, design, inception? Regularly review based on evidence: Are causal links and assumptions still valid? Are outputs, indicators, and targets still appropriate?
16
Key questions for Activity management and Performance Assessment should drive M&E
Question: What are the MFAT Activity Quality Criteria and how do they relate to Results and Monitoring and Evaluation?
17
Key criteria from MFAT Activity Quality Policy
Key questions for Activity management and Performance Assessment should drive M&E Key criteria from MFAT Activity Quality Policy Relevance – doing the right things? Any changes to context that affect relevance of the outputs? Effectiveness Outputs: quantity, quality, relevance, timeliness, coverage? b) Outcomes: What difference is the Activity making? Which aspects are working better, for whom, why? Unintended consequences? Key factors that helped or hindered achievement? Lessons learned? What improvements can be made?
18
Key criteria from Activity Quality Policy(cont’d)
Key questions for Activity management and Performance Assessment should drive M&E Key criteria from Activity Quality Policy(cont’d) c) Cross cutting issues - Gender Equality & women’s empowerment, human rights, environment and climate change 4. Efficiency – is this Activity being managed/delivered cost effectively? Harmonisation/coordination to avoid duplication or maximise synergies? 5. Impact - extent to which contributing to long term outcomes? Beyond the life of most Activities 6. Sustainability – likelihood of benefits continue beyond MFAT funding?
19
2. RESULTS MEASUREMENT TABLE
Purpose: sets out how the results identified in the results diagram will be assessed Suggested improvements in response to feedback: to enhance ability to address performance questions in DAC criteria (relevance, effectiveness, efficiency, impact and sustainability): Choose indicators that help assess these questions supplement quantitative indicators (what and how many) with qualitative information to address evaluative questions (why, how, so what) Separate baselines from targets for clarity Separate data source from frequency of measurement
20
Example Results Measurement Table
21
Indicators Clear and neutral: number of training participants
quality of water Balanced view of performance (quantity and quality) Concise and meaningful: count what counts! Use existing indicators where relevant and available (esp. for long term outcome indicators) Where new data collection needed- include in M&E workplan 21
22
Baselines Baselines describe the situation prior to the intervention enabling us to measure change Where not already available, baseline may need to be measured early in implementation- note in M&E workplan Good baseline data enables us to set realistic targets 22
23
Targets The quantity or quality that is intended to be achieved by a stated time period (what success, or achieving the result, looks like) Are always time bound: Year 1, Year 2, Year 3, total Often determined subjectively - it is important to have complete agreement with partners How good is the target? look at baseline, similar Activities, data trends or by gaining opinions of partners, and /or stakeholders. realistic and achievable regular review required to ensure they remain so Example from Fisheries and SBEC on quantity output targets that failed to take quality into account, which ended up working against effectiveness in outcomes. 23
24
Targets Hard targets and soft targets – recognising we have greater control over results at output and short-term outcome level (compared to medium and long-term results), it is important to treat targets accordingly. Hard (or specific) targets should be agreed for outputs and short-term outcomes For example: 90% of water samples 150 people trained (80 women and 70 men). Soft (or directional) targets are usually more appropriate for medium and long-term outcomes Improving educational achievement Decreasing rates of disease 24
25
Exercise 2 (20 minutes) Discuss the quality of the Results Measurement Table of your Activity Start with Outputs, then short term outcomes (the main focus of partner progress assessment and reporting) Look for opportunities to align to some SRF Direct Indicators where relevant Tips: Indicator must be neutral Balance indicator focus on quality and quantity Use consistent unit of measurement
26
3. MONITORING AND EVALUATION WORKPLAN
Purpose: Implementation plan for M&E of an Activity – key tasks, responsibilities, timeframes and resources (est % of budget) Baseline data, processes and tools Strengthening or creating monitoring systems, processes and capacity Baseline data collection Monitoring, review, learning and reporting Regular data collection, monitoring visits Regular results and learning discussions (incl review of Results Framework) and reporting: review progress, issues, and lessons based on quantitative and qualitative evidence informs workplan, and any adjustments to ensure relevance and effectiveness in complex/dynamic Aid context Regular reporting Introduce M&E workplan first so think of M&E system more wholistically and understand purpose before selecting indicators. Rule of thumb: 4-6% of Activity budget for M&E (depending on nature and complexity of Activity)- ensure covers M, not just indep evaluation at the end.
27
Monitoring and Evaluation Workplan (cont’d)
Evaluation: dependent on size and needs of the Activity What is and is not working well, and why, what improvements? Formative or process evaluations- early in the life of an Activity, aim to inform and improve implementation Summative or outcome evaluations- at the end of Activity (or later), aim to make a judgement about the value of an Activity, and identify lessons learned for other Activities Could be led by Partner or MFAT Independent evaluation commissioned by MFAT is mandatory for Activities >$10M.
28
Example of M&E workplan template
29
Conclusion: Activity Results Frameworks are not….
An admin reporting burden Relevant critical questions - for managing Activity, inform decisions, communicate results on relevance, effectiveness, efficiency, sustainability Over-complicated and slow Adaptive Management of complex Activities in dynamic context (risks/opportunities)- frequent, timely feedback, formative evaluation Baking a cake, building a rocket, or raising a child? Predictability of outcomes, risks determines level of monitoring and evaluation Rigid Activity Results Frameworks (Results Diagram/Theory of Change, Outputs, Indicators, targets) regularly reviewed and updated to ensure relevance, and used for progress assessment and reporting About counting widgets Count what counts, not what can be easily counted. Quality, not just quantity. Also qualitative evidence (incl. informed perspectives of key stakeholders)
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.