Module 8: Monitoring, evaluation and learning – for increased impact and improvement of the IEA process.

Slides:



Advertisements
Similar presentations
Chapter 5 Transfer of Training
Advertisements

Module 2: National IEA process design and organization
Session at a Glance Session 1: Introduction
Capacity Building Mandate We, the participants…recognize the need to support: …A coordinated effort to involve and assist developing countries in improving.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Performance management guidance
Working with the Teachers’ Standards in the context of ITE. Some key issues for ITE Partnerships to explore.
Ray C. Rist The World Bank Washington, D.C.
Abt Associates Inc. In collaboration with: I Aga Khan Foundation I BearingPoint I Bitrán y Asociados I BRAC University I Broad Branch Associates I Forum.
Screen 1 of 24 Reporting Food Security Information Understanding the User’s Information Needs At the end of this lesson you will be able to: define the.
Performance Appraisal System Update
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Putting It all Together Facilitating Learning and Project Groups.
SUNITA RAI PRINCIPAL KV AJNI
Performance Management Upul Abeyrathne, Dept. of Economics, University of Ruhuna, Matara.
Whose Job Is It? Part Two © Iowa Association of School Boards At the Board Table Discussion Tool.
Charting a course PROCESS.
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
Webinar: Leadership Teams October 2013: Idaho RTI.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
TEMPUS IV- THIRD CALL FOR PROPOSALS Recommendation on how to make a good proposal TEMPUS INFORMATION DAYS Podgorica, MONTENEGRO 18 th December 2009.
District Workforce Module Preview This PowerPoint provides a sample of the District Workforce Module PowerPoint. The actual Overview PowerPoint is 62 slides.
2004 National Oral Health Conference Strategic Planning for Oral Health Programs B.J. Tatro, MSSW, PhD B.J. Tatro Consulting Scottsdale, Arizona.
Supporting Integrated and Comprehensive Approaches to Climate Change Adaptation in Africa AAP Country Conference “Celebrating our Successes” 12 th - 16.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Alaska Staff Development Network – Follow-Up Webinar Emerging Trends and issues in Teacher Evaluation: Implications for Alaska April 17, :45 – 5:15.
The GEO Resource Book on Integrated Environmental Assessment Module 8 Improving the IEA Process and Increasing Impact through Monitoring, Evaluation and.
Module 2 Stakeholder analysis. What’s in Module 2  Why do stakeholder analysis ?  Identifying the stakeholders  Assessing stakeholders importance and.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Integrated Risk Management Charles Yoe, PhD Institute for Water Resources 2009.
Indicators to Measure Progress and Performance IWRM Training Course for the Mekong July 20-31, 2009.
Training Resource Manual on Integrated Assessment Session UNEP-UNCTAD CBTF Process of an Integrated Assessment Session 2.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
Effective Networking for Social Learning The Experience of Grupo Chorlaví Julio A. Berdegué Presented at the Annual Meeting 2004 of Euforic, June 8-9,
Intel ® Teach Program International Curriculum Roundtable Programs of the Intel ® Education Initiative are funded by the Intel Foundation and Intel Corporation.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
1 EMS Fundamentals An Introduction to the EMS Process Roadmap AASHTO EMS Workshop.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Livia Bizikova and Laszlo Pinter
Kathy Corbiere Service Delivery and Performance Commission
0 ©2015 U.S. Education Delivery Institute While there is no prescribed format for a good delivery plan, it should answer 10 questions What a good delivery.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Prepared by: Forging a Comprehensive Initiative to Improve Birth Outcomes and Reduce Infant Mortality in [State] Adapted from AMCHP Birth Outcomes Compendium.
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Copyright © May 2014, Montessori Centre International.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
1 Virtual training Panama City, June 2009 Module 8 Monitoring, Evaluation and Learning.
Module 2 National IEA Process Design and Organization Stakeholder’s consultative workshop/ Training on Integrated assessments methodology ECONOMIC VALUATION.
Monitoring and Evaluation
Impact-Oriented Project Planning
Fundamentals of Monitoring and Evaluation
HEALTH IN POLICIES TRAINING
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
How does it differ from community service?
Overview – Guide to Developing Safety Improvement Plan
Vision Facilitation Template
Overview – Guide to Developing Safety Improvement Plan
Draft OECD Best Practices for Performance Budgeting
Logic Models and Theory of Change Models: Defining and Telling Apart
design and organization
Building Knowledge about ESD Indicators
Effective Networking for Social Learning
Internal and External Quality Assurance Systems for Cycle 3 (Doctoral) programmes "PROMOTING INTERNATIONALIZATION OF RESEARCH THROUGH ESTABLISHMENT AND.
Integrating Gender into Rural Development M&E in Projects and Programs
Presentation transcript:

Module 8: Monitoring, evaluation and learning – for increased impact and improvement of the IEA process

Sessions at a Glance Session 1: Introduction to Module Session 2: Definitions and Context Session 3: Developing a Monitoring and Evaluation Plan Session 4: Framework, Attributes and Measures Session 5: Planning a Self-Assessment Session 6: Improvement Opportunities

Exercise: Previous Experience with Monitoring and Evaluation (30 minutes) Discuss with your neighbour (10 minutes) When have you been involved in an evaluation/monitoring process? What were some of the keys to success? What were some of the challenges? Discuss in plenary (20 minutes) Your stories, key elements of success and key challenges in plenary.

Monitoring Defined Monitoring is a planned, systematic process that closely follows a course of activities, and compares what is happening with what is expected to happen. Monitoring the IEA process makes sure the assessment meets its goals, while working within the scope of allocated resources (i.e., time, financial, human, informational and technical).

Evaluation Defined Evaluation assesses an achievement against preset criteria. Evaluations can have a variety of purposes, and follow distinct methodologies (process, outcome, performance, etc.). Evaluation of the IEA process determines the extent to which achievements (outputs, outcomes and impacts) are comparable with the originally intended purpose, and what lessons can be learned for the next environmental assessment and management cycle. The evaluation of the process is, first and foremost, a capacity-development opportunity.

Compare Monitoring and Evaluation Collecting data Sense of progress Present time Attention to details Inspires motivation Occurs continuously Requires management skills Asks: What needs to happen now to achieve our goal Evaluation Assessing data Sense of achievement Past / future time Attention to bigger picture Inspires creativity Occurs intermittently Requires leadership skills Asks: Have we achieved our goal? Can we do better next time?

Learning Defined An emotional and/or cognitive transformation taking place during information collection and information processing, bringing about behaviour change or change in the ability to act differently.

Key Questions for Developing an Evaluation Plan 1. What is the purpose of the evaluation? 2. Who will use the evaluation results? 3. Who will do the evaluation? 4. What evaluation framework is practical? 5. What needs to be monitored and evaluated? 6. What are the steps to develop a self-assessment matrix? 7. How can I use the evaluation to enhance a learning culture that keeps improving my IEA process?

Stages of National IEA Process ,

GEO as a Reporting Process

Plan - Do - Review Model ,

Why Monitoring and Evaluation? Lessons learned from IEAs help improve policy making processes, policies and state of environment. The IEA process builds capacity for periodic policy revision through monitoring and evaluation. Capacity is built among individuals and organizations.

Conceptual Understanding of the National IEA Process, With Links to Ecosystem Health and Human Well-Being

Example: Ozone-related Treaties Continuous improvement of ozone-related international treaties was due to research-based evidence. This involved close monitoring of ozone concentrations in the stratosphere, simulation of possible scenarios, and revision of international agreements. Monitoring took into account processes, products and impacts.

Example: SoE Monitoring in India Capacity building and preparation of the SoE report are the main activities of the SoE Monitoring Program. National Host Institutions (NHI) and State Host Institutions (SHI) lead the program. A two-stage monitoring and evaluation program is in place.

SoE Monitoring in India Stage 1: NHI performance evaluation (done by the Ministry) is indicated by the number of states that have progressed in their SoE programs and published final SoE reports. Stage 2: SHI performance as measured through the number of SoE reports prepared. The success of NHIs depend on SHI success, and SHIs depend on NHIs for funding.

Coupling Science and Policy Communication between the cycle of scientific data collection and processing, and the cycle of policy making is facilitated by a GEO-type process. It can take decades to develop the right set of policies after the first evidence of an environmental issue.

Example: Lake Balaton, Hungary The first scientific warnings of eutrophication in were published in the early 1970s. The process of policy response included: unusual scientific observation; scientific debate; acceptance by the policy-makers; policy development; policy enforcement; and a long time period required before the cumulative impact was apparent.

Example: United Kingdom The UK government has been publishing environmental indicators since 1996. The UK sustainable development strategy in 1999 significantly incorporated indicators. Headline indicators are used to communicate progress. An interdepartmental management board coordinates indicator development. Indicators were more successful in communicating progress than policy review.

Example: Ministry of Environment, Hungary Over a decade of published environmental and sustainable development indicators. Trends are distributed to a wide audience of technical experts, policy-makers and the general public. Regularly updated information has been a factor in the reports’ policy impacts.

Discussion: Additional Examples Do you know of other examples of how research-based evidence can guide policy making and lead to improvements? What are some examples in your country at the national level?

Sessions at a Glance Session 1: Introduction to Module Session 2: Definitions and Context Session 3: Developing a Monitoring and Evaluation Plan Session 4: Framework, Attributes and Measures Session 5: Planning a Self-Assessment Session 6: Improvement Opportunities

Foundation Steps for Effective Monitoring and Evaluation Identifying your purpose. Identifying your primary users. Deciding whether internal or external evaluators best serve your purpose.

1. Types of Purpose Judgment: Sets clear criteria and standards to judge performance. Can increase credibility of the GEO-process. Improvement: Open-ended evaluation that measures change over time. Often applied to cyclical activities, like the GEO Process. Knowledge-creation: Identifies emerging knowledge and insights. Increases the saliency of the GEO process.

2. Users of a GEO-type Evaluation People who: can revise the GEO process: have the mandate, knowledge and skills; and want to revise the GEO process: have a vested interest in influencing the design and implementation of the GEO process.

Primary Users The primary users of the evaluation may include: IEA core team (includes policy-makers); policy and decision-makers who are primary users of the IEA; and the evaluation team (internal and/or external).

Exercise: Purpose for Monitoring and Evaluation In small groups, discuss the following: Why do you need a plan for monitoring and evaluation of your IEA process and impact at the beginning of your planning process? Why is improvement-oriented evaluation relevant to your IEA process? Using the information above, what would you state as the purpose of your evaluation?

Exercise: Continued 4. Who are potential users? What are their interests in using the evaluation results? Do they have a mandate to revise the GEO? Group discussion (30 minutes), plenary presentations (15 minutes)

3. Evaluators Evaluators may include: A small internal evaluation task force (including the IEA core team, which is recommended). External evaluators (consultants and internal evaluators of another IEA). A combination of internal and external parties.

Plenary Discussion: Identifying Evaluators (40 minutes) In small groups, for 20 minutes, Brainstorm key criteria for selecting evaluators; and Identify types of evaluators and possible names. In plenary, discuss your results (20 minutes).

Plenary Discussion Continued: What are some other parameters to consider for a monitoring and evaluation plan to be a success in my context?

Sessions at a Glance Session 1: Introduction to Module Session 2: Definitions and Context Session 3: Steps to Developing a Monitoring and Evaluation Plan Session 4: Framework, Attributes and Measures Session 5: Planning a Self-Assessment Session 6: Improvement Opportunities

Corresponding Attributes of Effective Evaluations

Conceptual Understanding of the National IEA Process, With Links to Ecosystem Health and Human Well-Being

Framework for Monitoring and Evaluating the National IEA Process

Monitoring the timely Completion of Key Activities and Outputs

Possible Measures for Effective Knowledge Management

Discussion: Effective Knowledge Management (30 minutes) In small groups, consider the following (15 minutes): 1. Can you think of additional important measures of effective knowledge management? What do you think are reasonable targets for the measures you identified?

Possible Measures for Effective Opportunity Management

Discussion: Effective Opportunity Management (30 minutes) In small groups, consider the following (15 minutes): 1. Can you think of important measures of effective opportunity management not included in the table above? What do you think are reasonable targets for the measures you identified?

Measures for Effective Relationship Management Identifying relationships based on behaviour Behaviour Type Receiving (individual, organization) Seeking (individual, organization) Acting Upon (individual, organization, institution) Demanding (individual, organization, institution) Behaviour Exhibited Reports, e-mails, listserv Target users seek new information. Technical expertise is sought to revise policies. Specific needs, such as monitoring data for the next IEA cycle.

Possible Measures for Effective Relationship Management

Discussion: Measuring Relationship Management (30 minutes) In small groups, consider the following (15 minutes): Can you think of any other important measures of effective relationship management? What reasonable targets would you recommend for various measures?

Measures for Improvements in Policy and Policy Making Measurement should relate to the change statement you identified in your impact strategy. Measurement should also track other observed improvements in policies and policy processes.

Sessions at a Glance Session 1: Introduction to Module Session 2: Definitions and Context Session 3: Steps to Developing a Monitoring and Evaluation Plan Session 4: Framework, Attributes and Measures Session 5: Planning a Self-Assessment Session 6: Improvement Opportunities

Planning a Self Assessment Step 1 Identify major issues and monitoring questions, and develop specific measures. Step 2 Identify sources of data and data collection methods. Step 3 Set priorities and frequency of monitoring

Effective Relationship Management Activities and Outputs Step 1. Identify Major Issues and Monitoring Questions, and Develop Specific Measures Outcomes Your Change Statement Effective Relationship Management Activities and Outputs Effective Knowledge Management Effective Opportunity Management Timely completion of activities and outputs

Outcome-based Measures: Possible Organization of Your Self-Assessment Matrix Key Issues/Questions Specific Measures and Target Data Source Data Collection Method Your Change Statement Have the desired improvements in policies and policy processes that you identified in your impact strategy been realized? What other improvements in policies and policy processes have you observed during and following your national IEA process? Effective Relationship Management What changes in the thinking and actions of policy-makers and decision-makers (and other important relationships) have you observed? See Table 4 for guidance

Activity and Output-based Measures: Possible Organization of Your Self-Assessment Matrix Stage of the IEA Key Issues / Questions Specific Measures and Targets Data Source Data Collection Method Stage 1 Inception Timely completion of activities and outputs See Table 7 for guidance Effective Knowledge and Opportunity Management See Table 5 and 6 for guidance Stage 2 Institutional Setup Stage 3 Scoping and Design Stage 4 Planning Stage 5 Implementation of IEA Stage 6 Communication and Outreach Stage 7 Evaluation

Step 2: Identify Sources of Data and Data Collection Methods

Collecting Data for Monitoring Effective Relationship Management Requires that changes in behavior be identified and mapped: these incremental changes will lead towards the decisions or changes you are seeking. Can be a time intensive process, so it is important to set up simple ways to monitor your strategy against those measures. Can set up a small contacts database with a journaling function (see next slide).

Collecting Data for Monitoring Effective Relationship Management

Step 3: Set priorities and Frequency of Monitoring and Evaluation Ensure that critical indicators are monitored. Establish the frequency of monitoring for each indicator. Process indicators are needed throughout the IEA. Progress indicators are needed less frequently, but long after the IEA is complete.

Exercise: Preparing a Self-Assessment Form working groups of 4–5 persons. Complete steps 1–3 for preparing a self-assessment matrix. Step 1: Identify major issues and monitoring questions, and develop specific measures. For activity and output-based measures, assign different stages to different groups and compile in plenary. Step 2: Identify sources of data and data collection methods. Step 3: Set priorities and frequency of monitoring Plenary Session to compile the results and discuss them.

Sessions at a Glance Session 1: Introduction to Module Session 2: Definitions and Context Session 3: Steps to Developing a Monitoring and Evaluation Plan Session 4: Framework, Attributes and Measures Session 5: Planning a Self-Assessment Session 6: Improvement Opportunities

Improvement Opportunities Thinking of an IEA as a capacity-building process significantly increases its impact. The more that monitoring and evaluation is treated as an organizational learning opportunity (versus a value judgment), the more effectively the IEA supports improvements in policy making.

Exercise: Learning In pairs, for 10 minutes: Think of the last time you learned something new. What was it, how did you learn it and what did you do differently as a result?

What is learning? Learning is more than knowledge creation; Learning is demonstrated by behaviour change; and Information processing, in addition to information collection, is of paramount importance.

Learning Cycle

Exercise: Learning A. Write what comes to mind based on the following four questions (5 minutes): 1. What did you hear during the GEO module training course (e.g., Stage 1) that you already knew? 2. What new information and insight did you gain? 3. How are you going to use this new insight? 4. How else and when could you use this new information? How could you improve policy making with this new insight?

Exercise: Learning Discuss your findings with your neighbors (5 minutes). In plenary, discuss the insights you have gained from this exercise. How did the group discussion help you to recognize improvement opportunities for the national IEA process and ways to have impact, such as causing changes in policy making? (15 minutes.)

A Few Prerequisites for Learning motivation, which often is the urgency to solve a problem, or act with the support of new knowledge; trust to discuss values, assumptions and ideas without repercussions; mandate and opportunity to apply the new knowledge; and shared understanding of the importance of learning (not only what to learn but also how to learn).

Learning Cycle with GEO-Specific Questions

Organizational Learning Cycle

The Role of Learning in Improving the IEA Formulate your change statement (Module 3). Identify measures for your change statement and other supporting measures for key outcomes and activities/outputs (Module 8, Tables 8 and 9) Examine performance against making the desired changes and summarize results Formulate recommendations and lessons learned. Integrate (feedback) recommendations and lessons learned to improve the next planning cycle and other decisions.

How Can We Use Learning Opportunities? – The Monitoring Meeting Learning opportunities naturally present themselves at the beginning and end of each IEA stage and each IEA cycle. These are the times when you need to reflect and articulate lessons learned to improve the next course of action. Given the limited time available, we suggest that your core IEA team organize regular, mid-stage and/or stage-end monitoring and evaluation meetings to serve two purposes: 1. Monitor progress toward and capture lessons learned to improve the next IEA stage and the next IEA cycle. 2. Cultivate a learning, improvement-oriented approach throughout the whole IEA process.

Questions for Monitoring Meetings What were the most revealing lessons for you in this stage? How can you use this knowledge, attitude to improve the next stages process, outputs or targeted changes? Have you achieved what we planned for in this stage? If yes, what factors helped? If not, what factors hindered. How can we reach the desired goal? Are there any unexpected results, emerging phenomena, trends or questions you want to discuss? How did this stage contribute to the perception of saliency, legitimacy and credibility of the national IEA process?

Suggestions for General and Stage-Specific Questions and Exercises for Monitoring Progress and Promoting Learning at Specific GEO Stages General questions for monitoring meetings GEO Stage Major activity Prevailing learning condition Stage-specific activities (see Box 7) (1) What were the most revealing lessons for you in this stage? (2) How can you use this knowledge, attitude to improve the next stages, process, outputs or targeted changes? (3) Have you achieved what we planned for in this stage? If yes, what factors helped? If not, what factors hindered? How can you reach the desired goal? (4) Are there any unexpected results, emerging phenomena, trends or questions you want to discuss? (5) How did this stage contribute to the perception of saliency, legitimacy and credibility of the national IEA process? 1 Inception 2 Institutional setup Mandate Articulate, confirm mandate 3 Scoping and design Trust Nurture trust Force-field analysis 4 Planning Information processing Create opportunities for collective information processing 5 Implementation Shared understanding of learning Remind participants of how they learn best Keep, start, stop doing 6 Communication Motivation Revisit motivation Carousel discussion 7 Evaluation Application Focus on lessons learned for application Samoan circle

Exercise: Designing a monitoring meeting In groups of 4 or 5, prepare an agenda for half-day monitoring meeting for any stage of your IEA process (30 min). Address the following: What specific progress measures will be part of your meeting? What learning opportunities might be identified for the stage of the IEA process you have selected? In plenary, discuss the results from two groups (15 min).