Evaluating management effectiveness – what have we learned?

Slides:



Advertisements
Similar presentations
Policies and Procedures for Civil Society Participation in GEF Programme and Projects presented by GEF NGO Network ECW.
Advertisements

ARMENIA: Quality Assurance (QA) and National Qualifications Framework (NQF) Tbilisi Regional Seminar on Quality Management in the Context of National.
Donald T. Simeon Caribbean Health Research Council
INTERNATIONAL UNION FOR CONSERVATION OF NATURE. 2 Implemented in 12 countries of Africa, Asia, Latin America and the Middle East, through IUCN regional.
PROJECT MANAGEMENT AND FUNDING Steven Lugg. 1. Identify What You Need Have the community expressed concerns or complaints? Have you identified a group.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
Screen 1 of 24 Reporting Food Security Information Understanding the User’s Information Needs At the end of this lesson you will be able to: define the.
Dr. Julian Lo Consulting Director ITIL v3 Expert
INTEGRATING BENEFICIARY FEEDBACK INTO EVALUATION- A STRUCTURED APPROACH Presentation to UKES Conference May 2015 Theme: Theory and practice of inclusion.
Education and Culture Name Education and Culture International opportunities for Higher Education D. Angelescu (EACEA A4)
What should be the basis of
Standards and Guidelines for Quality Assurance in the European
Incorporating Ecosystem Objectives into Fisheries Management
The use and convergence of quality assurance frameworks for international and supranational organisations compiling statistics The European Conference.
An Introduction to Research Methodology
Environmental Impact Assessment (EIA): Overview
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
USAID’s Experience and Lessons Learned in Approaches used in Monitoring and Evaluating Capacity Building Activities Duane Muller, USAID November 7, 2008.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
COMPGZ07 Project Management Presentations Graham Collins, UCL
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
NIST Special Publication Revision 1
WWF – World Bank Management Effectiveness Tracking Tool What is Management Effectiveness Evaluation? Sue Stolton.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
AdriaMed Expert Consultation Interactions between capture fisheries and aquaculture Rome, Italy November st Coordination Committee (2000)
Developing Indicators
Rwanda MCC Threshold Program CIVIL SOCIETY STRENGTHENING PROJECT Cross-Cutting Advocacy Issues Data Collection Monitoring and Evaluation.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Module 2 Stakeholder analysis. What’s in Module 2  Why do stakeholder analysis ?  Identifying the stakeholders  Assessing stakeholders importance and.
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Division Of Early Warning And Assessment MODULE 5: PEER REVIEW.
Integrated Risk Management Charles Yoe, PhD Institute for Water Resources 2009.
WHEN AND HOW TO GO TRANSBOUNDARY EFFECTIVE STRATEGIES FOR TRANSBOUNDARY NATURAL RESOURCE MANAGEMENT Harry van der Linde Senior Program Officer Biodiversity.
Training Resource Manual on Integrated Assessment Session UNEP-UNCTAD CBTF Process of an Integrated Assessment Session 2.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
Monitoring and Evaluation
Policies and Procedures for Civil Society Participation in GEF Programme and Projects presented by GEF NGO Network ECW.
Optimising results of protected area management efforts – a capacity building workshop Sportsman’s Arms Hotel, Kenya 27 th Nov – 1st Dec.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Management effectiveness evaluation: WCPA Framework overview.
Theme 2 Developing MPA networks Particular thanks to: Theme 2 Concurrent Session Rapporteurs, Dan Laffoley, Gilly Llewellyn G E E L O N G A U S T R A L.
Quality Frameworks: Implementation and Impact Notes by Michael Colledge.
Kathy Corbiere Service Delivery and Performance Commission
1 Choosing from alternative approaches for assessing management effectiveness Prepared by Graeme Worboys *1, Dan Salzer *2, Marc Hockings *3 and Richard.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
How to measure the impact of R&D on SD ? Laurence Esterle, MD, PhD Cermes and Ifris France Cyprus, 16 – 17 October L. ESTERLE Linking science and.
Measuring Conservation Outcomes for Biodiversity: Name Date Location An overview on monitoring the status of biodiversity and the Outcome Monitoring Program.
Continual Service Improvement Methods & Techniques.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Session 2: Developing a Comprehensive M&E Work Plan.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Development of Gender Sensitive M&E: Tools and Strategies.
Collaborative in Conservation An Initial Framework and Example Nick Salafsky Foundations of Success & Conservation Measures Partnership *** Note: This.
Alice Pedretti, Project Manager Effective management of complaints for companies Lessons learned from the Management of Complaints Assessment Tool Amsterdam,
Evaluation Planning Checklist (1 of 2) Planning Checklist Planning is a crucial part of the evaluation process. The following checklist (based on the original.
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
Evaluation What is evaluation?
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
New Ecological Science Advice for Ecosystem Protection The EPA Science Advisory Board (SAB) Staff Office supports three external scientific advisory committees.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
2A. Develop a Formal Action Plan: Objectives
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Measuring Data Quality and Compilation of Metadata
Best practices for the self-assessment
Workshop Plenary Maintaining Protected Areas for Now and the Future
Monitoring and Evaluation in Communication Management
Presentation transcript:

Evaluating management effectiveness – what have we learned?

Some collected wisdom from a couple of hundreds of years of ‘learning experiences’ mistakes/year???)

All the steps need to be right for a good result!

Effective communication and team-building Support from agency and stakeholders Good preparation and planning Right questions and indicators Good execution Good analysis Plenty of follow-up Consistent framework Sound methods

communication and team-building Effective communication and team-building Involve stakeholders in all stages - from design to reporting Keep good communication all through the project Work as a team – try to avoid an ‘us and them’ evaluation which is threatening and resented ?

Effective communication and team-building Support from agency and stakeholders Evaluation should be part of an effective management cycle. A culture that encourages reflection - learning from mistakes and successes Built in to business plans, legislation and reporting requirements.

Effective communication and team-building Support from agency and stakeholders Consistent framework

Effective communication and team-building Support from agency and stakeholders Good preparation and planning Consistent framework

Time taken here saves a lot of wasted time later! A clear purpose, scale, scope and objectives are needed Time taken here saves a lot of wasted time later! Evaluations range from system-wide (or even across many countries) to very local.

All parties need to agree on these expectations. It is critical that management goals and objectives for the protected area or project being evaluated have been spelt out clearly. It is important at the beginning to know exactly what you expect to achieve, and to understand what levels of resourcing and support you can expect.

Think about…

he assumptions being made need to be spelled out clearly. We need to understand the links between the elements or criteria being evaluated so we can interpret the results of evaluation. The assumptions being made need to be spelled out clearly. For many evaluations, especially specific interventions or projects, a concept model is a very useful tool –! For many evaluations, especially specific interventions or projects, a concept model is a very useful tool –!

INPUT Funds from donor organisation. Expertise from external scientists Involvement of local community. OUTPUT Successful and self- sufficient fish-farm established with no negative environmental effects. PROCESS Development of aquaculture program to provide alternative income sources. OUTCOMES Income security established for community. No further reef bombing. Conservation of reef system.. Assumption: Community supports project Donor funds will continue until project becomes self- sustaining Assumption: Community adopts program. Environmental conditions remain relatively stable. Assumption : People will not seek further illegal income if they have a basic income from fish farm. Community stewardship level is high Environmental conditions remain relatively stable. International laws protecting reef can be enforced (no outside fishing vessels). CONTEXT e.g. Marine park with very high biodiversity values. Poor local community dependent on marine resources PLANNING Goal: to restore reef biodiversity and enhance community well-being. Objectives:.to stop reef bombing to establish alternative income source for local community

Effective communication and team-building Support from agency and stakeholders Good preparation and planning Consistent framework Sound methods The methodology needs to suit the purpose

Methodologies should be compatible or ‘harmonised’ as much as possible Design of methodology needs to consider how the initial phase will relate to later phases of evaluation Some flexibility is good – we can improve as we learn and as things change We should learn from others and use or adapt existing methodologies if possible

Cost-effective Replicable Robust and statistically valid Simple Field-tested Documented Credible Good explanatory power Scaleable Rapid Tools need to be appropriate and responsive to needs….but generally

Effective communication and team-building Support from agency and stakeholders Good preparation and planning Right questions and indicators Consistent framework Sound methods Asking the right questions makes evaluation much easier!

Different layers of questions look at each dimension. Different layers of questions look at each dimension. Questions should proceed logically from very general to specific and measurable indicators. Questions should proceed logically from very general to specific and measurable indicators. A good project or park plan makes this step much easier! A good project or park plan makes this step much easier!

Objective: to assess biodiversity conservation on the park Question: Is biodiversity of the park being conserved or lost? Is the population of all endangered species on the park stable? Indicator: a particular endangered species Question: Is this species declining? Over the past two years, has this species been seen less frequently that in the past? Is there scientific or anecdotal evidence that the populations are declining? Assumption: endangered species are a good proxy for biodiversity generally Assumption: this species is a good indicator for endangered species Assumption: anecdotal evidence and past research will give results of sufficient accuracy General, broad Specific, narrow

Indicators should have some explanatory power, or be able to link with other indicators to explain causes and effects. Some characteristics of good indicators are…. Choose good indicators

Measurable: able to be recorded and analysed in qualitative or quantitative terms; Precise: defined in the same way by all people; Consistent: not changing over time so that it always measures the same thing; and Sensitive: Changing proportionately in response to actual changes in the condition or item being measured. Margoluis and Salafsky 1998, p.88.

biologically relevant (reflect target health); socially relevant (recognized by stakeholders); sensitive to human-caused stress (reflect threats); anticipatory (early warning); measurable; and cost-effective (max. information/unit effort) TNC 2002.

Information should be ‘triangulated’ where possible. To get more accurate results, choose  several different indicators for the same question,  different sources of information, and  different methods or tools. The perfect indicator rarely exists Indicators have limitations!

Effective communication and team-building Support from agency and stakeholders Good preparation and planning Right questions and indicators Good execution Consistent framework Sound methods

Forming the team - who should be involved? Different levels of management agency staff Indigenous people/ traditional owners Community Other ‘experts’ NGOs, scientists etc External evaluators

Prepare the way: gaining approval, trust and cooperation of stakeholders is critical Background research: do your homework well to gain credibility Conduct workshops: take care to make sure all stakeholders have an opportunity to express their opinion (language, setting, cultural norms Follow up on information gaps where possible. Check information is correct. Establish additional monitoring where needed.

Effective communication and team-building Support from agency and stakeholders Good preparation and planning Right questions and indicators Good execution Good analysis Consistent framework Sound methods

Go back to your concept diagrams – the better these are, the easier interpretation of the results. Comparison over time and over space is very useful BUT be careful about doing fancy analysis with subjective or dodgy data! look at links between context, input, processes, outputs and outcomes. The combination of all these and teasing out possible explanations is most useful.

(ie we rarely know all the variables, so don’t jump to conclusions!) Data from range or biodiversity monitoring systems will never be able to test explicit a priori hypotheses since replication and controls are not possible. Data analysis can only ever build a case for a particular interpretation of causal relationships.

Effective communication and team-building Support from agency and stakeholders Good preparation and planning Right questions and indicators Good execution Good analysis Plenty of follow-up Consistent framework Sound methods

Consider communication and the evaluation audiences early. The way that findings are reported must suit the intended audiences. Timeliness of reporting is critical to making it useful. Short-term benefits of evaluation should be demonstrated clearly wherever possible Communicate well and often

Evaluation findings must be used positively Findings, wherever possible, should be positive, identifying challenges rather than blame. Advice from evaluation needs to be clear and specific Advice from evaluation needs to be clear and specific Some short-term and some longer-term Some short-term and some longer-term

The evaluation process itself is a vital learning experience, which enhances and transforms management. Evaluation often has impacts on management well before a formal report is prepared

Two key factors that determine whether evaluation findings will ‘make a difference’ are: a high level of commitment to the evaluation by managers and owners of the protected areas; and adequate mechanisms, capacity and resources to address the findings and recommendations.