Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluating management effectiveness – what have we learned?

Similar presentations


Presentation on theme: "Evaluating management effectiveness – what have we learned?"— Presentation transcript:

1 Evaluating management effectiveness – what have we learned?

2 Some collected wisdom from a couple of hundreds of years of ‘learning experiences’ (@x mistakes/year???)

3

4

5 All the steps need to be right for a good result!

6 Effective communication and team-building Support from agency and stakeholders Good preparation and planning Right questions and indicators Good execution Good analysis Plenty of follow-up Consistent framework Sound methods

7 communication and team-building Effective communication and team-building Involve stakeholders in all stages - from design to reporting Keep good communication all through the project Work as a team – try to avoid an ‘us and them’ evaluation which is threatening and resented ?

8 Effective communication and team-building Support from agency and stakeholders Evaluation should be part of an effective management cycle. A culture that encourages reflection - learning from mistakes and successes Built in to business plans, legislation and reporting requirements.

9 Effective communication and team-building Support from agency and stakeholders Consistent framework

10 Effective communication and team-building Support from agency and stakeholders Good preparation and planning Consistent framework

11 Time taken here saves a lot of wasted time later! A clear purpose, scale, scope and objectives are needed Time taken here saves a lot of wasted time later! Evaluations range from system-wide (or even across many countries) to very local.

12 All parties need to agree on these expectations. It is critical that management goals and objectives for the protected area or project being evaluated have been spelt out clearly. It is important at the beginning to know exactly what you expect to achieve, and to understand what levels of resourcing and support you can expect.

13 Think about…

14 he assumptions being made need to be spelled out clearly. We need to understand the links between the elements or criteria being evaluated so we can interpret the results of evaluation. The assumptions being made need to be spelled out clearly. For many evaluations, especially specific interventions or projects, a concept model is a very useful tool –! For many evaluations, especially specific interventions or projects, a concept model is a very useful tool –!

15 INPUT Funds from donor organisation. Expertise from external scientists Involvement of local community. OUTPUT Successful and self- sufficient fish-farm established with no negative environmental effects. PROCESS Development of aquaculture program to provide alternative income sources. OUTCOMES Income security established for community. No further reef bombing. Conservation of reef system.. Assumption: Community supports project Donor funds will continue until project becomes self- sustaining Assumption: Community adopts program. Environmental conditions remain relatively stable. Assumption : People will not seek further illegal income if they have a basic income from fish farm. Community stewardship level is high Environmental conditions remain relatively stable. International laws protecting reef can be enforced (no outside fishing vessels). CONTEXT e.g. Marine park with very high biodiversity values. Poor local community dependent on marine resources PLANNING Goal: to restore reef biodiversity and enhance community well-being. Objectives:.to stop reef bombing to establish alternative income source for local community

16 Effective communication and team-building Support from agency and stakeholders Good preparation and planning Consistent framework Sound methods The methodology needs to suit the purpose

17 Methodologies should be compatible or ‘harmonised’ as much as possible Design of methodology needs to consider how the initial phase will relate to later phases of evaluation Some flexibility is good – we can improve as we learn and as things change We should learn from others and use or adapt existing methodologies if possible

18 Cost-effective Replicable Robust and statistically valid Simple Field-tested Documented Credible Good explanatory power Scaleable Rapid Tools need to be appropriate and responsive to needs….but generally

19 Effective communication and team-building Support from agency and stakeholders Good preparation and planning Right questions and indicators Consistent framework Sound methods Asking the right questions makes evaluation much easier!

20 Different layers of questions look at each dimension. Different layers of questions look at each dimension. Questions should proceed logically from very general to specific and measurable indicators. Questions should proceed logically from very general to specific and measurable indicators. A good project or park plan makes this step much easier! A good project or park plan makes this step much easier!

21 Objective: to assess biodiversity conservation on the park Question: Is biodiversity of the park being conserved or lost? Is the population of all endangered species on the park stable? Indicator: a particular endangered species Question: Is this species declining? Over the past two years, has this species been seen less frequently that in the past? Is there scientific or anecdotal evidence that the populations are declining? Assumption: endangered species are a good proxy for biodiversity generally Assumption: this species is a good indicator for endangered species Assumption: anecdotal evidence and past research will give results of sufficient accuracy General, broad Specific, narrow

22 Indicators should have some explanatory power, or be able to link with other indicators to explain causes and effects. Some characteristics of good indicators are…. Choose good indicators

23 Measurable: able to be recorded and analysed in qualitative or quantitative terms; Precise: defined in the same way by all people; Consistent: not changing over time so that it always measures the same thing; and Sensitive: Changing proportionately in response to actual changes in the condition or item being measured. Margoluis and Salafsky 1998, p.88.

24 biologically relevant (reflect target health); socially relevant (recognized by stakeholders); sensitive to human-caused stress (reflect threats); anticipatory (early warning); measurable; and cost-effective (max. information/unit effort) TNC 2002.

25 Information should be ‘triangulated’ where possible. To get more accurate results, choose  several different indicators for the same question,  different sources of information, and  different methods or tools. The perfect indicator rarely exists Indicators have limitations!

26 Effective communication and team-building Support from agency and stakeholders Good preparation and planning Right questions and indicators Good execution Consistent framework Sound methods

27 Forming the team - who should be involved? Different levels of management agency staff Indigenous people/ traditional owners Community Other ‘experts’ NGOs, scientists etc External evaluators

28 Prepare the way: gaining approval, trust and cooperation of stakeholders is critical Background research: do your homework well to gain credibility Conduct workshops: take care to make sure all stakeholders have an opportunity to express their opinion (language, setting, cultural norms Follow up on information gaps where possible. Check information is correct. Establish additional monitoring where needed.

29 Effective communication and team-building Support from agency and stakeholders Good preparation and planning Right questions and indicators Good execution Good analysis Consistent framework Sound methods

30 Go back to your concept diagrams – the better these are, the easier interpretation of the results. Comparison over time and over space is very useful BUT be careful about doing fancy analysis with subjective or dodgy data! look at links between context, input, processes, outputs and outcomes. The combination of all these and teasing out possible explanations is most useful.

31 (ie we rarely know all the variables, so don’t jump to conclusions!) Data from range or biodiversity monitoring systems will never be able to test explicit a priori hypotheses since replication and controls are not possible. Data analysis can only ever build a case for a particular interpretation of causal relationships.

32 Effective communication and team-building Support from agency and stakeholders Good preparation and planning Right questions and indicators Good execution Good analysis Plenty of follow-up Consistent framework Sound methods

33 Consider communication and the evaluation audiences early. The way that findings are reported must suit the intended audiences. Timeliness of reporting is critical to making it useful. Short-term benefits of evaluation should be demonstrated clearly wherever possible Communicate well and often

34 Evaluation findings must be used positively Findings, wherever possible, should be positive, identifying challenges rather than blame. Advice from evaluation needs to be clear and specific Advice from evaluation needs to be clear and specific Some short-term and some longer-term Some short-term and some longer-term

35

36 The evaluation process itself is a vital learning experience, which enhances and transforms management. Evaluation often has impacts on management well before a formal report is prepared

37 Two key factors that determine whether evaluation findings will ‘make a difference’ are: a high level of commitment to the evaluation by managers and owners of the protected areas; and adequate mechanisms, capacity and resources to address the findings and recommendations.

38


Download ppt "Evaluating management effectiveness – what have we learned?"

Similar presentations


Ads by Google