Presentation is loading. Please wait.

Presentation is loading. Please wait.


Similar presentations

Presentation on theme: "RESULTS BASED MANAGEMENT"— Presentation transcript:

UNEP Brown Bag Lunch September 2006 Kabell Konsulting ApS Managing for results

2 Agenda Presentation Dialogue What is RBM? The Evolution of RBM
Key Concepts Results Reporting Behavioural Issues Dialogue RBM in UNEP Feed back

3 Introduction Objectives To provide input to UNEP’s discussions on RBM
To share some international lessons with using RBM To discuss where strengthening is needed in UNEP

4 What is RBM? Definition:
“RBM is a management strategy focusing on performance and achievement of outputs, outcomes and impacts” (DAC OECD 2002) Facilitates the systematic thinking about three basic questions: What is our goal: “Are we doing the right thing”? How will we reach that goal: “Are we doing it right?” How do we know whether we have achieved our goal and that we are doing it right: “How do we know?” Men Hvad er Results Based Management mere præcist. Den præcise definition ifølge DAC kan i jo selv læse. Overordnet set, så angiver Results-Based Management en metode til systematisk at vurdere: Gør vi det rigtige Gør vi det på den rigtige måde Og hvordan ved vi om vi gør det rigtige på den rigtige måde (M&E systemerne) Synonymous with Performance Management or Outcome Based Management -

5 What is RBM? RBM is not: “A management strategy focusing on inputs and activities” It may include, but should not be limited to, the following elements: Performance Measurement Performance Reporting Performance Based Budgeting Vigtigt at gøre sig klart at Results-Based Management som helhed er en omfattende tilgang til management som indeholder mere end måling og rapportering af resultater. Disse elementer er nødvendig elementer, men det er en stor fejltagelse at tro at implementeringen af disse elementer isoleret set er udtryk for RBM.

6 The Evolution of RBM In the 1980s: compliance and control.
In the 1990s: project based, introduction of log frames, evaluations focus on input and results and is made for accountability. After 2000: more systemic approach with results frameworks, focus on programmes not projects and on outcomes and impact, evaluation for learning, often jointly with partners

7 Focus on performance not compliance
The Evolution of RBM Focus on performance not compliance “Did projects spend their budget and comply with rules and procedures?” “Did projects achieve their objectives and deliver results?” Focus on outcomes and impact not input, activities (and output) “How many trees were planted and people trained?” Overordnet set kan man sige at der gennem de sidste 20 år er sket et paradigmeskift i den offentlige sektor fra et fokus på compliance til resultater og fra fokus på input og outputs til outcomes og impact. Realiteten er dog ofte den at mange organisationer i dag fortsat fokusere på deres output. Det kan vi vende tilbage til senere. “What was the development result e.g. reduction in poverty and improved living conditions”

8 The Evolution of RBM Focus on management, not administration
Client-centered and citizen focused Emphasis upon Outputs versus Inputs; reduction in ex-ante controls Extensive use of outsourcing, competition and private service providers Next Steps Program in the United Kingdom Reinventing Government in the United States Australian and New Zealand PEM Reforms Alternative Service Delivery in Canada

9 The Evolution of RBM RBM has gained considerable following because:
It improves transparency It provides the basis for stronger external and internal accountability It rewards achievement of results It encourages evidence based policymaking and management …and as a result improves the quality of policy and decision making RBM forventes at bidrage til mere gennemsigtighed . Bestræbelserne på at kunne træffe beslutninger på baggrund af dokumenterede erfaringer kender vi især for uddannelses sektoren. Afhængigt af hvor man stiller sig i den debat kan man til tider stille spørgsmål til validiteten af den erfaringsopsamling der foretages og heraf til de beslutninger der træffes på baggrund af erfaringsopsamling.

10 RBM in Development Cooperation
2000 MDGs 2002 Monterrey 2003 Rome 2004 Marrakesh 2005 Paris HLF-2 Mutual responsibility Coherence Partnership Harmonization Alignment Results for shared results Precedent sector publique en general, domaine du developpement nous retrouvos ces memes pre-ocupations et principes, mais avec un element de plus - le partenariat 2000 – meilleur coherence a travers les MDG – resultats et indicateurs 2002 partenariat - Bonne gouvernance dans les pays en vois de developpement, ressources et efficacite dans la gestion de l’aide – transaction costs - aussi First International Roundtable on Better Measuring, Monitoring, and Managing for Development Results Rome - harmonization - action plans - nouvelle structure mises en place: “l´architecture globale” Marrakesh – manaaging for results – Paris 2005 The second High-Level Forum on Harmonization and Alignment for Aid Effectiveness

11 Lessons Indicators and evaluations should go beyond compliance and accountability M&E should contribute to build bridge between individual projects and sectorwide or global programmes Indicators should measure contribution to MDGs and intermediary outcomes Evaluation of the performance of partners should be based on results achieved towards joint objectives

12 Glossary of Key Terms in Evaluation & Results Based Management
Key Concepts Glossary of Key Terms in Evaluation & Results Based Management

13 Key Concepts The Results Chain Impact Outcome Assumptions and Risks
Indicators Outputs Consider limitations of the Logframe approach Assumes that necessary resources will be provided The assumed cause-effect relations may not evolve as expected Absorbing inputs from other actors Static Capturing unintended consequences Med henblik på at undgå misforståelser i forhold til brugen f begreberne i the results chain har OECD DAC i 2002 lavet en lille ordbog med definitioner på nøgle begreberne. Med udgangspunkt i disse definitioner vil jeg kort gennemgå begreberne i the results chain. Activities Inputs

14 Key Concepts Input Activities Output
The financial, human and material resources for the development intervention Activities The coordination, technical assistance or training tasks delivered Output The products, capital goods and services which results from a development intervention; may also include changes resulting from the intervention which are relevant to the achievement of outcomes

15 Key Concepts Outcome (- a consequence of outputs) Impact
The likely or short term and medium term effect of an intervention’s outputs Impact Positive and negative, primary and secondary long term effects produced by a development intervention, directly or indirectly, intended or unintended DAC and UNFPA measurable change in behaviour, attitudes or socio-cultural values of groups; measurable change in legal, institutional and society practices

16 Key Concepts Result The output, outcome or impact (intended or unintended, positive and/or negative) of a development intervention Results may: Appear within a short time or take years to be fully realized Be planned or unforseen; Be either positive or negative; Be reflected at the level of individuals, groups, institutions or society A result is a describable or measurable change resulting from a cause and effect relationship. It is a change that can be observed, described and measured in some way, for which the cause can be identified

17 Key Concepts Indicator Quality-test of indicators:
A quantitative or qualitative factor or variable that provides a simple and reliable means to measure achievement, to reflect the the changes connected to an intervention or to help asses the performance of a development actor Quality-test of indicators: S Specific M Measurable A Available R Relevant T Trackable Imparative to include stakeholders in this proces: those that are going to collect the data and those that are going to use the data

18 Key Concepts Identify indicators for output, outcome and process:
Specify exactly what is to be measured Use a mix of qualitative and quantiative indicators Use project rating systems, where aggregation is needed Consider proxy indicators where direct indicators are not available Make sure to have baselines and target value Remember: ”An imprecise indicator about an important result is better than a precise indicator of an insignificant result” Provide example of project rating system Qualitative indicators: Peoples judgement and perceptions about a subject e.g. such as the confidence people have in sewing machines as instruments of financial independence Quantitative indicators: Objective or idenpendently verifiable numbers or rations. Measures quantity, such as the number of people who own sewing machines in a village Perhaps there is too much focus on indicators Measuring an indicator only provides an indication of development not evidence Danger of goal displacement – managing for indicators An indicator is normatively defined Limitations - Not everything is measurable Does not capture unintended effects

19 Example Indicators in UNDP Strategic Framework
Do they say something significant? Mix of qualitative or quantitative? Can they be aggregated? Are there baselines or target values? Use of process indicators? Can you actually find data to support them? Also No change theory: How we move from input to impact No defined outcome

20 Elements of an RBM System
A Results Based Management System Situation and problem analysis Formulating objectives, intended results and strategies Identifying indicators Setting baseline and targets Collect data on performance Analyse and report performance Integrating monitoring and evaluation findings Using performance information for future planning and decision making Strategic Planning Performance Measurement Performance Management Performance measurement = Monitoring

21 Example of results measurement system
RBM in UNDP 5 Goals 30 services lines 6 cross-cutting drivers Reporting on: rate of achievement of annual targets towards expected outcomes extent to which UNDP promote drivers of development effectiveness Difficulties identified in review by Dalberg  Do not demonstrate the actual achievements targeted. Indicators are not currently standardized across offices, and thus there is variability in the measurability and ability to attribute output and outcomes. The distinction between output and outcomes is in some cases not clear Goals Achieving the MDGs and reducing human poverty; Fostering democratic governance; Managing energy and environment for sustainable development[1] Supporting crisis prevention and recovery; and Responding to HIV/AIDS [1] Service lines for this strategic goal are: 1)Frameworks and strategies for sustainable development 2) Effective water governance 3) Access to sustainable energy services 4) Sustainable land management to combat desertification and land degradation 5) Conservation and sustainable use of biodiversity 6) National/sectoral policy and planning to control emissions of ozone-depleting substances and persistent organic pollutants. The six drivers are: (i) developing national capacities; (ii) enhancing national ownership; (iii) advocating and fostering an enabling policy environment; (iv) seeking South-South solutions; (v) promoting gender equality; and (vi) forging partnerships for results. (UNDP/DP/2005/16, 17)

22 Results Reporting Having determined expected results, identified indicators, and set baselines and targets as described, it is time to: Collect data on performance Analyse performance Report on performance – for external and internal use Three important issues to consider: Purpose of results reporting Aggregation Attribution/contribution Purpose: Internal vs External or both? Budgetting purposes Accountability or learning

23 Difficult to aggregate due to diversity in outcomes
Results Reporting Aggregating results: output or outcome? Project/programme outcome Easier to establish link between outputs and project outcome than national development changes More relevant for internal learning purposes but Difficult to aggregate due to diversity in outcomes One possibility is to establish standard outcomes for common program approaches (groupings of similar projects) Establish rating system measuring succes in meeting outcomes (Consider issue of validity/reliability) Establish a solid validation process

24 Results Reporting Attribution vs. Contribution
The move from projects and outputs to programmes and outcomes/higher order goals makes it difficult to attribute specific outcomes to individual donors In Denmark, even National Audit Office has accepted this in terms of performance measurement However, in-depth evaluations are still sometimes carried out with the specific objective of demonstrating attribution (counterfactual methodology)

25 Results Reporting Lessons learned in regards to results reporting
Qualitative assessments / statements are more interesting for internal learning than quantitative assessments Ratings provide a good indication of status in projects but should be supplemented with qualitative assessment and validation Supplement performance measurement (monitoring) with evaluations Establish a solid validation process Examples of performance monitoring systems at agency level: Ratings: Self-assessments at project/programme level. Needs to be assisted by some sort of validation process e.g. review of assessments and field visits (field-based auditing)

26 Behavioural Issues Different approaches to RBM are likely to effect equally different on behaviour of staff Managing By Results Performance measurement is used for accountability/reporting to internal and external stakeholders Sanctions if targets not met (creates push for staff to focus on outputs) Fixed targets Ressource allocation Managing For Results Performance measurement is used with the aim of achieving better results Stretch targets and assessments for learning to modify targets Consider both external and internal demands and needs for accountability and learning when establishing a performance measurement system

27 Behavioural Issues Experiences from implementing RBM
”What gets measured gets done” ”An imprecise indicator of a significant result is better than a precise indicator of an insignificant result” ”You become what you measure” Organizations typically focus more attention on the methodological and technical aspects of results-oriented M&E systems than on changing mental models (World Bank 2004) It is pivotal to create an organisation culture conducive to learning and innovation and establishing incentives that promote managing for results (World Bank 2004) Strong emphasis on formal systems of tight specification and measurement. Failure to recognizing the fact that for complex activities (..) a highly formalised approach to management has severe limitations (OECD 2002). The shift to a results-based approach has been largely characterized by a focus on the budgeting and programming aspects of management without realizing or emphasizing at the outset the scope of changes required in other areas of management for an effective implementation of an RBM system (JIU 2004)

28 RBM in UNEP? Conducive Conditions Problematic Conditions
Enablers and constraints for introducing RBM: Conducive Conditions An organisation has products and products that are simple Products are uniform, isolated and the production is autonomous An organisation is product oriented Causalities are known Quality definable in performance indicators Environment is stable Problematic Conditions An organisation has obligations and is highly value-oriented Products are multiple and generated together with others An organisation is process oriented Products are interwoven and causalities are unknown Quality not definable in performance indicators Environment is dynamic Follow up on exercise

29 Self assessment in UNEP
Conditions? Experiences? Incentives? Capacity? Plans?

30 What next?


Similar presentations

Ads by Google