Presentation is loading. Please wait.

Presentation is loading. Please wait.

National Evaluation and Results Management System– Sinergia – Two decades of lessons and experiences Directorate of Monitoring and Evaluation of Public.

Similar presentations


Presentation on theme: "National Evaluation and Results Management System– Sinergia – Two decades of lessons and experiences Directorate of Monitoring and Evaluation of Public."— Presentation transcript:

1 National Evaluation and Results Management System– Sinergia – Two decades of lessons and experiences Directorate of Monitoring and Evaluation of Public Policy November 2013

2 El Modelo SinergiaOur model Evidence for the decision making process MONITORINGEVALUATIONTERRITORIAL ACCOUNTABILITY INNOVATION AND RESEARCH

3 THE VALUE CHAIN: OUR CONCEPTUAL BASIS Goals Inputs Processes OutputsOutcomes Impacts Sinergia’s model is based on the value chain and is oriented to identify bottle necks in each link of the public policy process. ProcessesInstitutionalOutcomesImpacts Executive Evauation Our portfolio includes different types of evaluations in order to respond to bottlenecks identified in each link of the value chain.

4 Our process brings about transparency and consistency. In order to be effective, evaluations need to The evaluation process Be a result of a standardized process Include the participation of all stakeholders Answer decision maker questions Be in line with the government agenda Evaluation Schedule Design  3 Months Procurement  3 Months Development  8 Months Use of Results  6 Months TOTAL: 20 Months Selection of policies to be evaluated Evaluation Design Procurement Government Area EvaluationDevelopment Implementing Results

5 Through these years there has been changes and lessons learned The system´s evolution We are working in different sectors going beyond social inclusion area We have a wide evaluations portfolio We have published methodological guidelines of evaluation Our evaluations are public on internet Our process is part of the NDP quality management system Types of evaluation by year of implementationEvaluations by sector

6 We still face new challenges It is necessary a high level champion who is aware of the importance of doing evaluations and has the capacity to disseminates its attributes within the executive level. It is required an adequate legal framework but first it is important to know: What should be its scope?, What should regulate? It is important to develop the evaluation culture through different levels of government, as well as improving knowledge of the M&E concepts It is vital to involve citizens in the evaluation process, so they can use it for social control 1 Spread of the evaluation culture:

7 Evaluated entities should me more committed with using the evaluations results and with the agenda setting. Each evaluation must have a Plan for transfer and implement recommendations, which should be design between Sinergia, the evaluator and the evaluated entity. The data bases should be public and simple to be searched. It is need to have a monitoring scheme for the imlementation of evaluation results 2 Use of evaluations: Externally, for decision-making processes:Internally, for more influence: Replicate evaluations in order to contrast results and evaluate evaluators. Improve the quality of evaluations through meta- evaluation. To do systematic reviews in order to define new lines of action based on evaluations already done We still face new challenges

8 3 Quality of the evaluations: Working with universities Improve the evaluations process with the support of a technical expert, during the design, implementation and use of results. In house or peer reviewer? High level advisory Universities should play a critic role replicating evaluations in order to contrast and compare results. As well it would be important to exchange knowledge and experiences. Regular training Of the evaluation team, in order to implement new methodologies and improve the quality of the existing ones. It´s not just about quantity It doesn´t matter if we have a limited number of evaluations been done at the same time. It´s very important to have an adequate number in order to guarantee quality and rigorousness. We still face new challenges

9 4 Improving evaluators market Dialogue with consulting firms in order to improve the procurement process. Training for better proposal’s presentation. Prioritisation of the technical quality when it comes to qualify proposals. Improve strategies of evaluation costing. Promote the development of small consulting firms. We still face new challenges

10 Proposed questions

11 1. ¿Is it possible to observe any change in the quality of evaluations? How to ensure program evaluators are impartial and consistent? How to evaluate evaluators? Each evaluation has a Monitoring Commitee in order to guarantee impartiality and consistency. Evaluations managers must be technically strong to guarantee the accuracy of the evaluator. NDP technical offices and representatives of the evaluated entitty should act as a quality filter for the evaluation. The evaluations should also be evaluated through: Evaluations replicas done by universities. Meta-evaluation (contrasting evaluations results). Peer reviewers.

12 2. Is there enough transparency in evaluations? (How evaluations are assigned?, Are evaluations public?) - The extern procurement guarantees impartiality. - Evaluators are chosen by scoring upon specific criteria The consistency of the evaluation is guarantee by its technical design Evaluations are public in the official website: sinergia.dnp.gov.co Selection of policies to be evaluated Procurement EvaluationDevelopment Implementing Results Government Area

13 3. Still the results-based budget on the table, or the M & E systems are limited to recommending actions for improvement? Background The National Development Plan 2010-2014 was not designed upon a strict relation between goals and budget 2013 Work in the design of methodological guidelines to achieve a relation between planning and budgeting processes. 2014 i) A strategic formulation methodology for monitoring the national development plan ii) Guidelines for the design and process of scalability of the National Development Plan. Based on the monitoring process, determine the required inputs to acomplish the planned outcomes of a public intervention. (costing of inputs) Identify and link the planned outcomes of the public interventions with the outputs. In this way, attain the coordination between the design of public programs and budgeting. The evaluations allow the validation of causal relations between the links of the value chain. In Sinergia we are working on it:

14 Socio-economic situation Needs ObjectivesInputsActivitiesOutputs External factors Cost - efectiveness Expenditures economy Efectiveness Outcomes Efficacy Productivity Efficiency Impacts 3. Still the results-based budget on the table, or the M & E systems are limited to recommending actions for improvement?

15 Use of the information referred to the actions needed to implement the public interventions. This allows to develop good practices during the productive process. Use of the information of the delivery of good and services and the generation of strategic results. This allows to make budgeting decisions, approve or disapprove the continuity of public interventions and influence the adoption of the recomendations resulted from evaluations. Use of the information about the operation and the partial results of the public interventions. This allows to desig or re-design the implementation of public policies, make budgeting decisions and prioritize population groups. Objectives Inputs (costs) Activities Outputs (costing) Intermediate results Final results Inmediate results Executive entities from the national and subnational level Coordinating entities from the national, regional and subnational levels. Operative Management Political Entities objectives Sectorial objectives National Objectives Results chain Productive process 3. Still the results-based budget on the table, or the M & E systems are limited to recommending actions for improvement?

16 4. How to use evaluation results for the decision making process? / How to promote the use of evaluations? Evaluated entities should me more committed with using the evaluations results and with the agenda setting. Each evaluation must have a Plan for transfer and implement recommendations, which should be design between Sinergia, the evaluator and the evaluated entity. The data bases should be public and simple to be searched. It is need to have a monitoring scheme for the imlementation of evaluation results Externally, for decision-making processes:Internally, for more influence: Replicate evaluations in order to contrast results and evaluate evaluators. Improve the quality of evaluations through meta- evaluation. To do systematic reviews in order to define new lines of action based on evaluations already done

17 5. Is there a positive cost-benefit ratio doing evaluations? Numbers in USD Financial resources invested in evaluations 2010-2012 To design public policy (CONPES) To improve existing interventions. To improve procurement processes. It would be worth to quantify the benefits of evaluations for the public sector Use of evaluations

18 Thank you www.dnp.gov.co www.sinergia.dnp.gov.co/portaldnp/ @Sinergia_DNP PBX: 3815000


Download ppt "National Evaluation and Results Management System– Sinergia – Two decades of lessons and experiences Directorate of Monitoring and Evaluation of Public."

Similar presentations


Ads by Google