Evaluating and monitoring complex systems Pier Francesco Moretti JPI to Co-Work – 19 December 2012 - Warsaw.

Slides:



Advertisements
Similar presentations
USING VALUES AND POLICY TO PRIORITIZE INDICATORS OF SUSTAINABILITY Deborah J. Shields USDA Forest Service - Research.
Advertisements

Linking regions and central governments: Indicators for performance-based regional development policy 6 th EUROPEAN CONFERENCE ON EVALUATION OF COHESION.
Results Based Monitoring (RBM)
A NEW METRIC FOR A NEW COHESION POLICY by Fabrizio Barca * * Italian Ministry of Economy and Finance. Special Advisor to the European Commission. Perugia,
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Donald T. Simeon Caribbean Health Research Council
2025 Planning Contacts Meeting November 8, 2012 K-State 2025.
Evaluating public RTD interventions: A performance audit perspective from the EU European Court of Auditors American Evaluation Society, Portland, 3 November.
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Seminar on selected evaluation methodology issues 评估方法论研讨会 Independent Office of Evaluation of IFAD Beijing, 16 July 2014 国际农业发展基金独立评估办公室 2014 年 7 月 16.
LEVERAGING THE ENTERPRISE INFORMATION ENVIRONMENT Louise Edmonds Senior Manager Information Management ACT Health.
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
CONCEPT PAPER RESULT BASED PLANNING. RESULT-ORIENTED PLANNING Overall Objective/ Goal Specific Objective/ Purposes Expected Result/ Output Activities.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
Climate change and Environmental Degradation Risk and Adaptation assessment Step 2 Collect scientific information.
Results-Based Management
Understanding Project Management. Project attributes  The project has a purpose  Project has a life cycle like an organic entity  Project has clearly.
Lesson 3: Monitoring and Indicator Macerata, 23 nd October Alessandro Valenza, Director, t33 srl.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
 Summary Presentation of Haiti  Norway’s Evaluation: Basic Information  Challenges Leading to Policy Level Findings  Lessons from the Norwegian Portfolio.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Unit 10. Monitoring and evaluation
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
Impact evaluation: External and internal stakes Impact evaluation seminar - 2 to 6 December, Phnom Penh.
Lesson 8: Effectiveness Macerata, 11 December Alessandro Valenza, Director, t33 srl.
A simple performance measurement framework A good performance measurement framework will focus on the customer and measure the right things. Performance.
Guidance notes on the Intevention Logic and on Building a priority axis 27 September 2013.
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
SESSION 3: FROM SETTING PRIORITIES TO PROGRAMMING FOR RESULTS.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Project Cycle Management for International Development Cooperation Indicators Teacher Pietro Celotti Università degli Studi di Macerata 16 December 2011.
4/5 June 2009Challenges of the CMEF & Ongoing Evaluation 1 Common monitoring and evaluation framework Jela Tvrdonova, 2010.
Result Orientation in Interreg CENTRAL EUROPE Annual Meeting, Luxemburg, 15 September 2015 Monika Schönerklee-Grasser, Joint Secretariat.
Regional Policy Veronica Gaffey Evaluation Unit DG Regional Policy International Monitoring Conference Budapest 11 th November 2011 Budapest 26 th September2013.
Regional Policy Result Orientation of future ETC Programes Veronica Gaffey Head of Evaluation & European Semester 23 April 2013.
SUMMARY PROJECT OUTLINE (SPROUT) ITC-ILO/ACTRAV Training Course A : Trade Union Training on ILS & the ILO Declaration on Fundamental Principles &
Lessons from Programme Evaluation in Romania First Annual Conference on Evaluation Bucharest 18 February 2008.
Module 4 Challenges in measuring corruption What challenges are involved in measuring corruption? Take 3 minutes to write your own list.
DAC Evaluation Quality Standards Workshop, Auckland 6/2 & 7/ Evaluation quality standards in Dutch Development Cooperation Ted Kliest Policy and.
Measuring Sustainable development: Achievements and Challenges Enrico Giovannini OECD Chief Statistician June 2005.
EN Regional Policy EUROPEAN COMMISSION National evaluation conference Marielle Riché Evaluation unit, DG REGIO Bucharest, 18.
Guidelines for LDS preparation for Croatian LAG’s Estonian Leader Union Kadri Tillemann and Kristiina Timmo 28 th of September, Zagreb.
St. John’s, Antigua May What is STAP? In 1994, the GEF Instrument sets up STAP – “UNEP shall establish, in consultation with UNDP and the World.
From Monitoring Through Evaluation To Impact Assessment The case of NHDRs and vulnerable groups Andrey Ivanov Human Development Adviser, Bratislava RSC.
International engagement: m & e meeting Monitoring & Evaluation: an introduction for practitioners Liz Allen.
"The challenge for Territorial Cohesion 2014 – 2020: delivering results for EU citizens" Veronica Gaffey Acting Director EUROPEAN COMMISSION, DG for Regional.
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
(I)WRM indicators A GWP PERSPECTIVE Water Country Briefs Project Diagnostic Workshop, Geneva, December 2010 Mike Muller : GWP-TEC.
SUMMARY Macerata, 8 th April Andrea Gramillano, t33 srl.
ICT4D: Evaluation Dr David Hollow DSV, University of Stockholm.
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Steps in development of action plans ITC-ILO/ACTRAV Course A3 – Trade Union Training on Information Management for Trade Union Organization, Research.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Project monitoring and evaluation
Gathering a credible evidence base
Monitoring and Evaluation of Peacekeeping and Peacebuilding
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 7. Managers’ and stakeholders’ information needs.
Intro slide to all of these:
Claire NAUWELAERS, independent policy expert
بسم الله الرحمن الرحیم.
WHAT is evaluation and WHY is it important?
Addressing the challenge of water scarcity and droughts
2025 Planning Contacts Meeting
OGB Partner Advocacy Workshop 18th & 19th March 2010
Presentation transcript:

Evaluating and monitoring complex systems Pier Francesco Moretti JPI to Co-Work – 19 December Warsaw

Alice in Wonderworld: “if you do not know where you are going, how do you know when you get there…” A traveler needs both a map and a compass. BUT… But…

Evaluation (why) Justification of spending public funds/accountability Learning process/policy advice/ adaptation Evaluation (what) Process (tactical) Results (operational) Evaluation (how) Correlation/ causality Monitoring Data collection Relevance Responsibility Quantitative (what is happening) Qualitative (why is happening) in complex systems… Parameters Indicators Impacts (strategic) Parameters Indicators

Complex systems: understanding the process is crucial! Dynamic complex systems are inherently chaotic and unstable, but, they usually evolve into one of a number of possible steady states. These steady states are called "attractor basins". It is by causing dynamic complex systems to switch between attractor basins that control can be exercised.

From 1 to which of the 2s? 1 2a 2b 2c JPI addresses societal challenges and deal with complex systems. According to OECD, social indicators cannot be used to evaluate the effectiveness of particular social program. JPIs are not properly social programmes… But, how much can actions influence/control the evolution of the system? Contribution can come from different stakeholders! Evaluate to understand, but identify what you control (How much the action you adopt is correlated to the result you expect? What happened if you did not adopted that action?) Any system is linear at short timescales… how short? Warning: the evolution timescales of the system can be shorter than those of the actions! Are we driving the process? Evaluation is close linked to Foresight,ex ante, governance! Impact can be defined as a change of a variable A attributed to an input B (  A/  B): should we monitor its continuity? Better focusing on structural interventions (monotonous trends and not targets)? Are only funds are (and how much) impacting?

Indicators linked to objectives’ hierarchy, with their timescales: project monitoring, process evaluation, impact assessment. Set of process/impact indicators: context, inputs (including expenditure), outputs. leading, coincident, lagging (timescales) But what constitutes success? If targets for indicators are not met, does it imply lack of success (can it depend on missing data collection or identification) ? Be careful to use /working towards numerical metrics: it can distract from the objective! Criteria for process/impact indicators: Evaluating: why-what-how. Always together… Learning process…statistics to evaluate the “best action”: is it possible, where? Is there the possibility to compare realizations/experiments? From JPND/Voluntary guidelines ERAC-GPC: relevance, effectiveness, efficiency, utility, sustainability. Moreover: simple to operate, cost effective, credible, flexible… From Evaluation Journal…: validity, relevance, appropriateness, robustness, manageability.

From the “voluntary guidelines” ERAC-GPC : risks and uncertainties of research, time lag, Correlation/causality, information circulation, ex-ante, costs/administrative burden… Open issues BUT also: cross-border joint actions…variable geometry! …flexible (case by case) and dynamic indicators… based on past experience: is enough? New tools…. Dynamics of concept based clustering

Evaluation of projects: a long established experience in the principles of the evaluation procedure, criteria, scoring, reports, panels. See as an example the guide for evaluators as this is published on the web: /bonus_calls/bonus_call_2012_viable_ecosystem/ /bonus_calls/bonus_call_2012_viable_ecosystem/ Open issues also in the projects’ evaluation? inter disciplinary panels, ranking in the virtual common pot…