IAOD Evaluation Seminar “Demystifying Evaluation in WIPO - Best Practices from Initial Evaluations” Geneva November, 8 2012 2012 Evaluation of Project.

Slides:



Advertisements
Similar presentations
Evaluation at NRCan: Information for Program Managers Strategic Evaluation Division Science & Policy Integration July 2012.
Advertisements

Project Cycle Management
Guidance Note on Joint Programming
Learning from Existing Evaluation Practices on the Impacts and Effects of Intellectual Property on Development Geneva 6th/7th October 2011 Evaluation Section.
Thematic evaluation on the contribution of UN Women to increasing women’s leadership and participation in Peace and Security and in Humanitarian Response.
OECD/INFE High-level Principles for the evaluation of financial education programmes Adele Atkinson, PhD OECD With the support of the Russian/World Bank/OECD.
Progress Toward Impact Overall Performance Study of the GEF Aaron Zazueta GEF Evaluation Office Hanoi, March 10, 2010.
Project Monitoring Evaluation and Assessment
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Ray C. Rist The World Bank Washington, D.C.
Global Poverty Action Fund Community Partnership Window Funding Seminar January 2014 Global Poverty Action Fund Community Partnership Window Funding Seminar.
INTEGRATING BENEFICIARY FEEDBACK INTO EVALUATION- A STRUCTURED APPROACH Presentation to UKES Conference May 2015 Theme: Theory and practice of inclusion.
Return On Investment Integrated Monitoring and Evaluation Framework.
EVALUATION AND QUALITY ASSURANCE STRATEGY PRESENTED BY DR SHYAM PATIAR.
HOW TO WRITE A GOOD TERMS OF REFERENCE FOR FOR EVALUATION Programme Management Interest Group 19 October 2010 Pinky Mashigo.
Arnhem Business SchoolJ.Vinke 2005 Human Resource Management (HRM) Plan guide on developing a practical HRM plan.
EVALUATION IN THE GEF Juha Uitto Director
IAEA International Atomic Energy Agency How do you know how far you have got? How much you still have to do? Are we nearly there yet? What – Who – When.
TEMPUS IV- THIRD CALL FOR PROPOSALS Recommendation on how to make a good proposal TEMPUS INFORMATION DAYS Podgorica, MONTENEGRO 18 th December 2009.
IAOD Evaluation Seminar “Demystifying Evaluation in WIPO- Best Practices from Initial Evaluations” Geneva November, Validation of Program Performance.
© Grant Thornton UK LLP. All rights reserved. Review of Partnership Working Vale of Glamorgan Council Final Report- July 2008.
Fifth Overall Performance Study (OPS5).  Objective  Analytical framework  Key issues to be covered  OPS5 audience  Organizational issues  Group.
Exploring the use of QSR Software for understanding quality - from a research funder’s perspective Janice Fong Research Officer Strategies in Qualitative.
THE OECD APPROACH TO ASSESSING ORGANISATIONAL EFFECTIVENESS Frank van Tongeren Head of Division, Policies in Trade and Agriculture (OECD) ICAE pre-conference.
Evaluation in the GEF and Training Module on Terminal Evaluations
EQARF Applying EQARF Framework and Guidelines to the Development and Testing of Eduplan.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
IPA Funds Programme Management sept Bölgesel Rekabet Edebilirlik Operasyonel Programı’nın Uygulanması için Kurumsal Kapasitenin Oluşturulmasına.
“Learning from Existing Evaluation Practices on the Impacts and Effects of Intellectual Property on Development” Geneva 6th/7th October 2011 Evaluation.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
IAOD Evaluation Seminar “Demystifying Evaluation in WIPO- Best Practices from Initial Evaluations” Geneva November, Evaluation Section Internal.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Tracking of GEF Portfolio: Monitoring and Evaluation of Results Sub-regional Workshop for GEF Focal Points Aaron Zazueta March 2010 Hanoi, Vietnam.
Strategic Plan th October Management and Governance “GeSCI’s corporate structures and management arrangements were appropriate for.
UNDAF M&E Systems Purpose Can explain the importance of functioning M&E system for the UNDAF Can support formulation and implementation of UNDAF M&E plans.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points Western and Central Africa Dakar, May 2007.
Regional Seminar 2005 EVALUATING POLICY Are your policies working? How do you know? School Development Planning Initiative.
Workshop II Monitoring and Evaluation INTERACT ENPI Annual Conference December 2009 | Rome.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
ESPON 2013 Programme Info Day on New Calls and Partner Café Call for Proposals on Applied Research.
Aaron Zazueta Chief Evaluation Officer 2013 EVALUATION IN THE GEF.
Evaluation Plan New Jobs “How to Get New Jobs? Innovative Guidance and Counselling 2 nd Meeting Liverpool | 3 – 4 February L Research Institute Roula.
M&E in the GEF Carlo Carugi Senior Evaluation Officer Expanded Constituency Workshop Dakar, Senegal - July 2011.
Project Management Learning Program 7-18 May 2012, Mekong Institute, Khon Kaen, Thailand Writing Project Report Multi-Purpose Reporting.
Senior Evaluation Officer GEF Independent Evaluation Office Minsk, Belarus September 2015 Evaluation in the GEF and Training Module on Terminal Evaluations.
Independent Evaluation Group World Bank November 11, 2010 Evaluation of Bank Support for Gender and Development.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
1 EMS Fundamentals An Introduction to the EMS Process Roadmap AASHTO EMS Workshop.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Kathy Corbiere Service Delivery and Performance Commission
ACTED AME Appraisal, Monitoring and Evaluation. Summary 1/ ACTED AME department 2/ AME Responsibilities 3/ AME throughout project cycle 4/ Involvement.
The GEF Monitoring and Evaluation Policy. 2  Result-Based Management (RBM) - setting goals and objectives, monitoring, learning and decision making 
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
September 2005 DEA: First National Workshop 1 Development and Energy in Africa First National Workshop: Tanzania 12 September 2005 Introduction Gordon.
Developing Monitoring & Evaluation Frameworks: Process or Product? Anne Markiewicz.
Practical Experiences - Evaluation of Program 1 Geneva January 29, 2016.
AssessPlanDo Review QuestionYesNo? Do I know what I want to evaluate and why? Consider drivers and audience Do I already know the answer to my evaluation.
ICAJ/PAB - Improving Compliance with International Standards on Auditing Planning an audit of financial statements 19 July 2014.
Info-Tech Research Group1 Info-Tech Research Group, Inc. Is a global leader in providing IT research and advice. Info-Tech’s products and services combine.
Evaluation What is evaluation?
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Dr. Vladimir Mamaev UNDP Regional Technical Advisor Integrated Natural Resource Management in the Baikal Basin Transboundary Ecosystem Russian Federation.
Making Programs Make more Systematic use of Evaluations and
GEF Expanded Constituency Workshop
GEF Familiarization Seminar
GUIDELINES Evaluation of National Rural Networks
Development and Energy in Africa
Evaluation in the GEF and Training Module on Terminal Evaluations
Helene Skikos DG Education and Culture
The GEF Public Involvement Policy
Presentation transcript:

IAOD Evaluation Seminar “Demystifying Evaluation in WIPO - Best Practices from Initial Evaluations” Geneva November, Evaluation of Project „Developing Tools for Access to Patent Information“ – Key Lessons

Purpose of this presentation Briefly present the project we have evaluated; Discuss the evaluation approach, evaluation steps and the methodology used; Summarize the key conclusions and recommendations as far as they are of general relevance to WIPO and the seminar participants; Share my personal experience in conducting my first evaluation assignment for WIPO – what worked well and suggestions for improvements; Draw conclusions and make some suggestions for WIPO‘s future evaluation work.

Summary of project that was evaluated The evaluation of the DA-Project “Developing Tools for Access to Patent Information” was conducted by IAOD from July – September 2012 with my support as an independent consultant. The Project with a duration of 30 months and a budget of CHF 1,576,000 started in January It aimed at “enhancing access of developing countries to patent information”, by publishing patent landscape reports, developing an e-tutorial and organizing regional conferences.

Evaluative steps used Analysis of Terms of Reference; Desk study of documents and analysis of existing data; One-day briefing with the Director of IAOD, the Evaluation Section and key internal stakeholders of the project; Draft inception report („procedures for evaluation“); In-depth face-to-face semi-structured interviews with internal and external project stakeholders (four days); Discussion on preliminary findings, conclusions and recommendations with the purpose to get alignment with project stakeholders within WIPO; Draft report (one week), obtain written comments, amend report. De-briefing, integrating all comments into report (three days).

Evaluation Approach The emphasis of this particular evaluation was on organizational learning, while still ensuring the purpose of accountability. The evaluation approach was interactive and participatory (discussion based on a list of guiding questions rather than driven by what the evaluator feels is important). The process itself was designed to contribute to continuous improvement of WIPO’s services. The project was invited to participate in all interviews (prior consent of interview partners was obtained). There was no hidden agenda (e.g. instructions by IAOD on expected „findings“ or recommendations).

Methodology Assessment of project based on standard evaluation criteria (relevance, efficiency, effectiveness and sustainability of results) in order to provide a well-founded opinion whether the project provided the right type of support in the right way. Different evaluation tools were combined to ensure an evidence- based qualitative and quantitative assessment. Particular emphasis was given to cross-validation of data and an assessment of plausibility of the results obtained. The methodological mix included desk studies, literature review, individual interviews, interviews of focal groups and direct observation.

Main conclusions of the evaluation Conclusion 1: The project was generally well prepared and managed, but there is room for further enhancing existing tools for planning, monitoring and evaluating projects. WIPO does not have system to track users of online services. Conclusion 2: The project design was clearly overambitious, especially for achieving the objectives set for the patent landscaping reports. The project duration seems to be driven by budgeting cycles, not by realistic estimates of the time needed to achieve objectives. Conclusion 3: While the project overall provided the right type of support in the right way, not all of its expected outputs were delivered. It was not possible to assess outcomes, impact and potential sustainability, since most of the outputs had only been completed immediately prior to the evaluation.

Key recommendations of the evaluation To Project managers and DA Coordination Division: Improve project management tools using internationally recognized best practices. Examples include the use of logical frameworks, consistent application of result-based financial budgeting and reporting, include a brief assessment against key evaluation criteria into the self-evaluation reports (rather than only conducting an intermediate assessment of results). To WIPO Senior Management: Establish a system to collect data on who uses existing services as a basis to provide tailored information to specific target groups and to actively collect feed-back from them for the purpose of continuous improvement of its services.

Key recommendations of the evaluation To the WIPO Global Infrastructure Sector on formalizing coordination with other Sectors: Defining specific responsibilities to be assumed by each programme and requiring a formal sign-off by the programmes involved would help to ensure that coordination is less dependent on informal cooperation.

What worked well? Experience as a WIPO Evaluation Consultant This was my first assignment for WIPO. As a „newcomer“ to WIPO, I appreciated first of all the support received in familiarizing myself with the organization. No attempt of IAOD or the Project to push for desired evaluation results or “down-tune” critical comments in the report. After some “warming up” during the briefing, all discussions with stakeholders were open and constructive. Persons interviewed openly shared information and freely exchanged views. WIPO staff members supported the evaluation process actively (in particular also through arranging meetings with the right persons) and provided access to all relevant information.

What worked well? Experience as a WIPO Evaluation Consultant The Project prepared well, e.g. actively conducted user surveys and made them available – crucial for evaluation! The Project did use existing planning and monitoring tools and beyond minimum standards of WIPO even provided a result-based financial report (relating expenditures to outcomes and UN budget lines – this is best practice within the UN system). Both IAOD and the project provided timely, detailed and meaningful feed-back on the report.

What could be improved? Need to streamline ToRs. ToRs of 47 pages, partially with internal contradictions, especially with the Annexes. There is a risk that external evaluators with no prior evaluation experience will not understand the requirements of the job. Need to consistently align ToRs with WIPO‘s evaluation policy. My ToRs did not for instance not require an assessment of efficiency („value for money“), which should be an essential question. Standard ToRs would make all evaluations comparable and useful as an input to other evaluations (e.g. Thematic Evaluations). Need to clearly define the scope of work: All outputs required for an evaluation should be explicitly mentioned (including the meeting summaries). Otherwise, there is a risk that consultants will not deliver them, unless they are paid in addition.

Conclusions Project-cycle management tools (project planning, financial and operational reports) are not only important for monitoring, but also key inputs to evaluations. Further improvement is still possible. Data collection of projects is crucial as a factual basis for evaluations, because data collected ex post is often not reliable. WIPO should in general retrieve more information about who uses which services and for what purpose. A well formulated plan for the evaluation is crucial. WIPO rightly required a detailed inception report. There is room for enhancing consistency, clarity and completeness of evaluation ToRs. Using a single approach and format (WIPO evaluation norms) for all evaluations would make them comparable and useful for larger evaluations (e.g. thematic evaluations).

Conclusions An evaluation should be a constructive, participatory process geared towards organizational learning (promoting self-learning), while still ensuring the accountability purpose of the evaluation. This requires a relationship of trust with no hidden agendas or politically motivated interference. Whenever possible, evaluators should seek alignment on key evaluation results, because otherwise, recommendations are unlikely to be understood and implemented. For this purpose, a well prepared physical de-briefing is essential, if possible after comments on a draft report have been received.

Conclusions To summarize, working with WIPO has been an extremely positive, pleasant and enriching experience. An understanding of everyone in WIPO what evaluation is pivotal. Let me therefore warmly congratulate WIPO for organising this important seminar. Let me also thank the IAOD for the kind invitation to contribute as a speaker. I hope you found this brief presentation useful and am happy to answer any questions you might have. Thank you for your interest and attention! Daniel Keller, Director, Swiss Consulting Co. Ltd. Hanoi, Vietnam Management and Development Consultants