Download presentation
Presentation is loading. Please wait.
Published byMichelle Stack Modified over 10 years ago
1
1 In-depth Evaluation of R&D Programs – how could it be accountable? Seung Jun Yoo, Ph.D. R&D Evaluation Center KISTEP, KOREA Symposium on International Comparison of the Budget Cycle in Research Development and Innovation Policies Madrid (Spain), 3-4 July 2008 OECD/GOV/PGC/SBO
2
2 Contents 1. Overview of Current Evaluation 2. Architecture of In-depth Evaluation 3. Procedure of In-depth Evaluation 4. Evaluation with Accountability 5. Challenges and Discussion
3
3 Overview of Current Evaluation 1 - All R&D Programs (191) evaluated every year! - specific evaluation (mainly using checklists) : 27 programs - self/meta evaluation : 164 programs - in-depth evaluation (4 horizontal programs (pilot run)) : climate change related R&D programs : university centers of excellence R&D programs : infrastructure (facility/equipments) R&D programs : genome research R&D programs
4
4 Overview of Current Evaluation 2 - Efficiency & Effectiveness of Evaluation? - evaluating 191 programs every year? - efficiency & effectiveness of evaluation itself is questionable, considering characteristics of R&D programs.. - too much loads of evaluation to evaluators, program managers and researcher, etc. - not enough time to prepare and perform evaluation for all R&D programs and to communicate with stakeholders ( might yield poor accountability?)
5
5 Architecture of In-depth Evaluation 1 - Main Players *NSTC (National Science & Technology Council) MOSF …… MEST MOEMIK MIFAFF MW KISTEP (Evaluators) Evaluation Supporting Groups R&D programs of each ministry Decision maker for R&D Evaluation and Budget Allocation Agency NSTC
6
6 Architecture of In-depth Evaluation 2 - Evaluation & Budget allocation R&D BudgetSurvey/Analysis Programs/Projects implemented In-depth Evaluation Programs Feedback Evaluation group formed To (re)plan and/or improve program Input for budget allocation
7
7 Architecture of In-depth Evaluation 3 - Budget Process 5 yr plan Ministry Budget Ceiling 1 st Budget Review Program Budget 2 nd Budget Review with Evaluation Results Ministry of Strategy & Finance (MOSF) Budget Committee of National Assembly (Dec.) NSTC
8
8 Procedure of In-depth Evaluation 1 - 7-month schedule (suggested!) - Selected by selection committee based on special issue, etc. (month 0) - In-depth evaluation procedure for selected program(s) - month 1 : form evaluation group, gather program(s) data, study target R&D program(s), find major evaluation points - month 2 : develop logic model (with system dynamics, etc.) - month 3/4 : perform in-depth analysis (relevance, efficiency, effectiveness, program design & delivery, etc)
9
9 Procedure of In-depth Evaluation 2 - month 5 : interview (researchers, program managers, etc.) - month 6 : report interim evaluation result (MOSF, department(s)) - month 7 : report final evaluation result & recommendations
10
10 Evaluation with Accountability 1 - Responsibility 1 balance between quantitative and qualitative evaluation is important - systematic approach for qualitative evaluation is challenging - program goals vs projects implemented vs output enough time for evaluation is essential (7-month schedule) - to achieve the goal of evaluation with accountability - give enough time for PM to cope with evaluation process suitable only for limited # of programs to evaluate
11
11 Evaluation with Accountability 2 - Responsibility 2 qualitative assessment is needed to achieve the purpose of evaluation - measuring simple # of publications and patents? - publications : impact factor (1-2 yrs), citation index (more than 3 yrs) - patents : commercial purpose technology value evaluation - selected projects with excellent performance : consistent funding is required irregardless of its program evaluation!
12
12 Evaluation with Accountability 3 - Acceptability 1 understand well characteristics of program, sharing with stakeholders - performance indicators are useful tools to get stakeholders agreement - researchers, program managers, MOSF, etc. - to set up an evaluation strategy and points! - important especially for acceptability and for improving program delivery
13
13 Evaluation with Accountability 4 - Acceptability 2 (Understand & Change!) communication with stakeholders - interview with stakeholders is important to increase accountability : researchers, program managers, MOSF : evaluation strategy would better to share at the beginning - number of interviews are also important : lack of understanding evaluation is key inhibitor for accountability! : interviews at major steps such as evaluation strategy, survey weak/strong point of the program, report interim evaluation results, etc.
14
14 Challenges and Discussion 1 - Understand & Change & Improve! - Stakeholders should understand their program(s) : otherwise, rigid and too much defensive for keeping unchanged - Systematic way to understand diverse aspects of programs : goal, contents, projects, design & delivery, etc. - Sharing of program information to change : change and improve (to all stakeholders)
15
15 Challenges and Discussion 2 - Scientific, Socio-economic Interest - Technology impact evaluation for socio-economic understanding - Results of technology level evaluation are also useful
16
16 Challenges and Discussion 3 - Communication Consultation - Communication among stakeholders (Ministry/Agency, Researchers, MOSF, KISTEP, etc.) - For better evaluation practices, communication should be transformed to a way of consultation
17
17 Muchas gracias!
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.