Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluation from the External Evaluator’s Perspective

Similar presentations


Presentation on theme: "Evaluation from the External Evaluator’s Perspective"— Presentation transcript:

1 Evaluation from the External Evaluator’s Perspective
MINISTERUL FINANŢELOR PUBLICE Autoritatea de Management pentru Cadrul de Sprijin Comunitar Unitatea Centrală de Evaluare Evaluation Working Group – second training seminar for evaluation staff of Romanian NSRF and Operational Programmes Evaluation from the External Evaluator’s Perspective Dr Jim Fitzpatrick, Fitzpatrick Associates Economic Consultants, Ireland May 18, 2006

2 Content Evaluation as a “process”
Tasks in a typical evaluation process Methodologies in practice Common problems in practice Procurement of evaluation - typical stages… Some “tips” for evaluation commissioners Pitfalls the evaluator faces Some tips for evaluators Wider issues for the future

3 Some overall considerations from the external consultant’s perspective…
A wide variety of different contexts (e.g. doing v supervising, policy v service delivery, ex ante v ex post, technical v non-technical) “Planning” and “doing” closely related Experience across a wide range of organisations, topics, etc Overlaps with planning other types of assignments An external consultancy perspective

4 Evaluation is a process, not just a technique!
Evaluation is a balancing act between.. Client, user relations Research, analysis Managing the team Stakeholder Involvement Time, resources, budget

5 Tasks in a typical evaluation process…
Establish/Understand Context who is the “client”? why is the evaluation being done? any specific use intended? what kind of evaluation is needed? Obtain/Prepare/Agree Brief (ToR) is there one? is it clear? write one? is it agreed?

6 Evaluation tasks (continued)…
3. Prepare Work Plan (Proposal) overall approach (i.e. how interpreting brief, how going about it) analytical framework (i.e. overall logic) methodology/techniques (e.g. CBA, CEA, MCA, benchmarking) work programme (i.e. the data, data collection, e.g. surveys, interviews*) Evaluation Team/Resources (budget) no. of people/person days types of people necessary expertise (e.g. on technical aspects) *data not just statistics, includes other information

7 Evaluation tasks continued…
5. Doing the Evaluation implement method/work programme client relations manage team deal with unexpected issues 6. The Output/Report/Schedule meet how often, how many, when? nature of report e.g. length? style? nature? presentations?

8 Evaluation methodology in practice…
trying to establish if intervention did (or will) make a difference so “with-without” (scientific method at its core) formal quantitative techniques very desirable, but very different MCA/scoring, weighting and ranking most used others useful are: before v after (time-series) places that do, don’t have (“control group”) “expert” opinion views of stakeholders always need some framework for answering the evaluation questions (samples available)

9 Common problems in practice…
poor initial project/programme design inability to control for external influences poor/unavailable indicators (too few, too many, not really capturing essence of intervention) lack of consensus about purpose of evaluation “scope creep”

10 Procurement of evaluation - typical stages…
Policy issue or topic, regulatory requirement Terms of Reference, brief Invitations, tendering Selection, contracting Inception Managing, undertaking, analysing Reporting

11 Common Challenges, Good Practice
Stage Challenges Good practice 1. Policy issue, topic, need lack of clarity lack of client consensus clarity, consensus spell out objectives 2. Terms of Reference, Brief problems of no.1 flow in here over-specifying methodology “piece of string” problem don’t rush the ToR focus on objectives 3. Procurement, Tendering poor procedures overly formalised competitive procedures which prevent dialogue 4. Selection, Contracting, Inception disproportionate procedures rush to get started proportionality don’t rush the Inception

12 Common Needs, Challenges and Good Practice (Cont’d)
Stage Challenges Good practice 5. Managing, Undertaking, Analysing unrealistic deadlines absence of data data collection v analysis multiple stakeholders “scope creep” good project management role of Steering Committees? avoid surprises 6. Reporting emphasis on written “tomes” difficulty finalising reports “reversing” compromises into reports separation of consultants’ recommendations from policy decision

13 Some comments on the procurement process…
You need to balance competition with the need for dialogue with evaluators Can you invite too many bidders? Need for guidance on scale Circulation of all replies to all bidders! Ability, availability of clients representatives

14 Some practical tips for evaluation commissioners…
ensure programme/project planning are good (monitoring, evaluation considered at outset) Make sure the Terms of Reference have: Clarity Focus Indicate Scale Relationships: Be open post selection Avoid surprises take time to get shared understanding of what’s happening ensure there is some kind of method/framework being used performance indicators – use “sensibly”, and note they are the fuel of monitoring/evaluation, they are not it themselves

15 Pitfalls the evaluator faces…
Misunderstand context Objectives unclear, not agreed Client unclear, not agreed Lack of balance, being one-dimensional Thinking you know the answer Work that’s not used in the end Having no analytical framework Being over-ambitious Not having the right expertise Failing to consult stakeholders Not allowing time for project/process/management No “intellectual leadership” Report not doing work justice

16 Some practical tips for the evaluator
watch for “scope-creep” keep re-reading the brief estimate time needed and double it! avoid surprising the client don’t over-promise structure the report early on set internal deadlines SATISFACTION = PERCEPTIONS MINUS EXPECTATIONS (S=P-E)

17 Wider Issues for the Future
extent of evidence-informed, evaluation culture the need for research, evaluation to “speak” to policy makers need for more basic, neutral data collection balance between “independence” and “relevance” emphasis on costs of research/evaluation v costs of poor policy decision more inter-disciplinary research, evaluation (e.g. “economic” v “social”) over-evaluation of some areas, under-evaluation of others


Download ppt "Evaluation from the External Evaluator’s Perspective"

Similar presentations


Ads by Google