Module 7- Evaluation: Quality and Standards
17/02/20162 Overview of the Module How the evaluation will be done Questions and criteria Methods and techniques Quality
17/02/20163 Reproducibility of findings (another evaluator would reach the same conclusions), clear and total distinction between conclusions and recommendations Objectivity, impartiality, pluralism in the implementation, independence from project management or policy bodies Relevance of the initial questions asked the evaluators in relation to the concerns of the sponsors, all services involved and all stakeholders in the project or the policy High quality and transparent methods, competent evaluators Implementation at the right time, in response to a demand on the part of the people in charge of the project / policy, with the improvement of the design or the implementation of their programs as a goal Communication of results Quality of the report: organization, size, style (clear, concise) Clear, exhaustive and detailed presentation of all the arguments, clear distinction between observations, hypotheses and opinions Credible (trustworthy) RelevantAccessible Useful For an evaluation to be: It must be: Which means: Qualities of an Evaluation
17/02/20164 Determinants of Quality This is a process. Quality must be ensured all along the chain: –Quality of the demand –Quality of the Terms of reference –Quality of the evaluation questions –Quality of the evaluators –Quality of the preparation –Quality of the execution –Quality of the analysis –Quality of reporting
17/02/20165 Quality of the Demand Justified by a will to improve In a climate of transparency and partnership Relevance of the questions Timing Coherence between the question and the means Quality of the terms of reference (which are an expression of the demand)
17/02/20166 Quality of the TORS Terms of reference (TOR) are a statement of expectations for the evaluation that generally include the issues and details about the required methodology, scheduling, cost and evaluator qualifications
17/02/20167 Terms of Reference (TOR) Should Include… Scope / Focus -- the Issues Stakeholders Requirements Cost Schedule Qualifications of Evaluators Deliverables/Products Who the clients are
17/02/20168 Why Are TOR Necessary? To clarify the reasons for the evaluation To flag issues that have become apparent To indicate the general depth and scope required To indicate any imperatives To protect the evaluator
17/02/20169 Quality of the Questions and "Evaluability" of the Program
17/02/ Discussion Qualities of the Evaluators What should be the profile of an evaluator? What skills, knowledge and attitudes should an evaluator have?
17/02/ Qualities Required of an Evaluator Initiative and innovation Independence Knowledge of the field Analytical skills Interpersonal skills Project management skills Synthesis and writing skills
17/02/ Quality of the Preparation At the core of the preparation is the choice of the approach that will be used (this approach must be adapted) but also the quality of the program of activities, well prepared tools and instruments, etc A process is followed to prepare for an evaluation and all the stages must be respected and properly done See Module 8 for a description of the various stages)
17/02/ Quality of the Implementation What was planned was implemented without any surprises or accidents (The final report will recount how the implementation unfolded) Quality of the actions taken Quality of the argumentation and the conclusions
17/02/ Quality of the Analysis: How to Make Judgments Put Criteria –Agree on criteria and assess case with respect to those criteria –Criteria need to be known and seen Norm –Look at what other good organizations in the setting do and use as benchmark Expert Panel –Eminent people look at data and use their judgement, based on their experience
17/02/ Interview Data Focus Group Data Questionnaire Data Multiple Lines of Evidence: Triangulation
17/02/ The Data Analysis Must be Carried out with Care Group discussion: –Based on your experience, what are the factors that can affect the analysis of the data collected? Give examples from your countries
17/02/ Issues in Data Analysis Problem –A lot of data but difficulty in making use of it –Contradictory data –Insufficient data –Unreliable data
17/02/ Formulating Data Facts Findings Conclusions Recommendations Lessons learned
17/02/ Fact Versus Finding A fact is a piece of information that has been verified –There has been a 20% increase in program costs in the last 3 years A finding is an analysis of related facts –Although the cost of the program has increased, there has been a 10% increase in productivity
17/02/ Conclusions A conclusion covers a major aspect of the evaluation and is generally based on a collection of findings Conclusions are often saved for the concluding chapter of an evaluation report
17/02/ Recommendations Directed to a responsible person/body State clearly what is to be done State when it is to be done by
17/02/ Lessons Learned A lesson is a hypothesis that is based on the findings of one or more evaluations A lesson is presumed to relate to a general principle that may be applied more widely
17/02/ Lessons Learned Example from an evaluation of a corporate training program –“The outcomes of training are more likely to be transferred to the job when the immediate supervisor supports the transfer process by meeting with the employee and developing a plan.”
17/02/ Quality of Reporting: When do you Communicate? Before the evaluation –To ensure that people are informed of purpose and objectives of the evaluation and their role in it During the evaluation –To ensure that people are informed of progress After the evaluation –To disseminate results, decisions made and follow-up
17/02/ Effective Reports Respond to the questions and issues defined in the TOR Short, succinct and well organized Judicious use of graphs, tables and charts Developed with stakeholder participation Delivered on time
17/02/ Formal Report Outline Executive summary Introduction Program description Evaluation questions Methods Findings Conclusions and recommendations Appendix - instruments, TORs
17/02/ Ways to Communicate in an Evaluation InformalPhone calls s Quick faxes Internal correspondence FormalBriefings Presentations Written reports
17/02/ Different Audiences Have Different Needs Internal staff might need a verbal report and a memo with key points Donors and external stakeholders might need a full report Ministries might need an abstract Public at large might need a precise of findings Know your audience and match your reporting approach
17/02/ Effective Communication of Evaluation Results Captures the data in its conclusions Speaks the language of users Detached, non-possessive stance Objective - “truth” to power, but Pragmatic - goes only as far as the key stakeholders will accept