Presentation on theme: "Designing and Conducting Useful Self-Evaluations at UNESCO"— Presentation transcript:
1 Designing and Conducting Useful Self-Evaluations at UNESCO Hallie Preskill, Ph.D.University of New Mexico – USAAndBrad Cousins, Ph.D.University of Ottawa, CANADAJune 2004
2 Workshop ObjectivesAs a result of having taken this workshop, participants will:Understand how this workshop fits in the broader scope of evaluation at UNESCO.Understand how self-evaluation in UNESCO can be useful and potentially contribute to individual, team, and organizational learning.Understand how to practically and realistically design, implement and use self-evaluation studies as a working tool in the current context of their projects or activities.Have developed a self-evaluation plan for a project or activity in which they are involved.Know how to integrate the self-evaluation activities into existing work structures and processes.
3 AgendaEvaluation in UNESCOComponents of a Self-Evaluation PlanFocusing Your Self-EvaluationChoosing Among Data Collection MethodsAnalysing Evaluation Data Communicating & Reporting Evaluation Processes & FindingsReflecting on the Context of Self-EvaluationsMaximizing the Usefulness & Impact of Self-EvaluationsWorkshop evaluation and follow-up
4 Background for this Workshop: UNESCO Evaluation Strategy The workshops are part of a set of capacity building activities in self-evaluation, implemented by Internal Oversight Service (IOS) on a pilot basis mainly in collaboration with the Education Sector. This initiative constitutes an important aspect in the implementation of the “UNESCO Evaluation Strategy” developed by IOS and endorsed by the Executive Board. The Evaluation Strategy (as well as other recent Audit and Evaluation reports) calls for self-evaluation as a necessary complement to external independent evaluation.
5 Definition of Evaluation A systematic assessment of a planned, ongoing or completed intervention to determine its relevance, efficiency, effectiveness, impact and sustainability. The intent is to incorporate lessons learnt into the decision making process.(Source: adapted from OECD/DAC Glossary, 2002)Evaluation is context-dependent and involves (albeit systematic, data-based) value judgment. It is thus different from traditional scientific research. The latter claims to be value-free and aims at generalizability of findings across times and locations. This differentiation becomes blurred, however, especially with applied social science research, because applied social science research is becoming more utility- and action-oriented (e.g., “action research”) and the complexity of the ever-changing applied setting makes all decisions and interpretations of the researcher value-dependent - and generalizability contestable, even in highly controlled studies.
6 JudgementJudgement implies comparison of program performance data against some standard:Performance in program at prior point in timePerformance of those receiving similar programs (comparative treatment)Performance of those receiving no program (control)External standard
7 Summative evaluation (judgement) Formative evaluation (improvement) Evaluation is the use of systematic inquiry to make judgements about program merit, worth and significance and to support program decision making.Summative evaluation (judgement)Formative evaluation (improvement)Who is the judge?
8 External EvaluationOECD Glossary Definition of External Evaluation (2002, p. 23):The evaluation of a development intervention conducted by entities and/or individuals outside the donor and implementing organisations.Independent systematic approach to answering evaluative questionsTypically commissioned by senior managementWritten into the C/5 or conducted upon donor demandIOS facilitates the process and oversees the quality of the evaluationsConducted by external (to UNESCO) evaluation expertsSelection of C/5 evaluations is presented to ExBC/5 is the programme planning document that is developed each biennium; ExB = Executive Board
9 Self-EvaluationOECD / DAC Glossary Definition of Self-Evaluation (2002):An evaluation by those who are entrusted with the design and delivery of a development intervention.In the context of the UNESCO Evaluation Strategy:Self-evaluations are small-scale evaluation projects carried out by staff and management as part of their every-day work activities, which help them collect and use monitoring and evaluation data to answer their own questions concerning the quality and direction of their work.
10 Purposes of Self-Evaluation Provides opportunities for continuous reflection and learning (individual, group, organization)Provides timely information for decision making and action on a day-to-day implementation levelDraws on organization members’ knowledge of the project and evaluation contextResults in useful findings; recommendations meet specific information needsIf done well, results are from systematic, valid, and purposeful processes; minimizes perceptive fallaciesProvides opportunity to share achievementsDocuments what works, what does not, and possible reasons why
11 Benefits of Using a Collaborative Approach to Self-Evaluation Greater credibility to those involvedShared work saves resources and creates team spiritIncreased learning using reflection and dialogue with othersMore informed interpretations of findingsGreater breadth of recommendationsEnhanced stakeholder evaluation capacity
12 A Systems Framework for Evaluation The Evaluation ProcessThe Evaluation EnvironmentThe Organization’s EnvironmentExternal Requirements and DemandsIt takes a lot to ensure that evaluation contributes to learning of individuals, teams and the whole organization. These are some examples that play a role:High prevalence of political agendas that predetermine intended use of evaluation findingsLeadership and/or staff not open to negative feedback for fear of sanctionsNo open communication culture, again out of fear of sanctions that you could say something “wrong” or “politically incorrect”Strict hierarchies which do not encourage team work and prevent intellectual stimulation from exchanges with colleaguesLow staff morale because of restrictive work environment and high level of bureaucracy
13 An evaluation use conceptual framework Evaluation Resourcesand ContextUse of FindingsEvaluation Knowledge ProductionProcess UseEvaluationPracticeDecision or PolicySetting
14 Evaluation Practice Planning (divergent / convergent) Instrument developmentData collection, processingData analysis, interpretationReporting and follow up
15 Self-Evaluation Plan Components (Terms of Reference) Identifying Self-Evaluation Team MembersFocusing the Self-EvaluationBackground information (and logic model)Purpose of the evaluationEvaluation stakeholders (intended users of results)Evaluation scope (key questions)Designing and Implementing the Self-EvaluationData collection methods, instruments, sampleEvaluation timeline with specified roles and responsibilitiesCommunicating and reporting planBudget
16 Self-Evaluation Stakeholders Users of the evaluation findingsPrimaryYourself/your teamSecondaryImplementers of projects/activitiesColleagues doing similar workBSP (to feed into current reporting requirements)Immediate or Intermediate ManagersLeadership of the organization
17 Evaluation Key Questions Are the broad overarching questions that guide the evaluationForm the boundaries and scope of the evaluationAre typically written in an open-ended formatGuide the choice of data collection methodsReflect the stakeholders’ information needs
18 Sample Self-Evaluation Key Questions To what extent does the project bring about the intended changes in its target group?How can this project benefit from enhanced collaboration with partners?Why does this activity work well in one region, but not in the other?For whom is this project working best? Why?What additional services, materials, and/or activities are needed to reach better outcomes?What are the unintended consequences of this activity?
19 Using a Program’s Logic Model to Focus a Self-Evaluation A logic model:Articulates a program’s theory of action – how it is supposed to work.Is a systematic and visual way to represent a program’s underlying theory.Helps focus an evaluation by making assumptions and expectations explicit.Increases stakeholders’ understanding about a program and its evaluation.
20 Things the project does with the resources to meet its objectives Logic Model TemplateAssumptionsThe underlying assumptions that influence the project’s design, implementation or objectivesResourcesHuman, financial, organizational & community resources needed to achieve the project’s objectivesActivitiesThings the project does with the resources to meet its objectivesOutputsProducts of implementing the activities, which are necessary but not sufficient indications of achieving the project’s objectivesShort-term OutcomesShort-term intended and unintended changes (e.g., in knowledge, attitudes, skills) as a result of the projectLong-term OutcomesLong-term intended and unintended changes (e.g., in behavior, status, systems) as a result of the projectFor outputs, if you were to undertake capacity building in an Education Ministry, one of your activities would be to conduct a workshop, and one of the outputs would be “20 ministry staff attended” – which does not say anything about whether they have learned anything and are in fact able to apply what they have learned for educational planning (which would be outcomes / changes).If someone asks why we are not using the UNESCO SISTER/RBM language, we can say that for the purpose of the self-evaluation projects we do not find it useful.Which example will we discuss?
21 Developing a Logic Model for Your Self-Evaluation - Activity Think of a project or work activity that you would like to self-evaluate. It should be an evaluation:That is narrow in scopeThat is doableWhere there is an intended use of findingsWhere there are realistic opportunities for using the findingsYou may work in groups of 1-3, depending on how your work is actually organized.Using the Logic Model Template worksheet, begin to develop a logic model for your program/activity.Try to make a few notes in each of the columns.We have 30 minutes for this activity. Refer to IOS “coaches” if present.
22 Focusing Your Self-Evaluation Activity Revisit the Logic Model you began to draft.Using the worksheet, Focusing Your Self-Evaluation,Make some notes regarding the background of the program/activityWrite an evaluation purpose statementDevelop 2-3 evaluation questionsIdentify potential self-evaluation stakeholdersDescribe the intended use of the self-evaluation's findings
23 Criteria for Choosing Among Data Collection Methods Evaluation questionsStakeholder preferencesRespondent characteristicsRespondent availability/accessibilityLevel of acceptable intrusivenessValidity (trustworthiness of data)Costs (time, materials, subject matter experts)Organization’s experience
24 A Menu of Data Collection Methods Surveys (mail, online, phone; open-ended, closed questions)Interviews (individual, focus group; conversational, semi-structured, structured)Observations (quantitative-structured; qualitative-unstructured)Records and Documents (e.g.,meeting minutes, s, technical reports, existing databases)Tests (paper, simulation, computer)We should stress that good ongoing record keeping is a good way to amass evaluation data. And ask them to think carefully which existing sources of data they could use (so they do not need to be the ones actually collecting the data themselves).
25 Enhancing the Validity of Data Pilot testingTry out interview protocol, survey, or observation form with a sample similar to respondent population or have it critiqued by colleagues and/or experts.TriangulationMultiple: methods, data sources, evaluators, and/or theoriesSamplingRandom/Probability – generalizableNonrandom/Nonprobability – not generalizable
26 Designing Your Self-Evaluation Activity Transfer your evaluation questions tothe worksheet (top row).Discuss and note which data collection methods might be most appropriate and feasible for your self-evaluation study.Discuss and note who the respondents might be and whether you will include the entire population, or will select a sample (indicate how many you would like to include in your sample).We have 20 minutes allocated for this activity.
27 Considerations for Analyzing Data Evaluation Key QuestionsStakeholders’ understanding of, and experience with, data analysis methodsTypes of data (quantitative, qualitative)Levels of quantitative data (nominal, ordinal, interval)Choices for analyzing quantitative dataChoices for analyzing qualitative dataEvaluator skills and time – budget implicationsWithout being condescending, this might be where we in IOS can be most useful as resource persons.
28 Why Communicate and Report? To help organization members learn from one another and jointly improve their work…To build internal capacities - learn about UNESCO’s substantive work and evaluation practiceTo inform decision making by program staff and management about changes that will improve their own, as well as, overall organizational performance
29 Why Communicate and Report? To inform funders, community members, clients, customers, program staff, management, other parts of the organization, and other organizationsTo demonstrate results, accountabilityTo build awareness and support within your unit, division, sector or across sectors and other organizational entitiesTo reflect jointly with others on findings and derive future actionsTo aid decision making about continued implementation and funding, as well as replication at other sites
30 Communicating and Reporting Strategies Facilitates Individual LearningShort communications: Memos, , postcardsInterim reportsFinal reportsExecutive summariesNewsletters, Bulletins, Briefs, BrochuresNewsmediaWebsite communicationsFacilitates Interactive (Group) LearningVerbal presentationsVideotape/Computer generated presentationsPosters and Poster SessionsWorking sessionsSynchronous electronic communicationsPersonal discussionsPhotographyCartoonsDrama-PerformancePoetryThere is a handout to go with this slide.
31 Developing Your Communicating and Reporting Plan Activity Using the Communicating and Reporting Plan worksheet, work on Steps 1-6.Steps 7-8 can be completed when more of your self-evaluation plan has been developed.We have 15 minutes allocated for this activity.
32 How Can We Maximize the Usefulness and Impact of Our Self-Evaluations? Hold meetings with each other to discuss progress, ask questions, seek feedbackUse the evaluation planning worksheets provided in this workshopRecord questions and lessons learned throughout the process ( )Make use of IOS resource person specifically available to support self-evaluation projectsConsider linkages between this self-evaluation work and RBM reporting requirementsParticipate in a poster session in mid-September to share findings from the planned self-evaluations
33 USE LEGIT USE MISUSE NON-USE Mistaken Use Mischievous Use Ideal Use Rational non-usePolitical non-useAbuseNON-USE
34 Workshop Follow upCurrent status of “Learning from Evaluation” survey process (with Education Sector staff)Follow-up to this workshop:Support for self-evaluation projects (IOS contact: Sandy Taut)Online support materials: slides, handouts, workshop audiotape transcriptionOngoing assessment of self-evaluation processes based on observations and discussions
35 Additional Resources Canadian Evaluation Society American Evaluation AssociationAustralasian Evaluation SocietyEuropean Evaluation SocietySociété Française de l'ÉvaluationSee for standards of professional practice, ethics etc.Canadian Journal of Program Evaluation