Presentation is loading. Please wait.

Presentation is loading. Please wait.

Designing and Conducting Useful Self-Evaluations at UNESCO

Similar presentations


Presentation on theme: "Designing and Conducting Useful Self-Evaluations at UNESCO"— Presentation transcript:

1 Designing and Conducting Useful Self-Evaluations at UNESCO
Hallie Preskill, Ph.D. University of New Mexico – USA And Brad Cousins, Ph.D. University of Ottawa, CANADA June 2004

2 Workshop Objectives As a result of having taken this workshop, participants will: Understand how this workshop fits in the broader scope of evaluation at UNESCO. Understand how self-evaluation in UNESCO can be useful and potentially contribute to individual, team, and organizational learning. Understand how to practically and realistically design, implement and use self-evaluation studies as a working tool in the current context of their projects or activities. Have developed a self-evaluation plan for a project or activity in which they are involved. Know how to integrate the self-evaluation activities into existing work structures and processes.

3 Agenda Evaluation in UNESCO Components of a Self-Evaluation Plan Focusing Your Self-Evaluation Choosing Among Data Collection Methods Analysing Evaluation Data   Communicating & Reporting Evaluation Processes & Findings Reflecting on the Context of Self-Evaluations Maximizing the Usefulness & Impact of Self-Evaluations Workshop evaluation and follow-up

4 Background for this Workshop: UNESCO Evaluation Strategy
The workshops are part of a set of capacity building activities in self-evaluation, implemented by Internal Oversight Service (IOS) on a pilot basis mainly in collaboration with the Education Sector. This initiative constitutes an important aspect in the implementation of the “UNESCO Evaluation Strategy” developed by IOS and endorsed by the Executive Board. The Evaluation Strategy (as well as other recent Audit and Evaluation reports) calls for self-evaluation as a necessary complement to external independent evaluation.

5 Definition of Evaluation
A systematic assessment of a planned, ongoing or completed intervention to determine its relevance, efficiency, effectiveness, impact and sustainability. The intent is to incorporate lessons learnt into the decision making process. (Source: adapted from OECD/DAC Glossary, 2002) Evaluation is context-dependent and involves (albeit systematic, data-based) value judgment. It is thus different from traditional scientific research. The latter claims to be value-free and aims at generalizability of findings across times and locations. This differentiation becomes blurred, however, especially with applied social science research, because applied social science research is becoming more utility- and action-oriented (e.g., “action research”) and the complexity of the ever-changing applied setting makes all decisions and interpretations of the researcher value-dependent - and generalizability contestable, even in highly controlled studies.

6 Judgement Judgement implies comparison of program performance data against some standard: Performance in program at prior point in time Performance of those receiving similar programs (comparative treatment) Performance of those receiving no program (control) External standard

7 Summative evaluation (judgement) Formative evaluation (improvement)
Evaluation is the use of systematic inquiry to make judgements about program merit, worth and significance and to support program decision making. Summative evaluation (judgement) Formative evaluation (improvement) Who is the judge?

8 External Evaluation OECD Glossary Definition of External Evaluation (2002, p. 23): The evaluation of a development intervention conducted by entities and/or individuals outside the donor and implementing organisations. Independent systematic approach to answering evaluative questions Typically commissioned by senior management Written into the C/5 or conducted upon donor demand IOS facilitates the process and oversees the quality of the evaluations Conducted by external (to UNESCO) evaluation experts Selection of C/5 evaluations is presented to ExB C/5 is the programme planning document that is developed each biennium; ExB = Executive Board

9 Self-Evaluation OECD / DAC Glossary Definition of Self-Evaluation (2002): An evaluation by those who are entrusted with the design and delivery of a development intervention. In the context of the UNESCO Evaluation Strategy: Self-evaluations are small-scale evaluation projects carried out by staff and management as part of their every-day work activities, which help them collect and use monitoring and evaluation data to answer their own questions concerning the quality and direction of their work.

10 Purposes of Self-Evaluation
Provides opportunities for continuous reflection and learning (individual, group, organization) Provides timely information for decision making and action on a day-to-day implementation level Draws on organization members’ knowledge of the project and evaluation context Results in useful findings; recommendations meet specific information needs If done well, results are from systematic, valid, and purposeful processes; minimizes perceptive fallacies Provides opportunity to share achievements Documents what works, what does not, and possible reasons why

11 Benefits of Using a Collaborative Approach to Self-Evaluation
Greater credibility to those involved Shared work saves resources and creates team spirit Increased learning using reflection and dialogue with others More informed interpretations of findings Greater breadth of recommendations Enhanced stakeholder evaluation capacity

12 A Systems Framework for Evaluation
The Evaluation Process The Evaluation Environment The Organization’s Environment External Requirements and Demands It takes a lot to ensure that evaluation contributes to learning of individuals, teams and the whole organization. These are some examples that play a role: High prevalence of political agendas that predetermine intended use of evaluation findings Leadership and/or staff not open to negative feedback for fear of sanctions No open communication culture, again out of fear of sanctions that you could say something “wrong” or “politically incorrect” Strict hierarchies which do not encourage team work and prevent intellectual stimulation from exchanges with colleagues Low staff morale because of restrictive work environment and high level of bureaucracy

13 An evaluation use conceptual framework
Evaluation Resources and Context Use of Findings Evaluation Knowledge Production Process Use Evaluation Practice Decision or Policy Setting

14 Evaluation Practice Planning (divergent / convergent)
Instrument development Data collection, processing Data analysis, interpretation Reporting and follow up

15 Self-Evaluation Plan Components (Terms of Reference)
Identifying Self-Evaluation Team Members Focusing the Self-Evaluation Background information (and logic model) Purpose of the evaluation Evaluation stakeholders (intended users of results) Evaluation scope (key questions) Designing and Implementing the Self-Evaluation Data collection methods, instruments, sample Evaluation timeline with specified roles and responsibilities Communicating and reporting plan Budget

16 Self-Evaluation Stakeholders
Users of the evaluation findings Primary Yourself/your team Secondary Implementers of projects/activities Colleagues doing similar work BSP (to feed into current reporting requirements) Immediate or Intermediate Managers Leadership of the organization

17 Evaluation Key Questions
Are the broad overarching questions that guide the evaluation Form the boundaries and scope of the evaluation Are typically written in an open-ended format Guide the choice of data collection methods Reflect the stakeholders’ information needs

18 Sample Self-Evaluation Key Questions
To what extent does the project bring about the intended changes in its target group? How can this project benefit from enhanced collaboration with partners? Why does this activity work well in one region, but not in the other? For whom is this project working best? Why? What additional services, materials, and/or activities are needed to reach better outcomes? What are the unintended consequences of this activity?

19 Using a Program’s Logic Model to Focus a Self-Evaluation
A logic model: Articulates a program’s theory of action – how it is supposed to work. Is a systematic and visual way to represent a program’s underlying theory. Helps focus an evaluation by making assumptions and expectations explicit. Increases stakeholders’ understanding about a program and its evaluation.

20 Things the project does with the resources to meet its objectives
Logic Model Template Assumptions The underlying assumptions that influence the project’s design, implementation or objectives Resources Human, financial, organizational & community resources needed to achieve the project’s objectives Activities Things the project does with the resources to meet its objectives Outputs Products of implementing the activities, which are necessary but not sufficient indications of achieving the project’s objectives Short-term Outcomes Short-term intended and unintended changes (e.g., in knowledge, attitudes, skills) as a result of the project Long-term Outcomes Long-term intended and unintended changes (e.g., in behavior, status, systems) as a result of the project For outputs, if you were to undertake capacity building in an Education Ministry, one of your activities would be to conduct a workshop, and one of the outputs would be “20 ministry staff attended” – which does not say anything about whether they have learned anything and are in fact able to apply what they have learned for educational planning (which would be outcomes / changes). If someone asks why we are not using the UNESCO SISTER/RBM language, we can say that for the purpose of the self-evaluation projects we do not find it useful. Which example will we discuss?

21 Developing a Logic Model for Your Self-Evaluation - Activity
Think of a project or work activity that you would like to self-evaluate. It should be an evaluation: That is narrow in scope That is doable Where there is an intended use of findings Where there are realistic opportunities for using the findings You may work in groups of 1-3, depending on how your work is actually organized. Using the Logic Model Template worksheet, begin to develop a logic model for your program/activity. Try to make a few notes in each of the columns. We have 30 minutes for this activity. Refer to IOS “coaches” if present.

22 Focusing Your Self-Evaluation Activity
Revisit the Logic Model you began to draft. Using the worksheet, Focusing Your Self-Evaluation, Make some notes regarding the background of the program/activity Write an evaluation purpose statement Develop 2-3 evaluation questions Identify potential self-evaluation stakeholders Describe the intended use of the self-evaluation's findings

23 Criteria for Choosing Among Data Collection Methods
Evaluation questions Stakeholder preferences Respondent characteristics Respondent availability/accessibility Level of acceptable intrusiveness Validity (trustworthiness of data) Costs (time, materials, subject matter experts) Organization’s experience

24 A Menu of Data Collection Methods
Surveys (mail, online, phone; open-ended, closed questions) Interviews (individual, focus group; conversational, semi-structured, structured) Observations (quantitative-structured; qualitative-unstructured) Records and Documents (e.g.,meeting minutes, s, technical reports, existing databases) Tests (paper, simulation, computer) We should stress that good ongoing record keeping is a good way to amass evaluation data. And ask them to think carefully which existing sources of data they could use (so they do not need to be the ones actually collecting the data themselves).

25 Enhancing the Validity of Data
Pilot testing Try out interview protocol, survey, or observation form with a sample similar to respondent population or have it critiqued by colleagues and/or experts. Triangulation Multiple: methods, data sources, evaluators, and/or theories Sampling Random/Probability – generalizable Nonrandom/Nonprobability – not generalizable

26 Designing Your Self-Evaluation Activity
Transfer your evaluation questions to the worksheet (top row). Discuss and note which data collection methods might be most appropriate and feasible for your self-evaluation study. Discuss and note who the respondents might be and whether you will include the entire population, or will select a sample (indicate how many you would like to include in your sample). We have 20 minutes allocated for this activity.

27 Considerations for Analyzing Data
Evaluation Key Questions Stakeholders’ understanding of, and experience with, data analysis methods Types of data (quantitative, qualitative) Levels of quantitative data (nominal, ordinal, interval) Choices for analyzing quantitative data Choices for analyzing qualitative data Evaluator skills and time – budget implications Without being condescending, this might be where we in IOS can be most useful as resource persons.

28 Why Communicate and Report?
To help organization members learn from one another and jointly improve their work… To build internal capacities - learn about UNESCO’s substantive work and evaluation practice To inform decision making by program staff and management about changes that will improve their own, as well as, overall organizational performance

29 Why Communicate and Report?
To inform funders, community members, clients, customers, program staff, management, other parts of the organization, and other organizations To demonstrate results, accountability To build awareness and support within your unit, division, sector or across sectors and other organizational entities To reflect jointly with others on findings and derive future actions To aid decision making about continued implementation and funding, as well as replication at other sites

30 Communicating and Reporting Strategies
Facilitates Individual Learning Short communications: Memos, , postcards Interim reports Final reports Executive summaries Newsletters, Bulletins, Briefs, Brochures Newsmedia Website communications Facilitates Interactive (Group) Learning Verbal presentations Videotape/Computer generated presentations Posters and Poster Sessions Working sessions Synchronous electronic communications Personal discussions Photography Cartoons Drama-Performance Poetry There is a handout to go with this slide.

31 Developing Your Communicating and Reporting Plan Activity
Using the Communicating and Reporting Plan worksheet, work on Steps 1-6. Steps 7-8 can be completed when more of your self-evaluation plan has been developed. We have 15 minutes allocated for this activity.

32 How Can We Maximize the Usefulness and Impact of Our Self-Evaluations?
Hold meetings with each other to discuss progress, ask questions, seek feedback Use the evaluation planning worksheets provided in this workshop Record questions and lessons learned throughout the process ( ) Make use of IOS resource person specifically available to support self-evaluation projects Consider linkages between this self-evaluation work and RBM reporting requirements Participate in a poster session in mid-September to share findings from the planned self-evaluations

33 USE LEGIT USE MISUSE NON-USE Mistaken Use Mischievous Use Ideal Use
Rational non-use Political non-use Abuse NON-USE

34 Workshop Follow up Current status of “Learning from Evaluation” survey process (with Education Sector staff) Follow-up to this workshop: Support for self-evaluation projects (IOS contact: Sandy Taut) Online support materials: slides, handouts, workshop audiotape transcription Ongoing assessment of self-evaluation processes based on observations and discussions

35 Additional Resources Canadian Evaluation Society
American Evaluation Association Australasian Evaluation Society European Evaluation Society Société Française de l'Évaluation See for standards of professional practice, ethics etc. Canadian Journal of Program Evaluation


Download ppt "Designing and Conducting Useful Self-Evaluations at UNESCO"

Similar presentations


Ads by Google