Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

Similar presentations


Presentation on theme: "Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia."— Presentation transcript:

1 Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia

2

3

4 Overview Understanding the Expanding Role of Evaluators – Touchstones – Dilemmas Complexity and Role Expansion from a Foundation Perspective – Context/Philosophy – Challenges Evolution and Role Expansion from a Consultant Perspective – Strategy – Reciprocity Evaluation Standards as Guideposts Dialogue and Discussion

5 “Traditional” Role of Evaluators Evaluation is defined as a process to determine evaluand’s merit or worth With this definition and conceptual emphasis on evaluation’s objectivity, the role of evaluator is portrayed as a judge In practice, evaluators tried to distance themselves as non-intrusive observer, quiet note-taker, non-emotional (thus “objective”) presenter

6 This & That about Evaluation Types of Evaluation – Context Evaluation: What has been done before? What has happened? Who made it and who didn’t? – Process Evaluation: What happened and how? – Outcome and Impact Evaluation: What worked and didn’t work? Why and why not? – Lessons Learned: unintended outcomes, social learning Types of Data – Qualitative Data: to answer the how, why, in what way, what, where, who type of evaluation questions – Quantitative Data: to answer the how many, how much, to what extent type of questions – Secondary “Census”/Sector Data

7 How is Evaluator’s Role Expanded Context Evaluation  evaluator’s involvement in program planning as information feeder, could be also viewed by some as “expert” in the field because information is POWER Process Evaluation  evaluator’s involvement in program implementation as messenger, technical assistance provider, in some cases, facilitator (of discussions)

8 How is Evaluator’s Role Expanded (continued) Outcome Evaluation  evaluator’s involvement in determining the program’s “fate” as potential decision maker, as program’s advocate, or worse as program’s enemy Lessons Learned  evaluators’ involvement in disseminating or sharing of program products and engagement in learning process with program staff

9 More about Evaluators’ Role Expansion Empowerment evaluation approach, participatory evaluation approach becoming more and more popular in practice  evaluation becomes part of the intervention  evaluators engaging with program staff and participants in evaluation design, data collection, analysis, report writing, etc.  evaluators as trainer, monitor, technical assistance provider, coach, etc.

10 A Foundation Perspective Funders seeking better outcome data – Learning from / improving grant-making practices – Accountability “Strategic Philanthropy” movement – More targeted outcomes – Charity v. systems change – At extreme: ROI

11 Foundation (2) Tracking / following clients across service delivery systems (e.g., courts / foster care; preschool / school) as element of change strategy – Technical assistance on data-collection for individual projects – Cluster / Initiative evaluator is asked to fill this role

12 Strategic Philanthropy Challenges for Evaluation – Multiple stakeholders – Evaluation as Intervention – Evaluator as Interpreter – Evaluating sustainability

13 Challenge: Multiple Stakeholders Formative and summative initiative evaluation with different stakeholders – Board: How do we know we invested wisely? – Program staff: How can we do better? – Grantees: How can we learn from each other? Are we doing OK? Requires initiative evaluator to address multiple needs – more sophisticated evaluation designs and personal relationships

14 Challenge: Evaluation as Intervention Part of change strategy may be collecting data across systems (e.g., foster care / adoption; pre- school / public education) Projects (sites) encouraged to use data for local decision making Technical assistance on local evaluation may be key part of change strategy Initiative evaluator in both evaluator and intervention roles – Requires clarity on potential conflict of interest

15 Challenge: Evaluator as Interpreter In TA role, may interpret funder’s intent to grantees – What data are “good enough”? – How can our community meet local needs and contribute data to the overall evaluation? In evaluator role, need to interpret local outcomes in context of overall strategic intent – Are local outcomes measuring the right things? Requires different skills – maybe multiple evaluators?

16 Challenge: Evaluating Sustainability Sustainability has many definitions – Specific program – The organization – Networks / partnerships WKKF emphasizes sustaining capability to address local concerns – Creating “adaptive systems” Need for better ways of assessing adaptability of community systems

17 Implications for Initiative Evaluations Think of evaluation differently – Evaluating the foundation, not (only) grantees Was our systems change theory supported? – More directive about project-level evaluation Develop skills in systems analysis

18 Evolution and Role Expansion from a Consultant Perspective Strategy/Design Reciprocity Epistemology Connecting “what you do” with “what you get…” seeing the “blind spots”

19 Blind Spot Demonstration

20 Logic Models Shape Strategy Role as catalyst and learning coach BUT Logic Models are very subjective – Perception – Persuasion – Politics Who develops them matters Don’t assume it is “gospel” or the “truth” Consider an external design review panel

21 Reciprocity Increases Risk Organizational vs Foundation Effectiveness Evaluation role influences both sides of philanthropy The press for accountability may limit innovation and experimentation Demand for outcomes without attention to quality and timing may lead to greater escalation and hyperbole

22 Ways of Knowing Shape Questions and Answers

23 Suggested Guidelines – Joint Committee Standards Conflict of Interest – Identify and clearly describe possible sources – Agree in writing on procedures – Seek advice – Release evaluation procedures, data, reports publicly, when appropriate – Obtain evaluation contract from funders, whenever possible – Assess situations – Make internal evaluators directly responsible to agency heads – Metaevaluations

24 Suggested Guidelines – Joint Committee Standards Metaevaluation – Budget sufficient money and other resources – Responsibility assignment – Have chair nominated by a respected professional body – Determine and record rules – Discretion of the chair person – Final authority for editing report – Determine and record report audience

25 Suggested Guidelines – Joint Committee Standards Evaluator Credibility – Stay abreast of social and political forces associated with the evaluation – Ensure that both work plan and composition of evaluation team are responsive to key stakeholders’ concerns – Consider having evaluation plan reviewed and evaluation work audited by another evaluator whose credentials are acceptable to the client – Be clear in describing evaluation plan – Determine key audience needs for information – State evaluator’s qualifications relevant to program being evaluated

26 Suggested Guidelines – Joint Committee Standards Political Viability – Evaluation should be planned and conducted with anticipation of the different positions of various interest groups, so that their cooperation may be obtained, and so that possible attempts by any of these groups to curtail evaluation operations or to bias or misapply the results can be averted or counteracted Impartial Reporting – Reporting procedures should guard against distortion caused by personal feelings and biases of any party to the evaluation, so that evaluation reports fairly reflect the evaluation findings


Download ppt "Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia."

Similar presentations


Ads by Google