Presentation is loading. Please wait.

Presentation is loading. Please wait.

Guidelines for Evaluation Planning: Clarifying the Evaluation Request and Responsibilities Dr. Suzan Ayers Western Michigan University (courtesy of Dr.

Similar presentations


Presentation on theme: "Guidelines for Evaluation Planning: Clarifying the Evaluation Request and Responsibilities Dr. Suzan Ayers Western Michigan University (courtesy of Dr."— Presentation transcript:

1 Guidelines for Evaluation Planning: Clarifying the Evaluation Request and Responsibilities Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

2 Individuals who affect or are affected by an evaluation study Sponsor Agency/individual who authorizes the evaluation and provides necessary resources for its conduct Client Agency/individual requesting evaluation Stakeholders Those who have a stake in the program or in the evaluation’s results Audiences Individuals, groups, agencies who have an interest in the evaluation and receive its results

3 Understanding reasons for initiating evaluation Did a problem prompt the evaluation? Did a problem prompt the evaluation? Did some stakeholder demand it? Did some stakeholder demand it? Who has the need to know? Who has the need to know? What does s/he want to know? Why? What does s/he want to know? Why? How will s/he use the results? How will s/he use the results?

4 It is not uncommon for the clients to be uninformed about evaluation procedures and to have not given deep thought about the ramifications It is not uncommon for the clients to be uninformed about evaluation procedures and to have not given deep thought about the ramifications Frequently, the purpose is not clear until the evaluator has carefully read the relevant materials, observed the evaluation object, and interviewed stakeholders Frequently, the purpose is not clear until the evaluator has carefully read the relevant materials, observed the evaluation object, and interviewed stakeholders

5 Questions to begin Why is this evaluation being requested? What questions will it answer? Why is this evaluation being requested? What questions will it answer? To what use will the evaluation findings be put? By whom? What others should receive the information? To what use will the evaluation findings be put? By whom? What others should receive the information? What is to be evaluated? What does it include? Exclude? During what time period? In what settings? Who will participate? What is to be evaluated? What does it include? Exclude? During what time period? In what settings? Who will participate?

6 What are the essential program activities? How do they link with the goals and objectives? What are the essential program activities? How do they link with the goals and objectives? How much time and money are available for the evaluation? Who can help with it? How much time and money are available for the evaluation? Who can help with it? What is the political climate and context surrounding the evaluation? Will any political factors and forces interfere in gaining meaningful and fair information? What is the political climate and context surrounding the evaluation? Will any political factors and forces interfere in gaining meaningful and fair information?

7 Informational uses of evaluation Determine whether sufficient need exists to initiate a program and describing the target audience (part of needs assessment) Determine whether sufficient need exists to initiate a program and describing the target audience (part of needs assessment) Assist in program planning by identifying potential program models/activities to achieve certain goals (part of needs assessment) Assist in program planning by identifying potential program models/activities to achieve certain goals (part of needs assessment) Describe program implementation and whether changes from the initial model have occurred (monitoring or process study) Describe program implementation and whether changes from the initial model have occurred (monitoring or process study) Examine whether certain goals are being achieved at desired levels (outcome study) Examine whether certain goals are being achieved at desired levels (outcome study) Judge overall value of a program (cost effectiveness study) Judge overall value of a program (cost effectiveness study)

8 Noninformational uses Decision postponement Decision postponement Ducking responsibility (already know decision but need to make it look good) Ducking responsibility (already know decision but need to make it look good) Public relations (justify the program) Public relations (justify the program) Fulfilling grant requirements Fulfilling grant requirements These are typically more common in federal or national evaluations

9 Conditions under which evaluation studies are inappropriate Evaluation would produce trivial information Evaluation would produce trivial information –One-time effort –Low impact program Evaluation results will not be used Evaluation results will not be used –D.A.R.E. programs for example –Needs to be a commitment to use the results Cannot yield useful, valid information Cannot yield useful, valid information –A bad evaluation is worse than no evaluation at all

10 Evaluation is too soon for the stage of the program Evaluation is too soon for the stage of the program –Premature summative evaluations are among the most insidious misuses of evaluation (e.g.: fitness program evaluation in first six weeks will not yield meaningful information) Motives of the evaluation are improper Motives of the evaluation are improper –Ethical considerations, “hatchet jobs” –See attributes of an ethical evaluation ( http://www.wmich.edu/evalctr/jc/ ) http://www.wmich.edu/evalctr/jc/

11 Appropriateness-major steps Use a tool called evaluability assessment Use a tool called evaluability assessment –Clarify the intended program model or theory –Examine the program implementation to determine whether it matches the program model and could achieve the program goals –Explore different evaluation approaches to match needs of stakeholders –Agree on evaluation priorities and intended uses of the study

12 Determining evaluability Personal interviews with stakeholders Personal interviews with stakeholders Review existing program documentation Review existing program documentation Site visits Site visits

13 Who will evaluate? Does the potential evaluator have the… ability to use methodologies and techniques needed in the study? ability to use methodologies and techniques needed in the study? ability to help articulate the appropriate focus for the study? ability to help articulate the appropriate focus for the study? management skills to carry out the study? management skills to carry out the study? ability to usefully communicate results to audiences? ability to usefully communicate results to audiences? integrity to maintain proper ethical standards? integrity to maintain proper ethical standards?

14 Internal program knowledge, familiarity with stakeholders, history, continue in advocacy role after evaluation, quick start up, known quantity External impartial, credible, expertise, fresh look, more likely to obtain sensitive inside information, more likely to realistically present results (particularly if unpopular) and advocate change

15 Combination –Internal provides contextual information, collects majority of data, serves as advocate and support after external gone –External designs evaluation, selects/develops instruments, directs data collection, organizes report, ensures impartiality/credibility Best of both worlds when internal and external evaluation teams collaborate Best of both worlds when internal and external evaluation teams collaborate

16 Analyzing Resources- Personnel Can the evaluator use staff on site? Can the evaluator use staff on site? –Program staff: collect data –Secretaries: type, search records –Grad students: internshipS, course-related work –PTA: bodies, ideas, contacts, etc. All these can help with evaluation with no added cost to the budget

17 Analyzing Resources- other, constraints The more information that must be generated by the evaluator, the costlier the evaluation The more information that must be generated by the evaluator, the costlier the evaluation Are existing data, records, evaluations, and other documents available? Are existing data, records, evaluations, and other documents available? Are needed support materials: testing programs, computer services, etc… already in existence or must they become part of the budget? Are needed support materials: testing programs, computer services, etc… already in existence or must they become part of the budget? Time: knowing when to be ready with results is part of good planning Time: knowing when to be ready with results is part of good planning –Limited time can lessen evaluation impact as much as limited dollars

18 Phases of Identifying and Selecting Questions Divergent phase= a comprehensive “laundry list” of potentially important questions and concerns [many sources, all questions are listed] Divergent phase= a comprehensive “laundry list” of potentially important questions and concerns [many sources, all questions are listed] Convergent phase= evaluators select from the “laundry list” the most critical questions to be answered Convergent phase= evaluators select from the “laundry list” the most critical questions to be answered Criteria are developed after the convergent phase Criteria are developed after the convergent phase

19 Divergent Phase Sources Questions, concerns, values of stakeholders Questions, concerns, values of stakeholders –Clients, sponsors, participants, affected audiences –Policy makers, managers, primary consumers, secondary consumers –What is their perception of the program? What are their questions/concerns? How well do they think it is doing?

20 Stakeholder Interview Questions What is your general perception of the program? What do you think of it? What is your general perception of the program? What do you think of it? What do you perceive as the purposes? What do you perceive as the purposes? What concerns do you have about the program? Outcomes? Operations? What concerns do you have about the program? Outcomes? Operations? What major questions would you like the evaluation to answer? Why? What major questions would you like the evaluation to answer? Why? How could you use the information provided by these questions? How could you use the information provided by these questions?

21 Divergent Phase Sources Use of evaluation models/approaches Use of evaluation models/approaches Consumer-oriented: checklists and sets of criteria to help determine what to study Expertise-oriented: standards and critiques that reflect the view of the experts in the field Adversary-oriented: look for both strengths and weaknesses of the program

22 Professional standards, checklists, instruments, and criteria developed or used elsewhere Professional standards, checklists, instruments, and criteria developed or used elsewhere –Standards for practice exist at both the state and national level in physical education Views and knowledge of expert consultants Views and knowledge of expert consultants –Expertise in the content area may provide a neutral and broader view –They can be asked to generate a list of questions and can identify previous evaluations of similar programs

23 Matrix for Selecting Questions Would the evaluation question…. be of interest to key audiences? reduce present uncertainty? yield important information? be of continuing, not fleeting, interest? be critical to the study’s scope? have an impact on the course of events? be answerable in terms of $$, time, tech?

24 Convergent Phase Sit down with sponsor and/or client and review the laundry list and the items marked as “doable” Sit down with sponsor and/or client and review the laundry list and the items marked as “doable” –Reduce the list via consensus Provide the new list with a short explanation indicating why each is important and share with stakeholders Provide the new list with a short explanation indicating why each is important and share with stakeholders

25 Criteria and Standards Developed to reflect the degree of difference that would be considered meaningful enough to adopt the new program. Developed to reflect the degree of difference that would be considered meaningful enough to adopt the new program. Absolute Standard = a defined level is met or not met Absolute Standard = a defined level is met or not met –Typically state DOE require absolute standards –Learn range of expectations [stakeholders] and determine standards from that Relative Standard = comparison to other groups or standards Relative Standard = comparison to other groups or standards

26 Criteria and Standards Flexible Flexible Allow new question, criteria, and standards to emerge Allow new question, criteria, and standards to emerge Each question needs its own standards and criteria Each question needs its own standards and criteria Remember, the goal for this step is to lay the foundation to create a meaningful and useful evaluation


Download ppt "Guidelines for Evaluation Planning: Clarifying the Evaluation Request and Responsibilities Dr. Suzan Ayers Western Michigan University (courtesy of Dr."

Similar presentations


Ads by Google