Presentation on theme: "Dr. Suzan Ayers Western Michigan University"— Presentation transcript:
1 Evaluation Planning III: Identifying and Selecting the Evaluation Questions and Criteria Dr. Suzan AyersWestern Michigan University(courtesy of Dr. Mary Schutten)
2 Evaluation QuestionsEvaluations are conducted to answer questions and to apply criteria to judge the value of somethingEvaluation Questions provide the direction and foundation for the evaluationThey articulate the focus of the study
3 Criteria and Standards Criteria: used to identify the characteristics of a successful program (measure)Standards: designate the level of performance the program must achieve on these criteria to be deemed a success (performance)Without standards, evaluator can not judge the results, without criteria unable to judge the program itself
4 Phases of Identifying and Selecting Questions Divergent phase: a comprehensive “laundry list” of potentially important questions and concerns [many sources, all questions are listed]Convergent phase: evaluators select from the “laundry list” the most critical questions to be answeredCriteria are developed after the convergent phase
5 Divergent Phase Sources Questions, concerns, values of stakeholdersPolicy makers (legislators, board members)Administrators, managers (direct program)Practitioners (operate program)Primary consumers (clients, students, patients)Secondary consumers (affected audiences)What is their perception of the program? What Qs or concerns do they have? How well do they think it is doing? What would they change if given the chance?
6 Stakeholder Interview Questions fig 12.1 What is your general perception of the program? What do you think of it?What do you perceive as the purposes?What do you think the program theory is?What concerns do you have about the program? Outcomes? Operations?What major questions would you like the evaluation to answer? Why?How could you use the information provided by these questions?
7 Use of evaluation models/approaches Objectives-oriented: are goals defined and to what extent are they achieved?Management-oriented: questions about CIPP; context (need), input (design), process (implementation), product (outcomes)Participant-oriented: consider all stakeholders and listen to what they have to say. Process of program is criticalConsumer-oriented: checklists and sets of criteria to help determine what to study & what standards to applyExpertise-oriented: standards and critiques that reflect the view of the experts in the field
8 Findings and issues raised in the literature in the field of the program Evaluator should be conversant with salient issues in the program’s areaUse existing literature to help develop causative models and questions to guide the evaluationsLiterature search may be a useful start to the planning process
9 Professional standards, checklists, instruments, and criteria developed or used elsewhere Standards for practice exist in many fields, including PE and athleticsViews and knowledge of expert consultantsIf expertise in the content area, they may provide a neutral and broader viewThey can be asked to generate a list of questions and can identify previous evaluations of similar programs
10 Evaluator’s own professional judgment (p. 244) Trained to raise thoughtful questionsIs the purpose of the program really serving an important purpose?Are goals and objectives consistent with documented needs?What critical elements and events should be studied and observed?Summarizing suggestions from multiple sourcesP
11 Convergent Phase Three reasons to reduce to the range of variables There will always be a budget limitIf the study gets very complicated, it gets harder and harder manageAudience attention span is limitedWho should be involved?EvaluatorStakeholdersSponsorParties affected by the evaluation
12 Determining Which Questions to Study (Cronbach, 1980) Who would use the information? Who wants to know? Who will be upset if this question is dropped?Would an answer to the question reduce uncertainty or give info not now available?Would the answer to the question yield important information?Is this question merely of passing interest or does it focus on critical issues of continued interest?Would the scope of the evaluation be seriously limited if this question were dropped?Is it feasible to answer this question given the available financial and human resources? Time? Methods? Technology?
13 Convergent PhaseSit down with sponsor and/or client and review the laundry list and the items marked as “doable” (from 12.2 matrix)Reduce the list via consensusAdvisory board typical formatProvide the new list with a short explanation indicating why each is important and share with stakeholders
14 Matrix for Selecting Questions Fig 12.2 Would the evaluation question….Be of interest to key audiences?Reduce present uncertainty?Yield important informationBe of continuing (not fleeting) interest?Be critical to the study’s scope?Have an impact on the course of events?Be answerable in terms of $$, time, methods/technology?
15 Criteria and Standards Developed to reflect the degree of difference that would be considered meaningful enough to adopt the new programAbsolute Standard: a defined level is met/not metLearn stakeholders’ range of expectations and determine standards from thatRelative Standard: comparison to other groups or standardsTypically use the statistical concept of significance and effect size to determine if the program is “that much better” than what is in place (p. 253 expl)
16 Flexible (not indecisive) Allow new question, criteria, and standards to emergeRemember, the goal for this step is to lay the foundation to create a meaningful and useful evaluation