Presentation on theme: "Review: Introduction Define Evaluation"— Presentation transcript:
1 Review: Introduction Define Evaluation How do formal/informal evaluation differ?What are two uses of evaluation in education?What are the pros/cons of using an external evaluator?
2 Alternative Approaches to Evaluation Dr. Suzan AyersWestern Michigan University(courtesy of Dr. Mary Schutten)
3 Alternative Approaches Stakeholders: individuals and groups who have a direct interest in, and may be affected by, evaluation; should be involved early, actively & continuouslyProgram: activities that are provided on a continuing basis; typically what is evaluatedThere are a variety of alternative, often conflicting, views of what evaluation is and how it should be carried out
4 Why so many alternatives? The way one views evaluation directly impacts the type of activities/methods usedOrigins of alternative models stem from differences in:Philosophical & ideological beliefsMethodological preferencesPractical choices
5 Philosophical & Ideological Beliefs Epistemologies (philosophies of knowing)Objectivism (social science base of empiricism; replicate)Subjectivism (experientially-based; tacit knowledge)Pros/Cons of each?Principles for assigning value (parallel obj/subj)Utilitarian: focus on group gains (avg scores); greatest good for the greatest numberIntuitionist-pluralist: value is individually-determinedRoom for both or are these dichotomous?Philosophical purists are rare (impractical?)Choose the methods right for THAT evaluationUnderstand assumptions/limitations of different approaches
6 Methodological Preferences Quantitative (numerical)Qualitative (non-numerical)Evaluation is a transdiscipline; crosses paradigms“Law of the instrument” fallacyWith hammer/nails, all appears to need hammeringIdentify what is useful in each evaluation approach, use it wisely & avoid being distracted by approaches designed to deal w/ different needs
7 Practical Considerations Evaluators disagree whether/not intent of evaluation is to render a value judgmentDecision-makers or evaluator render judgment?Evaluators differ in views of evaluation’s political roleAuthority? Responsibility? These dictate eval styleInfluence of evaluators’ prior experienceWho should conduct the evaluation and nature of expertise needed to do soDesirability (?) of having a wide variety of evaluation approaches
8 Classification Schema for Evaluation Approaches Conceptual approaches to evaluation, NOT techniquesObjectives-oriented: focus on goals/objectives & degree to which they are achievedManagement-oriented: identifying and meeting informational needs of decision makersConsumer-oriented: generate information to guide product/service use by consumersExpertise-oriented: use of professional expertise to judge quality of evaluation objectParticipant-oriented: stakeholders centrally involved in processSee figure 3.1 (p. 68)
9 Objectives-oriented Approach Purposes of some activity are specified and then evaluation focuses on the extent to which these purposes are achievedRalph W. Tyler popularized this approach in education (criterion ref test)Tylerian modelsMetfessel & Michael’s paradigm (enlarged vision of alternative instruments to collect evaluation data)Provus’s Discrepancy Evaluation Model (agree on stds, det if discrepancy exists btwn perf/std, use discrepancy info to decide to improve, maintain, terminate program)Logic modelsDetermine long-term outcomes & backtrack to today
10 Objectives-oriented Steps Establish broad goals or objectives tied to mission statementClassify the goals or objectivesDefine objectives in behavioral termsFind situations where achievement of objectives can be shownSelect/develop measurement techniquesCollect performance dataCompare data with behaviorally stated objectives
11 Objectives-oriented Pros/Cons Strengths: simplicity, easy to understand, follow and implement; produces information relevant to the missionWeakness: can lead to tunnel visionIgnores outcomes not covered by objectivesNeglects the value of the objectives themselvesNeglects the context in which evaluation takes place
12 Goal Free EvaluationThis is the opposite of objectives-oriented evaluation, but the two supplement one anotherPurposefully avoid awareness of goals; should not be taken as given, goals should be evaluatedPredetermined goals not allowed to narrow focus of evaluation studyFocus on actual outcomes rather than intendedEvaluator has limited contact with program manager and staffIncreases likelihood of seeing unintended outcomes
13 Management-oriented Approach Geared to serve decision makersIdentifies decisions administrator must makeCollects data re: +/- of each decision alternativeSuccess based on teamwork between evaluators and decision makersSystems approach to education in which decisions are made about inputs, processes, and outputsDecision maker is always the audience to whom evaluation is directed
14 CIPP Evaluation Model (Stufflebeam) Context Evaluation: planning decisionsNeeds to address? Existing programs?Input Evaluation: structuring decisionsAvailable resources, alternative strategies?Process Evaluation: implementing decisionsHow well is plan being implemented? Barriers to success? Revisions needed?Product Evaluation: recycling decisionsResults? Needs reduced? What to do after program has ‘run its course’?
15 CIPP Steps Focusing the Evaluation Collection of Information Organization of InformationAnalysis of InformationReporting of InformationAdministration of Evaluation (timeline, staffing, budget etc…)
16 Context Evaluation Table 5.1 Objective: define institutional context, target population and assess their needsMethod: system analysis, survey, hearings, interviews, diagnostic tests, Delphi technique (experts)For deciding upon the setting to be served, the goals associated with meeting needs and objectives for solving problems
17 Input EvaluationObjective: identify and assess system capabilities, procedural designs for implementing the strategies, budgets, schedulesMethod: inventory human and material resources, feasibility, economics via literature review, visit exemplary programsFor selecting sources of support, solution strategies in order to structure change activities, provide basis to judge implementation
18 Process EvaluationObjective: identify or predict defects in the process or procedural design, record/judge procedural eventsMethod: monitoring potential procedural barriers, continual interaction with and observation of the activities of the staffFor implementing and refining the program design and procedure (a.k.a., process control)
19 Product EvaluationObjective: collect descriptions and judgments of outcomes and relate them to CIP, interpret worth/meritMethods: measure outcomes, collect stakeholder information, analyses of dataFor deciding to continue, terminate, modify, or refocus an activity and to document the effects (whether intended or unintended)
20 Uses of Management-oriented Approaches to Evaluation CIPP has been used in school districts, state and federal government agenciesUseful guide for program improvementAccountabilityFigure 5.1 (p. 94)Formative and summative aspects of CIPP
21 Management-oriented Pros/Cons Strengths: appealing to many who like rational, orderly approaches, gives focus to the evaluation, allows for formative and summative evaluationWeaknessws: preference given to top management, can be costly and complex, assumes important decisions can be identified in advance of the evaluation
22 REVIEW/Qs Why are there so many alternative approaches to evaluation? What two conceptual approaches to evaluation did we discuss tonight? What are their +/-?Which, if either, of these approaches do you think will work for your evaluation object?Identify your most likely evaluation object
Your consent to our cookies if you continue to use this website.