Presentation is loading. Please wait.

Presentation is loading. Please wait.

EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011.

Similar presentations


Presentation on theme: "EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011."— Presentation transcript:

1 EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011

2 Agenda Question- and method-oriented approaches Questions and discussion

3 Evaluation Theory Tree

4 Question- and Method- Oriented Approaches Address specific questions (often employing a wide range of methods) Advocate use a particular method Whether the questions or methods are appropriate for assessing merit and worth is a secondary consideration Both are narrow in scope and often deliver less than a full assessment of merit and worth

5 Objectives-Based Studies Some statement of objectives serves as the advance organizer Typically, an internal study conducted in order to determine if the evaluand’s objectives have been achieved Operationalize objectives, then collect and analyze information to determine how well each objective was met

6 Objectives-based evaluation results from a national research center

7 Accountability, Particularly Payment by Results Narrows evaluation to questions about outcomes Stresses importance of obtaining external, impartial perspective Key components include pass-fail standards, payment for good results, and sanctions for unacceptable performance

8 Success Case Method Evaluator deliberately searches for and illuminates instances of success and contrasts them to what is not working Compares least successful instances to most successful instances Intended as a relatively quick and affordable means of gathering important information for use in improving an evaluand

9 Standard normal distribution and location of ‘success’ and ‘failure’ cases

10 Objective Testing Programs Testing to assess the achievements of individual students and groups of students compared with norms, standards, or previous performance

11 Outcome Evaluation as Value- Added Assessment Recurrent outcome and value-added assessment coupled with hierarchical gain score analysis Emphasis on assessing trends and partialling out effects of the different components of an educational system, including groups of schools, individual schools, and individual teachers The intent is to determine what value each is adding to the achievement of students

12 Performance Testing Devices that require students (or others) to demonstrate their achievements by producing authentic responses to select tasks, such as written or spoken answers, musical or psychomotor presentations, portfolios of work products, or group solutions to defined problems Performance assessments are usually life-skill and content-related performance tasks so that achievement can be demonstrated in practice

13 Experimental and Quasi- Experimental Design Studies Random assignment to one or experimental or control conditions and then contrasting outcomes Required assumptions can rarely be met As a methodology, addresses only a narrow set of issues (i.e., cause-and- effect) Done correctly, produces unbiased estimates of effect sizes

14 Flow of units through a typical randomized experiment

15 Management Information Systems Supply information needed to conduct and report on an evaluand Typically organized around objectives, specified activities, projected milestones or events, and budget Government Performance and Results Act (GPRA) of 1993 and Performance Assessment Rating Tool (PART)

16 Cost Studies Largely quantitative procedures designed to understand the full costs of an evaluand and to determine and judge what investments returned in objectives achieved and broader societal benefits Compares computed ratios to those of similar evaluands Can include cost-benefit, cost- effectiveness, cost-utility, return on investment, rate of economic return, etc.

17 Judicial and Advocate- Adversary Evaluation Essentially puts an evaluand on trial Role-playing evaluators implement a prosecution and defense Judge hears arguments within the framework of a jury trial Intended to provide balanced evidence on an evaluand’s strengths and weaknesses

18 Case Studies Focused, in-depth description, analysis, and synthesis Examines evaluand in context (e.g., geographical, cultural, organizational, historical, political) Mainly concerned with describing and illuminating an evaluand, not determining merit and worth Stake’s approach differs dramatically from Yin’s

19 Case study designs

20 Theory-Driven/Theory-Based Program evaluations based on a program theory often begin with either (1) a well- developed and validated theory of how programs of a certain type within similar settings operate to produce outcomes or (2) an initial stage to approximate such a theory within the context of a particular program evaluation The theory can then aid a program evaluator to decide what questions, indicators (i.e., manifest variables), and linkages (assumed to be causal) between and among program elements should be used to evaluate a program

21 Theory-Driven/Theory-Based The point of a theory development or selection effort is to identify advance organizers to guide the evaluation (e.g., in the form of a measurement model) Essentially these are the mechanisms by which program activities are understood to produce or contribute to program outcomes, along with the appropriate description of context, specification of independent and dependent variables, and portrayal of key linkages The main purposes of theory-based evaluation are to determine the extent to which the program of interest is theoretically sound, understand why it is succeeding or failing, and provide direction for program improvement

22 Linear program theory model

23 Ecological program theory model

24 Mixed-Method Studies Combines quantitative and qualitative techniques Less concerned with assessing merit and worth, more concerned with “mixing” methodological approaches A key feature is triangulation Aimed at depth, scope, and dependability of findings

25 Basic mixed-method designs

26 Meta-Analysis and Research Reviews Premised on the assumption that individual studies provide only limited information about the effectiveness of programs, each contributing to a larger base of knowledge Concentrate almost exclusively on the desired effects or outcomes of programs The principal ideology is that an evaluation should not be viewed in isolation, but rather as one of a set of tests of a program or intervention’s results across variations in persons, treatments, outcomes, contexts, and other variables

27 Meta-analysis forest plot


Download ppt "EVAL 6000: Foundations of Evaluation Dr. Chris L. S. Coryn Kristin A. Hobson Fall 2011."

Similar presentations


Ads by Google