Download presentation
Presentation is loading. Please wait.
Published byCamren Ashurst Modified over 11 years ago
1
Program Evaluation Basics Dennis McBride, PhD Analytical Techniques for Public Service The Evergreen State College
2
Program Evaluation: What is it? Program evaluation is the use of social research procedures to systematically investigate the effectiveness of social intervention programs. Note: Program evaluation is not a method -- it is a reason to conduct social research. It uses multiple methods and is usually applied – but can be pure or theoretical research 2 Rossi et. al: Evaluation: A systematic Approach, 2002.
3
Program Evaluation: What is it? Evaluation research is more than the application of methods. It is also a political and managerial activity, an input into the complex mosaic from which policy decisions and allocations emerge for the planning, design, implementation, and continuation of programs to better the human condition... Evaluation research also needs to be seen as an integral part of the social policy and public administration movements. 3 Rossi and Freeman: Evaluation: A systematic Approach, 1985, p27.
4
Very Brief History Program evaluation can be traced back to at least the 1600s with Thomas Hobbes and cronies (mortality & morbidity); 1700s scurvy studies aboard British ship. Half were fed limes/half were not. Called Limeys since they had to eat limes. 1930s through the 1950s – large-scale evaluations were commonplace (delinquency prevention programs, felon recidivism programs, public housing programs, etc). The Great Depression, the New Deal. 1960s & 70s The War on Poverty and Great Society –during Kennedy and Johnson eras all gave rise to program evaluation. Publications/ books/journals grew rapidly. 1980 ; Evaluation has become the liveliest frontier of American social science (Cronbach: Toward Reform in Program Evaluation). Then came Reagan in 1980.... And again in 1984.... 4 Rossi and Freeman: Evaluation: A systematic Approach, 1985.
5
Key Questions What is the nature and scope of the problem requiring actions? What interventions may be undertaken to ameliorate the problem? What is the appropriate target population? Is the intervention reaching the target population? Is the intervention being implemented in the ways envisioned? Is it effective? How much does it cost? What are the costs relative to the benefits? 5 Rossi and Freeman: Evaluation: A systematic Approach, 1985.
6
Evaluation Topics Needs Assessments – Determines the existence of a problem and degree (these are usually done before evaluations are put in place). Cost-benefit (Does the results of a program justify its costs). Ex: Teens, psychotropic drugs. Monitoring studies (ongoing tracking of information of interest) – ex: pregnancy rates, STD rates, crime rates; (Performance Indicators, Benchmarks). Program Evaluations (Process and Outcome). 6
7
Examples 7
8
Teen Birth Rates by Country, 2004 (per 1000 girls 15-19) * Most recent available data from 2003 Source: Teen Birth Rates: How Does the United States Compare? The National Campaign to Prevent Teen Pregnancy. Retrieved 1/2/08 from http://www.teenpregnancy.org. 8
9
Consequences for Mother & Children Teen mothers are: Less likely to complete high school Less likely to go to college More likely to end up on welfare Children born to teenage mothers: Have lower birth weights Have lower cognitive development Have more behavior problems Are more likely to perform poorly in school Are at a greater risk of abuse/neglect Sons are 13% more likely to end up in prison Daughters are 22% more likely to become teen mothers themselves. Sources: General Facts and Stats. The National Campaign to Prevent Teen Pregnancy. Retrieved on January 2, 2008, from http://www.teenpregnancy.org/resources/data/genlfact.asp Kirby, D. (2007). Emerging Answers 2007: Research Findings on Programs to Reduce Teen Pregnancy and Sexually Transmitted Diseases. Washington, DC: National Campaign to Prevent Teen and Unplanned Pregnancy. Retrieved on December 26, 2007, from http://www.thenationalcampaign.org/EA2007/http://www.thenationalcampaign.org/EA2007/ 9
10
The Costs of Teen Childbearing From 1991 – 2004, teen childbearing in the U.S. cost taxpayers $161 billion. In 2004 alone, teen childbearing in the U.S. cost taxpayers $9.1 billion. – However, if the teen birth rate had not declined by 1/3 since 1991, the cost would have been $15.8 billion. The cost per teen mother is $1430 annually, but for young teens (17 and under) the cost jumps to $4080 annually. Source: The Costs of Teen Childbearing. The National Campaign to Prevent Teen Pregnancy. Retrieved 1/2/08 from http://www.teenpregnancy.org. 10
11
Evaluations seldom either totally fail or totally succeed. Rather, they are more or less effective in: Showing what works for whom, under what conditions; and Being used to improve program services to clients Evaluation Considerations
12
Good program to evaluation linkages Well identified objectives An adequate process evaluation plan Good communication between evaluator and program staff Adequate design/method(s) Adequate analysis Well stated results What to look for in an evaluation
13
Good Linkage/Correspondence between what the program is trying to do and what is being evaluated. The intervention is well defined. The evaluation plan adequately addresses the intervention. Good Linkages
14
The program objectives and evaluation questions coincide. The Program Objectives are well stated. The Evaluation Questions are well Stated. Think S.M.A.R.T Well identified program objectives.
15
S pecific M easurable A chievable R elated to the goal T ime Limited
16
16 From Logic Model: Increase intention to delay sexual activity.... An objective of this project intervention is to increase adolescent clients intention to delay sexual activity within the next year. Hyp_D1. There will be no difference in intention to delay sexual intercourse within the next year between adolescents receiving the intervention and those not receiving the intervention. Hyp_D2. There will be no difference before and following the intervention in Adolescent clients intention to delay sexual intercourse within the next year How likely is it that you will have sexual intercourse within the next year? a. I definitely will b. I probably will c. I dont know d. I probably will not e. I definitely will not Stating an Objective
17
Is the Program being Implemented as planned? Is Program Fidelity Being Tested? (AKA: Formative evaluations, Implementation evaluations, Fidelity assessments). Process Evaluation
18
What Services are received Quantity of Service Duration of Service Quality of Service Serving Intended Population Dosage Who is being served & How are they being served
19
Good communication between Evaluator, program staff, and other stake holders. Program staff have sufficient information about the evaluation to use it in program planning and convey it to stakeholders. Evaluator should provide timely and useful feedback to program staff Communication
20
Outcome Evaluation Design Measurement Analysis Results (AKA Summative Evaluations; Impact Evaluations). Note: There is no good reason to conduct an outcome evaluation before conducting a process evaluation. 20
21
Random Assignment Matched Comparison Groups Existing Statistics Time Series Pre-post only Designs Basic Research Designs 21
22
Is the design sufficiently strong/appropriate to test the evaluation questions? Is there Design Fidelity? Is the design originally proposed being implemented: Technical aspects of the instruments used (validity and reliability) are well described. Design issues 1: Adequate Design.
23
Adequate sample sizes Adequate follow-up rates Missing/incomplete data are accounted for Systematic data collection; data collection points are well identified and sustained. Data Quality is being Assessed. Design issues 2: Good Data Quality
24
Appropriate statistical tests. Accounting for Group differences. Accounting for selection bias. Accounting for differences in attrition between groups. Program drop-outs are not followed. Analysis issues
25
Limitations of the evaluation and potential effects of the limitations are identified. Results are not overstated given the weakness/limitations of the design. Results Issues
26
Post-test Comparison Treatment Risk Time 5432154321 Pre-test How likely is it that you will have sexual intercourse in the next year? Program B Program A 5. I definitely will 4. I probably will 3. I dont know 2. I probably will not 1. I definitely will not
27
Focus Group Uses Serve to augment quantitative survey data Help to confirm the rationale of the Intervention (client-centered approach) Highlight the complexities of providing services to high risk youth. may help bring about a more complete understanding of program effects. 27
28
Focus Group Uses – cont. Focus Groups can tap into the implicit effects of individualized interventions, such as the effects of the relationship between client and service providers, which may be subtle and vary greatly among participants and across communities. 28
29
What Focus Groups Are Not Focus groups should not be used as alternatives to Quantitative Designs when assessing program impacts. Generalizations are limited. Can not be used to estimate population parameters (e.g., prevalence/incidence). 29
30
Coordinator Group A The really strict teacher had to leave, and the new teacher couldnt control the class The coordinator seems shy and has trouble controlling the kids. I think she knows the information but cant get the information out. 30
31
Coordinator Group B Shes cool, shes like a best friend (…lots of chatter, agreement). Yea shes like a best friend who happens to know a lot more than we do. Shes not like a teacher, wholl just blurt it out, she like details it and explains it more carefully. (Someone else adds: To where we would understand it.) She puts it in our language. 31
32
Evidence Based Practices Generally means employing... interventions that research has shown to be effective in... achieve goals. Drake, R. E., Merrens, M. R., & Lynde, D.W. (2005) Evidenced-Based Mental Health Practice. WW Norton & Co. p.67. 32
33
Level of Evidence 1.Emerging Practices 2.Promising Practices 3.Evidence Based Practices (RCT, Replication, and Fidelity) Identified by national consensus panel recommendations based on studies which are systematically reviewed. 33
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.