Presentation on theme: "The Seven Deadly Sins of Program Evaluation"— Presentation transcript:
1 The Seven Deadly Sins of Program Evaluation William Ashton, Ph.D.
2 This Talk is for … Everyone -- Especially ATOD Professionals Some Experience with Program EvaluationCookbookLook Out ForSolutionsHot Topic -- OutcomesOne Step Beyond
3 A brief quote ...“Clearly, evaluation can evoke strong emotions, negative associations, and genuine fear.”-- Michael Q. Patton
4 Alternate TitleA Psych Geek Talks about Boring & Technical Research Methodology & Program Evaluation Stuff
5 Seven Deadly Sins of Program Evaluation Please See Insert 1
6 A History of Evaluation Process Datadocumenting services delivered (e.g. clients seen, talks given, participants at talks)Outcome Datadocumenting changes for populations receiving services (e.g. increase in family cohesion, increase in knowledge of drug refusal skills)Effectsdocumenting changes for populations receiving services that are due to the program -- and only the program
7 Counterfactuals Jim Fixx -- outcome: died while jogging at 51 Did My Program Make a DifferenceCompared to What?Counterfactual -- should have beensJim Fixx -- outcome: died while jogging at 51Counterfactual: ‘should have’ died when he was 40What difference did jogging make?jogging had a life-lengthening effect
8 Another Counterfactual Example Program school year you implement an anti-smoking program for eight-gradersOutcome -- Number of eight-grade tobacco violations drops from toDid your smoking program work … or ...Counterfactual -- principal shifts school’s enforcement focus away from tobacco to weapons & threats in violations would have dropped anyway!
9 The 7 Deadly Sins are ... 1. Using Bad Measures 2. Underestimating Regression3. Underestimating Maturation4. Underestimating Testing Effects5. Underestimating Local History6. Selected Groups7. Using Bad Statistics
10 Deadly Sin #1 Bad Measures Tests Archival Data Surveys questionnaires data you get from someone elsepublished survey data, school records
11 BAD Tests Look Out For Solutions: Use published (standardized) tests Homemade testsSolutions: Use published (standardized) testsLook For -- Internal Consistency (Reliability), a test’s ability to measure the trait and not error (Cronbach) a > .72Look For -- test-retest reliability, a test’s ability to measure the same trait twice. r > .70
12 BAD Archival Data Archival Data -- data you get from someone else Examples:number of eight-grade ATOD violationsnumber of high school tobacco violationsjuvenile court referralspolice report on gang activityoffice referrals at Ensley Aveune High School
13 BAD Archival DataProblem: Are the same procedures being used -- year to year -- to record dataExamples:principal shifts school’s enforcement focus away from tobacco to weapons & threatsnew school secretary records many tobacco violations as “other drug” violationsharddrive crashes -- all ATOD&V data from Ensley Avenue High School is lost for 1999.
14 BAD Archival Data Look Out For Solutions All archive data Sherlock Holmes approachLook Forchanges in policychanges in personnelread the find print
15 Pair off and ... Describe your program Describe how you could use bad measures in your program.Have your partner do the sameFive minutes total
16 Seven Deadly Sins of Program Evaluation Please See Insert 2
17 Deadly Sin #2 Underestimating Regression When measuring the same thing twiceextreme scores will become less extreme for no real reasonLook Out ForGiving a person the same test twiceForming groups based upon a pre-test score.
20 Deadly Sin #2 Solution Don’t Form Extreme Groups Form groups based upon random assignmentflip a coin!
21 Deadly Sin #3 Underestimating Maturation Participants “Grow Up” between pre-test and post-testExample: Behavioral and Emotional Rating Scale’s Interpersonal Strength subscale shows an increase between the pre-test (beginning of ninth grade) and post-test (end of ninth grade)Effect of program which targeted individual risk factors … or ...normal growth in Interpersonal Strength during the first year of high school?
22 Maturation Look Out For Solution Long test-retest intervals (some tests list acceptable intervals)test-retest intervals during “growth spurts”SolutionAvoid above warning signsUse a control group
23 Deadly Sin #4 Underestimating Testing Effects Pre-test influences both behavior and/or responses on the post-test.Example: IQ TestsPre-test ® Refusal Skill Training ® Post-testIs the positive outcome on the post-test caused by the training or the pre-test?
24 Testing Look Out For Solutions Obvious (Transparent) tests Highly Inflammatory (Reactive) testsSolutionsAvoid warning signsUse a control group
25 Deadly Sin #5 Underestimating “Local History” other non-treatment event influences treatment groupExample80% of FAST families evicted during FAST programSchool-wide anti-drug curriculumDrug-related death at school
26 Local History Look Out For Solution Single group-- pre/post-test designsSolutionSherlock Holmes ApproachUse a control group
27 In groups of four … Find new people Form a group of four Describe your programEach person describe how either a regression, maturation, testing or local history sin could effect your programHelp out your partners!Ten to Fifteen Minutes
28 Seven Deadly Sins of Program Evaluation Please See Insert 4
29 Control GroupsParticipants are randomly assigned to either the control or treatment groupControl group is given tests, but not the treatmentThis creates a counterfactual
30 Control Group Design Control Group Random Assignment Pre-test 8 Weeks NothingPost-testTreatment GroupRandom AssignmentPre-test8 WeekFAST CurriculumPost-test
31 Control Group Design Random Control/Treatment Design Eliminates RegressionMaturationTestingLocal History
32 Deadly Sin #6 Selected Groups Instead of Randomly Forming groups … Participants get to choose which group to joinGroups formed by a criterionFAST - teachers identify children most likely to benefitRegression
33 Selected Groups Look Out For Solution Participants Choosing Participants Being SelectedSolutionRandom Assignment to control and treatment groups
34 Deadly Sin #7 Bad Statistics Conducting Multiple Statistical Tests Conducting Statistical Tests on Small Samples
35 Conducting Multiple Statistical Tests … or begging for a Type I Statistical ErrorLook Out ForConducting several t-tests or chi-squared testsSolutionFind a statistician
36 Finding A Statistician Local CollegePsychology, Sociology or Math DepartmentProfessorClass ProjectSenior ThesisRemember College Time-Line
37 Conducting Statistical Tests on Small Samples … or begging for a Type II Statistical ErrorLook Out Forgroups with less than 15 personsSolutionDon’t do statisticsFind a statisticianGet more people
38 Find a new partner and ... Describe your program Discuss how you would use a random control/treatment group designWhat problems would you encounter trying to randomly assign participates to control versus treatment groups?
39 Inspirational Quote “Bad data is free. Good data costs money.” -- Bill Ashton
40 The Cost of Evaluation Does your funder require Effects Evaluation? Yes, then get evaluation money from funderNo, then ask yourself, “do I need to do this?”Will evaluation increase your chances of getting new funding?Yes, then find funding for evaluation and accept risk
41 Rights of Use of This Material Some trainers are very protective of their materials – they’re afraid that they’re giving away their business. I feel that freely distributing information like this is just good advertising for a trainer or consultant. So please use my material as you see fit; with the provision that you, in print, reference me. Please use the following information – in full:William Ashton, Ph.D.The City University of New York, York CollegeDepartment of Political Science and Psychology
Your consent to our cookies if you continue to use this website.