Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Seven Deadly Sins of Program Evaluation William Ashton, Ph.D.

Similar presentations


Presentation on theme: "The Seven Deadly Sins of Program Evaluation William Ashton, Ph.D."— Presentation transcript:

1 The Seven Deadly Sins of Program Evaluation William Ashton, Ph.D.

2 This Talk is for … 4 Everyone -- Especially ATOD Professionals 4 Some Experience with Program Evaluation –Cookbook Look Out For Solutions 4 Hot Topic -- Outcomes –One Step Beyond

3 A brief quote... “Clearly, evaluation can evoke strong emotions, negative associations, and genuine fear.” -- Michael Q. Patton

4 Alternate Title A Psych Geek Talks about Boring & Technical Research Methodology & Program Evaluation Stuff

5 Seven Deadly Sins of Program Evaluation Please See Insert 1

6 A History of Evaluation 4 Process Data –documenting services delivered (e.g. clients seen, talks given, participants at talks) 4 Outcome Data –documenting changes for populations receiving services (e.g. increase in family cohesion, increase in knowledge of drug refusal skills) 4 Effects – documenting changes for populations receiving services that are due to the program -- and only the program

7 Counterfactuals 4 Did My Program Make a Difference –Compared to What? 4 Counterfactual -- should have beens 4 Jim Fixx -- outcome: died while jogging at 51 –Counterfactual: ‘should have’ died when he was 40 4 What difference did jogging make? –jogging had a life-lengthening effect

8 Another Counterfactual Example 4 Program school year you implement an anti-smoking program for eight-graders 4 Outcome -- Number of eight-grade tobacco violations drops from to Did your smoking program work … or... violations would have dropped anyway! 4 Counterfactual -- principal shifts school’s enforcement focus away from tobacco to weapons & threats in violations would have dropped anyway!

9 The 7 Deadly Sins are... 1.Using Bad Measures 2.Underestimating Regression 3.Underestimating Maturation 4.Underestimating Testing Effects 5.Underestimating Local History 6.Selected Groups 7. Using Bad Statistics

10 Deadly Sin #1 4 Bad Measures –Tests Surveys questionnaires –Archival Data data you get from someone else published survey data, school records

11 BAD Tests 4 Look Out For –Homemade tests 4 Solutions: Use published (standardized) tests –Look For -- Internal Consistency (Reliability), a test’s ability to measure the trait and not error. (Cronbach)  >.72 –Look For -- test-retest reliability, a test’s ability to measure the same trait twice. r >.70

12 BAD Archival Data 4 Archival Data -- data you get from someone else –Examples: number of eight-grade ATOD violations number of high school tobacco violations juvenile court referrals police report on gang activity office referrals at Ensley Aveune High School

13 BAD Archival Data 4 Problem: Are the same procedures being used -- year to year -- to record data –Examples: principal shifts school’s enforcement focus away from tobacco to weapons & threats new school secretary records many tobacco violations as “other drug” violations harddrive crashes -- all ATOD&V data from Ensley Avenue High School is lost for 1999.

14 BAD Archival Data 4 Look Out For –All –All archive data 4 Solutions –Sherlock Holmes approach –Look For changes in policy changes in personnel read the find print

15 Pair off and... 4 Describe your program 4 Describe how you could use bad measures in your program. –Have your partner do the same 4 Five minutes total

16 Seven Deadly Sins of Program Evaluation Please See Insert 2

17 Deadly Sin #2 Underestimating Regression 4 When measuring the same thing twice –extreme scores will become less extreme for no real reason 4 Look Out For –Giving a person the same test twice –Forming groups based upon a pre-test score.

18 Don’t Form Groups Based upon Pre-Test Scores

19 Don’t Form Extreme Groups Please See Insert 3

20 Deadly Sin #2 Solution 4 Don’t Form Extreme Groups 4 Form groups based upon random assignment –flip a coin!

21 Deadly Sin #3 Underestimating Maturation 4 Participants “Grow Up” between pre-test and post-test –Example: Behavioral and Emotional Rating Scale’s Interpersonal Strength subscale shows an increase between the pre-test (beginning of ninth grade) and post-test (end of ninth grade) Effect of program which targeted individual risk factors … or... normal growth in Interpersonal Strength during the first year of high school?

22 Maturation 4 Look Out For –Long test-retest intervals (some tests list acceptable intervals) –test-retest intervals during “growth spurts” 4 Solution –Avoid above warning signs –Use a control group

23 Deadly Sin #4 Underestimating Testing Effects 4 Pre-test influences both behavior and/or responses on the post-test. 4 Example: IQ Tests Pre-test  Refusal Skill Training  Post-test –Is the positive outcome on the post-test caused by the training or the pre-test?

24 Testing 4 Look Out For –Obvious (Transparent) tests –Highly Inflammatory (Reactive) tests 4 Solutions –Avoid warning signs –Use a control group

25 Deadly Sin #5 Underestimating “Local History” 4 other non-treatment event influences treatment group Example 4 80% of FAST families evicted during FAST program 4 School-wide anti-drug curriculum 4 Drug-related death at school

26 Local History 4 Look Out For –Single group-- pre/post-test designs 4 Solution –Sherlock Holmes Approach –Use a control group

27 In groups of four … 4 Find new people 4 Form a group of four 4 Describe your program 4 Each person describe how either a regression, maturation, testing or local history sin could effect your program 4 Help out your partners! 4 Ten to Fifteen Minutes

28 Seven Deadly Sins of Program Evaluation 4 Please See Insert 4

29 Control Groups Participants are randomly assigned to either the control or treatment group –Control group is given tests, but not the treatment This creates a counterfactual

30 Control Group Design Control Group Random Assignment Pre-test 8 Weeks Nothing Post-test Treatment Group Random Assignment Pre-test 8 Week FAST Curriculum Post-test

31 Control Group Design Eliminates 4 Random Control/Treatment Design Eliminates –Regression –Maturation –Testing –Local History

32 Deadly Sin #6 Selected Groups 4 Instead of Randomly Forming groups … –Participants get to choose which group to join –Groups formed by a criterion FAST - teachers identify children most likely to benefit 4 Regression

33 Selected Groups 4 Look Out For –Participants Choosing –Participants Being Selected 4 Solution –Random Assignment to control and treatment groups

34 Deadly Sin #7 Bad Statistics 4 Conducting Multiple Statistical Tests 4 Conducting Statistical Tests on Small Samples

35 Conducting Multiple Statistical Tests … or begging for a Type I Statistical Error 4 Look Out For –Conducting several t-tests or chi-squared tests 4 Solution –Find a statistician

36 Finding A Statistician 4 Local College –Psychology, Sociology or Math Department Professor Class Project Senior Thesis –Remember College Time-Line

37 Conducting Statistical Tests on Small Samples … or begging for a Type II Statistical Error 4 Look Out For –groups with less than 15 persons 4 Solution –Don’t do statistics –Find a statistician –Get more people

38 Find a new partner and... 4 Describe your program 4 Discuss how you would use a random control/treatment group design 4 What problems would you encounter trying to randomly assign participates to control versus treatment groups?

39 Inspirational Quote “ Bad data is free. Good data costs money.” -- Bill Ashton

40 The Cost of Evaluation 4 Does your funder require Effects Evaluation? –Yes, then get evaluation money from funder –No, then ask yourself, “do I need to do this?” 4 Will evaluation increase your chances of getting new funding? –Yes, then find funding for evaluation and accept risk

41 Rights of Use of This Material 4 Some trainers are very protective of their materials – they’re afraid that they’re giving away their business. I feel that freely distributing information like this is just good advertising for a trainer or consultant. So please use my material as you see fit; with the provision that you, in print, reference me. Please use the following information – in full: 4 William Ashton, Ph.D. 4 The City University of New York, York College 4 Department of Political Science and Psychology 4


Download ppt "The Seven Deadly Sins of Program Evaluation William Ashton, Ph.D."

Similar presentations


Ads by Google