Presentation is loading. Please wait.

Presentation is loading. Please wait.

An Introduction to Evaluation Research Duane Shell Research Associate Professor Nebraska Prevention Center for Alcohol and Drug Abuse Department of Educational.

Similar presentations


Presentation on theme: "An Introduction to Evaluation Research Duane Shell Research Associate Professor Nebraska Prevention Center for Alcohol and Drug Abuse Department of Educational."— Presentation transcript:

1 An Introduction to Evaluation Research Duane Shell Research Associate Professor Nebraska Prevention Center for Alcohol and Drug Abuse Department of Educational Psychology

2 Why Evaluate? Because somebody says you have to. [usually a funder]

3 What is Evaluation? Something to be “feared”. A sneaky way for the “powers that be” to get you. How “the Man” stifles your creativity. How “they” can justify taking away your money.

4 What is Evaluation? Evaluation is a way for YOU to find out if things you are doing are working.

5 So Why Evaluate? You Evaluate for Yourself Not for others. Really!!

6 The difference between Evaluation and Research

7 Evaluation vs Research Evaluation and Research DO NOT differ in: Data collection tools Methodologies Analysis methods

8 The purpose of research is to: Test hypotheses Inform theory Or at least: Describe or draw conclusions About some phenomenon

9 The purpose of evaluation is to: Inform decision making about project success by Determining if objectives are achieved and Activities are done as planned

10 So What Is Evaluation? A systematic way to collect data to inform decision making

11 Inform Decision Making It is not possible to definitively answer the question: Was the project successful? The only thing that can be done is to collect evidence to allow a judgment

12 Collect Data The foundation of Evaluation is Data driven decision making. Data are typically collected from multiple sources Data are typically both quantitative and qualitative Multiple data are “triangulated” to make judgments and decisions

13 Systematic Data Collection Evaluation is based in a systematic approach to collecting data. Evaluation uses the same methodologies that are used in research. Standards of methodological rigor are the same as those used in research.

14 Systematic Data Collection Evaluation, though must apply methods and rigor within the context of the real world of the project being evaluated. Where the exacting standards of research design cannot be done, Evaluation attempts to collect data in the most systematic and open way possible.

15 Systematic Data Collection Recognition of the usual failure to achieve the most exacting standards of research is why evaluation decisions are based on: multiple triangulated data sources

16 Evaluation Basics

17 5 Basic Evaluation Questions 1)What will be assessed? 2)What measures/indicators will be used? 3)Who will be evaluated? 4)What data will be collected? 5)How will data be analyzed?

18 What Will Be Evaluated? Formative (aka Process) Evaluation: Done to help improve the project itself. Gather information on how the project worked. Data is collected about activities: What was done. Summative (aka Outcome) Evaluation: Done to determine what results were achieved. Data is collected about outcomes (objectives; goals): What happened.

19 What Measures Will Be Used? Formative Evaluation : Completion of planned activities Adherence to proposed time lines Meeting budget Summative Evaluation: Reaching a criterion Change in knowledge, attitude, skill, behavior

20 Who will be evaluated? Formative Evaluation: Those responsible for doing activities/delivering services and those participating in activities. Faculty Agency personnel Students Summative Evaluation: Those who were expected to be impacted by activities. Students Clients

21 What data will be collected? Formative Evaluation : Program records Observations Activity logs Satisfaction surveys Summative Evaluation: Observations Interviews Tests Surveys/questionnaires

22 How will data be analyzed? 1) Qualitative analysis (more for formative) 1) Self-reports 2) Documentation 3) Description 4) Case Study 2) Quantitative analysis (more for summative) 1) Group comparison 2) Group change 3) Individual change 4) Comparison to population/reference 5) Analysis of relationships

23 An Example The Cosmic Ray Observatory Project (CROP) Goal: Establish a statewide collaborative network of expert teachers fully capable of continuing the project locally. Objectives: Teachers will acquire knowledge about cosmic ray physics and skill in high energy research methods. Teachers will exhibit increased self-efficacy for conducting CROP research and integrating CROP into their teaching. Activity: High school physics teachers and students will attend a 3-4 week hands-on summer research experience on cosmic ray physics at UNL

24 Formative Evaluation What activities were evaluated? The specific components of the Summer Research Experience What measures were used? Completion of activities Participant satisfaction Participant evaluation of goal attainment Participant evaluation of activity effectiveness Who was evaluated? Participants What data was collected? Interviews Rating scales How was data analyzed? Content analysis of interview responses Frequency and descriptive statistical analysis of rating scales.

25 Examples of Formative Measures Interview Questions What was the most effective part of the workshop? Hands-on work with detectors6 Information from classroom sessions4 Teacher Comments (by teacher with coded category(s) indicated): For me personal was the activities. The actual connecting and wiring and those things. I don’t sit and take lectures very well. That’s just me. [Hands on work with the detectors] Um, I think it was the classroom work. There was a good review for those of us that have had physics and it was a good introduction for those that didn’t. [Information from classroom sessions]

26 Examples of Formative Measures Rating Scales 1.How effective do you think the workshop was in meeting its goals? 1 23 45 Not Effective Neither Effective SomewhatEffective Very Effective nor ineffective Effective 4.Indicate how USEFUL you think each of the following workshop components was using the following scale. 1 2 34 5 6 Very Unuseful Somewhat Somewhat Useful Very Unuseful Unuseful Useful Useful a. Classroom/lecture sessions on particle detectors and experimental techniques. b. Lab work sessions refurbishing and preparing detectors.

27 Examples of Formative Measures Rating Scales How effective do you think the workshop was in meeting its goals? Very EffectiveSomewhat Neither Effective Not Effective Effective nor Ineffective Effective (5) (4) (3) (2) (1)MSD 1 3 1 0 0 4.00.71 Very Unuseful Somewhat Somewhat Useful Very Unuseful Unuseful Useful Useful (1) (2) (3) (4) (5) (6) M SD Component 0 00 0 2 3 5.60.55 Classroom/lecture sessions on particle detectors and experimental techniques. 0 00 1 3 1 5.00.71 Lab work sessions refurbishing and preparing detectors.

28 Summative Evaluation What Outcomes were evaluated? Teachers’ increase in knowledge about cosmic ray physics and skill in high energy research methods Teachers’ change self-efficacy for conducting CROP research and integrating CROP into their teaching What measures were used? Knowledge gain Achieving criteria level of knowledge/skill Increase in self-efficacy Who was evaluated? Teachers What data was collected? Pre- and post-workshop tests of cosmic ray physics and research Pre- and post-workshop self-efficacy ratings How was data analyzed? Dependent t-tests of pre-post scores Comparing skill scores to criteria

29 Summative Evaluation Knowledge Test Questions 1.The energy distribution of primary cosmic rays bombarding the earth has been measured by a number of experiments. In the space below, sketch a graph of the number of observed primary cosmic rays vs. cosmic ray energy, and describe the distribution in a sentence or two. 2.Explain how a scintillation counter works, i.e. write down the sequence of events from the passage of a charged particle through a scintillator to the generation of an electric signal in a photomultiplier tube. 3.Describe some characteristic differences between electromagnetic showers and hadronic showers created when particles impinge on a block of matter or a cosmic ray enters the atmosphere. Hint: think in terms of the type of particle which initiates the shower, the type of secondary particles in the shower, the shape of the shower, depth penetration of the shower particles, etc.

30 Summative Evaluation Data Analysis Table 9 Participants Pre- and Post-Test Mean Scores on Knowledge Tests Pre-TestPost-Test df M SD M SD t ES Teachers45.002.6919.601.718.67*6.64 Note. ES = effect size computed by Cohen's d in averaged pre- and post-test SD units. Teachers, n = 5. *p <.01.

31 Summative Evaluation Self-Efficacy Questions Please rate how confident you are about each of the following from 0 (completely unconfident) to 100 (completely confident). 1. Your ability to set-up and maintain the CROP research equipment at your school. 2. Your ability to conduct CROP research at your school. 3. Your ability to teach students at your school who haven't attended the Summer Workshop how to conduct CROP research at your school. 4. Your ability to design your own research projects for your students utilizing the CROP research equipment. 5. Your ability to incorporate lessons and activities in high-energy physics into your classes. 6. Your ability to create "hands-on" projects and activities for students in your classes using the CROP research equipment.

32 Summative Evaluation Data Analysis Table 11 Participants Pre- and Post-Test Mean Self-Efficacy Scores Pre-Test Post-Test df M SD M SD t ES Conducting 441.0031.5877.8016.103.06*1.54 CROP Activities Integrating CROP 445.0031.3779.2517.313.32*1.41 Into Classes Utilizing Distance 456.6717.4870.3313.354.08*.89 Education Note. ES = effect size computed by Cohen's d in averaged pre- and post-test SD units. Teachers, n = 5. *p <.01.

33 Formative Evaluation Example To obtain student reactions for the development of the campus specific Web based brief intervention versions, student feedback will be obtained. Beta versions will be evaluated by recruiting a panel of students from the each participating campus. These students will complete the intervention and provide verbal and written feedback on their reactions to the program and their suggestions for improvement. Adjustments to the program will be made based on student feedback.

34 Summative Evaluation Example Students will complete the web-based brief alcohol intervention (pre-test). Approximately 6-weeks later, they will again complete the web-based brief alcohol intervention (post-test). Change will be determined by comparing post-test scores to pre-test scores using a Repeated Measures Analysis of Variance (ANOVA). Success will be determined by a statistically significant decrease in drinking and driving (Objective 1) and riding with a driver who has been drinking (Objective 2), with an effect size of at least a 10% pre- to post-test decrease for drunk driving and a 6% decrease for riding with a drinking driver.

35 Planning Evaluation The Logic Model A systematic linkage of project goals, objectives, activities, and outcomes.

36 Steps in Creating a Logic Model 1)Clarify what the goals of the project/ program are. 2)Clarify what objectives the project should achieve. 3)Specify what program activities will occur.

37 Goal Clarification High school physics teachers and students will attend a 3-4 week hands-on summer research experience on cosmic ray physics at UNL. Is this a goal?

38 Goal Clarification Establish a statewide collaborative network of expert teachers fully capable of continuing the project locally.

39 Developing Objectives Goal: Establish a statewide collaborative network of expert teachers fully capable of continuing the project locally. Objectives 1.Teachers will acquire knowledge about cosmic ray physics and skill in high energy research methods. 2.Teachers will exhibit increased self-efficacy for conducting CROP research and integrating CROP into their teaching.

40 CROP Logic Model Goal: Establish a statewide collaborative network of expert teachers fully capable of continuing the project locally. Objectives: Teachers will acquire knowledge about cosmic ray physics and skill in high energy research methods. Teachers will exhibit increased self-efficacy for conducting CROP research and integrating CROP into their teaching. Activity: High school physics teachers and students will attend a 3-4 week hands-on summer research experience on cosmic ray physics at UNL

41 Evaluating the Logic Model Goal – Objective Correspondence Are objectives related to the overall goal? Goal – Activity Correspondence Do anticipated activities adequately implement the goals? Activity – Objective Correspondence Will program activities result in achieving objectives?

42 CROP Logic Model Goal: Establish a statewide collaborative network of expert teachers fully capable of continuing the project locally. Objectives: Teachers will acquire knowledge about cosmic ray physics and skill in high energy research methods. Teachers will exhibit increased self-efficacy for conducting CROP research and integrating CROP into their teaching. Activity: High school physics teachers and students will attend a 3-4 week hands-on summer research experience on cosmic ray physics at UNL

43 An Example GOAL 1:Increase the availability of attractive student centered social activities located both on and off the NU campus. Objective 1.1: Increase by 15% from baseline the number of students aware of campus and community entertainment options available to NU students. Activity: Develop and maintain an interactive web site describing social and entertainment options for students.

44 Another Example GOAL 7: Reduce high-risk alcohol marketing and promotion practices. Objective 7.3: Reduce by 25% from baseline, the volume of alcohol advertisements in the Daily Nebraskan, The Reader and Ground Zero that mention high-risk marketing and promotion practices. Activity: Work with the media to encourage at least 3 newspaper articles or television news stories in the Lincoln market each school year concerning high-risk marketing and promotion practices.

45 Logic Model Example

46 Logic Model Worksheet GoalsActivitiesObjectives (Outcome) Indicators Measures Who Evaluated Data Sources Data Analysis

47 Final Thoughts Funders including Federal, State, and Foundation increasingly want more Summative (Outcome) Evaluation

48 Especially Summative Evaluation of Educational Activities and Outreach Activities

49 To be competitive for funding You need a strong evaluation That is more than Did activities get done and did attendees like it?

50 Where to get Help The SSP Core (Well duh-Why else would we be doing this Brown Bag?) The SSP can assist with: Developing an evaluation plan and logic model for funding proposals Conducting evaluations of ongoing projects


Download ppt "An Introduction to Evaluation Research Duane Shell Research Associate Professor Nebraska Prevention Center for Alcohol and Drug Abuse Department of Educational."

Similar presentations


Ads by Google