Presentation is loading. Please wait.

Presentation is loading. Please wait.

Methods and Implications of Using Methods Ellen Taylor-Powell University of Wisconsin-Cooperative Extension.

Similar presentations


Presentation on theme: "Methods and Implications of Using Methods Ellen Taylor-Powell University of Wisconsin-Cooperative Extension."— Presentation transcript:

1 Methods and Implications of Using Methods Ellen Taylor-Powell University of Wisconsin-Cooperative Extension

2 Our time today Overview: Sources and Methods Program examples Cultural considerations Attribution vs. contribution Application of evaluation standards as we think about methods and implications

3 Process Ask questions Interactivity Share examples

4 Methods = CHOICES So many choices, so many decisions

5 “Developing an evaluation is an exercise of dramatic imagination” (Cronbach, 1982: 239)

6 a.There is one best way to collect data b.Quantitative methods that collect numbers provide more useful information c.Evaluation data collection involves any established social science research method d.We often collect data from program participants e.We should always collect data from as many participants as possible Let’s get started by checking ourselves! (answer each statement with either true or false)

7 Myths Choice about which method to choose is primarily a technical decision There is one best method There are established and known standards of what constitutes methodological quality and excellence More data is always better “Hard” data is better than “soft” data

8 Where do methods fall in the process of planning an evaluation? http://www.uwex.edu/ces/pdande/evaluation/

9 INPUTSOUTPUTSOUTCOMES Program investments ActivitiesParticipationShortMedium Long- term Evaluation methods: How will you collect the information to answer your questions? Match evaluation questions and methods to your PROGRAM Evaluation questions : What questions do you want to answer? Logic model

10 Example: What do you (and others) want to know about the program? Staff Money Partners Assess parent ed programs Design- deliver evidence- based program of 8 sessions Parents increase knowledge of child dev Parents better understanding their own parenting style Parents use effective parenting practices Improved child- parent relations Research INPUTSOUTPUTS OUTCOMES Facilitate support groups Parents gain skills in new ways to parent Parents identify appropriate actions to take Parents of 3-10 year olds attend Reduced stress Parents gain confidence in their abilities Strong families Inputs ProcessOutcomes Impact

11 Possible evaluation questions… Staff Money Partners Assess parent ed programs Design & deliver evidence-based program of 8 sessions Parents increase knowledge of child dev Parents better understand their own parenting style Parents use effective parenting practices Improved child- parent relations Research Facilitate support groups Parents gain skills in effective parenting practices Parents identify appropriate actions to take Strong families Parents of 3-10 year olds attend To what extent is stress reduced? relations improved? To what extent did behaviors change? For whom? Why? What else happened? To what extent did knowledge and skills increase? For whom? Why? What else happened? Did all parents participate as intended? Who did/not not?Did they attend all sessions?support groups?Level of satisfaction? Were all sessions delivered? How well? Do support groups meet? What amount of $ and time were invested? Reduced stress

12 Sources of data Sources of evaluation information People: youth participants, parents, teachers, volunteers, leaders, judges… Pictorial records and observations: before-after photos; observations at events; artwork… Existing information: record books, plans of work, logs, journals, meeting minutes…

13 Data collection methods Data collection methods Survey Interview Focus group Observation Expert or peer reviews Portfolio reviews Testimonials Tests Photograph, videotape, slides Diaries, journals, logs Document review and analysis

14 Polling slide…How many use/have used these methods?

15 Creative methods… Creative expression: drawing, drama, role-playing Photography, videotape, slides Diaries, journals, logs Personal stories Expert review Buzz session Affinity diagramming ??? “There can be no definitive list of creative evaluation approaches. Such as list would be a contradiction in terms” Patton: 346

16 Pros and cons of different methods Insert slides OR connect to a pdf? Example re. choices

17 Quantitative: numbers breadth generalizability Qualitative: words depth specific "Not everything that counts can be counted and not everything that can be counted counts.“ (Albert Einstein) Quantitative information – Qualitative information

18 Often, it is better to use more than one data collection method…. TRIANGULATION Why?

19 Examples How might you mix sources of information in your evaluation? How might you mix data collection methods to evaluate your program?

20 Polling or quiz

21 1. Focus: Whole farm phosphorus management – 11 Western counties 2. Questions3. Indicators4. Timing5. Data collection SourcesMethodsSampleInstruments 1 What did the phosphorus mngt program actually consist of? Who did what? #, type of activities implemented: course developed, workshops conducted, on-farm work #, who, role of partners At time of activity At time of involvement Staff Partners Recording log/data base Log Interview annually All activities All partners Need form; system for ongoing recording Need recording form; Interview questions 2 Did the expected number of farmers attend the various activities? Who participated in what? #, key characteristics of participating farmers per activity At time of activity (workshop, field day, on- farm visit) Attendance logs Record review All participants Need recording form and system for collecting data 3 What resulted? To what extent did participating farmers: a) increase their knowledge? b) Increase skills in tracking P levels? c) adopt recommendations? d) reduce P levels? e) save money? 4 What else Happened? #,% of participants who a) Report increased knowledge b) demonstrate skill c) report changes in feeding levels d) record P reductions e) report $ savings; amount of savings End of each workshop Ongoing Annually –4 th quarter Participants Farmers Staff Partners; other stakeholders Post session survey Observations Record review Informal interviews Focus groups All participants All participants 5-7 selected in each grouping Questionnaire TBD Recording logs and questions TBD Develop focus group protocol for each group EXAMPLE DESIGN: Baseline? Comparison group? External contingencies? Other outcomes?

22 Contribution vs. attribution We need to accept the fact that what we are doing is measuring with the aim of reducing the uncertainty about the contribution made, not proving the contribution made. Mayne, 1999:10

23 Culturally appropriate evaluation methods How appropriate is the method given the culture of the respondent/the setting? Culture differences: nationality, ethnicity, religion, region, gender, age, abilities, class, economic status, language, sexual orientation, physical characteristics, organizational affiliation

24 Is a written questionnaire culturally appropriate? Things to consider: Literacy level Tradition of reading, writing Setting Not best choice for people with oral tradition Translation (more than just literal translation) How cultural traits affect response – response sets How to sequence the questions Pretest questionnaire may be viewed as intrusive

25 Are interviews culturally appropriate? Things to consider: Preferred by people with an oral culture Language level proficiency; verbal skill proficiency Politeness – responding to authority (thinking it’s unacceptable to say “no”), nodding, smiling, agreeing Need to have someone present Relationship/position of interviewer May be seen as interrogation Direct questioning may be seen as impolite, threatening, or confrontational

26 Are focus groups culturally appropriate? Things to consider: Issues of gender, age, class, clan differences Issues of pride, privacy, self-sufficiency, and traditions Relationship to facilitator as prerequisite to rapport Same considerations as for interview

27 Is observation culturally appropriate? Things to consider: Discomfort, threat of being observed Issue of being an “outsider” Observer effect Possibilities for misinterpretations

28 CHALLENGES Hard to reach populations Young children When to do follow-up Sensitive subject matter Reactivity Evaluation as an add- on

29 Insert – polling, quiz…some type of interactivity

30 Apply the evaluation standards to your methods decisions Utility Feasibility Propriety Accuracy

31 UTILITY Will the data sources and collection methods serve the information needs of your primary users?

32 FEASIBILITY Are your sources and methods practical and efficient? Do you have the capacity, time, and resources? Are your methods non-intrusive and non-disruptive?

33 PROPRIETY Are your methods respectful, legal, ethical, and appropriate? Does your approach protect and respect the welfare of all those involved or affected?

34 ACCURACY Are your methods technically adequate to: answer your questions? measure what you intend to measure? reveal credible and trustworthy information? convey important information?

35 When choosing methods, consider… The purpose of your evaluation – what do you want to know? Your use/users – what kind of data will your stakeholders find most credible and useful? –Percents, comparison, stories, statistical analysis Your respondents − how they can best be reached, how they might best respond? Your comfort level Level of burden to program or participants Pros and cons of each method RESOURCES

36 http://www.uwex.edu/ces/pdande/evaluation/index. html

37 http://www.uwex.edu/ces/pdande/ http://www.uwex.edu/ces/lmcourse/

38 Resources Ohio State University Penn State

39


Download ppt "Methods and Implications of Using Methods Ellen Taylor-Powell University of Wisconsin-Cooperative Extension."

Similar presentations


Ads by Google