Presentation is loading. Please wait.

Presentation is loading. Please wait.

How to Use the Bruner Foundation Guide & Powerpoint Slides Evaluation Essentials:A Guide for Nonprofit Organizations and Their Evaluation Partners. (the.

Similar presentations


Presentation on theme: "How to Use the Bruner Foundation Guide & Powerpoint Slides Evaluation Essentials:A Guide for Nonprofit Organizations and Their Evaluation Partners. (the."— Presentation transcript:

0 Building Evaluation Capacity Presentation Slides for Participatory Evaluation Essentials: An Updated Guide for Non-Profit Organizations And Their Evaluation Partners 2010 Anita M. Baker, Ed.D. Bruner Foundation Rochester, New York

1 How to Use the Bruner Foundation Guide & Powerpoint Slides Evaluation Essentials:A Guide for Nonprofit Organizations and Their Evaluation Partners. (the Guide) and slides are organized to help an evaluation trainee walk through the process of designing an evaluation and collecting and analyzing evaluation data. The Guide also provides information about writing an evaluation report. The slides allow for easy presentation of the content, and in each section of the Guide there are activities that provide practice opportunities. The Guide has a detailed table of contents for each section and it includes an evaluation bibliography. Also included are comprehensive appendices which can be pulled out and used for easy references, as well as to review brief presentations of other special topics that are not covered in the main section and sample logic models, completed interviews which can be used for training activities, and a sample observation protocol. For the Bruner Foundation-sponsored REP project, we worked through all the information up front, in a series of comprehensive training sessions. Each session included a short presentation of information, hands-on activities about the session topic, opportunities for discussion and questions, and homework for trainees to try on their own. By the end of the training sessions, trainees had developed their own evaluation designs which they later implemented as part of REP. We then provided an additional 10 months of evaluation coaching and review while trainees actually conducted the evaluations they had designed and we worked through several of the additional training topics that are presented in the appendix. At the end of their REP experience, trainees from non-profit organizations summarized and presented the findings from the evaluations they had designed and conducted. The REP non-profit partners agreed that the up-front training helped prepare them to do solid evaluation work and it provided opportunities for them to increase participation in evaluation within their organizations. The slides were first used in in a similar training project sponsored by the Hartford Foundation for Public Giving. We recommend the comprehensive approach for those who are interested in building evaluation capacity. Whether you are a trainee or a trainer, using the guide to fully prepare for and conduct evaluation or just look up specific information about evaluation-related topics, we hope that the materials provided here will support your efforts. Bruner Foundation Rochester, New York

2 They may NOT be sold or redistributed in whole or part for a profit.
These materials are for the benefit of any 501c3 organization. They MAY be used in whole or in part provided that credit is given to the Bruner Foundation. They may NOT be sold or redistributed in whole or part for a profit. Copyright © by the Bruner Foundation 2010 * Please see the previous slide for further information about how to use the available materials. Bruner Foundation Rochester, New York

3 Important Terminology
Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

4 Building Evaluation Capacity Session 1 Evaluation Basics
Anita M. Baker Evaluation Services Bruner Foundation Rochester, New York

5 Working Definition of Program Evaluation
The practice of evaluation involves thoughtful, systematic collection and analysis of information about the activities, characteristics, and outcomes of programs, for use by specific people, to reduce uncertainties, improve effectiveness, and make decisions. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

6 Working Definition of Program Evaluation
The practice of evaluation involves thoughtful, systematic collection and analysis of information Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

7 Working Definition of Program Evaluation
The practice of evaluation involves thoughtful, systematic collection and analysis of information about the activities, characteristics, and outcomes of programs, Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

8 Working Definition of Program Evaluation
The practice of evaluation involves thoughtful, systematic collection and analysis of information about the activities, characteristics, and outcomes of programs, for use by specific people, Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

9 Working Definition of Program Evaluation
The practice of evaluation involves thoughtful, systematic collection and analysis of information about the activities, characteristics, and outcomes of programs, for use by specific people, to reduce uncertainties, improve effectiveness, and make decisions. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

10 Working Definition of Participatory Evaluation
Participatory evaluation involves trained evaluation personnel and practice-based decision-makers working in partnership. P.E. brings together seasoned evaluators with seasoned program staff to: Address training needs Design, conduct and use results of program evaluation Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

11 Types of Evaluation Monitoring: Tracking progress through regular reporting. Usually focuses on activities and/or expenditures. Formative Evaluation: An evaluation that is carried out while a project is underway. Often focuses on process and implementation and/or on more immediate or intermediate outcomes. Summative Evaluation: An evaluation that assesses overall outcomes or impact of a project after it ends. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

12 Types and Focuses of Evaluation
GRANTEE DONORS PROGRAM AREA FOUNDATION Monitoring Compliance with terms of Grant Efficient Fund Administration Compliance with Due Diligence Policies and Budget Compliance with Laws & Policies that Govern the Foundation Formative Implementation Short/Mid-Term Outcomes Donor Services & Development Activities Program Performance Relative to Strategy Internal Performance Goals Summative Long-Term Outcomes Donor Value Created Program Strategy & Goals Foundation Strategy & Goals Adapted from Kramer, 2004 Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

13 Types and Focuses of Evaluation (What’s in a Name?)
Cluster Evaluation Administrative Processes GRANTEE DONORS PROGRAM AREA FOUNDATION Monitoring Compliance with terms of Grant Efficient Fund Administration Compliance with Policies & Budget Compliance with Governing Laws & Policies Formative Implementation Short/Mid-Term Outcomes Donor Services & Development Activities Program Performance Relative to Strategy Internal Goals Summative Long-Term Outcomes Donor Value Created Strategy & Foundation Process Evaluation Impact Evaluation Multiple Sources Donor Engagement Bruner Foundation Rochester, New York Grantmaker Performance Evaluation

14 Evaluation Strategy Clarification
All Evaluations Are: Partly social Partly political Partly technical Both qualitative and quantitative data can be collected and used and both are valuable. There are multiple ways to address most evaluation needs. Different evaluation needs call for different designs, types of data and data collection strategies. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

15 Purposes of Evaluation
Evaluations are conducted to: Render judgment Facilitate improvements Generate knowledge Evaluation purpose must be specified at the earliest stages of evaluation planning and with input from multiple stakeholders. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

16

17 Who are Evaluation Stakeholders, and Why Do They Matter?
Decision-makers Information-seekers Those directly involved with the evaluation subject (evaluand) Most programs/strategies have multiple stakeholders Organization managers, clients and/or their caregivers, program staff, program funders, partner organizations Stakeholders have diverse, often competing interests related to programs and evaluation. Certain stakeholders are the primary intended users of evaluation. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

18 What is Needed to Conduct Evaluation?
Specify evaluation questions Develop an evaluation design Apply evaluation logic Collect and analyze data Summarize and share findings Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

19 What is an Evaluation Design?
An evaluation design communicates plans to evaluators, program officials and other stakeholders. Evaluation designs help evaluators and their partners think about and structure evaluations. And help them answer 3 critical questions. What? Now What? So What? Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

20 Good Evaluation Designs Include the Following
 Summary information about the program  The questions to be addressed by the evaluation  The data collection strategies that will be used  The individuals who will undertake the activities  When the activities will be conducted  The products of the evaluation (who will receive them and how they should be used)  Projected costs to do the evaluation Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

21 Evaluation Questions Get you Started
Focus and drive the evaluation. Should be carefully specified and agreed upon in advance of other evaluation work. Generally represent a critical subset of information that is desired. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

22 Evaluation Questions: Criteria
It is possible to obtain data to address the questions. There is more than one possible “answer” to the question. The information to address the questions is wanted and needed. It is known how resulting information will be used internally (and externally). The questions are aimed at changeable aspects of programmatic activity. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

23 Evaluation Questions: Advice
Limit the number of questions Between two and five is optimal Keep it manageable

24 Evaluation Questions: Advice
Limit the number of questions Between two and five is optimal Keep it manageable PAGE 12 What are staff and participant perceptions of the program? How and to what extent are participants progressing toward desired outcomes? Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

25 EVALUATION QUESTION EXAMPLES
What was the quality of the professional development provided to participating teachers and other school administrators and staff specifically: through the Summer Institutes, Quarterly Meetings, and Monthly TA Meetings? How and to what extent was this maintained/expanded as the project expanded in Year 2? How and to what extent did the professional development inform teacher’s content knowledge, and influence change in teacher practice? To what extent did teachers embrace and use the Earth Force Process, and to what extent were they able to help students enhance their STEM knowledge through service learning? What was the extent of implementation fidelity in the classroom with respect to following the service learning model presented to teachers through the summer institute and the ongoing technical assistance? How and to what extent was this sustained/enhanced or eroded during expansion of the project? To what extent did students and parents/guardians value the use of the Earth Force Process/Service Learning as a STEM learning strategy?

26 What do you need to know about a program …
What do you need to know about a program …. before you design an evaluation? What is/are the purpose(s) of the program? What stage is the program in? (new, developing, mature, phasing out) Who are the program clients? Who are the key program staff (and where applicable, in which department is the program)? What specific strategies are used to deliver program services? What outcomes are program participants expected to achieve? Are there any other evaluation studies currently being conducted regarding this program? Who are the funders of the program? What is the total program budget? Why this program was selected for evaluation? Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

27 What is Evaluative Thinking?
Evaluative Thinking is a type of reflective practice that incorporates use of systematically collected data to inform organizational decisions and other actions. MARCIE Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

28 What Are Key Components of Evaluative Thinking?
1. Asking questions of substance 2. Determining data needed to address questions 3. Gathering appropriate data in systematic ways 4. Analyzing data and sharing results 5. Developing strategies to act on findings Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

29 What Are Key Components of Evaluative Thinking?
1. Asking questions of substance 2. Determining data needed to address questions 3. Gathering appropriate data in systematic ways 4. Analyzing data and sharing results 5. Developing strategies to act on findings Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

30 What Are Key Components of Evaluative Thinking?
1. Asking questions of substance 2. Determining data needed to address questions 3. Gathering appropriate data in systematic ways 4. Analyzing data and sharing results 5. Developing strategies to act on findings Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

31 What Are Key Components of Evaluative Thinking?
1. Asking questions of substance 2. Determining data needed to address questions 3. Gathering appropriate data in systematic ways 4. Analyzing data and sharing results 5. Developing strategies to act on findings Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

32 What Are Key Components of Evaluative Thinking?
1. Asking questions of substance 2. Determining data needed to address questions 3. Gathering appropriate data in systematic ways 4. Analyzing data and sharing results 5. Developing strategies to act on findings Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

33 What Are Key Components of Evaluative Thinking?
1. Asking questions of substance 2. Determining data needed to address questions 3. Gathering appropriate data in systematic ways 4. Analyzing data and sharing results 5. Developing strategies to act on findings Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

34 Building Evaluation Capacity Session 2 Evaluation Logic
Anita M. Baker Evaluation Services Bruner Foundation Rochester, New York

35 How is Evaluative Thinking Related to Organizational Effectiveness?
Organizational effectiveness = the ability of an organization to fulfill its mission sound management, strong governance, persistent rededication to achieving results Grantmakers for Effective Organizations Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

36 What Are Key Components of Evaluative Thinking?
1. Asking questions of substance 2. Determining data needed to address questions 3. Gathering appropriate data in systematic ways 4. Analyzing data and sharing results 5. Developing strategies to act on findings Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

37 How is Evaluative Thinking Related to Organizational Effectiveness?
To determine effectiveness, organization MUST have evaluative capacity evaluation capacity/skills ability to transfer those skills to organizational competencies, (i.e., evaluative thinking in multiple areas). Organizational capacity areas where evaluative thinking is less evident, are also capacity areas of organizations that usually need to be strengthened. Mission, Strategic Planning, Leadership, Governance, Finance, Fund Raising/Fund Development, Business Venture Development, Technology Acquistion & Training, Client Interaction, Marketing & Communications, Program Development, Staff Development, Human Resources, Alliances & Collaboration Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

38 What is Needed to Conduct Evaluation?
Specify evaluation questions Develop an evaluation design Apply evaluation logic Collect and analyze data Summarize and share findings Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

39 Logic Model Overview Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

40 What is a Logic Model? A Logic Model is a simple description of how a program is understood to work to achieve outcomes for participants. It is a process that helps you to identify your vision, the rationale behind your program, and how your program will work. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

41 Summarizing Logic Models . . .
Can be useful for program planning, evaluation and fund development. Can be used to build consensus on the program’s design and operations. Can be done to show programs currently or optimally. Can help develop a realistic picture of what can be accomplished. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

42 What’s the Difference Between a Logic Model and a Theory of Change
A Logic Model is a widely used tool that graphically presents specific details of an individual program’s inputs, activities and outcomes. Theory of Change is a model designed to link outcomes and activities to explain how and why desired change is expected to come about. The terms are sometimes used interchangeably but they are actually different tools. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

43 A Theory of Change . . . Is generally more useful for a whole organization or collection of program/strategies in a department (or initiative) Is a causal model that shows underlying assumptions and clarifies necessary pre- conditions that must be achieved before long-term outcomes can be achieved Often includes components to describe internal and external context Adapted from Steven LaFrance, 2009 Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

44 Use of Logic Models and Theory of Change
Present a quick and simple representation of something Show basic inputs, activities and outcomes and guide basic evaluation Summarize a more complex undertaking into basic categories Design or summarize a complex initiative Evaluate appropriate outcomes at right time in right sequence Explain more precisely why an initiative did or did not work (Adapted from Clarke and Anderson, 2004) Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

45 To Construct a Logic Model You Must Describe:
Inputs: resources, money, staff/time, facilities, etc. Activities: how a program uses inputs to fulfill its mission – the specific strategies, service delivery. Outputs: tangible, direct products of program activities Outcomes: changes to individuals or populations during or after participation. It’s easiest to embed targets here (on simple form). Indicators: Indicators are specific characteristics or changes that represent achievement of an outcome. Targets can be embedded here. Targets: specify the amount or level of outcome attainment that is expected, hoped for or required. Inputs Activities Outputs Outcomes (w/ targets) Indicators (w/ targets) Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

46 Logic Model Assessment Answer these questions to review your model
Does the model seem logical? Is it clear? Is it comprehensive? Are the outcomes appropriate, useful, likely to be accepted? Are targets reasonable? Are Activities: sufficient in number, duration and intensity? doable given project inputs? Any activities unrelated ? or missing? Do inputs seem sufficient? Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

47

48 Logic Models Can Incorporate Context and Assumptions
Ask yourself…. …do the outcomes seem reasonable given the program activities? …do the assumptions resonate with me and my experiences? …are there gaps in the strategy? Contextual Analysis Identify the major conditions and reasons for why you are doing or could do this work Assumptions Why are conditions like this? What can address these conditions and improve the situation? Short-term Outcomes: What benefits can or do we expect for participants during and after the program? New knowledge? Increased skills? Changed attitudes? Modified behavior? Improved condition? Altered status? Inputs: What resources do we need, can we dedicate, or do we currently use for this project? Assumptions Longer-term Outcomes: What do we think happens ultimately? How does or can this contribute to organizational and community value? How could or should these activities and short- term outcomes lead to or contribute to our desired long-term outcomes? Activities: What can or do we do with these inputs to fufill the program mission? Bruner Foundation Rochester, New York Anita Baker, Evaluation Services Pathway Map Adapted from OMG

49 Let’s analyze an example logic model
Ask yourself…. …do the outcomes seem reasonable given the program activities? …do the assumptions resonate with me and my experiences? …are there gaps in the strategy? Contextual Analysis People in my community: Have few job skills and are likely to have bad jobs or no jobs, and limited job histories. Have few opportunities for job training, placement, or help to deal with issues that come up while on the job. Assumptions Jobs exist, we just have to help people find them. The absence of a job history perpetuates unemployment. Education can help people improve their skills. Being able to ask a mentor for advice is useful. Job seekers need help with soft skills and technical training. Personal, one-on-one attention and classes can inspire and support people in keeping jobs and establishing job histories. Short-term Outcomes -Participants learn specific marketable skills and strategies to help them get and keep jobs. Participants establish trusting relationships with mentors who can answer questions and support them while they are involved in on-the-job training. Longer-term Outcomes Participants maintain their employment and establish records that increase the likelihood for continuous work and better jobs. Assumptions Activities Provide 6 weekly soft skills classes. Identify on-the-job training opportunities and assist participants with placement. Conduct 6 months of on-the-job supervised training and lunchtime mentoring sessions Getting solid hard and soft skills are the first steps to keeping a job. If people feel supported, they will keep working. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

50 Important Things to Remember
There are several different approaches and formats for logic models. Not all programs lend themselves easily to summarization in a logic model format. The relationships between inputs, activities and outcomes are not one to one. Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

51 Important Things to Remember
Logic models are best used in conjunction with other descriptive information or as part of a conversation. When used for program planning, it is advisable to start with outcomes and then determine what activities will be appropriate and what inputs are needed. It is advisable to have one or two key project officials summarize the logic model. It is advisable to have multiple stakeholders review the LM and agree upon what is included and how. Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

52 Outcomes, Indicators and Targets
Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

53 Logical Considerations - Planning
Think about the results you want. Decide what strategies will help you achieve those results? Think about what inputs you need to conduct the desired strategies. Specify outcomes, identify indicators and targets.** DECIDE IN ADVANCE, HOW GOOD IS GOOD ENOUGH 5. Document how services are delivered. 6. Evaluate actual results. **use caution with these terms Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

54 Logical Considerations - Evaluation
Identify outcomes, indicators and targets.** DECIDE IN ADVANCE, HOW GOOD IS GOOD ENOUGH 5a. Collect descriptive information about participants. 5b. Document service delivery. Evaluate actual results. **use caution with these terms Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

55 Outcomes Changes in attitudes, behavior, skills, knowledge, condition or status. Must be: Realistic and attainable Related to core business Within program’s sphere of influence Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

56 Outcomes: Reminders Time-sensitive
Programs have more influence on more immediate outcomes Usually more than one way to get an outcome Closely related to program design; program changes usually = outcome changes Positive outcomes are not always improvements (maintenance, prevention) Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

57 Indicators Specific, measurable characteristics or changes that represent achievement of an outcome. Indicators are: Directly related to the outcome, help define it Specific, measurable, observable, seen, heard, or read Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

58 Indicator: Reminders Most outcomes have more than one indicator
Identify the set of indicators that accurately signal achievement of an outcome (get stakeholder input) When measuring prevention, identify meaningful segments of time, check indicators during that time Specific, Measurable, Achievable, Relevant, Timebound (SMART) Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

59 Targets Relative to external standards (when available)
Specify the amount or level of outcome attainment expected, hoped for or required. Targets can be set: Relative to external standards (when available) Past performance/similar programs Professional hunches Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

60 Target: Reminders Targets should be specified in advance, require buy in, and may be different for different subgroups. Carefully word targets so they are not over or under-ambitious, make sense, and are in sync with time frames. If target indicates change in magnitude – be sure to specify initial levels and what is positive. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

61 Let’s “break it down” Use the “I’ll know it when I see it” rule
The BIG question is what evidence do we need to see to be convinced that things are changing or improving? The “I’ll know it (outcome) when I see it (indicator)” rule in action -- some examples: I’ll know when I see and when I see that retention has increased among home health aides involved in a career ladder program a reduction in the employee turnover rate among aides involved in the program survey results that indicate that aides are experiencing increased job satisfaction Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

62

63 “I’ll know it when I see it”
I’ll know that economic stability has increased among the clients I place in permanent employment when I see an increase in the length of time that clients keep their jobs and when I see an increase in the number of clients who qualify for jobs with benefits I’ll know my clients are managing their nutrition and care more effectively when I see my clients consistently show up for scheduled medical appointments and when I see decreases in my clients’ BMIs Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

64 Examples? of Indicators
Outcome Process Indicators Outcome Indicators MENTORING PRG. Increase in educational, social, and occupational functioning Increase positive/healthy behaviors Improved relationship quality with parents, peers, and other adults Number of mentors recruited/ matched Frequency and duration of meetings Types of activities Length of matches Increased attendance at school Increased academic motivation Decreased at-risk behaviors More positive interaction with parents, peers and adults HS graduation/enrollment in post-secondary education SOCIAL WORK PRG. Improved communication skills Improved relationships Increased positive behaviors Improved life skills Number of meetings indiv./family/school Duration of meetings Meeting attendance Quality of staff and materials Effective expression of thoughts and feelings More positive interaction with peers and adults Reduced/no incident of illegal behavior Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

65 Examples of Indicators with Time References
Outcomes Indicators Initial: Teens are knowledgeable of prenatal nutrition and health guidelines Program participants are able to identify food items that are good sources of major dietary requirements Intermediate: Teens follow proper nutrition and health guidelines Participants are within proper ranges for prenatal weight gain Participants abstain from smoking Participants take prenatal vitamins Longer Term: Teens deliver healthy babies Newborns weigh at least 5.5 pounds and score 7 or above on the APGAR scale. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

66 What if you saw results like these?
Ryan White Services, State of Ct. RESULTS Desired Outcome 2009 2010 Chg * 65% of clients show slowed or prevented disease progression at 6 and 12 months 83% 87% * 75% of clients are fully engaged in HIV primary medical care 96% * 80% of clients show progress in 2 or more areas of service plan 90% 94% * 50% of clients with mental health issues show improvement in mental health function by 6 months 97% * 75% of clients enrolled in SA treatment decrease use of drugs/alcohol after accessing services 93% 92% * 90% of clients show improved or maintained oral health at 6 and 12 months

67 Outcome, Indicator, Target - EXAMPLE
Indicators 65% of clients show slowed or prevented disease progression at 6 and 12 months Sustained CD4 counts within 50 cells Viral loads <5000 50% of clients with MH issues show improvement at 3 months, by 6 months or at program end. Maintaining or decreasing mental health distress symptoms from baseline to follow-up using SDS Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

68 Outcome, Indicator, Target - EXAMPLE
Participants will be actively involved in program activities Indicators At least 500 participants will be enrolled each month. Participants will attend 70% or more of all available sessions. At least half of participants will participate in 100 or more hours per cycle. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

69

70 The Innovation Network Version
Bruner Foundation Rochester, New York Anita Baker, Beth Bruner

71 Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

72 Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

73 State level logic model: Reducing and preventing youth tobacco use
Outcomes - Impact Inputs Activities Reach Short Medium Long Increased awareness of need to eliminate youth access to tobacco products, including tobacco industry tactics, laws, noncompliance Promote community involvement in restricting tobacco access to youth Establish baseline of current practices Inform/educate Eliminate self-service Facilitate active enforcement of laws Increased compliance and enforcement of laws and policies Decreased access to tobacco for minors Community Parents, Caretakers Law enforcement Retailer Health Department Coalition Members Decreased supply to minors Increased commitment to eliminate access/sources Funding Social norms less supportive of youth tobacco use Increased knowledge and skills in participating in policy change Increased # of youth actively engaged in policy change Partners Local Regional State Facilitate youth involvement in policy change Recruit youth Involve youth/adults Educate Community org, Businesses Policy makers Adults Youth serving org Youth Increased commitment by youth and adults for youth to participate in policy change Increased adoption of policy changes that involve youth in the change process Promote school and community based prevention programs and policies Establish baseline of existing resources Educate Assist with planning and implementing programs/services Research and best practices Reference: Wisconsin Tobacco Control Program Work Group including Christine Dobbe and Jennifer Leahy of UW-Extension, Based on: Workgroup CDC guidelines and best practices CIA Sub-Group technical assistance packet CIA Sub-Group reviewers CIA Sub-Group Suggested Goals “Restaurant Ordinances vs. All Worksites Including Restaurants” (CIA Sub-Group, 2003) DPH 2004 Boundary Statement and template objectives Delayed average age at first use; reduced initiation Schools Community Families Youth serving org Youth Increased knowledge about tobacco dependence; benefits and options for youth prevention (e.g, CDC guidelines, school-family initiatives) Increased # of effective prevention programs or policies adopted Reduced morbidity and mortality Increased # of youth participating in prevention programs See Treating Tobacco Addiction Youth Logic Model Promote youth cessation services and policies Increased commitment to adopt effective programs/policies for youth prevention Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

74 Anita M. Baker Evaluation Services
Building Evaluation Capacity Session 3 DOCUMENTING SERVICE DELIVERY DATA COLLECTION OVERVIEW SURVEY DEVELOPMENT Anita M. Baker Evaluation Services Bruner Foundation Rochester, New York

75 What is Needed to Conduct Evaluation?
Specify evaluation questions Develop an evaluation design Apply evaluation logic Collect and analyze data Summarize and share findings Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

76 Documenting Service Delivery (Implementation)
Implementation = following a design to deliver planned strategies You must be able to: Accurately describe what a program looks like in operation Determine if the description matches the intended program design. Implementation Outcomes Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

77 Implementation Assessment
Review Documents (e.g., program descriptions, proposals, lms) Conduct Observations (to determine fidelity and quality) Conduct Interviews (ask about context and critical features) Collect directly-reported data (e.g., surveys, activity logs) Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

78 Documenting Implementation: Focus on the Following
Background and Contextual Information Origin of the program Nature of the program sites (demographics, breadth of participation) How need for the program was determined Historical background of the program Background, qualifications and activities of program personnel Administrative features (including finances) Critical Features Target group Activities, schedule, organization Frequency, duration ***Barriers or Problems*** Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

79

80 How are evaluation data collected?
All have limitations and benefits All can be used to collect either quantitative or qualitative data Require preparation on the front end: Instrument Development and testing Administration plan development Analysis plan development Surveys Interviews Observations Record Reviews Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

81 Evaluation Data Collection Options
Qualitative Data Quantitative Data Interviews Conducting guided conversations with key people knowledgeable about a subject Record Review Collecting and organizing data about a program or event and its participants from outside sources Surveys Administering a structured series of questions with discrete choices Focus Groups Facilitating a discussion about a particular issue/question among people who share common characteristics We will discuss surveys, observations, focus groups, interviews, and record reviews in depth External Record Review Utilizing quantitative data that can be obtained from existing sources Observations Documenting visible manifestations of behavior or characteristics of settings Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

82 Initial Thoughts about . . . . Data Collection
Who will you collect data about? Clients, caregivers, other service providers working with clients, staff, some other group? Who are considered participants of your program? Be sure to clearly specify your eval. target population. What instruments do you need? Surveys, interview guides, observation checklists and/or protocols, record extraction or record review protocols? Are there any pre-tested instruments (e.g., scales for measuring human conditions and attitudes)? If not, how will you confirm validity? Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

83 Increasing Rigor in Program Evaluation
Reliable, not Valid Valid, not Reliable Neither Valid nor Reliable Valid and Reliable Validity and Reliability Mixed methodologies Multiple perspectives/ sources of data Multiple points in time Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

84

85 Survey Result Example Percent of Training Participants (N=93) who Think AAV Helped or Will Help Them: Some A Lot TOTAL Discuss issues of violence with clients 45% 55% 100% Provide positive interventions for clients 32% 65% 97% Understand the importance of self-care/stress reduction 38% 58% 96% Access additional strategies for self-care/stress reduction 47% 51% 98% Offer clients new ways to: De-escalate Situations 31% 67% Manage Anger 54% 43% Do safety planning 52% Conduct Bystander Interventions 39% Target = 50% or more say “a lot” to each Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

86 Surveys Series of items with pre-determined response choices
Can be completed by administrator or respondents Can be conducted “paper/pencil” phone, internet (e-survey) using alternative strategies Instruments are called – surveys, “evaluations,” questionnaires USE SURVEYS TO: Study attitudes and perceptions Collect self-reported assessment of changes in response to program Collect program assessments Collect some behavioral reports Test knowledge Determine changes over time. PRE POST GRAND CLAIMS Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

87 Surveys Are Most Productive When They Are:
Well targeted, with a narrow set of questions Used to obtain data that are otherwise hard to get. Used in conjunction with other strategies. Surveys are best used: with large numbers, for sensitive information, for groups that are hard to collect data from Most survey data are qualitative but simple quantitative analyses are often used to summarize responses. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

88 Surveys can be administered and analyzed quickly when . . .
pre-validated instruments are used sampling is simple or not required the topic is narrowly focused the numbers of questions (and respondents*) is relatively small the need for disaggregation is limited Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

89 Benefits of Surveys Can be used for a variety of reasons such as exploring ideas or getting sensitive information. Can provide information about a large number and wide variety of participants. Analysis can be simple. Computers are not required. Results are compelling, have broad appeal and are easy to present. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

90 Drawbacks of Surveys Designing surveys is complicated and time consuming, but use of existing instruments is limited. The intervention effect can lead to false responses, or it can be overlooked. Broad questions and open-ended responses can be difficult to use. Analyses and presentations can require a great deal of work. You MUST be selective. ! Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

91 Developing Survey Instruments
Identify key issues or topics. Review available literature, other surveys. Convert key issues into questions, identify answer choices. Determine what other data are needed, add questions accordingly. Determine how questions will be ordered and formatted. ADD DIRECTIONS. Have survey instrument reviewed. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

92 For Survey Items, Remember:
State questions in specific terms, use appropriate language. Use multiple questions to sufficiently cover topics. Avoid “double-negatives.” Avoid asking multiple questions in one item (and). Be sure response categories match the question, are exhaustive and don’t overlap. Be sure to include directions, check numbering, formatting etc. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

93

94 Assessing Survey Instruments
Are questions comprehensive without duplication, exhaustive without being exhausting? Do answer choices match question stem, provide coverage, avoid overlap? Are other data needs (e.g., characteristics of respondent) addressed? Do order and formatting facilitate response? Are directions clear? Does the survey have face validity? Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

95

96 Survey Result Example How would you rate Session 2 overall?
BEC Session 2 - January 2012 How would you rate Session 2 overall? Answer Options Response Percent Response Count Not So Good Okay 15% 5 Very Good 71% 24 Excellent answered question 34 skipped question 1 Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

97 SOAR Afterschool Program Teacher Survey Results
School 1 N= 24 School 2 N=14 School 3 Teacher Awareness: % who . . . Know their school has SOAR After School Prg 100% 93% Can ID SOAR Leaders 78% 83% Feel somewhat/very conf. describing activities 34% 85% 57% Teacher Involvement: % who . . . Have a role in SOAR 30% 67% 50% Have been involved with SOAR 35% 58% Progress and Impacts: % who . . . Describe progress as somewhat/very noticeable Feel SOAR efforts result in positive outcomes 36% 75% 43% Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

98 Survey Result Example % of 2005-06 Freshman who . . . Peer Study Group
Disaggregated Data % of Freshman who . . . Peer Study Group Total Yes n=232 No n=247 N=479 Reported struggling to maintain grades 36% 58% 47% Are planning to enroll for the sophomore year at this school 89% 72% 80% Note: A total of 1000 Freshmen were enrolled , about ½ of whom were involved in Peer Study groups. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

99 Types of Surveys Mail Surveys (must have correct addresses and return instructions, must conduct tracking and follow-up). Response is typically low. Electronic Surveys (must be sure respondents have access to internet, must have a host site that is recognizable or used by respondents; must have current addresses). Response is often better. Web + (combining mail and e-surveys). Data input required, analysis is harder. The type of survey to administer depends on the type of information that is being gathered, how much access there is to respondents, and how much time and resources are available. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

100 Types of Surveys Phone Surveys (labor intensive and require trained survey administrators, access to phone numbers, usually CATI software). Response is generally better than mail, but must establish refusal rules. Staged Surveys (trained survey administrators required, caution must be used when collecting sensitive info). Can be administered orally, multiple response options possible, response rates very high. Intercept Surveys (require trained administrators). Refusal is high. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

101 Sampling Surveys are not always administered to every member of a group (population). Often, some members, a sample, are selected to respond. Convenience Samples. Provide useful information to estimate outcomes (e.g. 85% of respondents indicated the program had definitely helped them) Must be used cautiously, generalization limited. Random Samples. Everyone must have equal opportunity. Careful administration and aggressive follow-up needed. Generalization/prediction possible. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

102 How Many Surveys Do you Need to Administer?
The sample should be as large as probabilistically required. (Probability – not Percentage) If a population is smaller than 100, include them all. When a sample is comparatively large, adding cases does not increase precision. When the population size is small, relatively large proportions are required and vice versa. You must always draw a larger sample than needed to accommodate refusal. Desired sample size ÷ (1-refusal proportion) Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

103 Survey Result Example % of 2005-06 Freshman who . . . Peer Study Group
Disaggregated Data % of Freshman who . . . Peer Study Group Total Yes n=232 No n=247 N=479 Reported struggling to maintain grades 36% 58% 47% Are planning to enroll for the sophomore year at this school 89% 72% 80% Note: A total of 1000 Freshmen were enrolled , about ½ of whom were involved in Peer Study groups. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

104 How Big Should Your Sample Be?
Identify the population size, desired confidence and sampling error thresholds. 95% confidence with 5% error is common. With the right sample size you can be 95% confident that the answer given by respondents is within 5 percentage points of the answer if all members of the population had responded. Use this formula: n=385/(1+(385/all possible respondents)). OR Consult a probability table (see manual). Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

105 How Many Surveys Do you Need to Administer?
The sample should be as large as probabilistically required. (Probability – not Percentage) If a population is smaller than 100, include them all. When a sample is comparatively large, adding cases does not increase precision. When the population size is small, relatively large proportions are required and vice versa. You must always draw a larger sample than needed to accommodate refusal. Desired sample size ÷ (1-refusal proportion) Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

106 Increasing Response Rate
Write a good survey and tailor administration to respondents. Advertise survey purpose and administration details in advance. Carefully document who receives and completes surveys. Aggressively follow-up. Send reminders. Consider using incentives. Make response easy. Remember: Non-response bias can severely limit your ability to interpret and use survey data. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

107 Calculating Response Rates
Response rate is calculated by dividing the number of returned surveys by the total number of “viable” surveys administered. Desirable response rates should be determined in advance of analysis and efforts should be made to maximize response. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

108

109 Things to Think about Before Administering a Survey
Target group: who, where, sampling? Respondent assistance, A/P consent Type of survey, frequency of administration Anonymity vs. Confidentiality Specific fielding strategies, incentives? Time needed for response Tracking administration and response Data analysis plans Storing and maintaining confidentiality Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

110

111 After School Program Feedback
Table 4a: Percent of Respondents Who Thought Participation in Theatre Classes and the Spring Production Helped* Them in the Following Ways * Some or A lot 9th Grade n=71 10/11th Grade n=97 Work collaboratively with others 90% (41%) 95% (58%) Try new things 85% (37%) 96% (58%) Listen actively 84% (37%) 89% (55%) See a project through from beginning to end 79% (32%) 81% (39%) Learn to value others’ viewpoints 71% (33%) 78% (29%) Become more confident in front of others 68% (35%) 82% (46%) Use an expanded vocabulary 67% (21%) 72% (28%) With memorization 63% (29%) 78% (40%) Express yourself with words 63% (16%) 83% (35%) Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

112 Steps to Take When Analyzing Quantitative Data
Develop an Analysis Plan Code and Enter Data Verify Data Entry (randomly or x%) Prepare Data for Analysis Conduct Analyses According to the Plan Develop Tables, Figures and Narrative Summaries to Display Results of Analysis Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

113 FCPS Selected Findings
Participants reported that plans are useful and used. % Agreeing 2003 N=65 Written plan offers options that can be implemented 100% Letter of intent will guide future caregivers Intend to use the information to plan for loved one’s needs 98% Would recommend FCPS to a friend

114 FCPS Selected Findings
Participants consistently reported that plans are useful and used. % Agreeing 2003 N=65 2005 N=88 2007 N=89 2009 N=133 TOTAL N=988 Written plan offers options that can be implemented 100% Letter of intent will guide future caregivers 99% Intend to use the information to plan for loved one’s needs 98% Would recommend FCPS to a friend 96% Fewer than 3% of customers indicated they were unsatisfied with any of the above.

115 Anita M. Baker Evaluation Services
Building Evaluation Capacity Session 4 Surveys (e-Surveys), Record Reviews and Quantitative Analysis Anita M. Baker Evaluation Services Bruner Foundation Rochester, New York

116 Things to Think about Before Administering a Survey
Target group: who, where, sampling? Respondent assistance, A/P consent Type of survey, frequency of administration Anonymity vs. Confidentiality Specific fielding strategies, incentives? Time needed for response Tracking administration and response Data analysis plans Storing and maintaining confidentiality Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

117 E-Surveys – Primary Uses
Collecting survey data Alternative Administration Increases ease of access for some Generating hard copy surveys Entering and analyzing data Acct: BECAccount Password: Evalentine Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

118 E-Surveys – Key Decisions
What Question types do you need? How will they be displayed? Do you need an “other” field? Should they be “required?” How will you reach respondents? How will you conduct follow-up? Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

119

120 Record Reviews: Accessing existing internal information, or information collected for other purposes. Can be focused on own records records of other orgs adding questions to existing docs Instruments are called – protocols USE REC REVIEW TO: Collect some behavioral reports Conduct tests, collect test results Verify self-reported data Determine changes over time Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

121 Collecting Record Review Data
Review existing data collection forms (suggest modifications or use of new forms if possible). Develop a code book or at least a data element list keyed to data collection forms. Develop a “database” for record review data. Develop an analysis plan with mock tables for record review data. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

122 Record Review Analysis Example
CDR EF MHA MS CENTRAL TOTAL Number of Participants AGE at INTAKE (Convert to %s) 17 and Younger 18 – 21 22 – 34 35 – 49 50 – 64 65 and Older PRIMARY DISABILITY (%s) Neurological Developmental/Cognitive Physical Chronic Disease/Illness Psychiatric Sensory Other Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

123 Record Review Example: Descriptive
CDR EF MHA MS CENTRAL TOTAL Number of Participants 32 45 33 43 157 310 AGE at INTAKE 17 and Younger 3% 4% 10% 7% 18 – 21 13% 47% 20% 22 – 34 29% 19% 18% 17% 35 – 49 39% 27% 34% 40% 28% 30% 50 – 64 36% 22% 38% 23% 65 and Older 9% PRIMARY DISABILITY Neurological 60% 98% Developmental/Cognitive 31% 78% 43% Physical 6% 2% Chronic Disease/Illness 1% Psychiatric 97% 11% Sensory Other Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

124 Record Review Example: Evaluative
Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

125 Sources of Record Review Data
Available Administrative Data Other Extant Data Intake Forms Attendance Rosters Program Logs (e.g., daily activity descriptions ) Evaluation Forms (e.g., customer satisfaction surveys, session assessments) Case Files or Case Management Data (these may include both internal data – such as progress toward internally established goals; and external data – such as reports about a participant’s living arrangements, employment or childbearing status). Exit or Follow-up Data Assessments (these may also include both internal data – such as culminating knowledge measurements at the end of a cycle; and external data such as test scores, report card grades; scale scores on a behavioral scale; medical or substance use test results). Census Data -- available on the internet, in libraries or by demand from marketing firms. Vital Statistics -- also available on the internet, in libraries and from local health departments Topical Outcome Data -- e.g., crime statistics, birth outcomes, juvenile arrest data KIDS COUNT child well-being indicators National survey data -- e.g., NELS, NLS, YRBS Community Profile Data UI (unemployment insurance) data Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

126

127 What Happens After Data are Collected?
Data are analyzed, results are summarized. Findings must be converted into a format that can be shared with others. Action steps should be developed from findings Step 3 moves evaluation from perfunctory compliance into the realm of usefulness. “Now that we know _____ we will do _____.” Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

128 Important Data-Related Terms
Data can exist in a variety of forms Records: Numbers or text on pieces of paper Digital/computer: Bits and bytes stored electronically Memory: Perceptions, observations or facts stored in a person’s mind Qualitative, Quantitative Primary v. Secondary Data Variables (Items) Unit of Analysis Duplicated v. Unduplicated Unit Record (Client-level) v. Aggregated May reach logical conclusions rather than definitive answers Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

129 Analyzing Quantitative Data: A Few Important Terms*
Case: individual record (e.g., 1 participant, 1 day, 1 activity) Demographics: descriptive characteristics (e.g., gender) Disaggregate: to separate or group information (e.g., to look at data for males separately from females) – conducting crosstabs is a strategy for disaggregating data. Partition(v): another term that means disaggregate. Unit of Analysis: the major entity of the analysis – i.e., the what or the whom is being studied (e.g., participants, groups, activities) Variable: something that changes (e.g., number of hours of attendance) *common usage Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

130 Plan your Analysis in Advance!
What procedures will be conducted with each set of data and who will do them? How will data be coded and recoded? How will data be disaggregated (i.e. “broken out for example by participant characteristics, or time). How will missing data be handled. What analytical strategies or calculations will be performed (e.g., frequencies, cross-tabs). How comparisons will be made. Whether/which statistical testing is needed. Analytical procedures – such as frequency – the percent that experienced a certain factor or result, or comparisons among groups. Participant characteristics may include age, gender, race/ethnicity, - we will cover more detail on the ways to break down your data in this presentation when we get to the analyses in more details For example, Danielle may have a certain target for how may or what percentage of the moms will agree to breastfeed upon delivery. If analyzing results of the # of moms breastfeeding after delivery, will want to compare it to the target of how many they were aiming for. ***I got a couple sample analysis plans from Tina that we can include as examples in the resources section. They are very complex, but I’ll work with Elena to simplify them. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

131 Record Review Example: Descriptive
CDR EF MHA MS CENTRAL TOTAL Number of Participants 32 45 33 43 157 310 AGE at INTAKE 17 and Younger 3% 4% 10% 7% 18 – 21 13% 47% 20% 22 – 34 29% 19% 18% 17% 35 – 49 39% 27% 34% 40% 28% 30% 50 – 64 36% 22% 38% 23% 65 and Older 9% PRIMARY DISABILITY Neurological 60% 98% Developmental/Cognitive 31% 78% 43% Physical 6% 2% Chronic Disease/Illness 1% Psychiatric 97% 11% Sensory Other Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

132 Q Data Analysis: Basic Steps
Organize and arrange data (number cases as needed). Scan data visually. Code data per analysis plan. Enter and verify data. Determine basic descriptive statistics. Recode data as needed (including missing data). Develop created variables. Re-calculate basic descriptive statistics. Conduct other analyses per plan Basic descriptive stats: frequencies (counts) and measures of central tendency (mean, mode, median, range) to see if anything looks unusual When visually scanning data – any odd numbers or dates, zeroes where they should not be? Recording missing data in standard format - (use blank cells if currently has non-numeric data where only numeric data should be) – can use 999 for missing data – just be sure to be consistent. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

133 Coding and Data Entry Create codebook(s) as needed (identify codes and affix them to instrument copies). Create electronic database when possible (use Excel,SPSS, SAS). ID/create unique identifiers for cases and affix or enter as needed. Enter or extract data as needed (do not recode as data are entered). Make (electronic or paper) copies of your data. Codebook – for example for survey, will want to do it right on a master copy of the survey and then write codes onto the surveys by responses (talk to Elena about this and YDI example) Create unique ids if preserving anonymity Copies of data are for editing, cutting and pasting, etc. - store your master copy of data away Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

134 Strategies for Analyzing Quantitative Data
Important Things to Look at or Summarize Frequencies: How often a response or status occurs. Total and Valid Percentages: Frequency/total *100 Measures of Central Tendency: Mean, Median, (Modes) Distribution: Minimum, Maximum, Groups (*iles) Cross-Tabulations: Relationship between two or more variables (also called contingency analyses, can include significance tests such as chi-square analyses) Useful, 2nd Level Procedures Means testing (ANOVA, t-Tests) Correlations Regression Analyses For example, we might say that the mean length of stay in a shelter is 11 days; yet the mode is 7 days (most common length of stay), and the median is 9 days – out of 30 cases, 15 are less than 9 days and 15 are above 9 days. Frequencies: Of the 30 grandparents taking the survey, 20 said that the support they receive at the center helps them care for their grandchildren. Percentages – that means that 67% (20/30*100) say that the center helps them care for their grandchildren Ratios: 10 of the 20 women receiving prenatal counseling through the Maternity care program chose to breastfeed after delivery. Cross-tabs: (i.e. female participants were more likely to complete the program than the male participants) If you were trying to say whether the relatinoship holds up in the larger population or is just due to chance, then you need to move to inferential statistics and would do what is called a chi square test to see if the relationship is significant but if just using to describe the data you have – for example you have the whole program population data that you are describing – then can use a cross-tab without a significance test. Bruner Foundation Rochester, New York Anita M. Baker, Evaluation Services

135 Anita M. Baker Evaluation Services
Building Evaluation Capacity Session 5 Interviews, Observations, Analysis of Qualitative Data Anita M. Baker Evaluation Services Bruner Foundation Rochester, New York

136 How are Evaluation Data Collected?
Surveys Interviews Observations Record Reviews All have limitations and benefits All can be used to collect either quantitative or qualitative data Require preparation on the front end: Instrument Development and testing Administration plan development Analysis plan development Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

137 Evaluation Data Collection Options
Qualitative Data Quantitative Data Interviews Conducting guided conversations with key people knowledgeable about a subject Record Review Collecting and organizing data about a program or event and its participants from outside sources Surveys Administering a structured series of questions with discrete choices Focus Groups Facilitating a discussion about a particular issue/question among people who share common characteristics We will discuss surveys, observations, focus groups, interviews, and record reviews in depth External Record Review Utilizing quantitative data that can be obtained from existing sources Observations Documenting visible manifestations of behavior or characteristics of settings Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

138 What are Qualitative Data?
Qualitative data - come from surveys, interviews, observations and sometimes record reviews. They consist of: descriptions of situations, events, people, interactions, and observed behaviors; direct quotations and ratings from people about their experiences, attitudes, beliefs, thoughts or assessments; excerpts or entire passages from documents, correspondence, records, case histories, field notes. Collecting and analyzing qualitative data permit study of selected issues in depth and detail and help to answer the “why questions.” ! Qualitative data are just as valid as quantitative data! Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

139 How are evaluation data collected?
Surveys Interviews Observations Record Reviews All have limitations and benefits All can be used to collect either quantitative or qualitative data Require preparation on the front end: Instrument Development and testing Administration plan development Analysis plan development Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

140 Observations Observations are conducted to view
and hear actual program activities so that they can be described thoroughly and carefully. Observations can be focused on programs overall or participants in programs. Users of observation reports will know what has occurred and how it has occurred. Observation data are collected in the field, where the action is, as it happens. Instruments are called protocols, guides, sometimes checklists Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

141 Use Observations: To document implementation.
To witness levels of skill/ability, program practices, behaviors. To determine changes over time.

142 Trained Observers Can:
See things that may escape awareness of others Learn about things that others may be unwilling or unable to talk about Move beyond the selective perceptions of others Present multiple perspectives Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

143 Other Advantages The observer’s knowledge and direct experience can be used as resources to aid in assessment Feelings of the observer become part of the observation data OBSERVER’S REACTIONS are data, but they MUST BE KEPT SEPARATE Let’s try it . . . Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

144 Methodological Decisions: Observations
What should be observed and how will you structure your protocol? (individual, event, setting, practice) How will you choose what to see? Will you ask for a “performance” or just attend a regular session, or both? Strive for “typical- ness.” Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

145 Methodological Decisions: Observations
Will your presence be known, or unannounced? Who should know? How much will you disclose about the purpose of your observation? How much detail will you seek? (checklist vs. comprehensive) How long and how often will the observations be? Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

146 Conducting and Recording Observations: Before
Clarify the purpose for conducting the observation Specify the methodological decisions you have made Collect background information about the subject (if possible/necessary) Develop a specific protocol to guide your observation Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

147 Conducting and Recording Observations: During
Use the protocol to guide your observation and record observation data BE DESCRIPTIVE (keep observer impressions separate from descriptions of actual events) Inquire about the “typical-ness” of the session/event. Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

148 Conducting and Recording Observations: After
Review observation notes and make clarifications where necessary. clarify abbreviations elaborate on details transcribe if feasible or appropriate Evaluate results of the observation. Record whether: the session went well, the focus was covered, there were any barriers to observation there is a need for follow-up Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

149 Observation Protocols
Comprehensive Setting Beginning, ending and chronology of events Interactions Decisions Nonverbal behaviors Program activities and participant behaviors, response of participants Checklist – “best” or expected practices Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

150 Analyzing Observation Data
Make summary statements about trends in your observations Every time we visited the program, the majority of the children were involved in a literacy development activity such as reading, illustrating a story they had read or written, practicing reading aloud. Include “snippets” or excerpts from field notes to illustrate summary points. Take a minute to read slide 14 in your notes! Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

151

152 Interviews An interview is a one-sided conversation between an interviewer and a respondent. Questions are (mostly) pre-determined, but open-ended. Can be structured or semi-structured. Respondents are expected to answer using their own terms. Interviews can be conducted in person, via phone, one-on-one or in groups. Focus groups are specialized group interviews. Instruments are called protocols, interview schedules or guides Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

153 Use Interviews: To study attitudes and perceptions using respondent’s own language. To collect self-reported assessment of changes in response to program. To collect program assessments. To document program implementation. To determine changes over time. Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

154 Methodological Decisions: Interviews
What type of interview should you conduct? Unstructured Semi-structured Structured Intercept What should you ask? How will you word and sequence the questions? What time frame will you use (past, present, future, mixed)? Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

155 Interviews: More About Methodological Decisions
How much detail and how long to conduct? Who are respondents? (Is translation necessary? How many interviews, on what schedule? Will the interviews be conducted in-person, by phone, on-or-off site? Are group interviews possible/useful? Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

156 Conducting and Recording Interviews: Before
Clarify purpose for the interview. Specify answers to the methodological decisions. Select potential respondents – sampling. Collect background information about respondents. Develop a specific protocol to guide your interview (develop an abbreviation strategy for recording answers). Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

157 Conducting and Recording Interviews: During
Use the protocol (device) to record responses. Use probes and follow-up questions as necessary for depth and detail. Ask singular questions. Ask clear and truly open-ended questions. Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

158 Conducting and Recording Interviews: After
Review interview responses, clarify notes, decide about transcription. Record observations about the interview. Evaluate how it went and determine follow-up needs. Identify and summarize some key findings. Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

159

160 Tips for Effective Interviewing
Communicate clearly about what information is desired, why it’s important, what will happen to it. Remember to ask single questions and use clear and appropriate language. Avoid leading questions. Check (or summarize) occasionally. Let the respondent know how the interview is going, how much longer, etc. Understand the difference between a depth interview and an interrogation. Observe while interviewing. Practice Interviewing – Develop Your Skills! Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

161 More Tips Recognize when the respondent is not clearly answering and press for a full response. Maintain control of the interview and neutrality toward the content of response. Treat the respondent with respect. (Don’t share your opinions or knowledge. Don’t interrupt unless the interview is out of hand). Practice Interviewing – Develop Your Skills! Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

162 Analyzing Interview Data
Read/review completed sets of interviews. Record general summaries Where appropriate, encode responses. Summarize coded data Pull quotes to illustrate findings. Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

163 Analyze interviews pp. 116 - 119

164 Analysis of Qualitative Data Analytical Strategies Similar For Qualitative and Quantitative Data
Consider how you plan to use findings, -- who is the audience? what format works best? Plan your analysis in advance. How does the data fit within overall evaluation plan, other data? How will findings fit in the overall report plan? How will you code, display and draw conclusions about data? How will you validate/verify and adjust your findings? Be careful interpreting data! Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

165 There is no single process!
Analysis Plan Specifics, You Must Decide . . . What procedures will be conducted with each set of data and who will do them? How will data be partitioned? What types of codes will be applied to the data? How will comparisons be made? Data to other project data (within group) Data to expectations Data to data from other sources (across groups) There is no single process! Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

166 Steps to Take When Analyzing
Qualitative Data Segment or partition data (i.e., divide it into meaningful analytical units) Reduce data Code data Compare data Organize, summarize and display data Draw conclusions, verify/validate results Revise summaries and displays accordingly Process is Iterative Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

167 Coding Qualitative Data
A priori or deductive codes: predetermined categories based on accepted theory or program knowledge Inductive: based on raw data (not predetermined) Hierarchical: larger categories with subcategories in each You can combine inductive and deductive within a hierarchical coding scheme I will give an example of combining codes. May have a pre-determined code of transportation barriers when analyzing a survey that asks about access to certain services. Then I may come up with (based purely on responses), sub categories of time of day that the buses run, safety on the bus lines is questionable Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

168 Coding Strategies and Reminders
Keep a master list of codes Distinguish a priori and inductive codes Re-apply codes to all segments Use multiple codes, but keep coding schemes as simple as possible Test out sample entries to identify potential problems before finalizing code selections Check for inter/intra coder reliability (consistency) Coding is not exact (expect differences) Co-occurring codes (more than one applies) Face-sheet codes (descriptors) Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

169 Enumeration A strategy for organizing, summarizing, and displaying qualitative data Quantify frequency of codes,* or types Use counts to define results (e.g., most responses were positive; all responses fell into 4 categories – the category most exemplified was __________). * e.g., none, some, a lot, as a percentage Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

170 Anita M. Baker Evaluation Services
Building Evaluation Capacity Session 6 Putting it All Together: Projecting Level of Effort and Cost, (Stakeholders, Reporting, Evaluative Thinking) Anita M. Baker Evaluation Services Bruner Foundation Rochester, New York

171 Good Evaluation Designs Include the Following
Summary information about the program Questions to be addressed by the evaluation Data collection strategies that will be used The individuals who will undertake the activities When the activities will be conducted Products of the evaluation (who will receive them and how they should be used) Projected costs to do the evaluation Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

172 Increasing Rigor in Program Evaluation
Mixed methodologies Multiple sources of data Multiple points in time Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

173 Thinking about . . . . Data Analysis
Putting it all together: Combining Data from Multiple Sources ** Develop an overall analysis plan, guided by your evaluation questions. Clarify why you are using multiple methods. Sequence: Are you using one form of data collection to inform the design of the next (i.e. informant interviews prior to record review)? To answer different questions: Your different methods may be designed to answer different evaluation questions. For example, record review may be your source of information on case-worker compliance with protocols, while interviews may be your source on how they are using the new protocols. Triangulation: Both sources may answer the same question. Interviews with caseworkers is one source of information on compliance with the new protocols, record reviews could be another source to substantiate the interview findings. Plan out how you will join your analysis across the methods and determine the overall findings. E.g., I will analyze the interview data to determine the extent to which caseworkers are using the new protocols and I will then check these results against the record review data. I will also examine the records for specific examples of the types of protocol use that caseworkers report in the interviews. Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

174 Projecting Level of Effort
LOE projections are often summarized in a table or spreadsheet. To estimate labor and time: List all evaluation tasks Determine who will conduct each task Estimate time required to complete each task in day or half-day increments (see page 77 in your manual). Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

175 Projecting Level of Effort: List Tasks
Proposed Workplan for the Beehives Project, Phase I STAFF ASSISTANCE Submitted to: One Economy Submitted by: Evaluation Inc. July - November, 2009 TIMELINE Project Director Project Staff Admin. Asst. Client Input Design Draft Survey Instrument Develop draft with questions for Beehive, Money and Jobs users: and draft analysis plan Review with One Economy mgmt and key staff Conference call regarding revisions/piloting Address Incentives Discuss incentives during conference call re:revisions Devise incentives options plan, obtain incentives Conduct Mock Survey Launch Convert draft paper survey to electronic format Review and annotate mock e-survey Launch mock survey and obtain client feedback Launch Survey for 30 Days Make final revision to survey text, launch Develop analysis plan, obtain approvals Determine follow-up strategy Survey site management Conduct follow-up activities

176 Projecting Level of Effort: Estimate Time
Proposed Workplan for the Beehives Project, Phase I STAFF ASSISTANCE Submitted to: One Economy Submitted by: Evaluation Inc. July - November, 2009 TIMELINE Project Director Project Staff Admin. Asst. Client Input Design Draft Survey Instrument Develop draft with questions for Beehive, Money and Jobs users: and draft analysis plan 1 Review with One Economy mgmt and key staff 0.5 Conference call regarding revisions/piloting 0.25 Address Incentives Discuss incentives during conference call re:revisions Devise incentives options plan, obtain incentives Conduct Mock Survey Launch Convert draft paper survey to electronic format Review and annotate mock e-survey Launch mock survey and obtain client feedback Launch Survey for 30 Days Make final revision to survey text, launch Develop analysis plan, obtain approvals Determine follow-up strategy Survey site management 2 Conduct follow-up activities

177 Projecting Timelines Timelines can be constructed separately or
embedded in an LOE chart (see example pp. 77- 78). To project timelines: Assign dates to your level of effort, working backward from overall timeline requirements. Be sure the number of days required for a task and when it must be completed are in sync and feasible. Check to make sure evaluation calendar is in alignment with program calendar. Don’t plan to do a lot of data collecting around program holidays Don’t expect to collect data only between 9 and 5 Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

178 Projecting Level of Effort: Identify Dates
Proposed Workplan for the Beehives Project, Phase I STAFF ASSISTANCE Submitted to: One Economy Submitted by: Evaluation Inc. July - November, 2009 TIMELINE Project Director Project Staff Admin. Asst. Client Input Design Draft Survey Instrument Develop draft with questions for Beehive, Money and Jobs users: and draft analysis plan by 7/27 1 Review with One Economy mgmt and key staff by 7/31 0.5 Conference call regarding revisions/piloting by 8/5 0.25 Address Incentives Discuss incentives during conference call re:revisions Devise incentives options plan, obtain incentives by 8/10 Conduct Mock Survey Launch Convert draft paper survey to electronic format Review and annotate mock e-survey Launch mock survey and obtain client feedback 8/10 - 8/15 Launch Survey for 30 Days Make final revision to survey text, launch 9/1 - 9/30 by 8/26 Develop analysis plan, obtain approvals Determine follow-up strategy Survey site management as needed 2 Conduct follow-up activities 10/1 - 10/5

179

180 Clearly Identify Audience Decide on Format
Who is your audience? Staff? Funders? Board? Participants? Multiple What Presentation Strategies work best? PowerPoint Newsletter Fact sheet Oral presentation Visual displays Video Storytelling Press releases Report full report, executive summary, stakeholder-specific report? Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

181 Think About Communication Strategies
Are there natural opportunities for sharing (preliminary) findings with stakeholders? At a special convening At regular or pre-planned meetings During regular work interactions (e.g., clinical supervision, staff meetings, board meetings) Via informal discussions Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

182 Additional Reporting Tips
Findings can be communicated in many forms. * brief memos * powerpoint presentations * oral reports * formal evaluation report is most common Think about internal and external reporting. Plan for multiple reports. Before you start writing, be sure to develop an outline and pass it by some stakeholders. If you’re commissioning an evaluation report, ask to see a report outline in advance. If you are reviewing others’ evaluation reports, don’t assume they are valuable just because they are in a final form. Review carefully for the important components and meaningfulness. Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

183 Components of A Strong Program Evaluation Report
Description of the subject program. Clear statement about the evaluation questions and the purpose of the evaluation. Description of actual data collection methods Summary of key findings (including tables, graphs, vignettes, quotes, etc.) Discussion or explanation of the meaning and importance of key findings Suggested Action Steps Next Steps (for the program and the evaluation) Issues for Further Consideration (loose ends) Introduction Methods Findings Conclusions Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

184

185 Budgeting and Paying for Evaluation
Usually the cost to do good evaluation is equivalent to about 10 – 15% of the costs to operate the program effectively. Most of the funds for evaluation pay for the professional time of those who develop designs and tools, collect data, analyze data, summarize and present findings. Other expenses include overhead and direct costs associated with the evaluation (e.g., supplies, computer maintenance, communication, software) Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

186 Projecting Budgets Determine rates for all “staff” to the project.
Calculate total labor costs by multiplying LOE totals by “staff” rates. Estimate other direct costs (ODC) such as copying, mail/delivery, telephone use and facilities. Estimate any travel costs. Calculate the subtotal of direct costs including labor (fringe where appropriate), ODC and travel. Estimate additional indirect (overhead) costs, where appropriate, as a percentage applied to the direct costs. Apply any other fees where appropriate. Sum all project costs to determine total cost of project. Establish a payment schedule, billing system and deliverables. Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

187 Things to Avoid when Budgeting and Paying for Evaluation
It is bad practice to assume there is a standard, fixed evaluation cost regardless of program size or complexity. It is dangerous to fund an evaluation project that does not clarify how evaluation funds will be used. Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

188 What Should Thoughtful Organizations Do to Obtain Funds for Evaluation?
Write evaluation costs into project development budgets. Use the money accordingly. Set aside funds for evaluation on a percentage basis into the organizational budget. Develop and follow a plan to use these funds. Obtain funds solely for the purpose of evaluation. Consider sharing and/or pooling resources. Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

189

190 Who are Evaluation Stakeholders, and Why Do They Matter?
Decision-makers Information-seekers Those directly involved with the evaluation subject Most programs/strategies have multiple stakeholders Organization managers, clients and/or their caregivers, program staff, program funders, partner organizations Stakeholders have diverse, often competing interests related to programs and evaluation. Certain stakeholders are the primary intended users of evaluation. Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

191

192 What is Evaluative Thinking?
Evaluative Thinking is a type of reflective practice that incorporates use of systematically collected data to inform organizational decisions and other actions. MARCIE Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

193 Organizations that Regularly use Evaluative Thinking Will Also . . .
Think carefully about developing and assessing programs and other actions. Incorporate program evaluation findings and other assessment findings into program and other planning. Involve significant others in planning and revising plans. Develop written, logical plans. Follow plans. Have strategies in place to modify plans Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

194 Organizations That Regularly Use Evaluative Thinking Will . . .
Regularly conduct evaluations that include attention to characteristics, activities and outcomes of selected programs. Involve program staff, org. leaders and clients (as appropriate) in developing/revising program evaluation plans; collecting and analyzing program evaluation data. Share results of program evaluations including findings about client outcomes, as appropriate, with leaders, staff, clients, board members and funders. Use results of program evaluation to drive continuous improvement of programs. Use results to modify policies and procedures. Bruner Foundation Rochester, New York Anita Baker, Evaluation Services

195 Organizations That Regularly Use Evaluative Thinking Will Also . . .
Insure that there are key staff with evaluation expertise to address the organization’s evaluation needs and that there are staff members whose jobs or components of their jobs are dedicated to evaluation. Hire evaluation consultants when needed. Provide or obtain training in evaluation for program staff members and make sure that the training is current, well-delivered, and provided for enough staff members to insure that evaluation use is a standard practice. Bruner Foundation Rochester, New York Anita Baker, Evaluation Services


Download ppt "How to Use the Bruner Foundation Guide & Powerpoint Slides Evaluation Essentials:A Guide for Nonprofit Organizations and Their Evaluation Partners. (the."

Similar presentations


Ads by Google