Presentation is loading. Please wait.

Presentation is loading. Please wait.

Program Evaluation Webinar Series Part 2:

Similar presentations


Presentation on theme: "Program Evaluation Webinar Series Part 2:"— Presentation transcript:

1 Program Evaluation Webinar Series Part 2:
“Getting Started and Engaging Your Stakeholders” Presented by: Leslie Fierro and Carlyn Orians I’d like now to introduce today’s speakers.

2 Getting Started and Engaging Your Stakeholders
Leslie A. Fierro, MPH TKCIS Contractor NCEH/CDC Carlyn Orians, MA, Battelle Centers for Public Health Research and Evaluation Presented November 20, 2008 First is Leslie Fierro. Leslie is a part-time epidemiologist contracted with the Air Pollution and Respiratory Health Branch at CDC where she leads the development of evaluation indicators efforts within the Community Interventions and Evaluation Team. She’s worked extensively in the areas of environmental epidemiology and public health surveillance. Leslie holds a Master’s of Public Health Degree in Epidemiology and Biostatistics and a Certificate of Advanced Study in Evaluation. She’s currently pursuing a PhD in Evaluation and Applied Research Methods at Claremont Graduate University. Our second speaker today is Carlyn Orians. Carlyn is an evaluator with the Battelle Centers for Public Health Research and Evaluation in Seattle, Washington. She’s worked with CDC and other agencies to design and conduct evaluations of many federal health programs. She’s currently developing evaluation guidance materials to build evaluation capacity among state grantees that are part of the National Asthma Control Program. Carlyn is currently pursuing a Certificate of Advanced Study in Evaluation through Claremont Graduate University. Leslie, you want to start us off today? I sure do. Great. Well, hello and thank you so much for joining us today. I am Leslie Fierro and as Lani Wheeler noted, I’m joined by my colleague, Carlyn Orians, and we are just delighted to have been invited to present on a topic today that is so important to evaluation, engaging stakeholders. Can we go to next slide, please?

3 Agenda Definition of evaluation.
Difference between evaluation & research. CDC’s Evaluation Framework. The who, why, when & how of stakeholders. Tangible examples of engaging stakeholders. Here’s a list of the topics that we will be covering during this Webinar. Specifically, the definition of evaluation, what makes the evaluation different from research, and CDC’s evaluation framework. We will also discuss in detail, what we are calling the who, why, when, and how of stakeholders. Additionally, we have drawn upon our work to identify examples that we think illustrate how stakeholders can be involved in each of the six steps of CDC’s evaluation framework, as well as how engaging stakeholders helps to facilitate designing and conducting quality evaluations that follow the evaluation standards of utility, feasibility, propriety and accuracy. Next slide.

4 What is the Purpose of Evaluation?
The purpose of evaluation can be: social betterment program planning and decision making empowerment of individuals Okay. To set the stage for our presentation today, let’s first revisit the definition of evaluation. Now actually there is no one definition of evaluation. And part of the reason for this is that evaluators differ in what they believe the ultimate purpose of evaluation is. Some examples include social betterment, use of evaluation information to inform program planning and decision making by program managers, and empowerment of individuals so that they can conduct their own evaluation. And this becomes important when we think of engaging stakeholders in the process of evaluation because different evaluators will involve stakeholders to a greater or a lesser degree based upon what they believe is the ultimate purpose of evaluation. So, here we selected three definitions of evaluation out of the many that exist.

5 Evaluation: Definition 1
Evaluation is the systematic investigation of the merit, worth, or significance of an object. -- Michael Scriven The first is derived from a definition that I believe is from Dr. Michael Scriven but quoted here from CDC’s Framework for Program Evaluation in Public Health. In this definition, evaluation is the systematic investigation of the merit, worth or significance of an object.

6 Evaluation: Definition 2
Evaluation is the systematic assessment of the operation and/or the outcomes of a program or policy, compared to a set of explicit or implicit standards as a means of contributing to the improvement of the program or policy. -- Carol Weiss The second definition is from Carol Weiss. And it states that evaluation is a systematic assessment of the operation and/or the outcomes of a program or policy compared to a set of explicit or implicit standards as a means of contributing to the improvement of the program or policy.

7 Evaluation: Definition 3
Evaluation is the systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or inform decisions about future programming. -- Michael Patton And last but not least, we have a definition from Michael Patton who’s the thought leader in the type of evaluation known as Utilization Focused Evaluation. He defines evaluation as the systematic collection of information about the activities, characteristics and outcomes of the program to make judgments about the program, to improve program effectiveness and/or inform decisions about future programming.

8 The Common Element Note that all 3 of these statements share a common element. .. …they all define evaluation as a systematic and formalized endeavor. So, looking across all of these different definitions, we can see one thing that they clearly have in common -- all define evaluation as a systematic endeavor. So, although we all tend to conduct evaluation activities informally everyday, we go out to dinner and we decide whether or not we like our dinner and if it’s quite right for us, here we see that evaluation is a systematic and more formalized endeavor.

9 Summative Evaluations
Summative evaluations seek to judge a program by asking, “Should this project be…” cancelled? continued? expanded? What I find particularly interesting about these definitions, however, is how they differ. The first definition emphasizes the role the evaluation in judging the quality of something. When this definition came into being which was the 1950s, 1960s, evaluators were much more likely than they are today to have the task of evaluating a pilot or a demonstration project with the intention of determining whether it should be cancelled, continued or expanded. And that’s what we now call Summative Evaluation.

10 Formative Evaluations
Formative evaluations seek to use the evaluation findings. They ask the question: “Is the program being conducted as planned?” “Is the program doing well?” Formative evaluations seek to improve programs or policies. The last two definitions emphasize the importance of using evaluation findings. In these two definitions we can see that there is an interest in assessing whether or not a program is being conducted as planned and whether the program is doing well. However, the purpose of evaluation here is to improve a program or policy, something we call Formative Evaluation.

11 The CDC Definition Evaluation is the systematic collection of information about the activities, characteristics and outcomes of the program to make judgments about the program, improve program effectiveness and/or inform decisions about future program development. This emphasis on use for programmatic improvement is the definition that we adopt for our presentation today and is also the one that’s adopted by CDC in a self-study guide that accompanies the CDC framework. This definition - the definition that’s in the self-study guide - is that evaluation is the systematic collection of information about the activities, characteristics and outcomes of the program to make judgments about the program, improve program effectiveness and/or inform decisions about future program development. Change the slide, please.

12 Research vs. Evaluation
Research and evaluation share methodologies but ask different questions. Specific to Research Specific to Evaluation Commonalities When I think about evaluation, what it is, what it isn't and why it is conducted in the manner it is, I find it helpful to think through the differences and the similarities of research and evaluation. The important part here is that research and evaluation are not all about methods. Often, however, when we conduct research or evaluation, we are tempted to get to the bottom line. We have a question and we attempt to find the most appropriate method for answering it.    However, it’s important to remember that both research and evaluation are enveloped in a broader context than just methodology. And it is that context in which these two types of investigation fit that is very important. The research and evaluation share the use of systematic methodology, and either can use quantitative, qualitative or mixed methods to generate answers to pressing questions. Research and evaluation, however, differ in their purpose.

13 To add to an existing body of theoretical knowledge.
The Research Context Purpose To add to an existing body of theoretical knowledge. Typical Questions Does the theory hold in this situation? How can we improve the theory to make better predictions in every situation? The research is primarily conducted to add to an existing body of theoretical knowledge. Researchers often test theories in their fields to uncover situations where the theory does not hold so as to create a better theory. Or they see something that strikes them as odd -- so apples falling to the ground from a tree, for example -- and set out to develop theories that explain this peculiar phenomena.

14 Research Findings Research is conducted to: test and improve theories.
develop generalizable theories. theories apply across different settings, people, and times. So, over time, researchers in a particular discipline, they test and they retest existing theories to build better ones. These theories are generalizable in nature. They are to apply across different settings, people and time.

15 The Evaluation Context
Purpose Judge the merit or worth of a program. Provide information for decision making. Typical Questions Is the program doing what was intended? What steps might be taken to improve it? The general purpose of evaluation as we saw from the previous definitions is to make a judgment of the merit or the worth of a program and to provide information for decision making. So we may ask questions such as: is the program doing what was intended? Is it producing unintended outcomes, whether positive or negative? And we might also ask what steps might be taken to improve this program or to improve upon what has already been accomplished.

16 Evaluation Findings Evaluation results: are not usually generalizable.
focus on a specific situation. evaluation of a single program in a particular context. Often, results of these evaluations are not generalizable. And I say often and not always on purpose because depending on how the evaluation is designed, one can produce generalizable information. But more often than not, evaluations focus on highly specific situations.

17 Research vs. Evaluation
Evaluation asks: “Is this program working?” Research asks: “Will this program work across multiple settings?” They’re conducted for a single program to see if it’s working in the context in which it resides. So the question then becomes, is this program working rather than will this program work across multiple settings.

18 The Role of Researchers
Play a single role as content experts. Researchers and evaluators also differ in the number and types of roles that they play. Researchers typically only play one role. So they’re experts in their particular content area.

19 The Role of Researchers
Play a single role as content experts. Identify gaps in current knowledge. They identify gaps or inconsistencies in current knowledge;

20 The Role of Researchers
Play a single role as content experts. Identify gaps in current knowledge. Derive their own questions. they derive questions to which the answers to those questions will help to fill the gaps that they’ve seen;

21 The Role of Researchers
Play a single role as content experts. Identify gaps in current knowledge. Derive their own questions. Perform their own research. and then they carry out their research.

22 The Role of Evaluators Evaluators:
Play multiple roles—facilitator, educator, scientific expert, etc. Evaluators on the other hand often play multiple roles.

23 The Role of Evaluators Evaluators:
Play multiple roles—facilitator, educator, scientific expert, etc. Involve stakeholders. And this is largely because evaluation involves stakeholders in the process. So for example, the evaluator does not set out by themselves to identify and prioritize what questions will be answered in the evaluation.

24 The Role of Evaluators Evaluators:
Play multiple roles—facilitator, educator, scientific expert, etc. Involve stakeholders. Collaborate to identify and prioritize questions. As you will see in some of the upcoming examples, evaluators work with stakeholders to generate a list of possible evaluation questions and to prioritize these questions often based upon what is feasible and useful for the program. Most evaluators often play many roles such as facilitator, educator, scientific experts, and even in some cases counselor. Change slide, please.

25 In a Nutshell… “Research seeks to prove, evaluation seeks to improve.”
M.Q. Patton So to boil this all down, we can say that research seeks to prove whereas evaluation seeks to improve. Next slide.

26 The Findings Must be Useful
To improve a program, the findings must be useful! How? The stakeholders must be involved. The questions must be relevant to the program. The findings must be credible to key stakeholders. And to improve a program, the findings from an evaluation must be regard as useful. Engaging stakeholders increases the likelihood that the evaluation questions posed will be relevant to the program being evaluated and that the information generated from the evaluation will be credible to key stakeholders. So Carlyn will now talk about the who, when, why and how of engaging stakeholders so that we can get a better understanding of what is involved when we engage stakeholders in the evaluation process.

27 Getting Started and Engaging Your Stakeholders
The Who, When, Why, and How of Stakeholder Involvement in CDC’s Evaluation Framework Hello. This is Carlyn Orians and, again, I want to thank you all for being here and giving us this opportunity to talk to you about how we can use stakeholders and why we might want to in process of doing an evaluation. Next slide, please.

28 “Who” are the Stakeholders?
Stakeholders are: people and/or organizations that are interested in the program, are interested in the results of the evaluation and/or have a stake in what will be done with the results of the evaluation. I’d like to start by making sure that we have a common understanding of what we mean when we say stakeholder. One definition that we use is that stakeholders are people and/or organizations that are interested in the program, are interested in the results of the evaluation and/or have a stake in what will be done with the results of the evaluation.

29 Examples of Stakeholders
Persons involved in program operations Example: asthma program staff and partners Persons served or affected by the program Example: participants in asthma interventions, asthma patients, families, caregivers, etc. Intended users of evaluation findings Example: policy makers, managers, administrators, advocates, funders, and others So as you consider for each of your programs and your evaluation activities who are the stakeholders, you want to ask that question, who are the persons who are involved in the program operations. For example, this could be your asthma program staff or the partners that you work with to implement parts of the program. It can include persons served or affected by the program. So those who are involved in the asthma interventions, whether that’s people with asthma, their families, caregivers, whoever is involved in those asthma interventions, and then the intended users of evaluation findings, which can include programs staff and partners, but it might include another set of individuals as well. It could include advocates, funders, policymakers and others.

30 Each Program is Different
Develop a list of stakeholders at the start of any evaluation activity. So for each program, that will differ a little bit and what we encourage people to do is to develop a list of the start of any evaluation activity about who those stakeholders are for your program because there is no one list that fits all, each program is different. Next slide, please.

31 Which Stakeholders Matter Most?
Review your list of stakeholders and think strategically about these questions: “Who do we need to…” For many programs, the list of stakeholders is potentially very long, and it might seem rather a daunting task to figure out how you might engage all of these stakeholders. It’s important to consider who they all are and their perspectives and interests, but what you need to do when you review the list is to think strategically about which stakeholders are the most important for the activity you’re undertaking. These four questions that are listed here are questions that can help you think about which stakeholders are the most important for your evaluation.

32 Which Stakeholders Matter Most?
Review your list of stakeholders and think strategically about these questions: “Who do we need to…” enhance credibility? Who - specifically who do you need to enhance credibility?

33 Which Stakeholders Matter Most?
Review your list of stakeholders and think strategically about these questions: “Who do we need to…” enhance credibility? implement program changes? Who do you need to help implement program changes that you want to make based on the evaluation results?

34 Which Stakeholders Matter Most?
Review your list of stakeholders and think strategically about these questions: “Who do we need to…” enhance credibility? implement program changes? advocate for changes? Who might be in a position to advocate for those changes?

35 Which Stakeholders Matter Most?
Review your list of stakeholders and think strategically about these questions: “Who do we need to…” enhance credibility? implement program changes? advocate for changes? fund, authorize, or expand the program? Who funds or authorizes the program? Who might be important if your goal is to expand the programs to different areas, different populations, or to make it permanent? So these are questions you need to ask about your longer list of stakeholders to consider who is the most important to involve. Next slide, please.

36 “Why” Engage Stakeholders?
Engaging stakeholders is an important part of the CDC Evaluation Framework. Involving stakeholders may be a requirement of your program. So why would you want to engage stakeholders? What might be the benefits? Well first of all, it is an important part of the CDC evaluation framework. So it’s an important part of the framework which has, as Leslie described, has a focus on use. And it’s also may be depending on the program that you’re with be an expectation about the way you conduct evaluations.

37 “Why” Engage Stakeholders?
Engaging stakeholders is an important part of the CDC Evaluation Framework Stakeholders will add credibility. If you want the results to be acted upon, they must be credible. And of course, stakeholders can add credibility. You want - if you want the results to carry weight and be acted on, you want to make sure that they will be viewed as credible and stakeholders can help you with that.

38 “Why” Engage Stakeholders?
Engaging stakeholders is an important part of the CDC Evaluation Framework. Stakeholders will add credibility. Stakeholders may have resources to help. Stakeholders may be able to contribute data, skills, analytical skills, etc. It may also be the case that stakeholders have important resources that you can use whether that’s data, skills, analytic skills. So think of terms, if you’re - the resources your stakeholders may bring to your efforts.

39 “Why” Engage Stakeholders?
Engaging stakeholders is an important part of the CDC Evaluation Framework. Stakeholders will add credibility. Stakeholders may have resources to help. Stakeholders may be critical to implementing or advocating for action based on the results. They also may be critical to implementing or advocating for action based on the results.

40 “Why” Engage Stakeholders?
Engaging stakeholders is an important part of the CDC Evaluation Framework. Stakeholders will add credibility. Stakeholders may have resources to help. Stakeholders may be critical to implementing or advocating for action based on the results. You will build trust and understanding among program constituents. Involving stakeholders helps to reduce fear of the evaluation process. And finally, you may be able to build trust and understanding among the program constituents through involving them in the evaluation. I think many people have a great deal of fear about evaluation and the fear tends to be greater to the extent that people don’t know what it is. And I’ve often found that when you engage them in the process, even if it’s just to keep them informed about it, what it’s going to cover, how it’s going to be used, it makes people more comfortable. It helps bring up - build understanding and build trust. Next slide, please.

41 “Why” Would Stakeholders Want to be Involved?
Get answers to their questions. You may wonder why would stakeholders want to be involved. And indeed not all stakeholders do want to be involved. And that’s an important first step is figuring out who is interested and what their interest might be. But some of the reasons stakeholders might be interested in participating in evaluation is that they have questions they’d like answers to.

42 “Why” Would Stakeholders Want to be Involved?
Get answers to their questions. Learn about evaluation. They may also be interested in learning about evaluation and building capacity by - for themselves or others in their organization.

43 “Why” Would Stakeholders Want to be Involved?
Get answers to their questions. Learn about evaluation. Influence the design and methods. They may also have strong feelings about the kinds of methods that would be appropriate or the design of the evaluation and they might want their voices heard on that.

44 “Why” Would Stakeholders Want to be Involved?
Get answers to their questions. Learn about evaluation. Influence the design and methods. Protect their constituents. They may also feel protective. This goes back to the question of fear and trust. If they’re protective and they’re worried that evaluation could bring harm to them, they may enter it simply - into the evaluation efforts simply to protect their interest and protect their constituents. And it’s important that you understand that.

45 “Why” Would Stakeholders Want to be Involved?
Get answers to their questions. Learn about evaluation. Influence the design and methods. Protect their constituents. Motivated to help program succeed. And many people are motivated to participate because they believe in the program. They want it to be successful. And they believe that helping with the evaluation will give them a role in seeing the success of the program. Next slide.

46 CDC’s Evaluation Framework
Ensure use and share lessons learned Gather credible evidence Engage stakeholders Describe the program Focus the evaluation design Justify conclusions STEPS Standards Utility Feasibility Propriety Accuracy So here’s the CDC evaluation framework, which is probably familiar to most of you. There are six steps in this framework. The first of those steps is engage stakeholders. The steps that follow from there are: describe the program, focus the evaluation design, gather credible evidence, justify conclusions, and share use and share lessons learned. So although the first step is engage stakeholders, it doesn’t mean that it’s something you do only at the beginning and then let go. And as we walk through the examples today, you’ll see how that first step feeds into the entire evaluation framework. Next, please.

47 “When” and “How” Can You Engage Stakeholders?
You can use the CDC Evaluation Framework to engage stakeholders in: identifying and prioritizing evaluation questions, selecting credible sources, and developing a plan for action based on evaluation results. You can use the CDC evaluation framework to engage stakeholders in identifying and prioritizing evaluation questions, in selecting credible sources, and in developing a plan for action based on those evaluation results.

48 “When” and “How” Can You Engage Stakeholders?
You can engage stakeholders in every step of the evaluation process. But… you don’t have to engage stakeholders in all these ways in every evaluation. It’s true. You actually can use stakeholders in all of those ways. But, and this is an important but, important caveat. It doesn’t mean that you have to engage them in all those ways in every evaluation. It depends on several factors. Next slide please.

49 “How” Depends on… Evaluator preference.
How you engage stakeholders depends on evaluator preference…

50 “How” Depends on… Evaluator preference. Stakeholder preference.
but it also depends on stakeholder preference. So our - is this a way in which you like to operate and way in which stakeholders would like to be engaged?

51 “How” Depends on… Evaluator preference. Stakeholder preference.
Resources. It also may depend on resources. And that you can think about in two different ways: do you have the luxury of time for engaging them? Do you have the means to meet with them and get their opinions? But you also may need their resources. As we talked about before, the stakeholders may bring important resources to the table. So it may be important to engage them if there are resources they have that you need.

52 “How” Depends on… Evaluator preference. Stakeholder preference.
Resources. Degree of trust or threats to credibility. And it may also depend on the degree of trust or threats to credibility that you believe existed in a given situation.

53 “How” Depends on… Evaluator preference. Stakeholder preference.
Resources. Degree of trust or threats to credibility. If there is a high degree of mistrust, engage stakeholders early in the evaluation process. This helps ensure that the results are viewed as credible and are acted upon. If there’s a high degree of mistrust it may be especially important to engage stakeholders upfront in considering how you will proceed with the evaluation, giving people opportunities for input, opportunities to critique so that at the end of the day, you have evaluation results that people will find credible and that they will be willing to act on. Next slide, please.

54 CDC’s Evaluation Framework
Ensure use and share lessons learned Gather credible evidence Engage stakeholders Describe the program Focus the evaluation design Justify conclusions STEPS Standards Utility Feasibility Propriety Accuracy What I’d like to do is just very briefly walk through what the six steps are in the framework and how you might think about stakeholder involvement in each of those steps. And then we’ll move into some concrete examples so you can see what this means.

55 CDC’s Framework Step 1 Who should be involved?
Ensure use and share lessons learned Gather credible evidence Engage stakeholders Describe the program Focus the evaluation design Justify conclusions STEP 1 Standards Utility Feasibility Propriety Accuracy Engage stakeholders Who should be involved? Develop list of potential stakeholders. Decide which stakeholders are the most important to include. Step 1, engage stakeholders. We’ve already talked about who should be involved. Develop your list of potential stakeholders, use those four questions to help you decide who are the most important to include.

56 CDC’s Framework Step 2 Ensure use and share lessons learned Gather credible evidence Engage stakeholders Describe the program Focus the evaluation design Justify conclusions STEP 1 Standards Utility Feasibility Propriety Accuracy Do stakeholders share a vision of what the program does and its intended outcomes? Diverse views? Similar views? Engage them early in the process. Describe the program Step 2 is describe the program. Do stakeholders share a vision of what the program does and its intended outcomes? Do they have diverse views, or do they have similar views? It’s important to know this as you embark on a program evaluation and you may want to make sure that you engage them early on to make - so that you understand what vision stakeholders have in their heads for your programs.

57 CDC’s Framework Step 3 Ensure use and share lessons learned Gather credible evidence Engage stakeholders Describe the program Focus the evaluation design Justify conclusions STEP 1 Standards Utility Feasibility Propriety Accuracy What are the most pressing and important evaluation questions for stakeholders? What questions do they need answered to be able to take action? What methods are available and preferred? What will be considered “credible evidence”? Focus the evaluation design Step 3 is focus the evaluation design. What are the most pressing and important evaluation questions for stakeholders? Leslie talked about this a little earlier. In research often an investigator comes and they have their set of questions. With evaluation, it’s much more common that it’s the stakeholders who are the source of the evaluation questions. What do they need answers to, if they’re going to take actions on the results of the evaluations, what questions do they need answers to to be able to take action, that’s a very critical role for stakeholders. And what methods can they contribute to, which ones do they prefer, or which will be - what will be considered credible evidence for them. Next slide.

58 CDC’s Framework Step 4 Ensure use and share lessons learned Gather credible evidence Engage stakeholders Describe the program Focus the evaluation design Justify conclusions STEP 1 Standards Utility Feasibility Propriety Accuracy Gather evidence stakeholders will find credible. Gather credible evidence So Step 4 is gather credible evidence. We just talked about that. The importance of knowing what your stakeholders will find is credible.

59 CDC’s Framework Step 5 Ensure use and share lessons learned Gather credible evidence Engage stakeholders Describe the program Focus the evaluation design Justify conclusions STEP 1 Standards Utility Feasibility Propriety Accuracy How do diverse stakeholders interpret the findings? May engage stakeholders in the analysis. Perhaps solicit their interpretation before results are finalized. Justify conclusions Justify conclusions. So how do diverse stakeholders interpret the findings? You may want to engage them in doing the analysis or you may prefer to bring results to them before they are finalized to get their interpretations. Have they thought of a different take on the findings than what you may have had?

60 CDC’s Framework Step 6 Ensure use and share lessons learned Gather credible evidence Engage stakeholders Describe the program Focus the evaluation design Justify conclusions STEP 1 Standards Utility Feasibility Propriety Accuracy Which stakeholders can play a role in disseminating results or acting on findings? This is a critical step for stakeholder involvement. Ensure use and share lessons learned And then the final one is ensure use and share lessons learned. And as we’ve talked about already, when you take approach to evaluation that’s focused on program improvement and on using those results to take action, this is really a critical step in which stakeholders can play a role.

61 Getting Started and Engaging Your Stakeholders
Tangible Examples of Stakeholder Involvement in the Evaluation Process What I’d like to do is walk through an example.

62 Example #1 – Asthma and Home Environment in Low-Income Apts
Intervention to improve indoor environment for children with asthma via following activities: So here I am posing for you an intervention in the home environment and low-income apartments.

63 Example #1 – Asthma and Home Environment in Low-Income Apts
Intervention to improve indoor environment for children with asthma via following activities: Provide education and training for apartment owners, building inspectors, maintenance vendors, and tenants regarding asthma triggers and housing codes. This intervention is to improve indoor environment for children with asthma through several activities: provide education and training for apartment owners, building inspectors, maintenance vendors and tenants regarding asthma triggers and housing codes.

64 Example #1 – Asthma and Home Environment in Low-Income Apts
Intervention to improve indoor environment for children with asthma via following activities: Provide education and training for apartment owners, building inspectors, maintenance vendors, and tenants regarding asthma triggers and housing codes. Work with city officials to enhance existing housing code. Work with city officials to enhance existing housing code.

65 Example #1 – Asthma and Home Environment in Low-Income Apts
Intervention to improve indoor environment for children with asthma via following activities: Provide education and training for apartment owners, building inspectors, maintenance vendors, and tenants regarding asthma triggers and housing codes. Work with city officials to enhance existing housing code. Promote smoke-free housing. Promote smoke-free housing.

66 CDC’s Evaluation Framework
Ensure use and share lessons learned Gather credible evidence Engage stakeholders Describe the program Focus the evaluation design Justify conclusions STEPS Standards Utility Feasibility Propriety Accuracy So now we’re going to walk through the six steps, how you might engage stakeholders in an evaluation of an intervention that looks like this.    Next slide.

67 Step 1: Engage the Stakeholders
Low-Income Housing Scenario Who are the stakeholders? So who should you include, who are your stakeholders in this evaluation?

68 Step 1: Engage the Stakeholders
Low-Income Housing Scenario Who are the stakeholders? affected families Possibilities include affected families which are those who are living in the housing that you’re trying to improve.

69 Step 1: Engage the Stakeholders
Low-Income Housing Scenario Who are the stakeholders? affected families those who receive training (e.g., owners, inspectors, maintenance personnel, tenants) Also those who receive training, the owners, the inspectors, the maintenance personnel, and the tenants themselves, as well as your trainers.

70 Step 1: Engage the Stakeholders
Low-Income Housing Scenario Who are the stakeholders? affected families those who receive training (e.g., owners, inspectors, maintenance personnel, tenants) trainers I know sometimes in programs, the staff - program staff do the training themselves. Sometimes you get an outside group or partner to be involved in that training. So they’re certainly one of your stakeholders, the ones that are actually providing the training.

71 Step 1: Engage the Stakeholders
Low-Income Housing Scenario Who are the stakeholders? affected families those who receive training (e.g., owners, inspectors, maintenance personnel, tenants) trainers city officials responsible for the housing code And then since you’re trying to make changes in the indoor environment, city officials who have responsibilities for the housing code may be important to stakeholders for you to consider involving.

72 Step 1: Engage the Stakeholders
Low-Income Housing Scenario Who are the stakeholders? affected families those who receive training (e.g., owners, inspectors, maintenance personnel, tenants) trainers city officials responsible for the housing code smoke-free advocates There also may be important advocacy groups out there who are, for example, advocating for smoke-free environments. And this is not a complete list.

73 Step 1: Engage the Stakeholders
Low-Income Housing Scenario Who are the stakeholders? affected families those who receive training (e.g., owners, inspectors, maintenance personnel, tenants) trainers city officials responsible for the housing code smoke-free advocates Stakeholders may include advocates as well as those resistant to change . It’s important to include a mix of views so you don’t get surprised at the end. You will need to think about the setting in which you’re operating your program, your intervention to see if there are other important stakeholders whether those are groups that are likely to be resistant or groups that are likely to be advocates for the kinds of changes that you’re trying to make with your program. And it’s important to include a mix of views because you don’t want to be surprised at the end, have failed to consider something that is important to one of these audiences.    Next slide.

74 Step 2: Describe the Program
Low-Income Housing Scenario Obtain stakeholders’ views on: So what are some options for including stakeholders in Step 2? And Step 2, again, is describing the program.

75 Step 2: Describe the Program
Low-Income Housing Scenario Obtain stakeholders’ views on: What is the current indoor environment? Well it might be very important to get their views on the current indoor environment, and the changes that they would like to see and that they think are realistic to achieve.

76 Step 2: Describe the Program
Low-Income Housing Scenario Obtain stakeholders’ views on: What is the current indoor environment? What changes are both desired and seen as realistic? I think stakeholders can be very important in assessing the realism of the changes that you’re trying to make.

77 Step 2: Describe the Program
Low-Income Housing Scenario Obtain stakeholders’ views on: What is the current indoor environment? What changes are both desired and seen as realistic? What outcomes do they think are most important? But it’s also important to understand what changes they want to make, what outcomes do they think are important, because this will be important for your evaluation design.

78 Step 2: Describe the Program
Low-Income Housing Scenario Obtain stakeholders’ views on: What is the current indoor environment? What changes are both desired and seen as realistic? What outcomes do they think are most important? Which activities contribute to which outcomes? And I mentioned this already but a logic model is one way that you can look at how activities contribute to outcomes and engage your stakeholders in that discussion. and that will be covered in more detail in the next Webinar series.    Next slide.

79 Step 2: Describe the Program
Low-Income Housing Scenario Obtain stakeholders’ views on: What is the current indoor environment? What changes are both desired and seen as realistic? Which outcomes do they think are most important? Which activities contribute to which outcomes? (You could use a logic model here… Stay tuned for the next Webinar in this series…) And that will be covered in more detail in the next Webinar series.    Next slide.

80 Step 3: Focus the Evaluation Design
Low-Income Housing Scenario What questions do the stakeholders think are the most important to ask? Step 3 is focusing the evaluation design. How might you engage the stakeholders in this step, in this particular example? You might want to know what questions they think are the most important.

81 Step 3: Focus the Evaluation Design
Low-Income Housing Scenario What questions do the stakeholders think are the most important to ask? Should the evaluation focus on changes to the housing code? Should the evaluation for example, focus more on changes to the housing code…

82 Step 3: Focus the Evaluation Design
Low-Income Housing Scenario What questions do the stakeholders think are the most important to ask? Should the evaluation focus on changes to the housing code? Focus on trigger reduction? Which triggers? or are trigger reductions most important and which triggers?

83 Step 3: Focus the Evaluation Design
Low-Income Housing Scenario What questions do the stakeholders think are the most important to ask? Should the evaluation focus on changes to the housing code? Focus on trigger reduction? Which triggers? What is the best design? And what is the best design?

84 Step 3: Focus the Evaluation Design
Low-Income Housing Scenario What questions do the stakeholders think are the most important to ask? Should the evaluation focus on changes to the housing code? Focus on trigger reduction? Which triggers? What is the best design? Is a pre-post design best? Should the evaluation look at the conditions before and after?

85 Step 3: Focus the Evaluation Design
Low-Income Housing Scenario What questions do the stakeholders think are the most important to ask? Should the evaluation focus on changes to the housing code? Focus on trigger reduction? Which triggers? What is the best design? Is a pre-post design best? Should there be a control group? Should they look at the changes that are made to a group of houses compared to apartment building in which the program has not worked to see if they really is a difference between the group you’re working with and a control group?

86 Step 3: Focus the Evaluation Design
Low-Income Housing Scenario What questions do the stakeholders think are the most important to ask? Should the evaluation focus on changes to the housing code? Focus on trigger reduction? Which triggers? What is the best design? Is a pre-post design best? Should there be a control group? Are observational methods appropriate? Or are observational methods more appropriate? And your stakeholders can be really important sources of information for that focusing of your evaluation design. Next slide, please.

87 Step 4: Gather Credible Evidence
Low-Income Housing Scenario What role can stakeholders play? So Step 4 is gathering credible evidence. How might stakeholders be engaged in that in this particular example?

88 Step 4: Gather Credible Evidence
Low-Income Housing Scenario What role can stakeholders play? Are there stakeholders that can facilitate access to residences and/or translate if necessary? Well you might consider whether there are stakeholders that can provide access to the residents or provide translation services. You may find that they can play a very important role in giving the kind of access and entry into the population that you need to successfully conduct this evaluation.

89 Step 4: Gather Credible Evidence
Low-Income Housing Scenario What role can stakeholders play? Are there stakeholders that can facilitate access to residences and/or translate if necessary? Can the trainers help collect data before, during, or after each training? Other opportunities for data collection? The trainers may also be very important. Can they help collect data before, during or after each training? Are there other opportunities for data collection?

90 Step 4: Gather Credible Evidence
Low-Income Housing Scenario What role can stakeholders play? Are there stakeholders that can facilitate access to residences and/or translate if necessary? Can the trainers help collect data before, during, or after each training? Other opportunities for data collection? What information will the stakeholders need to act? For example, will policy makers be more influenced by information on environmental conditions, asthma symptoms, or medical care usage? And then it’s also important to think about your stakeholders’ needs for information. Will they be more influenced to act based on information on environmental conditions? Will they care more about what effect it's had on the use of medical care or on asthma symptoms?

91 Step 4: Gather Credible Evidence
Low-Income Housing Scenario What role can stakeholders play? Are there stakeholders that can facilitate access to residences and/or translate if necessary? Can the trainers help collect data before, during, or after each training? Other opportunities for data collection? What information will the stakeholders need to act? For example, will policy makers be more influenced by information on environmental conditions, asthma symptoms, or medical care usage? Data in what form? Pictures? Stories? Numerical data? How will they react to different kinds of information? Will they be more moved by pictures, by stories? Do they need numbers? Do they need cost data? What is the type of information that they will need to take the action that you’re hoping to see from your evaluation? Next slide, please.

92 Step 5: Justify Conclusions
Low-Income Housing Scenario How do stakeholders view success? What does “success” look like to them? Step 5 is justify conclusions. I think it’s very important to know and get input from the stakeholders on how they view success. I think very often, we carry in our own heads an idea of what success looks like. And if we don’t explicitly share that and talk about that, we may be surprised that other people view that differently. They may be very happy with an outcome that your - that surprises you or not be happy with one that you think is important and that’s very important to get that kind of input.

93 Step 5: Justify Conclusions
Low-Income Housing Scenario How do stakeholders view success? What does “success” look like to them? Do stakeholders have alternative interpretations of the initial findings? It’s also important to know if they have different types of interpretation of the data. If you present them with one explanation for the changes that you’re seeing, will they agree with that? Do they have a different explanation about what’s going on?

94 Step 5: Justify Conclusions
Low-Income Housing Scenario How do stakeholders view success? What does “success” look like to them? Do stakeholders have alternative interpretations of the initial findings? Do they perceive barriers you might have missed? Or do they perceive barriers to making changes that you might not even have considered? So stakeholders can give you really important information on interpreting what you have found.  Next slide, please.

95 Step 6: Ensure Use and Share Lessons Learned
Low-Income Housing Scenario Can stakeholders help disseminate evaluation findings? Step 6 is ensure use and share lessons learned. And here again, think about the stakeholders that you have. Are they in a position to help you disseminate information?

96 Step 6: Ensure Use and Share Lessons Learned
Low-Income Housing Scenario Can stakeholders help disseminate evaluation findings? Is there an owner’s newsletter? A tenant meeting? A neighborhood association? Perhaps there’s an owner’s newsletter or a tenant meeting or a neighborhood association, these can be important ways that you can share information with the stakeholders in your program.

97 Step 6: Ensure Use and Share Lessons Learned
Low-Income Housing Scenario Can stakeholders help disseminate evaluation findings? Is there an owner’s newsletter? A tenant meeting? A neighborhood association? Is information needed in a different form to prompt action? A report? An executive brief? An oral presentation? It’s also important to consider whether these different stakeholders have different needs for information. Is a report the best way to reach some? Do some need an executive brief? Do some need an oral presentation? What is the best way to share that information?

98 Step 6: Ensure Use and Share Lessons Learned
Low-Income Housing Scenario Can stakeholders help disseminate evaluation findings? Is there an owner’s newsletter? A tenant meeting? A neighborhood association? Is information needed in a different form to prompt action? A report? An executive brief? An oral presentation? Is there interest beyond the community involved in the intervention? And another thing to consider, let’s say you’re doing your program in a particular place but maybe there would be interest beyond that location or beyond that particular audience, so when coming up with your plans for sharing lessons learned and making sure that the results are used, think a little bit more broadly. And your stakeholders can help you think about that. Who else? Who else might be interested in the information? What are other audiences you might want to consider?    Next slide, please.

99 What if you ignored stakeholders?
If you ignore the stakeholders, potential pitfalls include… So in this particular example, which is trying to make changes in home environment in low-income apartments, what would happen if you ignored stakeholders? Are there potential problems or pitfalls you would encounter if you ignored the stakeholders in doing this evaluation? Well here are some potential pitfalls that you might encounter.

100 What if you ignored stakeholders?
If you ignore the stakeholders, potential pitfalls include… Failure to ask important questions. For example, it may be important to know if tenants have access to quit-smoking services. First of all, you might fail to ask important questions. For example, maybe it’s important to know if tenants have access to quit-smoking services if you’re hoping that they will not smoke in their apartments anymore.

101 What if you ignored stakeholders?
If you ignore the stakeholders, potential pitfalls include… Difficulty gaining access to or trust of program constituents. For example, will you gain access to data or be received into residents’ homes? It also may be difficult to gain access to or the trust of people in the program. Can you get into their homes? Will they receive you? Are there data that you need from the housing office, from an owners' group? And do you need, you know, do you need to have their involvement to get the information that you need to do your evaluation?

102 What if you ignored stakeholders?
If you ignore the stakeholders, potential pitfalls include… Failure to uncover unanticipated consequences. For example, if improvements are made, will rents increase? For example, maybe you’ll discover through talking with stakeholders that they’re concerned that if improvements are made, rents might increase. This may be something that you need to include in your evaluation and take a look at. There might be other surprises. So that engagement can give you information about things that are not something that you had as the focus of your evaluation, but that might be ultimately very important to know.

103 What if you ignored stakeholders?
If you ignore the stakeholders, potential pitfalls include… Failure to communicate results in ways that are understood and/or accessible. For example, are there literacy or language issues? Cultural preferences? You also might fail to communicate the results in ways that are understood or accessible to the audiences that you have in mind. Are there literacy issues in the audiences you need to reach? Or language barriers? Or are there cultural preferences for how they like to receive information? If you take the time upfront to learn about that, you’re much more likely to get information that is usable and well received into the hands of the people who need it. So real take-home message is, your evaluation might not be lead to action, at least the action that you want if you don’t - at least consider about the ways you might engage your stakeholders.    Next slide, please.

104 What if you ignored stakeholders?
If you fail to involve the stakeholders… …your evaluation may not lead to action! An important point in all of this is that the way in which you engage who your stakeholders are and the way you engage them is going to vary from project to project, from evaluation to evaluation. So how might you look at these questions a little bit differently in a different example? I’m not going to walk through all the steps in this next example, but I just want to give you a flavor for how you might think about it a little bit differently.

105 Example #2 – Care Coordination Across Health Systems
Intervention to provide and integrate care coordination and case management for high-risk children with asthma. In the second example, we’re looking at care coordination across health systems. This is an intervention to provide and integrate care coordination and case management for high risk in children with asthma. 105

106 Example #2 – Care Coordination Across Health Systems
Intervention to provide and integrate care coordination and case management for high-risk children with asthma. Involves standardizing protocols across care systems, including: Medicaid HMOs Home nursing agencies Health departments The intervention involves standardizing protocols across care systems including Medicaid, HMOs, home nursing agencies, and health departments. 106

107 Example #2 – Care Coordination Across Health Systems
Intervention to provide and integrate care coordination and case management for high-risk children with asthma. Involves standardizing protocols across care systems, including: Medicaid HMOs Home nursing agencies Health departments The goal: to prove success and thus convince insurers to continue reimbursing case management services. The goal is to prove success and thus convince insurers to continue reimbursing case management services.  So how might stakeholder engagement be different in this example? 107

108 Engage the Stakeholders
Housing Scenario Healthcare Scenario affected families those who receive training trainers city officials smoke-free advocates Who are the stakeholders in this scenario? First of all, we would have a different set of stakeholders.

109 Engage the Stakeholders
Housing Scenario Healthcare Scenario affected families those who receive training trainers city officials smoke-free advocates insurers These stakeholders might include insurers… They might also be interested in very different set of outcome questions. For example, they might be really interested whether this is an intervention that they can sustain if they did a pilot to see what the cost of - is of doing this kind of case management, what kinds of outcomes do they need to show to answer the question of whether they could sustain this. They might prefer particular kinds of data to help give them that proof, to help give that information to make those decisions.

110 Engage the Stakeholders
Housing Scenario Healthcare Scenario affected families those who receive training trainers city officials smoke-free advocates insurers case managers case managers,

111 Engage the Stakeholders
Housing Scenario Healthcare Scenario affected families those who receive training trainers city officials smoke-free advocates insurers case managers families families,

112 Engage the Stakeholders
Housing Scenario Healthcare Scenario affected families those who receive training trainers city officials smoke-free advocates insurers case managers families health system administrators health system administrators.

113 Focus the Evaluation Design
Housing Scenario Healthcare Scenario What changes are both desired and seen as realistic? What outcomes do they think are most important? Which activities contribute to which outcomes? What outcome questions? They might also be interested in very different set of outcome questions. For example, they might be really interested whether this is an intervention that they can sustain if they did a pilot to see what the cost of - is of doing this kind of case management, what kinds of outcomes do they need to show to answer the question of whether they could sustain this. They might prefer particular kinds of data to help give them that proof, to help give that information to make those decisions.

114 Focus the Evaluation Design
Housing Scenario Healthcare Scenario What changes are both desired and seen as realistic? What outcomes do they think are most important? Which activities contribute to which outcomes? Is this intervention sustainable? For example, they might be really interested whether this is an intervention that they can sustain if they did a pilot to see what the cost… They might prefer particular kinds of data to help give them that proof, to help give that information to make those decisions.

115 Focus the Evaluation Design
Housing Scenario Healthcare Scenario What changes are both desired and seen as realistic? What outcomes do they think are most important? Which activities contribute to which outcomes? Is this intervention sustainable? What is the cost of doing this kind of case management? is of doing this kind of case management, what kinds of outcomes do they need to show to answer the question of whether they could sustain this.

116 Gather Credible Evidence
Housing Scenario Healthcare Scenario Data in what form? Pictures? Stories? Numerical data? Stakeholders that can facilitate access ? Trainers help collect data ? What type of data do they need to make a decision? They might prefer particular kinds of data to help give them that proof, to help give that information to make those decisions.

117 Gather Credible Evidence
Housing Scenario Healthcare Scenario Data in what form? Pictures? Stories? Numerical data? Stakeholders that can facilitate access ? Trainers help collect data ? Is quantitative data more highly valued than qualitative data? And these - some of these stakeholders may value quantitative data more highly than other types of data.

118 Gather Credible Evidence
Housing Scenario Healthcare Scenario Data in what form? Pictures? Stories? Numerical data? Stakeholders that can facilitate access ? Trainers help collect data ? Is quantitative data more highly valued than qualitative data? Do the data need to show cost savings? And they may need to see evidence of cost savings to be willing to act.

119 Gather Credible Evidence
Housing Scenario Healthcare Scenario Data in what form? Pictures? Stories? Numerical data? Stakeholders that can facilitate access ? Trainers help collect data ? Be sensitive to the stakeholders’ interests and time constraints. And you might find that some of these stakeholders have very different interests in or ability to participate in evaluation. They may be very busy, they may have a lot of time constraints, and you may need to be very sensitive to the ways in which they'd like to be involved and can be involved so that you facilitate that interaction in a way that they're going to be comfortable with, that you’re going to be comfortable with and make things work.

120 Gather Credible Evidence
Housing Scenario Healthcare Scenario Data in what form? Pictures? Stories? Numerical data? Stakeholders that can facilitate access ? Trainers help collect data ? Be sensitive to the stakeholders’ interests and time constraints. Be aware of and consider the political environment. And of course every program, every evaluation has its own political environment. You need to think through what that is so that you're hearing the right voices and presenting information that will lead to the action in the political context that you want to see.

121 Example #3 – Daycare Education
Intervention to train childcare providers to identify triggers and to manage children with asthma. Now I'll introduce a third example which is daycare education.

122 Example #3 – Daycare Education
Intervention to train childcare providers to identify triggers and to manage children with asthma. Intervention reaches out to: Large daycare centers Licensed home daycare programs Participation is encouraged through continuing education credits In the third example, this is an intervention to train child care providers to identify triggers and to manage children with asthma. This intervention reaches out to large daycare centers and licensed home daycare programs. Participation in the program is encouraged through continuing education credits.

123 Example #3 – Daycare Education
Intervention to train childcare providers to identify triggers and to manage children with asthma. Intervention reaches out to: Large daycare centers Licensed home daycare programs Participation is encouraged through continuing education credits Long-term goal: to expand beyond pilot providers to larger community. The long-term goal is to expand this beyond a pilot to the larger community. How might stakeholder engagement be different in this example?

124 Engage the Stakeholders
Housing Scenario Daycare Scenario affected families those who receive training trainers city officials smoke-free advocates Who would be the stakeholders in this scenario? Once again, we have a different set of stakeholders.

125 Engage the Stakeholders
Housing Scenario Daycare Scenario affected families those who receive training trainers city officials smoke-free advocates daycare providers licensing boards families These stakeholders might include trainers, daycare providers, licensing boards, and families.

126 Focus the Evaluation Design
Housing Scenario Daycare Scenario What changes are both desired and seen as realistic? What outcomes do they think are most important? Which activities contribute to which outcomes? What outcome questions? And the outcomes are likely to be quite different than the examples we looked at before.

127 Focus the Evaluation Design
Housing Scenario Daycare Scenario What changes are both desired and seen as realistic? What outcomes do they think are most important? Which activities contribute to which outcomes? behavior change environmental outcomes evidence to support decision to expand beyond pilot Here, we might be very concerned with behavior change, with environmental outcomes, and with the kind of evidence that would be used to support a decision to expand beyond the pilot center.

128 Focus the Evaluation Design
Housing Scenario Healthcare Scenario Focus on changes to the housing code? Trigger reduction? Should there be a control group? Observational methods? What design mix? The design mix evaluation design might be different.

129 Focus the Evaluation Design
Housing Scenario Healthcare Scenario Focus on changes to the housing code? Trigger reduction? Should there be a control group? Observational methods? Consider pre-post design. Consider control group design. It might consider a pre- post-control design…

130 Gather Credible Evidence
Housing Scenario Healthcare Scenario Data in what form? Pictures? Stories? Numerical data? Stakeholders that can facilitate access ? Trainers help collect data ? What data collection methods? and we might think about different data collection methods.

131 Gather Credible Evidence
Housing Scenario Healthcare Scenario Data in what form? Pictures? Stories? Numerical data? Stakeholders that can facilitate access ? Trainers help collect data ? Success stories may be important . It may be that particular success stories are important with this audience to convey the evaluation results. And again, the audience for your evaluation results will be different in this example.

132 Getting Started and Engaging Your Stakeholders
Tangible Examples of Stakeholder Involvement in Applying the Evaluation Standards Okay so we've heard Carlyn about the ways that stakeholders can be engaged in each of the steps in the CDC framework that you see around the outside of the circle.

133 Evaluation Standards Apply to Every Step
Ensure use and share lessons learned Gather credible evidence Engage stakeholders Describe the program Focus the evaluation design Justify conclusions STEPS Standards Utility Feasibility Propriety Accuracy So what I'm going to do now is provide you with a brief example from some of our recent work in the Air Pollution and Respiratory Health Branch at the CDC to demonstrate how engaging stakeholders helps in carrying out the four evaluation standards that reside at the center of the CDC framework.

134 The Evaluation Standards
Utility Feasibility Propriety Accuracy These four standards -- utility, feasibility, propriety, and accuracy -- were developed in the 1970s by the Joint Committee on Standards for Educational Evaluation. Now, although these were developed for educational evaluation, they are really every bit as applicable to other disciplines that conduct evaluation, so that includes public health.

135 The Evaluation Standards
There are actually 30 evaluation standards grouped into four categories. A complete list of the standards is published in CDC’s Framework for Program Evaluation in Public Health. (See link under “Learning Aids”.) And it probably appears from this presentation and the circular figure that you just saw depicting the CDC evaluation framework that there are only four standards. But, in fact, there are a total of 30 standards that sit under these four headings. A more detailed list is available in CDC’s framework.

136 Why Use Standards? Standards provide a way to: Make difficult decisions when designing and implementing an evaluation. Judge the quality of an evaluation. Determine where an evaluation can be better balanced. Well, as you all know, often in public health and in evaluation we’re dealing with a limited budget. And this means that we can't always develop an evaluation design in a way that we feel is ideal. So, you know, in many cases, it seems almost inevitable that there's going to be trade-offs that are going to occur. And what the standards do is they provide evaluators with guideposts that can help them and their stakeholders make difficult decisions when they're designing and implementing an evaluation. Additionally, they're a good way to judge the quality of an evaluation and to see where an evaluation can be better balanced.

137 The “Utility” Standard
Who needs the evaluation information and what information do they need? Identify affected stakeholders. Collect data that addresses important questions. Ensure the data is responsive to the needs of the stakeholders. Well, when we consider the utility standard, we think through who needs the information from the evaluation as well as the types of information these individuals need. And Carlyn spoke about that and I spoke about that a little bit earlier, too. The standards residing under this heading include items such as identifying stakeholders who are involved in or affected by the evaluation. And other standards under this heading include collecting data that addresses important questions about the program being evaluated and making sure that this data is responsive to the needs of stakeholders.

138 The “Feasibility” Standard
How much money, time, and effort can we put into this? Keep the burden of data collection as low as possible. Consider the political interests of the stakeholders. Ensure that the evaluation is cost effective. The standards associated with feasibility help evaluators to ensure that the evaluation will be viable and pragmatic. Using the standards under this heading we aim keep the burden associated with data collection as low as possible. We acknowledge that political interests of individuals who are involved in or interested in the evaluation findings. And we make every effort to assure the evaluation is cost effective.

139 The “Propriety” Standard
What steps need to be taken for the evaluation to be ethical? Ensure the ethical treatment of evaluation participants. Ensure the ethical treatment of those who commissioned the evaluation. Ensure the ethical treatment of those affected by the findings of the evaluation. The propriety standards lay out items that are very important to consider with respect to ethics. This includes ethical treatment of the individuals from whom data will be collected as well as the individuals who may have commissioned the evaluation or could be affected by the findings of the evaluation.

140 The “Accuracy” Standard
What design will lead to accurate information? Clearly and accurately document the program. Gather data in a way that produces reliable information. Analyze and report the data in a systematic and unbiased way. Assure that the source of the information is defensible. And for accuracy, a set of standards exists that help the evaluator to design and implement the evaluation so that the results will be technically adequate to inform decision making. Now there are 12 standards under this heading. And some include being sure that the program that is being evaluated is documented clearly and accurately. And so we talked about that earlier when we were talking about the importance of logic modeling. Some other standards, for example, include gathering data in a way that will produce valid and reliable information, as well as analyzing and reporting the data in a systematic and unbiased way and then also assuring that the source of the information is defensible. So in other words, how adequate is the data for answering the evaluation questions at hand? So let’s see these standards in action. Can we change the slide please?

141 CDC’s Asthma Control Program
CDC’s Asthma Control Program funds 35 states and territories to implement statewide asthma control programs using a public health perspective. Their approach involves: Engaging, enhancing, and maintaining relationships with partners. Developing, improving, and conducting asthma surveillance activities. Designing and implementing interventions with partners. So in 2006, the Air Pollution and Respiratory Health Branch initiated a collaborative process to develop a program monitoring system for the asthma control program. This program currently funds 35 states and territories to implement statewide asthma control programs using a public health perspective. This approach involves engaging, enhancing, and maintaining relationships with partners across the state or territory; developing, improving, and conducting asthma surveillance activities; and designing and implementing interventions with partners across the state.

142 Example #4 - Asthma Program Monitoring System
This example involves collecting information for the purpose of evaluation across multiple sites. So as Carlyn mentioned, this example is a little bit different than the types that she provided. Here we’re looking at an example that includes collecting information for the purpose of evaluation across multiple sites rather than designing an evaluation that’s specific to one site. Change the slide, please.

143 Example #4 - Asthma Program Monitoring System
Purpose: Develop a strategic, systematic approach to collecting information about our program activities, progress, and accomplishments across all funded states. The purpose of this endeavor was to develop a strategic, systematic approach to collecting information about our program activities, our progress, and accomplishments across all of the funded states.

144 Employing the Framework Model
Step 1: Involve the stakeholders CDC Air Pollution and Respiratory Health Branch. Representatives of funded State Asthma Programs. Step 2: Describe the Program Develop logic models. Step 3: Focus the evaluation design Identify and prioritize evaluation questions. Step 4: Gather credible evidence Create data collection instrument. And to accomplish this purpose, we engaged stakeholders. And these stakeholders included the staff within the CDC’s Air Pollution and Respiratory Health Branch and representatives from funded state asthma programs. To develop the content of this monitoring system, we engaged the stakeholders in a process that was driven by the steps outlined in the CDC framework that Carlyn has given us some examples of. We developed a series of logic models describing the program. We identified and prioritized key evaluation questions that we wanted to be able to answer at a national program level. And then we embarked upon creating a data collection instrument that would help us in gathering data on a regular basis across all of the funded programs.

145 Developing the Data Collection Instrument
In this example, developing the data collection instrument was a highly collaborative endeavor aided by the program evaluation standards. So I'm going to focus here on the creation of this data collection instrument and how our collaborative approach was aided by the evaluation standards.

146 The Iterative Process CDC internal workgroup identifies core information needs and drafts survey instrument. State workgroup members review instrument, pilot test, provide pilot data and comments. Create mock report to illustrate type of information obtained from instrument. CDC internal workgroup discusses comments and mock report. Modify instrument as necessary. So we used an iterative approach in designing this data collection instrument. First, a workgroup consisting of staff from the Air Pollution and Respiratory Health Branch worked together to identify core information needs and to create a draft of the core data collection instrument, which is essentially a survey instrument. This draft instrument was then shared with the workgroup and state representatives who reviewed the instrument, pilot tested it, and provided extensive comments about how it could be improved. A subset of states, represented on the workgroup, filled out the instrument and provided me with some data so that I could create a mock report -- something that would demonstrate the type of information that might be generated through such a program monitoring system. The mock reports and state comments were then brought back to the CDC workgroup to discuss the comments and modify the data collection instrument accordingly.

147 Applying the Standards
How were the evaluation standards employed in this example? How was “engaging the stakeholders” important in applying each of the standards? So you might be asking yourself, “Well, how were the standards employed in this example and specifically how was engaging stakeholders important to fulfilling each of these evaluation standards? Can you change the slide, please?

148 The “Utility” Standard
Who are the intended users of the information? Why do they need the information? CDC Air Pollution and Respiratory Health Branch State partners Examine progress. Identify promising patterns. Share successful practices Identify areas for improvement. Compare and contrast program designs and performance. So here are some questions that we might pose when we’re thinking about the standard of utility to ensure that the information needs of intended users are met. In this case, the primary intended users of the information coming from the monitoring system were the staff at the Air Pollution and Respiratory Health Branch. It was anticipated that project officers, evaluators, epidemiologists, and managers would use the information from this system to examine the progress that was being made across all the states in aggregate form, identify patterns that occurred in practice that were promising with the intention of sharing these practices across the states, and also identify areas for improvement, both within the states and at the national office. The states were considered to be secondary intended users of this information. They might, for example, query the monitoring system to see how states similar to them were doing in particular areas or to see how similar states were designing and carrying out their own program. Change slide.

149 The Utility Standard in Action
Determined what information the CDC needed to do their job and how they will use the information. Engaged states in discussions about how to make the information useful for them. Developed mock reports to facilitate conversations with stakeholders about how the data could be used. To carry out the utility standard, we wanted to know what type of information CDC needed to do their job and ways that they might use the information. We also engaged states in similar discussions inquiring as to whether they felt the information we were collecting might be useful to them and how we could make the information more useful to them. As I mentioned earlier, mock reports were developed to facilitate conversations with individuals at CDC about how the data might be used because sometimes it’s just easier to know how you would use the information if you see it right in front of you. Change slide.

150 The “Feasibility” Standard
How much money, time, and effort can we put into this? Remained conscious of how much time this data collection will require of the states. Identified areas where processes could be streamlined. Recognized that there will be less time for other programmatic activities. The feasibility standard was also very important to this work. States are asked for a variety of information already, as many of you know-- through semi-annual reports, end of year reports -- and so, we wanted to be very conscious of how much more we asked and in what instances we could streamline data collection processes. Additionally, it was recognized that time used to provide the answers to the questions that are posed in this data collection instrument was time that would be taken away from other programmatic activities. Change slide.

151 The Feasibility Standard in Action
Carefully selected “need-to-know” questions. Asked states: What information do you already have? How feasible is it for you to obtain? How long does it take to locate information? How long does it take to fill out the data collection instrument? Always kept the burden of data collection as low as possible. Throughout our process, we attempted to be mindful of asking questions that would provide CDC with need-to-know information, not just things that would be nice to know. Furthermore, during the pilot test with states, we asked specific questions about feasibility. For example, we wanted to know if the information we were requesting was already collected by the program or if it was easy to obtain or estimate it. Furthermore, it was important for us to consider how long it would take to obtain the information and fill out the instrument. As outlined - as is outlined in the feasibility standard, we wanted to keep the burden that was associated with this data collection as low as possible.

152 The Feasibility Standard in Action
The content of the data collection instrument was a careful balance between feasibility and utility. High priority data requiring higher burden was balanced by eliminating lower priority questions. Ultimately deciding on the content of this instrument was a careful balance between feasibility and utility. If a piece of information was high up on the list of the need-to-know items, then we might unfortunately need states to take on some additional data collection and reporting burden while, of course, finding other questions that we could eliminate to help balance out the burden. Change the slide.

153 The “Propriety” Standard
What steps need to be taken for the evaluation to be ethical? Program monitoring systems can be used to generate performance measures. The data collected could have significant implications. Understand that the information collected can affect the stakeholders’ programs. Program monitoring systems are tricky in the sense that the data collected through these systems ultimately can be used to generate performance measures or program indicators. So these are not just numbers. They imply meanings. And in particular, they carry with them a connotation of whether or not a state or a national program is doing good or they're doing bad. Thus the information produced using the data collected through the instrument could have implications. As we discussed earlier, this consideration lays squarely under the standard of propriety. So we needed to be very understanding that the information could affect programs associated with this monitoring system at both the state and the national level. Change the slide.

154 The Propriety Standard in Action
Used a highly collaborative process. Fostered open, honest dialogue about expectations and concerns. Remained highly responsive to stakeholder concerns. Modified instrument to address stakeholder concerns. As already described, we truly embarked on and carried out a highly collaborative process in which states and CDC staff were highly engaged. We attempted to create an atmosphere where individuals participating would feel free to share their expectations and their concerns. And most importantly, we tried to highly -- to be highly responsive to any of the concerns that were expressed by the stakeholders and change the instrument content when it was needed. Change the slide.

155 The “Accuracy” Standard
What design will lead to accurate information? Be certain that the information collected is valid and reliable. If the information is inconsistent or irrelevant to the concept, reconsider whether to collect it. In the previous slide on propriety, I talked about how the data collected could have implications for state and national programs. Therefore, it was also important to give detailed consideration to how accurate the information collected through the instrument would be. We really needed to feel comfortable that the information produced would be valid and reliable. And if we discovered that a particular piece of information may not be provided consistently over time and/or provide us with information that we felt would adequately represent the concept of interest, we really needed to consider whether it was worth collecting the information at all.

156 Balancing the Standards
When balancing feasibility vs. accuracy consider the effect on propriety. Here you can see the balance that occurs between the standards. So even if the information is feasible to obtain, it may not be accurate, and limited accuracy could have similar implications -- could have serious implications for programs, thus throwing out the importance of that propriety standard. Change the slide, please.

157 The Accuracy Standard in Action
Involved states in: Pilot tests and reviews. Discussions about “guessing”. Analyzing the quality of the information provided. Revising questions or collecting information that would help to improve accuracy. Decisions to discard some questions. So to help us examine the accuracy of the data collected, we asked states who pilot tested the instrument a number of questions, some of which you see on this slide. And we wanted to know where individuals were guessing. And we had discussions about where we felt a best guess was acceptable and where it absolutely was not. We also asked about the quality of information and ways that we could revise the instrument to collect information that would help us to improve the accuracy of the data provided. Of course, it isn’t always the case that an instrument can be adjusted in a manner that improves the accuracy of information, so we also had to consider just throwing some questions out. Change the slide.

158 What We Have Reviewed Today
The goal of evaluation and role of stakeholders. Who to engage as stakeholders. Why it is important to engage stakeholders. When and how to engage stakeholders. Stakeholder involvement across all steps of CDC framework. Stakeholder involvement in fulfilling the evaluation standards. Examples: Single and multiple sites Okay. So that’s the end of that example for the monitoring system. And so here’s just kind of a brief recap of what we have reviewed today. So we talked about the goal of evaluation and the role of stakeholders, who to engage, why it’s important to engage stakeholders, when and how to engage them, how stakeholders can be involved across all of the steps in the CDC framework, and as well how they can be involved in fulfilling the standards. And then have provided some examples for both single and multiple sites. Next slide.

159 Thank You for Joining Us!
Leslie Fierro: Carlyn Orians: So thank you so much for joining us. We really hope that the information that we provided will be helpful to you in your future evaluation endeavors. And then we thought we would provide you with our contact information in case you have any additional questions that we may not have time to answer now.


Download ppt "Program Evaluation Webinar Series Part 2:"

Similar presentations


Ads by Google