Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Introduction to Logic Modeling Workshop National Environmental Partnership Summit May 8, 2006 Presented by: Yvonne M. Watson, Evaluation Support Division.

Similar presentations


Presentation on theme: "1 Introduction to Logic Modeling Workshop National Environmental Partnership Summit May 8, 2006 Presented by: Yvonne M. Watson, Evaluation Support Division."— Presentation transcript:

1 1 Introduction to Logic Modeling Workshop National Environmental Partnership Summit May 8, 2006 Presented by: Yvonne M. Watson, Evaluation Support Division National Center for Environmental Innovation Office of Policy, Economics and Innovation U.S. Environmental Protection Agency

2 2 Presentation Goals  Enable participants to: Understand key logic modeling, performance measurement and program evaluation terminology. Learn how to develop a logic model for their programs in preparation for developing and refining meaningful performance measures and conducting program evaluations.

3 3 Session Agenda  Module 1: Developing a Logic Model  Module 2: Building on your Logic Models - The Bridge to Performance Measurement and Program Evaluation

4 4 Module 1: Developing a Logic Model

5 5

6 6 What is a Logic Model? A logic model is a diagram and text that describes/ illustrates the logical (causal) relationships among program elements and the problem to be solved, thus defining measurements of success. We use these resources… For these activities… To produce these outputs… So that these customers can change their ways… Which leads to these outcomes… Leading to these results!

7 7 Logic Model Longer term outcome (STRATEGIC AIM) Short term outcome CustomersOutputs WHY HOW PROGRAM RESULTS FROM PROGRAM EXTERNAL CONDITIONS INFLUENCING PERFORMANCE (+/-) Intermediate outcome Activities Resources/ Inputs

8 8 Elements of the Logic Model Inter- mediate Changes in behavior, practice or decisions. Behavior Inter- mediate Changes in behavior, practice or decisions. Behavior Customer User of the products/ services. Target audience the program is designed to reach. Customer User of the products/ services. Target audience the program is designed to reach. Activities Things you do– activities you plan to conduct in your program. Activities Things you do– activities you plan to conduct in your program. Outputs Product or service delivery/ implementation targets you aim to produce. Outputs Product or service delivery/ implementation targets you aim to produce. Resources/ Inputs: Programmatic investments available to support the program. Resources/ Inputs: Programmatic investments available to support the program. Short-term Changes in learning, knowledge, attitude, skills, understanding. Attitudes Short-term Changes in learning, knowledge, attitude, skills, understanding. Attitudes Long- term Change in condition. Condition Long- term Change in condition. Condition External Influences Factors outside of your control (positive or negative) that may influence the outcome and impact of your program/project. External Influences Factors outside of your control (positive or negative) that may influence the outcome and impact of your program/project. Outcomes WHY HOW PROGRAM RESULTS FROM PROGRAM

9 9 Types of Program Elements 1.5 Evaluation Support Division Staff 2.Develop and deliver training material 3.Program Evaluation Training materials 4.EPA Managers and staff complete training 5.EPA HQ and Regional Staff, States 1._____________ 2._____________ 3._____________ 4._____________ 5._____________ Example Type of Program Element

10 10 Types of Program Elements 6.Knowledge of program evaluation increased 7.Customers equipped with skills to manage and conduct an evaluation 8.Number of evaluations managed and conducted increased 9.Program evaluation skills are used by customers in the work place 10.Quality of evaluations managed and conducted is improved 11.Evaluation culture is established at EPA 6.__________________ 7.__________________ 8.__________________ 9.__________________ 10.__________________ 11.__________________ Example Type of Program Element

11  Develop and design PE, PM, IA and Logic Model curriculum and exercises.  Deliver PE, PM, IA and Logic Model training. PE skills are used by customers in the work environment # of evaluations conducted and managed increased. Resources Outcomes Short- term IntermediateLong- term OutputsActivitiesCustomers  Knowledge of PE increased/ improved.  Customers equipped with skills to manage and conduct evaluations. ESD Staff: Y. Watson M. Mandolia J. Heffelfinger D. Bend C. Kakoyannis Access to: John McLaughlin  NCEI Staff  IAC Staff  PEN  PEC Winners  HQ/ Regional managers & staff Partners OCFO OW OSWER ORD OARM  Knowledge of PM increased/ improved.  Customers equipped with skills to develop measures.  Technical assistance delivered. Strategic Plan ESD TRAINING LOGIC MODEL  Knowledge of Logic modeling increased/ improved.  Customers equipped with skills to develop logic models of their programs. Customers understanding of their programs is improved. PM skills are used by customers in the work environment. # of staff developing measures is increased. Customers use program evaluation regularly and systematically to improve environmental programs in terms of: - environmental & health outcomes - reduced costs - cost effective- ness - EJ Benefits -Public Involvement - Efficiency Environ- mental programs more effectively achieve their strategic goals. Quality of evaluations managed and conducted is improved. Quality of measures developed and reported is improved.  Provide technical assistance for workshop/ training attendees.  PM training materials.  Customers complete training.  PE training materials.  Customers complete training.  NCEI Staff  SIG Recipients  HQ/ Regional managers & staff  States/Tribes  SBAP  CARE Customers use logic models to help conduct evaluations and develop measures.  Logic Model training materials.  Customers complete training.  NCEI Staff  SIG Recipients  States  SBAP  CARE  Provide guidance for Environmental Results Grants Training.  Facilitate Train the trainer sessions for PE, PM and Logic Modeling.  Environmental Results Grants Training materials.  Partners complete training.  IA training materials.  Customers complete training. OCFO, OW, OSWER, ORD, OARM EPA Project Officers & Grant Managers Partners deliver PE, PM and Logic model training to their clients/ customers. EPA POs & GMs recognize outputs/ outcomes in grant proposals ESD Training Goal: To provide training to enable our EPA partners to more effectively conduct and manage program evaluations and analyses and develop performance measures that can be used to improve their programs and demonstrate environmental results.

12 12 What are Logic Models Used For?  Staff and managers can use logic models to… Develop program/project design Identify and develop performance measures for their program/project Support strategic planning Communicate the priorities of the program/project Focus on key evaluation questions

13 13 Benefits of Logic Modeling  Illustrates the logic or theory of the program or project.  Focuses attention on the most important connections between actions and results.  Builds a common understanding among staff and with stakeholders.  Helps staff “manage for results” and informs program design.  Finds “gaps” in the logic of a program and work to resolve them.

14 14 Steps in the Logic Model Process 1.Establish a team or work group and collect documents. 2.Define the problem and context for the program or project and determine what aspect of your program/project you will logic model. 3.Define the elements of the program in a table. 4.Verify the logic table with stakeholders. 5.Develop a diagram and text describing logical relationships. 6.Verify the Logic Model with stakeholders. Then use the Logic Model to identify and confirm performance measures and in planning and evaluation.

15 15 Step 1. Establish a team/workgroup and collect documents and information  Convene/consult a team/workgroup Provides different perspectives and knowledge Attempts agreement on program performance expectations  Review sources of program or project documentation Strategic and operational plans Budget requests Current metrics Past evaluations  Conduct interviews of appropriate staff

16 16 Step 2. Define the problem the program addresses and the context the program operates in  Problem or Issue Statement: What is the problem (s) the program/project is attempting to solve or the issue (s) the program/project will address?  Community Needs: What is the specific need of the community the program is attempting address?  Program Niche: What assets or activities make your organization uniquely qualified to address the problem?  Context/External Factors: Are there factors outside of your control (positive or negative) that may influence the outcome and impact of your program/project? What are the drivers of success and constraints on success?  Assumptions: State the assumptions behind how and why you believe your approach to address the problem will work.

17 17 Step 2. Define the problem the program addresses and the context the program operates in Problem Poor quality of performance measures developed and insufficient number and quality of evaluations conducted at EPA resulting in program inefficiencies, ineffectiveness and inability to achieve strategic goals. External Factors Drivers of success: GPRA, PART, Grants Order 5700.7 Constraints on success: FTE & $$ Program Niche Evaluation Support Division provides Logic Modeling, Measurement and Evaluation Training, Technical Assistance and Train-the-Trainer Sessions Outcomes - Quality of evaluations managed and conducted is improved. - Quality of measures developed and reported is improved. - Customers use evaluation and measurement regularly and systematically to improve environmental programs. - Environmental programs more effectively achieve their strategic goals. Community Need $$ and skills to develop measures and conduct evaluations Assumptions Target audience uses measurement and evaluation skills

18 18 Program outcomes related to factor(s) - HOW -WHO WHAT and WHY Step 3. Define the elements of the program or project in a table External Influences: Outcomes Resources/ Inputs ActivitiesOutputs Customers reached Short-term (change in attitude) Intermediate (Change in behavior) Long-term (change in condition)

19 19 External Influences: Reduction in budget available to provide training to customers. Outcomes Long-term (Change in Condition) Evaluation culture established. Quality of evaluations managed and conducted is improved. Intermediate (Change in Behavior) Number of evaluations conducted and managed increased. Program evaluation skills are used by customers in the work environment. Short-term (Change in Attitude) Knowledge of program evaluation increased. Customers equipped with skills to manage and conduct evaluations. Customers Reached NCEI Staff IAC Staff PEN PEC 04’ OSWER OW States HQ/Regional managers & staff Outputs PE training materials EPA managers and staff complete training. Innovation training materials. Activities Develop training materials. Deliver training. Resources/ Inputs 5 ESD Staff $65K Extramural funds Outside Consultant Step 4. Verify the logic table with stakeholders ESD example

20 20 Step 5. Develop a diagram and text describing logical relationships  Draw arrows to indicate/link the causal relationships between the logic model elements.  Limit the number of arrows. Show only the most critical feedback loops.  Work from both directions (right-to-left and left-to-right): Ask “How-Why” questions: –Start with Outcomes and ask “How?” –Start at Activities and ask “Why?” Ask “If-Then” questions: –Start at Activities and move along to Outcomes asking, “If this, then that?” We use these resources… For these activities… To produce these outputs… So that these customers can change their ways… Which leads to these outcomes… Leading to these results!

21 21 Step 6. Verify logic with stakeholders  Seek review from the same, or an even broader, group of stakeholders.  Compare to what units in the organization do and define their contributions to the outcomes.  Check the logic by checking it against reality.

22 22 Questions to Verify Your Logic Model Is the program’s outcome structure described and is it logical? Are the outcomes in the right order making clear the cause-effect relationship between outcomes? If the short-term (first order) outcomes are achieved, will they lead to, in part, achieving the intermediate (second order) outcomes? –What else, if anything has to happen to enable the full realization of the intermediate outcomes? If the intermediate outcomes are achieved, will they result in predicted changes in the long-term (third order) outcomes? –What else, if anything has to happen to enable the full realization of the longer-term outcomes?

23 23 Questions to Verify Your Logic Model Are the program’s customers described and are they the right customers, given the outcomes? Are there other customers that need to be reached if the outcomes (short-term, intermediate, or longer-term) are to be achieved? Are the program’s major resources, processes, and outputs described and are they logically consistent and sufficient to achieve outcomes? Could activities be consolidated into strategies to make the presentation more clear? Are the program’s partners and external factors described?

24 Courtney and Bailey Peter’s Model: A Safe Place to Play

25 Lead a Great Life

26 Performance Goals * *Please note that Performance Goals include key research outputs, a synthesis product, & plans for effective transfer to intended clients to bring about short-term outcomes. The synthesis product addresses and serves to answer a key research question linked to outcomes, and compares the accomplishment represented by the Performance Goal to baseline conditions and to related goals in future years needed to achieve outcomes. ORD Is Developing Outcome-Oriented Performance Indicators Programs are implemented & managed from LEFT to RIGHT Resources Research Activities Research Outputs Short-Term Outcomes Effective Transfer Intermediate Outcomes Long-Term Outcomes 8..... which contribute to measurable changes in environmental contaminants, stressors, or exposures… Strategic Objectives Strategic Goals 7. …resulting in changes in client decisions or actions … 6. client reactions and to changes in knowledge, attitudes, skills, or aspirations... 5. … and to prepare for the effective transfer of key research outputs that will lead to... 3. … to create the research knowledge, tools, processes, technologies etc. and the key outputs that must be developed over time … 2. …to coordinate and conduct the research, development, & administrative activities that will be needed to answer key research questions, and … Annual Performance Measures 1. We use resources ($, FTE, & infrastructure), partnerships, and guidance from stakeholders … 9. to demonstrate measurable long- term improvements in human or ecosystem health Specific Clients 4. … for intended clients (such as senior scientists or decision makers in government, the regulated community, or the public) … … Research Program Results Clients

27 27 Logic Model: Laboratory-Strengthening Component of EPA’s Safe Drinking Water Program in Central America Resources EPA  OIA  ORD  OW  R2  R9  R10 Training workshops conducted Training and reference materials in Spanish Laboratory water professionals & personnel Quality analytical data is used to make decisions and enforce environmental laws Partners  In-country water utilities  Pan American Health Orgs  Ministries of Health (MOH)  USAID Procure and deliver laboratory equipment to MOH and water labs Water quality decision-makers view analytical lab data more favorably Water professionals have increased knowledge of standard operating procedures for data handling, analysis and equipment use Improved drinking water quality Mission: Improve drinking water quality by strengthening the capacity of institutions responsible for providing safe drinking water to produce data of known quality in targeted rural and key urban/periurban areas of El Salvador, Nicaragua, and Honduras. Develop and deliver comprehensive laboratory training courses and reference materials External influences affecting program effectiveness:  Absence of a strong drinking water regulatory framework  Strict two-year timeline  Occurrence of natural disasters Assess laboratory capabilities and training and equipment needs Reports on training and equipment needs ActivitiesOutputsCustomers Short-term outcomes Intermediate outcomes Long-term outcomes Laboratory equipment delivered Reliable analytical data is produced by laboratories Increased number of laboratories achieve accreditation Water quality decision- makers (such as in MOH)

28 Resources Staff $ Regions States Trade Associations Individual Companies OGC/ OECA Activities Promulgate Rulemaking Policy Interpretation Outreach Outputs Rules Guidance Letters DSW Network Calls & Website Customers Generators Recyclers States States/Regions Outcomes Short-Term Long-Term Awareness of Rule (states & industry) Exclusion is available Relevant states pick up rule Firm(s) lobby state to pick up rule Regions authorize states for rule Awareness by industry Favorable change in recycling costs Change in stigma Increased recycling (desire to recycle and capacity) Less haz waste Conserve resources Better env’t & healthy public External: market forces (T&D costs), number of recyclers, markets for recycled products, technology, bureaucratic process, politics, state budgets, PR, public goodwill RCRA Recycling Logic Model

29  Develop and design PE, PM, IA and Logic Model curriculum and exercises.  Deliver PE, PM, IA and Logic Model training. PE skills are used by customers in the work environment # of evaluations conducted and managed increased. Resources Outcomes Short- term IntermediateLong- term OutputsActivitiesCustomers  Knowledge of PE increased/ improved.  Customers equipped with skills to manage and conduct evaluations. ESD Staff: Y. Watson M. Mandolia J. Heffelfinger D. Bend C. Kakoyannis Access to: John McLaughlin  NCEI Staff  IAC Staff  PEN  PEC Winners  HQ/ Regional managers & staff Partners OCFO OW OSWER ORD OARM  Knowledge of PM increased/ improved.  Customers equipped with skills to develop measures.  Technical assistance delivered. Strategic Plan ESD TRAINING LOGIC MODEL  Knowledge of Logic modeling increased/ improved.  Customers equipped with skills to develop logic models of their programs. Customers understanding of their programs is improved. PM skills are used by customers in the work environment. # of staff developing measures is increased. Customers use program evaluation regularly and systematically to improve environmental programs in terms of: - environmental & health outcomes - reduced costs - cost effective- ness - EJ Benefits -Public Involvement - Efficiency Environ- mental programs more effectively achieve their strategic goals. Quality of evaluations managed and conducted is improved. Quality of measures developed and reported is improved.  Provide technical assistance for workshop/ training attendees.  PM training materials.  Customers complete training.  PE training materials.  Customers complete training.  NCEI Staff  SIG Recipients  HQ/ Regional managers & staff  States/Tribes  SBAP  CARE Customers use logic models to help conduct evaluations and develop measures.  Logic Model training materials.  Customers complete training.  NCEI Staff  SIG Recipients  States  SBAP  CARE  Provide guidance for Environmental Results Grants Training.  Facilitate Train the trainer sessions for PE, PM and Logic Modeling.  Environmental Results Grants Training materials.  Partners complete training.  IA training materials.  Customers complete training. OCFO, OW, OSWER, ORD, OARM EPA Project Officers & Grant Managers Partners deliver PE, PM and Logic model training to their clients/ customers. EPA POs & GMs recognize outputs/ outcomes in grant proposals ESD Training Goal: To provide training to enable our EPA partners to more effectively conduct and manage program evaluations and analyses and develop performance measures that can be used to improve their programs and demonstrate environmental results.

30 30 “Z” Logic A Outputs Resources Action A B Outputs Resources Action B Strategic Program Results C Outcomes ResourcesAction C

31 31 ResourcesActivitiesOutputs For Customers Short-term outcomes Intermediate outcomes Long-term outcomes Energy R,D,&D Program Using ‘Z’ Logic Perform research External Influences: Price of oil and electricity, economic growth in industry and in general, perception of risk of global climate change and need for national energy security, market and technology assumptions. Source: McLaughlin and Jordan, 1999 Program $, Staff Ideas for technology change Potential for technology change documented Leads to applications in energy technologies For industry researchers Develop technology Added resources Lab prototype report Technology available for commercialization Leads to commercial prototype Deploy technology Added resources Policies, incentives, information Early adopters express desire to buy Leads to knowledge, less risk perceived Produce technology & educate market Commercial $, Staff Manufacture the technology in market Consequences of use- Lower energy costs and emissions Leads to technology accepted, purchased For buyers of that technology For users and Manufacturers Competitive economy, cleaner environment (Shared responsibility)

32 32 Exercise 1: Logic Modeling Brief application of logic modeling

33 33 Exercise 2: Logic Modeling Developing your own logic model

34 34 Module 2: Building on Your Logic Model – The Bridge to Performance Measurement and Program Evaluation

35 35 Drivers for Performance Measurement and Program Evaluation  Good Program Management.  Government Performance and Results Act (GPRA) of 1993 Requires EPA to report schedules for and summaries of program evaluations that have been or will be conducted and identify those that influence development of the Agency’s Strategic Plan.  OMB’s Program Assessment Rating Tool (PART) Tool designed to assess and evaluate programs across the government (40 EPA programs scheduled for PART assessment through 2008).  Environmental Results Order 5700.7 Requires EPA grant officers and grant recipients to identify outputs and outcomes from grants and connect them to EPA’s strategic plan.

36 36 Definitions Performance Measurement: The ongoing monitoring and reporting of program progress and accomplishments, using pre-selected performance measures. Performance measure – a metric used to gauge program or project performance. Indicators – measures, usually quantitative, that provide information on program performance and evidence of a change in the “state or condition” in the system.

37 37 Definitions Program Evaluation:  A systematic study that uses measurement and analysis to answer specific questions about how well a program is working to achieve its outcomes and why.  Consists of various activities: Needs assessment Design assessment Process/Implementation Evaluability assessment Outcome and Impact

38 38 Orientation/Approaches to Measurement and Evaluation  Program Evaluation Orientation Accountability (Audit) or Learning & Program Improvement -What outcomes have been achieved and why? -What aspects of my program lead to these outcomes? -What roles did context play in my outcomes?  Performance Measurement Orientation Accountability

39 39 Differences between Measurement and Evaluation Performance Measurement  Ongoing monitoring and reporting of accomplishments.  Examines achievement of program objectives.  Describes program achievements in terms of outputs, outcomes in a given time against a pre-established goal.  Early warning to management. Program Evaluation  In-depth, systematic study conducted periodically or on ad-hoc basis.  Examines broader range of information on program performance than is feasible to monitor on an on-going basis.  Explains why the results occurred.  Longer term review of effectiveness.

40 40 Relationship between Measurement and Evaluation  Performance measurement data provides information needed to conduct the evaluation and assess program performance.  Lack of performance measurement data is a major obstacle to conducting an evaluation.

41 41 What can Measurement and Evaluation do for you?  Ensure program goals & objectives are being met.  Determine if allocated resources are yielding the greatest environmental benefit.  Identify what works well, what does not and why.  Identify program areas that need improvement.

42 42 Logic Model Longer term outcome (STRATEGIC AIM) Short term outcome CustomersOutputs WHY HOW PROGRAM RESULTS FROM PROGRAM EXTERNAL CONDITIONS INFLUENCING PERFORMANCE (+/-) Intermediate outcome Activities Resources/ Inputs

43 43 Types of Measures CategoryDefinitionExamples Resources/ Inputs Resources consumed by the organization. Amount of funds, # of FTE, materials, equipment, supplies (etc.). Activities The work performed that directly produces the core products and services. # of training classes offered as designed; Hours of technical assistance training for staff. Outputs Products and services provided as a direct result of program activities. # of technical assistance requests responded to; # of compliance workbooks developed/delivered. Customer Reached Measure of target population receiving outputs. % of target population trained; # of target population receiving technical assistance. Customer Satisfaction Measure of satisfaction with outputs.% of customers dissatisfied with training; % of customers “very satisfied” with assistance received. Outcomes Accomplishment of program goals and objectives (short-term and intermediate outcomes, long-term outcomes--impacts). % increase in industry’s understanding of regulatory recycling exclusion; # of sectors that adopt regulatory recycling exclusion; % increase in materials recycled.

44 44 Work Quality Measures CategoryDefinitionExamples Efficiency Measure that relates outputs to costs. Cost per workbook produced; cost per inspection conducted. Productivity Measure of the rate of production per some specific unit of resource (e.g., staff or employee). The focus is on labor productivity. Number of enforcement cases investigated per inspector. Cost Effectiveness Measure that relates outcomes to costs. Cost per pounds of pollutants reduced; cost per mile of beach cleaned. Service Quality Measure of the quality of products and services produced. Percent of technical assistance requests responded to within one week.

45 45 ESD Training Performance Measures ResourcesActivitiesOutputsCustomer reached Short-term Outcome Intermediate Outcome Long-term Outcomes ESD Staff  Y. Watson  M. Mandolia  J. Heffelfinger  D. Bend  C. Kakoyannis Access to consultant John McLaughlin $65,000  Develop and design PE, PM, IA and Logic Model curriculum and exercises.  Deliver PE, PM, IA and Logic Model training.  Provide technical assistance for workshop/training attendees.  PE, PM, IA and Logic Modeling materials.  Customers complete training.  Technical assistance delivered.  NCEI staff  IAC staff  PEN  PEC Winners  HQ/Regional Staff  SIG Recipients  States  Tribes  SBAP  CARE  Knowledge of LM, PE, PM and IA increased /improved  Customers equipped w/ skills to manage & conduct evaluations and develop performance measures  Customers understanding of their program improves.  PE, PM & LM skills are used by customers in the work environment  Evaluations conducted and managed increased  Staff developing measures is increased  Quality of evaluations managed and conducted is improved  Quality of measures developed & reported improved.  Environmental programs more effectively achieve their strategic goals.  Resources expended per year  # of training courses designed  # of trainings delivered per category  # of customers completing training  # /%trained by category  Level of satisfaction w/ training  #/% of EPA staff/partners reporting increase in knowledge about LM, PE and PM  # of customers that have developed a PE or PM Plan, Logic Model or Measures or conducted an evaluation  Increase in number of peer reviewed evaluations conducted Agency-wide Efficiency: Cost per workshop/training delivered Average prep-time per workshop Productivity: Hours of training per full-time equivalent Example Measures Logic Model Elements

46 46 Types of Evaluation Questions as they Fit into the Logic Model Process NEEDS OutcomesImpact Longer term outcome (STRATEGIC AIM) Intermediate outcome Short term outcome CustomersOutputsActivities Resources/ Inputs WHYHOW Evaluation Dialogue Between OMB and Federal Evaluation Leaders: Digging a Bit Deeper into Evaluation Science, April 2005

47 47 Common Evaluation Questions Asked at Different Stages of Program Development Program StageType of ActivityCommon Evaluation Questions Program design 1. Needs assessment  What are the dimensions of the problem and the resources available to address it? 2. Design assessment  Is the design of the program well formulated, feasible, and likely to achieve the intended goals? Early stage of program or new initiative within a program 1. Process evaluation or implementation assessment 2. Outcome monitoring or evaluation  Is the program being delivered as intended to the targeted recipients?  Is the program well managed?  What progress has been made in implementing new provisions?  Are the early program outcomes observed desirable? Mature, stable program with well- defined program model 1. Evaluability assessment  Is the program ready for an outcome or impact evaluation? 2. Process evaluation  Why is a program no longer obtaining desired outcomes? 3. Outcome monitoring or evaluation  Are desired program outcomes obtained?  Did the program produce unintended side-effects? 4. Net impact evaluation  Did the program cause the desired impact?  Is one approach more effective than another in obtaining the desired outcomes? Adapted from Evaluation Dialogue Between OMB and Federal Evaluation Leaders: Digging a Bit Deeper into Evaluation Science, April 2005

48 48 Evaluation Questions Across the Performance Spectrum PROGRAM ELEMENTS: Resources/ Inputs (We use These) Activities/ Outputs (To do these things) Target Customer (For these people) Short-term Outcome Intermediate Outcome Long-Term Outcome EVALUATION QUESTIONS:  Do we have enough,  The right,  The necessary level,  The consistency? Why or why not?  Are we doing things the right way we say we should?  Are we producing products and services at the levels anticipated?  According to anticipated quality indicators? Why or why not  Are we reaching the customers targeted?  Are we reaching the anticipated numbers?  Are they satisfied? Why or why not?  Did the customers understanding, knowledge, skills, or attitude change?  What evidence do we have that the program caused the changes?  Are customers using the information, knowledge, skill, or attitude change as expected?  With what results?  Are customers served changing behaviors/practi ces in the expected direction/level?  If so, what did we do to cause the behavior change?  What changes in condition (Environment) have occurred?  Why? EXTERNAL CONDITIONS: What factors might influence my program’s success?

49 49 Contacts Yvonne M. Watson Evaluation Support Division National Center for Environmental Information U.S. Environmental Protection Agency (202) 566-2339 watson.yvonne@epa.gov


Download ppt "1 Introduction to Logic Modeling Workshop National Environmental Partnership Summit May 8, 2006 Presented by: Yvonne M. Watson, Evaluation Support Division."

Similar presentations


Ads by Google