Presentation is loading. Please wait.

Presentation is loading. Please wait.

Integrating Evaluation into the Design of Your Innovative Program Evaluation Support Division National Center for Environmental Innovation Office of Policy,

Similar presentations


Presentation on theme: "Integrating Evaluation into the Design of Your Innovative Program Evaluation Support Division National Center for Environmental Innovation Office of Policy,"— Presentation transcript:

1 Integrating Evaluation into the Design of Your Innovative Program Evaluation Support Division National Center for Environmental Innovation Office of Policy, Economics and Innovation US Environmental Protection Agency Innovation Symposium Chapel Hill, NC Thursday, January 10, 2008

2 2 Workshop Outline 1.Introductions 2.Activity – Evaluation in Our Lives 3.Evaluation and its Evolution at EPA 4.Case Study – Product Stewardship in MN 5.Exercise – Integrating Evaluation in MN 6.Opportunities to Integrate Evaluation

3 3 Introductions  This will be an interactive workshop… so let’s interact! Get to know someone at your table Tell us Who they are, Who they work with, and Their New Year’s resolution

4 4 Purpose of the Workshop  Through discussion and a practical, real- world example, provide participants with the structure and conceptual understanding necessary to integrate evaluation and performance management into the design of environmental programs.

5 5 Evaluation In Our Lives  Activity Name something in your life that you or someone else decided was worth measuring and evaluating. What was the context? Was there a target or goal…what was it? Who was the audience? How did you measure progress or success? How did you use what you learned?

6 6 Evaluation In Our Programs  What can we take from evaluation in our lives and apply to addressing environmental challenges? Measure what matters Evaluate for others and for ourselves  Integrating evaluation into program design Equal parts art and skill Performance management and quality evaluation are inseparable

7 7 Evaluation In The EPA  Evaluation Support Division  ESD’s Mission Evaluate innovations Build EPA’s capacity to evaluate  Performance Management An approach to accomplishing EPA goals and ESD’s mission

8 8 Performance Management PERFORMANCE MANAGEMENT Performance management includes activities to ensure that goals are consistently being met in an effective and efficient manner. Performance management tools include logic models, performance measurement and program evaluation. Logic Model Tool/framework that helps identify the program/project resources, activities, outputs customers, and outcomes. Performance Measurement Helps you understand what level of performance is achieved by the program/project. Program Evaluation Helps you understand and explain why you’re seeing the program/project results.

9 9 Steps to Completing an Evaluation VI. Design the Evaluation II. Identify Team/Develop Evaluation Plan III. Describe the Program IV. Develop Evaluation Questions V. Identify/Develop Measures VIII. Analyze and Interpret Information IX. Develop the Report VII. Collect Information I. Selecting a Program for Evaluation

10 10

11 Logic Model Longer term outcome ( STRATEGIC AIM ) Short term outcome CustomersOutputs WHY HOW PROGRAM RESULTS FROM PROGRAM EXTERNAL CONDITIONS INFLUENCING PERFORMANCE (+/-) Intermediate outcome Activities Resources/ Inputs VictoryCommitmentTrainingSnodgrassJugglingRegimen Me

12 12 Performance Measurement  Definition The ongoing monitoring and reporting of program progress and accomplishments, using pre-selected performance measures  Measures are designed to check the assumptions illustrated in the logic model

13 13 Measures Across the Logic Model Spectrum ElementDefinitionExample Measure Resources/ Inputs Measure of resources consumed by the organization. Amount of funds, # of FTE, materials, equipment, supplies (etc.). Activities Measure of work performed that directly produces the core products and services. # of training classes offered as designed; Hours of technical assistance training for staff. Outputs Measure of products and services provided as a direct result of program activities. # of technical assistance requests responded to; # of compliance workbooks developed/delivered. Customer Reached Measure of target population receiving outputs. % of target population trained; # of target population receiving technical assistance. Customer Satisfaction Measure of satisfaction with outputs.% of customers dissatisfied with training; % of customers “very satisfied” with assistance received. Outcomes Accomplishment of program goals and objectives (short-term and intermediate outcomes, long-term outcomes--impacts). % increase in industry’s understanding of regulatory recycling exclusion; # of sectors that adopt regulatory recycling exclusion; % increase in materials recycled.

14 14 Program Evaluation  Definition A systematic study that uses measurement and analysis to answer specific questions about how well a program is working to achieve its outcomes and why.  Orientation/Approaches to Evaluation Accountability External Audience Learning & Program Improvement Internal/External Audiences

15 15 Types of Evaluation Process Evaluation Outcome Evaluation Impact Evaluation Longer term outcome (STRATEGIC AIM) Intermediate outcome Short term outcome CustomersOutputsActivities Resources/ Inputs WHYHOW Design Evaluation

16 16 Questions, Comments and Clarifications  Are there any questions or comments about what we have covered so far?

17 17 Environmental Evaluation: Evolving Theory and Practice  ESD is witnessing the shift from awareness to action  We are adapting to the increasing sophistication of our clients and demands from stakeholders Capacity Building Evaluations  Managing performance requires integrating evaluation into program design

18 18 Our Case Study  Our case study is representative of a trend toward more sophisticated evaluations of environmental programs  ESD is applying learning and adding to it as we take on more sophisticated projects  From here on, you are receiving information necessary to complete the exercises You are responsible for integrating evaluation into the program Ask questions and take notes!

19 19 Case Study: Paint Product Stewardship Initiative  Background on…  Current Status and Goals of PPSI  Minnesota Demonstration Program

20 20 Evaluating the Demonstration Program  What Will We Evaluate? Paint Management Systems Education Markets Cooperation? Financing system?

21 21 Regional Draft Infrastructure  Why Are We Evaluating? Leadership Legislation Learning Transfer

22 22 Evaluating the Demonstration Program  What will we evaluate? Paint, Management Systems, Education, Markets  Why are we evaluating the program? Leadership, Legislation, Learning, Transfer  Can we integrate evaluation into this project? We need a framework to follow…and we are building it as we go Initially, integrating evaluation into your program is a design and planning activity

23 Integrating Evaluation into Program Design

24 24 Questions, Comments and Clarifications  Take a few minutes to familiarize yourself with the mission, goals and objectives of the MN demonstration program

25 25 Exercise: Integrating Evaluation  Minnesota Demonstration Project and Performance Management We will introduce a process for integrating evaluation into the MN program We will use the process to, step-by-step, integrate evaluation into the design of the MN program  Logistics Your table is your group for the rest of the workshop After brief instruction, each team will complete each step of the process and report the results

26 1. Team 2. Mission 3. Goals & Objectives 4. Logic Model Integrating Evaluation into Program Design Program QuestionsDocumentation Measures 1. Context 2. Audience 3. Communication 4. Use 1. Data Sources 2. Collection Methods & Strategy 3. Analysis Tools 4. Data Collection 5. Data Management 1. Performance Management Policy 2. Evaluation Methodology

27  is our program  Your table is the team that will build evaluation into the MN program.  Describing the MN program  Mission  Goals and objectives  Logic model: we are going to make one! Select and Describe the Program 1. Team 2. Mission 3. Goals & Objectives 4. Logic Model Integrating Evaluation into Program Design

28 Describe the Program: Logic Model VictoryCommitmentTrainingSnodgrassJugglingRegimen Me Instructions: Each table will craft a line of logic based on one goal (long-term outcome) of the MN project. For each component of the model (e.g. activity, output, outcome), brainstorm with your group to decide on 2-3 items to complete your line of logic. ResourcesActivitiesOutputsCustomersShort TermIntermediateLong Term Outcomes

29  What are the critical questions to understanding the success of the MN program?  Use an outcome from your logic model to create your evaluation question Evaluation Questions

30  What contextual factors may influence the answers to each question?  Who are the audiences for each question? What’s the best way to communicate with each audience? How might each audience use the answer to each question? Evaluation Questions 1. Context 2. Audience 3. Communication 4. Use

31 31 Evaluation Questions  What are the critical questions to understanding the success of the MN program?  Use an outcome from your logic model to create your evaluation question.  What contextual factors may influence the answers to each question?  Who are the audiences for each question? What’s the best way to communicate with each audience? How might each audience use the answer to each question?

32  What can we measure to answer each question?  Where can we find the information for each measure?  How can we collect the information?  Given our questions and information to be collected, what will be an effective collection strategy? Performance Measures 1. Data Sources 2. Collection Methods & Strategy 3. Analysis Tools 4. Data Collection 5. Data Management Measures

33  What analytical tools will give us the most useful information?  How will we implement the collection strategy?  How will we manage the data? Performance Measures Measures

34 34 Performance Measures  What can we measure to answer each question?  What methods are best suited for each measure?  What analytical tools will give us the most useful information?  Given our questions and information to be collected, what will be our collection strategy? How will we implement the collection strategy? How will we manage the data? Measures

35 Documentation: Methodology & Policy  Evaluation Methodology  The process of integrating evaluation generates a framework for a methodology and an evaluability assessment  Performance Management Policy  Across office programs and projects  Guides strategy and planning 1. Evaluation Methodology 2. Performance Management Policy Measures Documentation

36 36 Check the Logic  Revisit the process and the decisions made  Look for the flow in the process and identify potential breaks  Identify potential obstacles to our approach to managing the performance of the MN demonstration program  1 st cycle is integrating – next cycle begins implementation

37 37 What is happening today with the PPSI?  MOU  Workgroups/committees  Minnesota demonstration project planning  Integrating evaluation into project design

38 38 Recap and Next Steps  Practice : Theory An inconsistent ratio  Movement in the environmental community toward: Evidence Effectiveness Evaluation  Opportunities to merge theory and practice Policy Leadership New programs Capacity building efforts like this one

39 39 Thank You! Evaluation Support Division National Center for Environmental Innovation Office of Policy, Economics and Innovation U.S. Environmental Protection Agency Matt Keene (202) 566-2240 Keene.Matt@epa.gov www.epa.gov/evaluate

40 40

41 41

42 42

43 43 Adaptive Management Cycle

44 44 Evaluation…In the Life of a Program  When to do it?  What are the obstacles?  Are there solutions?  Are there opportunities to improve evaluations in your shop?

45 45

46 46 Evaluation Questions  What are the critical questions to understanding the success of the MN program?  Link your questions to a component in your line of the logic model  What contextual factors may influence the answers to each question?  Who are the audiences for each question? What’s the best way to communicate with each audience? How might each audience use the answer to each question?

47 47 Document Evaluation Policy and Methodology  Evaluation Policy  Evaluation Methodology

48 48 Performance Measures  What can we measure to answer each question?  What methods are best suited for each measure?  What analytical techniques could we use to maximize the rigor of our analysis?  Given the level of rigor desired, what will be our collection strategy? How will we implement the collection strategy? How will we manage the data?

49 49 Materials  Presentation  Flip charts  Markers  Projector  Laptop  Tape for flipchart paper  Post its

50 50 Supporting documents from PPSI, etc.  MN MOU  MN Goals and Objectives and Tasks  Workplan  Logic Model

51 51 Logic Model Conceptual framework Performance Measurement Helps you understand what. Program Evaluation Helps you understand and explain why. Program Mission Adapt/Learn/ Transfer Aggregate/ Analysis Planning Performance Management Cycle – needs adaptive management componets like “implement”

52 52 Steps to Integrating Evaluation into Program Design Select Program Needs Mission Goals & Objectives Logic Model Context Select a Program Document Identify Measures Develop Questions Describe Program Identify a Team Audiences Use Communication Data Management Collection Collection Strategy Analysis Methods Policy Methodology

53 Integrating Evaluation into Program Design Team Program Questions Measures Documentation Needs & Mission Goals & Objectives Logic Model Audience Methods Analysis Strategy Collection Context Communication Use Performance Management Policy Evaluation Methodology Data Management Integrating Evaluation into Program Design

54 54 Program Management Cycle

55 55 Needs, Mission and Goals and Objectives  Mission  What drives the need for performance management?  Goals and Objectives

56 56 Logic Model  Each table gets a logic model template  Goals from the MN project represent a long term outcomes  Each table fills in the other components of the Logic Model  We’ll put the lines of logic together to form a complete’ish model

57 Integrating Evaluation into Program Design Program Questions Measures Documentation Integrating Evaluation into Program Design

58 58 Program Measures Documentation Goals & Objectives Logic Model Data Sources Methods & Strategy Analysis Techniques Collection Context Performance Management Policy Evaluation Methodology Data Management Integrating Evaluation into Program Design Needs & Mission Questions Communication Use Audience Team

59 59 1. Team 2. Mission 3. Goals & Objectives 4. Logic Model Program QuestionsDocumentation Measures 1. Audience 2. Context 3. Communication 4. Use 1. Data Sources 2. Collection Methods & Strategy 3. Analysis Tools 4. Data Collection 5. Data Management 1. Performance Management Policy 2. Evaluation Methodology

60 60

61 61

62 62


Download ppt "Integrating Evaluation into the Design of Your Innovative Program Evaluation Support Division National Center for Environmental Innovation Office of Policy,"

Similar presentations


Ads by Google