Integrating Evaluation into the Design of Your Innovative Program Evaluation Support Division National Center for Environmental Innovation Office of Policy,

Slides:



Advertisements
Similar presentations
ICANN Strategic planning process Draft key priorities for the July 2006 – June 2009 Plan for community comment November 2005.
Advertisements

Beginning Action Research Learning Cedar Rapids Community Schools February, 2005 Dr. Susan Leddick.
Analyzing Student Work
Strategic Management & Planning
How to Map a Sales Process That Creates Value for Customers! July 2003.
Brief Overview of the CREW Project Rijit Sengupta CUTS International CREW Project Inception Meeting March 2013, Jaipur (India)
1 Theory of Change Chesapeake Bay Funders Network Program Evaluation Training Workshop OMG Center for Collaborative Learning January 9-10, 2008.
Chapter8Chapter8 GLOSSARYGLOSSARY EXIT Glossary Modern Management, 9 th edition Click on terms for definitions Business portfolio analysis Commitment principle.
Results-Based Management: Logical Framework Approach
Measuring for Success NCHER Legislative Conference Sophie Walker September 26, 2013.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Results-Based Management: Logical Framework Approach
Community Capacity Building Program Strategic Planning
Continuous Quality Improvement (CQI)
Welcome ISO9001:2000 Foundation Workshop.
CONNECTICUT ACCOUNTABILTY FOR LEARNING INITIATIVE Executive Coaching.
Planning for Sustainability: Framework and Process U.S. Department of Labor, Employment & Training Administration, Division of Youth Services Grantee Check-Up.
CASE STUDIES IN PROJECT MANAGEMENT
Impact Evaluation: Initiatives, Activities, & Coalitions Stephen Horan, PhD Community Health Solutions, Inc. September 12, 2004.
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Module 3 of 3.
Sustainability… Start Now for a Vibrant Future Sustainability Workshop for Persistently Dangerous Schools Grantees Philadelphia, PA Tuesday, September.
 Community Coaching for Planning, Action, and Evaluation A CYFERnet-Community Online Workshop May 18, 2011 Laura Laumatia University of Idaho Susan Jakes.
Integrating Evaluation into the Design of the Minnesota Demonstration Project Paint Product Stewardship Initiative St. Paul, MN May 1, 2008 Matt Keene,
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Toolkit for Mainstreaming HIV and AIDS in the Education Sector Guidelines for Development Cooperation Agencies.
The Measurement and Evaluation of the PPSI Oregon Pilot Program Paint Product Stewardship Initiative Portland, Oregon December 10, 2009 Matt Keene Office.
© 2011 Underwriters Laboratories Inc. All rights reserved. This document may not be reproduced or distributed without authorization. ASSET Safety Management.
LIBERIA THE BIG PICTURE Can the Agency tell the truth about results? Can the participants tell the truth about results?
1 Survival Skills: Learn How to Measure What Matters January 7, 2008 Presented by: Yvonne M. Watson and Britta Johnson Evaluation Support Division National.
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
Communicating Impact Elizabeth Coke Haller School Health Team Leader Program Development and Services Branch Division of Adolescent and School Health.
Logic Models as Tools for Developing Performance Measures Presented by JoAnn A. Smith, MPH Community Health Administration Grants Monitoring and Program.
Mapping the logic behind your programming Primary Prevention Institute
Strategies for Knowledge Management Success SCP Best Practices Showcase March 18, 2004.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Livia Bizikova and Laszlo Pinter
1 Tempus Tempus Workshop Sarajevo 7 June 2006 « Good practice in Preparing an Application » Anne Collette European Training Foundation Tempus Department.
Kathy Corbiere Service Delivery and Performance Commission
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
Integrating Evaluation into the Design of the Minnesota Demonstration Project Paint Product Stewardship Initiative St. Paul, MN May 1, 2008 Matt Keene,
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
Elementary School Administration and Management GADS 671 Section 55 and 56.
EVALUATION OF THE SEE SARMa Project. Content Project management structure Internal evaluation External evaluation Evaluation report.
Capacity Development Results Framework A strategic and results-oriented approach to learning for capacity development.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
Session 2: Developing a Comprehensive M&E Work Plan.
Results Oriented Monitoring & Evaluation Model (ROMEM) Presented by Shawn Grey Ministry of Transport and Works Jamaica. Model developed by MTW and DPMI.
Introduction to Monitoring and Evaluation. Learning Objectives By the end of the session, participants will be able to: Define program components Define.
Getting to the Root of the Problem Learn to Serve 501 Commons November 6, 2013 Bill Broesamle.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Strategic Planning for State Energy Workforce Consortia Day 2.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
January 23,  Balance state’s higher education long range plan and agency operations in the required strategic plan;  Involve agency staff in.
Stages of Research and Development
Rebecca McQuaid Evaluation 101 …a focus on Programs Rebecca McQuaid
Country Level Programs
Monitoring and Evaluation
Fundamentals of Monitoring and Evaluation
Introduction to Comprehensive Evaluation
Introduction to Program Evaluation
Designed for internal training use:
Collaborative Leadership
4.2 Identify intervention outputs
Introduction to the PRISM Framework
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
Using Logic Models in Project Proposals
Presentation transcript:

Integrating Evaluation into the Design of Your Innovative Program Evaluation Support Division National Center for Environmental Innovation Office of Policy, Economics and Innovation US Environmental Protection Agency Innovation Symposium Chapel Hill, NC Thursday, January 10, 2008

2 Workshop Outline 1.Introductions 2.Activity – Evaluation in Our Lives 3.Evaluation and its Evolution at EPA 4.Case Study – Product Stewardship in MN 5.Exercise – Integrating Evaluation in MN 6.Opportunities to Integrate Evaluation

3 Introductions  This will be an interactive workshop… so let’s interact! Get to know someone at your table Tell us Who they are, Who they work with, and Their New Year’s resolution

4 Purpose of the Workshop  Through discussion and a practical, real- world example, provide participants with the structure and conceptual understanding necessary to integrate evaluation and performance management into the design of environmental programs.

5 Evaluation In Our Lives  Activity Name something in your life that you or someone else decided was worth measuring and evaluating. What was the context? Was there a target or goal…what was it? Who was the audience? How did you measure progress or success? How did you use what you learned?

6 Evaluation In Our Programs  What can we take from evaluation in our lives and apply to addressing environmental challenges? Measure what matters Evaluate for others and for ourselves  Integrating evaluation into program design Equal parts art and skill Performance management and quality evaluation are inseparable

7 Evaluation In The EPA  Evaluation Support Division  ESD’s Mission Evaluate innovations Build EPA’s capacity to evaluate  Performance Management An approach to accomplishing EPA goals and ESD’s mission

8 Performance Management PERFORMANCE MANAGEMENT Performance management includes activities to ensure that goals are consistently being met in an effective and efficient manner. Performance management tools include logic models, performance measurement and program evaluation. Logic Model Tool/framework that helps identify the program/project resources, activities, outputs customers, and outcomes. Performance Measurement Helps you understand what level of performance is achieved by the program/project. Program Evaluation Helps you understand and explain why you’re seeing the program/project results.

9 Steps to Completing an Evaluation VI. Design the Evaluation II. Identify Team/Develop Evaluation Plan III. Describe the Program IV. Develop Evaluation Questions V. Identify/Develop Measures VIII. Analyze and Interpret Information IX. Develop the Report VII. Collect Information I. Selecting a Program for Evaluation

10

Logic Model Longer term outcome ( STRATEGIC AIM ) Short term outcome CustomersOutputs WHY HOW PROGRAM RESULTS FROM PROGRAM EXTERNAL CONDITIONS INFLUENCING PERFORMANCE (+/-) Intermediate outcome Activities Resources/ Inputs VictoryCommitmentTrainingSnodgrassJugglingRegimen Me

12 Performance Measurement  Definition The ongoing monitoring and reporting of program progress and accomplishments, using pre-selected performance measures  Measures are designed to check the assumptions illustrated in the logic model

13 Measures Across the Logic Model Spectrum ElementDefinitionExample Measure Resources/ Inputs Measure of resources consumed by the organization. Amount of funds, # of FTE, materials, equipment, supplies (etc.). Activities Measure of work performed that directly produces the core products and services. # of training classes offered as designed; Hours of technical assistance training for staff. Outputs Measure of products and services provided as a direct result of program activities. # of technical assistance requests responded to; # of compliance workbooks developed/delivered. Customer Reached Measure of target population receiving outputs. % of target population trained; # of target population receiving technical assistance. Customer Satisfaction Measure of satisfaction with outputs.% of customers dissatisfied with training; % of customers “very satisfied” with assistance received. Outcomes Accomplishment of program goals and objectives (short-term and intermediate outcomes, long-term outcomes--impacts). % increase in industry’s understanding of regulatory recycling exclusion; # of sectors that adopt regulatory recycling exclusion; % increase in materials recycled.

14 Program Evaluation  Definition A systematic study that uses measurement and analysis to answer specific questions about how well a program is working to achieve its outcomes and why.  Orientation/Approaches to Evaluation Accountability External Audience Learning & Program Improvement Internal/External Audiences

15 Types of Evaluation Process Evaluation Outcome Evaluation Impact Evaluation Longer term outcome (STRATEGIC AIM) Intermediate outcome Short term outcome CustomersOutputsActivities Resources/ Inputs WHYHOW Design Evaluation

16 Questions, Comments and Clarifications  Are there any questions or comments about what we have covered so far?

17 Environmental Evaluation: Evolving Theory and Practice  ESD is witnessing the shift from awareness to action  We are adapting to the increasing sophistication of our clients and demands from stakeholders Capacity Building Evaluations  Managing performance requires integrating evaluation into program design

18 Our Case Study  Our case study is representative of a trend toward more sophisticated evaluations of environmental programs  ESD is applying learning and adding to it as we take on more sophisticated projects  From here on, you are receiving information necessary to complete the exercises You are responsible for integrating evaluation into the program Ask questions and take notes!

19 Case Study: Paint Product Stewardship Initiative  Background on…  Current Status and Goals of PPSI  Minnesota Demonstration Program

20 Evaluating the Demonstration Program  What Will We Evaluate? Paint Management Systems Education Markets Cooperation? Financing system?

21 Regional Draft Infrastructure  Why Are We Evaluating? Leadership Legislation Learning Transfer

22 Evaluating the Demonstration Program  What will we evaluate? Paint, Management Systems, Education, Markets  Why are we evaluating the program? Leadership, Legislation, Learning, Transfer  Can we integrate evaluation into this project? We need a framework to follow…and we are building it as we go Initially, integrating evaluation into your program is a design and planning activity

Integrating Evaluation into Program Design

24 Questions, Comments and Clarifications  Take a few minutes to familiarize yourself with the mission, goals and objectives of the MN demonstration program

25 Exercise: Integrating Evaluation  Minnesota Demonstration Project and Performance Management We will introduce a process for integrating evaluation into the MN program We will use the process to, step-by-step, integrate evaluation into the design of the MN program  Logistics Your table is your group for the rest of the workshop After brief instruction, each team will complete each step of the process and report the results

1. Team 2. Mission 3. Goals & Objectives 4. Logic Model Integrating Evaluation into Program Design Program QuestionsDocumentation Measures 1. Context 2. Audience 3. Communication 4. Use 1. Data Sources 2. Collection Methods & Strategy 3. Analysis Tools 4. Data Collection 5. Data Management 1. Performance Management Policy 2. Evaluation Methodology

 is our program  Your table is the team that will build evaluation into the MN program.  Describing the MN program  Mission  Goals and objectives  Logic model: we are going to make one! Select and Describe the Program 1. Team 2. Mission 3. Goals & Objectives 4. Logic Model Integrating Evaluation into Program Design

Describe the Program: Logic Model VictoryCommitmentTrainingSnodgrassJugglingRegimen Me Instructions: Each table will craft a line of logic based on one goal (long-term outcome) of the MN project. For each component of the model (e.g. activity, output, outcome), brainstorm with your group to decide on 2-3 items to complete your line of logic. ResourcesActivitiesOutputsCustomersShort TermIntermediateLong Term Outcomes

 What are the critical questions to understanding the success of the MN program?  Use an outcome from your logic model to create your evaluation question Evaluation Questions

 What contextual factors may influence the answers to each question?  Who are the audiences for each question? What’s the best way to communicate with each audience? How might each audience use the answer to each question? Evaluation Questions 1. Context 2. Audience 3. Communication 4. Use

31 Evaluation Questions  What are the critical questions to understanding the success of the MN program?  Use an outcome from your logic model to create your evaluation question.  What contextual factors may influence the answers to each question?  Who are the audiences for each question? What’s the best way to communicate with each audience? How might each audience use the answer to each question?

 What can we measure to answer each question?  Where can we find the information for each measure?  How can we collect the information?  Given our questions and information to be collected, what will be an effective collection strategy? Performance Measures 1. Data Sources 2. Collection Methods & Strategy 3. Analysis Tools 4. Data Collection 5. Data Management Measures

 What analytical tools will give us the most useful information?  How will we implement the collection strategy?  How will we manage the data? Performance Measures Measures

34 Performance Measures  What can we measure to answer each question?  What methods are best suited for each measure?  What analytical tools will give us the most useful information?  Given our questions and information to be collected, what will be our collection strategy? How will we implement the collection strategy? How will we manage the data? Measures

Documentation: Methodology & Policy  Evaluation Methodology  The process of integrating evaluation generates a framework for a methodology and an evaluability assessment  Performance Management Policy  Across office programs and projects  Guides strategy and planning 1. Evaluation Methodology 2. Performance Management Policy Measures Documentation

36 Check the Logic  Revisit the process and the decisions made  Look for the flow in the process and identify potential breaks  Identify potential obstacles to our approach to managing the performance of the MN demonstration program  1 st cycle is integrating – next cycle begins implementation

37 What is happening today with the PPSI?  MOU  Workgroups/committees  Minnesota demonstration project planning  Integrating evaluation into project design

38 Recap and Next Steps  Practice : Theory An inconsistent ratio  Movement in the environmental community toward: Evidence Effectiveness Evaluation  Opportunities to merge theory and practice Policy Leadership New programs Capacity building efforts like this one

39 Thank You! Evaluation Support Division National Center for Environmental Innovation Office of Policy, Economics and Innovation U.S. Environmental Protection Agency Matt Keene (202)

40

41

42

43 Adaptive Management Cycle

44 Evaluation…In the Life of a Program  When to do it?  What are the obstacles?  Are there solutions?  Are there opportunities to improve evaluations in your shop?

45

46 Evaluation Questions  What are the critical questions to understanding the success of the MN program?  Link your questions to a component in your line of the logic model  What contextual factors may influence the answers to each question?  Who are the audiences for each question? What’s the best way to communicate with each audience? How might each audience use the answer to each question?

47 Document Evaluation Policy and Methodology  Evaluation Policy  Evaluation Methodology

48 Performance Measures  What can we measure to answer each question?  What methods are best suited for each measure?  What analytical techniques could we use to maximize the rigor of our analysis?  Given the level of rigor desired, what will be our collection strategy? How will we implement the collection strategy? How will we manage the data?

49 Materials  Presentation  Flip charts  Markers  Projector  Laptop  Tape for flipchart paper  Post its

50 Supporting documents from PPSI, etc.  MN MOU  MN Goals and Objectives and Tasks  Workplan  Logic Model

51 Logic Model Conceptual framework Performance Measurement Helps you understand what. Program Evaluation Helps you understand and explain why. Program Mission Adapt/Learn/ Transfer Aggregate/ Analysis Planning Performance Management Cycle – needs adaptive management componets like “implement”

52 Steps to Integrating Evaluation into Program Design Select Program Needs Mission Goals & Objectives Logic Model Context Select a Program Document Identify Measures Develop Questions Describe Program Identify a Team Audiences Use Communication Data Management Collection Collection Strategy Analysis Methods Policy Methodology

Integrating Evaluation into Program Design Team Program Questions Measures Documentation Needs & Mission Goals & Objectives Logic Model Audience Methods Analysis Strategy Collection Context Communication Use Performance Management Policy Evaluation Methodology Data Management Integrating Evaluation into Program Design

54 Program Management Cycle

55 Needs, Mission and Goals and Objectives  Mission  What drives the need for performance management?  Goals and Objectives

56 Logic Model  Each table gets a logic model template  Goals from the MN project represent a long term outcomes  Each table fills in the other components of the Logic Model  We’ll put the lines of logic together to form a complete’ish model

Integrating Evaluation into Program Design Program Questions Measures Documentation Integrating Evaluation into Program Design

58 Program Measures Documentation Goals & Objectives Logic Model Data Sources Methods & Strategy Analysis Techniques Collection Context Performance Management Policy Evaluation Methodology Data Management Integrating Evaluation into Program Design Needs & Mission Questions Communication Use Audience Team

59 1. Team 2. Mission 3. Goals & Objectives 4. Logic Model Program QuestionsDocumentation Measures 1. Audience 2. Context 3. Communication 4. Use 1. Data Sources 2. Collection Methods & Strategy 3. Analysis Tools 4. Data Collection 5. Data Management 1. Performance Management Policy 2. Evaluation Methodology

60

61

62