1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.

Slides:



Advertisements
Similar presentations
Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Advertisements

Performance Measurement: Defining Results Tutorial #1 for AmeriCorps
Introduction to Monitoring and Evaluation
Sustainability Planning Pat Simmons Missouri Department of Health and Senior Services.
How to Evaluate Your Health Literacy Project Jill Lucht, MS Project Director, Center for Health Policy
SIX STAGE MODEL FOR EFFECTIVE HRD EVALUATION
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Results-Based Management: Logical Framework Approach
Evaluation.
Program Evaluation It’s Not Just for OMB Anymore….
Project Cycle Management (PCM)
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
PPA 502 – Program Evaluation
Chapter 15 Evaluation.
Evaluation. Practical Evaluation Michael Quinn Patton.
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
HOW TO WRITE A GOOD TERMS OF REFERENCE FOR FOR EVALUATION Programme Management Interest Group 19 October 2010 Pinky Mashigo.
Continuous Quality Improvement (CQI)
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Reporting and Using Evaluation Results Presented on 6/18/15.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
Eight Steps to Improving Career Services in Schools, Colleges, and Agencies James P. Sampson, Jr. National Career Development Association Global Conference.
Monitoring and Evaluation for Adult Education Programmes Module 1 © 2013 PRIA International Academy | Appreciation Courses Monitoring and Evaluation for.
Presented by: Pechanga Environmental Department Designing and Managing a Recycling Program Source Reduction Strategies for Tribal Solid Waste Programs.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Too expensive Too complicated Too time consuming.
Integrating Evaluation into the Design of the Minnesota Demonstration Project Paint Product Stewardship Initiative St. Paul, MN May 1, 2008 Matt Keene,
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Outcome Based Evaluation for Digital Library Projects and Services
Integrating Evaluation into the Design of Your Innovative Program Evaluation Support Division National Center for Environmental Innovation Office of Policy,
The Logic Model An Introduction. Slide 2 Innovation Network, Inc. Who We Are National nonprofit organization Committed to evaluation as a tool for empowerment.
System Establishing Your Management Reporting System.
CSI - Introduction General Understanding. What is ITSM and what is its Value? ITSM is a set of specialized organizational capabilities for providing value.
The Measurement and Evaluation of the PPSI Oregon Pilot Program Paint Product Stewardship Initiative Portland, Oregon December 10, 2009 Matt Keene Office.
R10 Indian General Assistance Program Environmental Results & IGAP Assistance Agreements ATCEM 2008.
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
LIBERIA THE BIG PICTURE Can the Agency tell the truth about results? Can the participants tell the truth about results?
1 Survival Skills: Learn How to Measure What Matters January 7, 2008 Presented by: Yvonne M. Watson and Britta Johnson Evaluation Support Division National.
ISM 5316 Week 3 Learning Objectives You should be able to: u Define and list issues and steps in Project Integration u List and describe the components.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Overview of Chapters 11 – 13, & 17
Logic Models as Tools for Developing Performance Measures Presented by JoAnn A. Smith, MPH Community Health Administration Grants Monitoring and Program.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Business Analysis. Business Analysis Concepts Enterprise Analysis ► Identify business opportunities ► Understand the business strategy ► Identify Business.
Measuring the Impact of Your Volunteer Program Barbra J. Portzline, Ph.D. Liz Benton, MBA.
Integrating Evaluation into the Design of the Minnesota Demonstration Project Paint Product Stewardship Initiative St. Paul, MN May 1, 2008 Matt Keene,
Learning-Centered Leadership Joseph Murphy Peabody College, Vanderbilt University.
Laying the Groundwork Before Your First Evaluation Diana Epstein, Ph.D, CNCS Office of Research and Evaluation Adrienne DiTommaso, MPA, CNCS Office of.
TCRF Strategic Planning Process A Stakeholders’ Consultative Retreat- Morogoro 26 th -27 April 2013.
Basic Program Logic RESOURCES/INPUTS The ingredients you need to implement your program! YOUR PROGRAM What you do to achieve your departmental goals! RESULTS/IMPACT.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
Session 2: Developing a Comprehensive M&E Work Plan.
Getting to the Root of the Problem Learn to Serve 501 Commons November 6, 2013 Bill Broesamle.
Evaluation What is evaluation?
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Logic Models How to Integrate Data Collection into your Everyday Work.
Designing Effective Evaluation Strategies for Outreach Programs
Implementing Career Resources, Service-Delivery Tools, and Services
Strategic Planning for Learning Organizations
Introduction to Program Evaluation
Short term Medium term Long term
Logic Models and Theory of Change Models: Defining and Telling Apart
Communicate the Impact of Poor Cost Information on a Decision
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Implementing Career Resources, Service-Delivery Tools, and Services
Using Logic Models in Project Proposals
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Presentation transcript:

1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation Support Division National Center for Environmental Innovation Office of Policy, Economics and Innovation U.S. Environmental Protection Agency

2 Presentation Objective Introduce the Paint Product Stewardship Initiative to the key steps in designing the demonstration program evaluation.

3 Session Agenda Program Evaluation: Definition, Uses, Types - What is Program Evaluation? - Why Should We Evaluate? Steps In the Evaluation Process I. Select Program to Evaluate II. Identify Evaluation Team III. Describe the Program IV. Develop Evaluation Questions V. Identify Existing and Needed Data VI. Select Data Collection Methods VII. Select Evaluation Design VIII. Develop Evaluation Plan

4 What is Program Evaluation? Program Evaluation: A systematic study that uses measurement and analysis to answer specific questions about how well a program is working to achieve its outcomes and why. Performance Measurement: The ongoing monitoring and reporting of program progress and accomplishments, using pre-selected performance measures.

5 Why Evaluate? Good Program Management:  Ensure program goals and objectives are being met.  Help prioritize resources by identifying the program services yielding the greatest environmental benefit.  Learn what works well, what does not, and why.  Learn how the program could be improved. Provide information for accountability purposes:  Government Performance and Results Act of 1993: Requires EPA to report schedules for and summaries of evaluations that have been or will be conducted and identify those that influence development of the Agency’s Strategic Plan.  Environmental Results Order : Requires EPA grant officers and grant recipients to identify outputs and outcomes from grants and connect them to EPA’s strategic plan.

6 Steps for Designing an Evaluation VI. Select Data Collection Methods II. Identify Evaluation Team III. Describe the Program IV. Develop Evaluation Questions V. Identify Existing and Needed Data VIII. Develop Evaluation Plan VII. Select Evaluation Design I. Select Program to Evaluate

7 Assessing Whether to Evaluate Your Program (Evaluability Assessment) 1.Is the program significant enough to merit evaluation?  Consider: program size, # of people served, transferability of pilot, undergoing PART 2.Is there sufficient consensus among stakeholders on program’s goals and objectives? 3.Are staff & managers willing to make decisions about or change the program based on evaluation results? 4.Are there sufficient resources (time, money) to conduct an evaluation? 5.Is relevant information on program performance available or can it be obtained? 6.Is an evaluation likely to provide dependable information? 7.Is there a legal requirement to evaluate? (Adapted from Worthen et al )

8 Steps for Designing an Evaluation II. Identify Evaluation Team III. Describe the Program IV. Develop Evaluation Questions V. Identify Existing and Needed Data I. Select Program to Evaluate VI. Select Data Collection Methods VIII. Develop Evaluation Plan VII. Select Evaluation Design

9 Identify Evaluation Team Members Select diverse team members: Individuals responsible for designing, collecting, and reporting information used in the evaluation Individuals with knowledge of the program Individuals with a vested interest in the conduct/impact of the program Individuals with knowledge of evaluation Identify a Skeptic!

10 Steps for Designing an Evaluation VI. Select Data Collection Methods II. Identify Evaluation Team III. Describe the Program IV. Develop Evaluation Questions V. Identify Existing and Needed Data VIII. Develop Evaluation Plan VII. Select Evaluation Design I. Select Program to Evaluate

11 Describe the Program  Describe the program using a logic model  Use the logic model to: Check assumptions about how the program is supposed to work Brainstorm evaluation questions

12

13 Elements of the Logic Model Inter- mediate Changes in behavior, practice or decisions. Behavior Inter- mediate Changes in behavior, practice or decisions. Behavior Customer User of the products/ services. Target audience the program is designed to reach. Customer User of the products/ services. Target audience the program is designed to reach. Activities Things you do– activities you plan to conduct in your program. Activities Things you do– activities you plan to conduct in your program. Outputs Product or service delivery/ implementation targets you aim to produce. Outputs Product or service delivery/ implementation targets you aim to produce. Resources/ Inputs: Programmatic investments available to support the program. Resources/ Inputs: Programmatic investments available to support the program. Short-term Changes in learning, knowledge, attitude, skills, understanding. Attitudes Short-term Changes in learning, knowledge, attitude, skills, understanding. Attitudes Long- term Change in condition. Condition Long- term Change in condition. Condition External Influences Factors outside of your control (positive or negative) that may influence the outcome and impact of your program/project. External Influences Factors outside of your control (positive or negative) that may influence the outcome and impact of your program/project. Outcomes PROGRAM RESULTS FROM PROGRAM WHY HOW

14 Outcomes Shorter-term awareness Intermediate behavior Longer-term condition OutputsActivitiesCustomers PPSI Demonstration Program Program Goal: Design, implement and evaluate a fully-funded statewide paint product stewardship program that is cost-effective and environmentally beneficial OUTREACH/EDUCATION Establish relationships/partnerships Implement education/outreach and social marketing projects/campaign Project Implementation Stage Baseline information Program database Awareness of recycled paint and waste hierarchy improves September 13, 2007 MEASUREMENT Collect baseline data Ongoing data collection Interim analysis Consumers Retailers Manufacturers Agencies Environmental Groups Recyclers Planning and Needs Assessment Stage Implementation Stage Use and Transfer Stage Management systems = Collection, Processing, Transportation, Recycling, Disposal Waste Hierarchy = Reduce, Reuse, Recycle, Resource Recovery Education materials Workshops Media Tools for Consumers and Retailers Less waste paint Decisions based on waste hierarchy Interim reports and presentations

15 Steps for Designing an Evaluation II. Identify Evaluation Team III. Describe the Program IV. Develop Evaluation Questions V. Identify Existing and Needed Data I. Select Program to Evaluate VI. Select Data Collection Methods VIII. Develop Evaluation Plan VII. Select Evaluation Design

16 What are Evaluation Questions?  Questions (at any point on the performance spectrum/ logic model) that the evaluation is designed to answer.  They should reflect stakeholders’ needs.  Evaluation questions are KEY because they: Frame the scope of the evaluation Drive the evaluation design, data collection, and reporting

17 Types of Evaluations and Common Evaluation Questions Evaluation TypeCommon Evaluation Questions Design assessment  Is the design of the program well formulated, feasible, and likely to achieve the intended goals? Process evaluation or implementation assessment  Is the program being delivered as intended to the targeted recipients?  Is the program well managed? Outcome evaluation  Are desired program outcomes obtained?  Did the program produce unintended outcomes? Net impact evaluation  Did the program cause the desired impact?  Is one approach more effective than another in obtaining the desired outcomes? Cost evaluation  What are the specific costs for implementing and operating the program?  Is the program cost efficient? Cost effective? Adapted from Evaluation Dialogue Between OMB and Federal Evaluation Leaders: Digging a Bit Deeper into Evaluation Science, April 2005

18 The Evaluation Plan What: Brief document describing evaluation purpose, audience, scope, design, & methods. Why: The purpose is to clearly articulate and communicate expectations for the evaluation. Who: Developed by one or more team members based on team’s common understanding. When: Can be developed at any point from initial selection of the program through development of the research design. Plans are living documents and need to be revised to account for changes in evaluation objectives or methods.

19 Components of an Evaluation Plan Purpose of the evaluation/ Evaluation questions Primary audience Context (organizational, management, political) Data collection methods and analysis Evaluation design How evaluation findings will be reported Consider different formats for different target audiences Expectations for roles and communication among evaluators, program staff/managers, and key stakeholders Resources available for evaluation (staff, budget) Timeline for evaluation Note: Save sufficient time to develop evaluation questions and analyze data thoroughly.

20 Steps for Designing an Evaluation VI. Select Data Collection Methods II. Identify Evaluation Team III. Describe the Program IV. Develop Evaluation Questions V. Identify Existing and Needed Data VIII. Develop Evaluation Plan VII. Select Evaluation Design I. Select Program to Evaluate

21 Contact Matt Keene (202) Evaluation Support Division National Center for Environmental Innovation Office of Policy, Economics and Innovation U.S. Environmental Protection Agency

22