Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Survival Skills: Learn How to Measure What Matters January 7, 2008 Presented by: Yvonne M. Watson and Britta Johnson Evaluation Support Division National.

Similar presentations


Presentation on theme: "1 Survival Skills: Learn How to Measure What Matters January 7, 2008 Presented by: Yvonne M. Watson and Britta Johnson Evaluation Support Division National."— Presentation transcript:

1 1 Survival Skills: Learn How to Measure What Matters January 7, 2008 Presented by: Yvonne M. Watson and Britta Johnson Evaluation Support Division National Center for Environmental Innovation Office of Policy, Economics and Innovation U.S. Environmental Protection Agency

2 2 Presentation Goals  Enable participants to: Use a logic model to help understand how program resources and activities produce specific outputs and how they connect to outcomes to achieve program/project goals and objectives Understand the different types of performance measures (resource, outputs, outcomes, efficiency, productivity, etc.) Use a step-by-step approach to develop, assess, and choose the appropriate measures for their organization, program, or project

3 3 Steps to Developing, Implementing and Reporting Performance Measurement Information I. Identify Team/Develop Performance Measurement Plan II. Describe the Program III. Develop Performance Measurement Questions IV. Develop Measures VI. Analyze and Interpret Information VII. Develop the Report V. Collect Information

4 4 Performance Measurement: Session Agenda  Module 1: Planning for Performance Measurement  Module 2: Identifying and Developing Performance Measures

5 5 PERFORMANCE MANAGEMENT Performance management includes activities to ensure that goals are consistently being met in an effective and efficient manner. Performance management tools include logic models, performance measurement and program evaluation. Logic Model Tool/framework that helps identify the program/project resources, activities, outputs customers, and outcomes. Performance Measurement Helps you understand what level of performance is achieved by the program/project. Program Evaluation Helps you understand and explain why you’re seeing the program/project results. PERFORMANCE MANAGEMENT TOOLS

6 6 Drivers for Performance Measurement  Good Program Management  Government Performance and Results Act (GPRA) of 1993 Requires EPA to report schedules for and summaries of program evaluations that have been or will be conducted and identify those that influence development of the Agency’s Strategic Plan.  OMB’s Program Assessment Rating Tool (PART) Tool designed to assess and evaluate programs across the federal government.  Environmental Results Order 5700.7 Requires EPA grant officers and grant recipients to identify outputs and outcomes from grants and connect them to EPA’s strategic plan.  Regional Priorities

7 7 Orientation Exercise  In small groups, discuss and complete the following incomplete sentences: A program is … Performance Measurement is … Performance Measurement leads to … How do we use measurement in our everyday lives?  One thing we want to learn from this workshop is…  After completing the sentences, select a reporter.

8 8 Module 1: Planning for Performance Measurement

9 9 Steps to Developing, Implementing and Reporting Performance Measurement Information I. Identify Team/Develop Performance Measurement Plan II. Describe the Program III. Develop Performance Measurement Questions IV. Develop Measures VI. Analyze and Interpret Information VII. Develop the Report V. Collect Information

10 10 Part A. Identify Team Members  Individuals responsible for the design, collection and reporting of performance information  Individuals with intimate knowledge of the program/project  Individuals who have a vested interest in the conduct/impact of the program  Individuals with some knowledge of the measurement planning process  Identify a Skeptic!

11 11 Part B. Develop the Measurement Plan: Things to Consider  Purpose of the performance measurement system  Project/ program mission  Primary audience  Scope (including program description/ logic model)  Context (organizational, management, political)  Role, expectations for program staff, participants, and key stakeholders

12 12 Part B. Measurement Plan Outline  Performance measurement questions  Data collection/analysis  Reporting (including feedback loop)  Resources (staff and budget)  Timeline  Communication  Steps to monitor implementation of the plan

13 13 Steps to Developing, Implementing and Reporting Performance Measurement Information I. Identify Team/Develop Performance Measurement Plan II. Describe the Program III. Develop Performance Measurement Questions IV. Develop Measures VI. Analyze and Interpret Information VII. Develop the Report V. Collect Information

14 14 Elements of the Logic Model Inter- mediate Changes in behavior, practice or decisions. Behavior Inter- mediate Changes in behavior, practice or decisions. Behavior Customer User of the products/ services. Target audience the program is designed to reach. Customer User of the products/ services. Target audience the program is designed to reach. Activities Things you do– activities you plan to conduct in your program. Activities Things you do– activities you plan to conduct in your program. Outputs Product or service delivery/ implementation targets you aim to produce. Outputs Product or service delivery/ implementation targets you aim to produce. Resources/ Inputs: Programmatic investments available to support the program. Resources/ Inputs: Programmatic investments available to support the program. Short-term Changes in learning, knowledge, attitude, skills, understanding. Attitudes Short-term Changes in learning, knowledge, attitude, skills, understanding. Attitudes Long- term Change in condition. Condition Long- term Change in condition. Condition External Influences Factors outside of your control (positive or negative) that may influence the outcome and impact of your program/project. External Influences Factors outside of your control (positive or negative) that may influence the outcome and impact of your program/project. Outcomes WHY HOW PROGRAM RESULTS FROM PROGRAM

15 15 ResourcesActivitiesOutputs For Customers Short-term outcomes Intermediate outcomes Long-term outcomes Fitness Plan Strength Training Plan External Influences: Injuries, Broken Equipment Source: Britta Johnson, 2007 Britta, Yoga Mat, Cardio Equipment, Weights Britta, Yoga Mat, Cardio Equipment, Weights Build Muscles Physically Fit (Increased Flexibility, Increased Cardiac Capacity, Increased Strength) Britta Practice Yoga Muscles Stretched Cardio Workout Calories Burned, Heart Rate Elevated Calories Burned, Heart Rate Elevated Work Out on a Regular Basis Increased Drive to Work Out

16 16  Develop and design PE, PM, IA and Logic Model curriculum and exercises.  Deliver PE, PM, IA and Logic Model training. PE skills are used by customers in the work environment # of evaluations conducted and managed increased. Resources Outcomes Short- term IntermediateLong- term OutputsActivitiesCustomers  Knowledge of PE increased/ improved.  Customers equipped with skills to manage and conduct evaluations. ESD Staff: Y. Watson M. Mandolia J. Heffelfinger D. Bend C. Kakoyannis Access to: John McLaughlin  NCEI Staff  IAC Staff  PEN  PEC Winners  HQ/ Regional managers & staff Partners OCFO OW OSWER ORD OARM  Knowledge of PM increased/ improved.  Customers equipped with skills to develop measures.  Technical assistance delivered. Strategic Plan ESD TRAINING LOGIC MODEL  Knowledge of Logic modeling increased/ improved.  Customers equipped with skills to develop logic models of their programs. Customers understanding of their programs is improved. PM skills are used by customers in the work environment. # of staff developing measures is increased. Customers use program evaluation regularly and systematically to improve environmental programs in terms of: - environmental & health outcomes - reduced costs - cost effective- ness - EJ Benefits -Public Involvement - Efficiency Environ- mental programs more effectively achieve their strategic goals. Quality of evaluations managed and conducted is improved. Quality of measures developed and reported is improved.  Provide technical assistance for workshop/ training attendees.  PM training materials.  Customers complete training.  PE training materials.  Customers complete training.  NCEI Staff  SIG Recipients  HQ/ Regional managers & staff  States/Tribes  SBAP  CARE Customers use logic models to help conduct evaluations and develop measures.  Logic Model training materials.  Customers complete training.  NCEI Staff  SIG Recipients  States  SBAP  CARE  Provide guidance for Environmental Results Grants Training.  Facilitate Train the trainer sessions for PE, PM and Logic Modeling.  Environmental Results Grants Training materials.  Partners complete training.  IA training materials.  Customers complete training. OCFO, OW, OSWER, ORD, OARM EPA Project Officers & Grant Managers Partners deliver PE, PM and Logic model training to their clients/ customers. EPA POs & GMs recognize outputs/ outcomes in grant proposals ESD Training Goal: To provide training to enable our EPA partners to more effectively conduct and manage program evaluations and analyses and develop performance measures that can be used to improve their programs and demonstrate environmental results.

17 17 Two Important Rules to Follow  For every action identified in the Logic Model, the must be an output that connects to an outcome through a specific customer. OR  An action must produce an output that becomes a key input to another activity. THINK CONNECTIONS!

18 18 Exercise 1: Logic Modeling Developing a logic model

19 19

20 20 Module 2: Identifying and Developing Performance Measures

21 21 Definitions: Performance Measurement: The ongoing monitoring and reporting of program progress and accomplishments, using pre-selected performance measures. Performance Measure: A metric used to gauge program or project performance. Indicators: Measures, usually quantitative, that provide information on program performance and evidence of a change in the “state or condition” in the system.

22 22 Differences Between an Indicator and Performance Measure  Terms used to describe data associated with outcomes  Difference between terms is degree of control we have over them  Performance measures help assess the effect of the program  Indicators measure the “state of” something typically in the natural environment

23 23 Uses of Performance Measurement  Monitoring and reporting  Strategic planning  Budgeting and financial management  Program management  Process improvement  Contract management  Communication with public Source: Chapel, T., Center for Disease Control and Prevention, Power Point Presentation, Program Alignment, Performance Measurement, and Program Improvement

24 24 Limitations and Pitfalls of Performance Measures  Provide descriptive data, not rigorously evaluative.  Can encourage undesirable behavior.  May require too much time and effort.  Can be ignored, not automatically used.  Performance measurement is time-bound, and context-bound.

25 25 Steps to Developing, Implementing and Reporting Performance Measurement Information I. Identify Team/Develop Performance Measurement Plan II. Describe the Program III. Develop Performance Measurement Questions IV. Develop Measures VI. Analyze and Interpret Information VII. Develop the Report V. Collect Information

26 26 Performance Questions Across the Performance Spectrum PROGRAM ELEMENTS: Resources (We use these) Activities/ Outputs (To do these things) Target Customer (For these people) Short term Outcome (To change them in these ways) Intermediate Outcome (So they can do these things) Long-Term Outcome (Which leads to these outcomes) PERFORMANCE QUESTIONS:  Do we have enough,  The right,  The necessary level,  The consistency?  Are we doing things the way we say we should?  Are we producing products and services at the levels anticipated?  According to anticipated quality indicators measures?  Are we reaching the customers targeted?  Are we reaching the anticipated numbers?  Are they satisfied?  Did the customer’s attitude, knowledge, skills or understanding change?  Are customers using the change as expected? With what results?  Are customers served changing in the expected direction and level?  If so, what did we (others) do to cause the change?  What changes in condition have occurred?  Did the program achieve its goals and objectives? EXTERNAL INFLUENCES: What factors might influence my program’s success?

27 27 Measures Across the Logic Model Spectrum ElementDefinitionExample Measure Resources/ Inputs Measure of resources consumed by the organization. Amount of funds, # of FTE, materials, equipment, supplies (etc.). Activities Measure of work performed that directly produces the core products and services. # of training classes offered as designed; Hours of technical assistance training for staff. Outputs Measure of products and services provided as a direct result of program activities. # of technical assistance requests responded to; # of compliance workbooks developed/delivered. Customer Reached Measure of target population receiving outputs. % of target population trained; # of target population receiving technical assistance. Customer Satisfaction Measure of satisfaction with outputs.% of customers dissatisfied with training; % of customers “very satisfied” with assistance received. Outcomes Accomplishment of program goals and objectives (short-term and intermediate outcomes, long-term outcomes--impacts). % increase in industry’s understanding of regulatory recycling exclusion; # of sectors that adopt regulatory recycling exclusion; % increase in materials recycled.

28 28 Work Quality Measures CategoryDefinitionExamples Efficiency Measure that relates outputs to costs. Cost per workbook produced; cost per inspection conducted. Productivity Measure of the rate of production per some specific unit of resource (e.g., staff or employee). The focus is on labor productivity. Number of enforcement cases investigated per inspector. Cost Effectiveness Measure that relates outcomes to costs. Cost per pounds of pollutants reduced; cost per mile of beach cleaned. Service Quality Measure of the quality of products and services produced. Percent of technical assistance requests responded to within one week.

29 29 Types of Performance Measures 1.Number of technical assistance requests responded to. 2.Number of workshops conducted 3.Gallons of waste gasoline emitted into marinas 4.Number and percent of marina owners & boaters reporting increased knowledge of regulations and best practices 5.Number and percent of marina’s contacted 6.Percent increase in targeted marina owners/boaters adopting BMPs 7.Total combined in-kind hours contributed by partner organizations 1.Output 2.Activity 3.Long-term outcome 4.Short-term Outcome 5.Customer reached 6.Intermediate Outcome 7.Resource Example Type of Measure

30 30 Steps to Developing, Implementing and Reporting Performance Measurement Information I. Identify Team/Develop Performance Measurement Plan II. Describe the Program III. Develop Performance Measurement Questions IV. Develop Measures VI. Analyze and Interpret Information VII. Develop the Report V. Collect Information

31 31 Steps for Developing Measures  Step 1: Identify Potential Measures  Step 2: Assess Each Measure  Step 3: Choose the Best Measures  Step 4: Identify Baseline, Target, Timeline and Reporting Schedule

32 32 Key Steps in Identifying Potential Measures  Identify the information needed and the audience  Identify measures in existing documents  Review the logic model and select the appropriate logic model element  Express the logic model element as a performance measure  Determine if the measure clearly relates to the program/project goal or objective

33 33 Identify the Information Needed and the Audience  Review the performance measurement questions developed earlier  Consider what information is needed to assess whether your program/project is meeting its goals and objectives. Ask yourself:  Who needs to know what about the program, why, and in what format?

34 34 Identify Measures in Existing Documents  Review measures specified in: Program/Project Mission, Goals, Objectives, Service standards Legislation, Strategic plans (GPRA), Court Orders, PART, Regional Plans, National Program Management Guidance, Regional Priority Commitments Previous evaluations and research reports Consider other sources

35 35 Review the Logic Model  Review the logic model – Identify the aspects of performance that are most important to measure (resources, activities, outputs, outcomes) Identify contextual factors that could influence the program either positively or negatively and generate measures for them as appropriate

36 36

37 37 Express the Logic Model element as a performance measure  Consider how to express the measure in terms of: Data: –Raw Numbers (number of marinas in compliance) –Averages (mean number of compliance issues) –Percentages (% of marinas in compliance) –Ratios (Cost per marina in compliance) –Rates (number of compliance violations per 100 marinas) Unit of Measure: –Is it appropriate to the measure?

38 38 New England Marina Initiative Measures ActivitiesOutputsCustomer reachedShort-term Outcome Intermediate Outcome Long-term Outcomes  Establish goals, identify resources, & gaps  Facilitate Regional Marina Workgroup  Develop measurement process using statistical principles, random sampling techniques  Conduct environmental workshops  Inventory of marina owners  Strategic Plans  Network of Partners  Baseline Data  Technical assistance provided  Brochures & Fact sheets  State Environmental Agencies (Regional Marina Workgroup)  State & Regional Marine Trade Associations  NGOs  Marina Owners & Staff  Boaters  Region 1 & Partners’ Awareness and Understanding of Marinas’ Environmental Impact Improves  Region 1 & Partners Agree to Address Issues  Marina Owners & Boaters have Increased Knowledge of Regulations and Best Practices  Emerging Issues are Identified  Marina Owners & Boaters take Corrective Action to Comply with Regulations  Marina Owners & Boaters Adopt Marina Best Practices  Trade Associations are More Proactive in Addressing Marina Environmental Issues  Healthier Marina Communities  # of measurement processes developed  # of workshops conducted  # of guidance documents and fact sheets distributed  Number of technical assistance requests responded to  #/% of marinas contacted  Level of satisfaction with technical assistance  #/% of marina owners & boaters reporting increase in knowledge of regulations and best practices  % increase in targeted marina owners/ boaters adopting best practices  Gallons of waste gasoline emitted into marinas Efficiency: Cost per training workshop Productivity: Hours per technical assistance visit per FTE Example Measures Logic Model Elements

39 39 Determine whether the measures clearly relate to the mission/goal  Review the program/project mission and or goal What key activities, outputs or outcomes are specified in the mission or goal?  Review the list of potential measures developed Will the data collected from the measures developed clearly demonstrate that the mission and or goal was accomplished?

40 40 Determine whether the measures clearly relate to the mission/goal New England Marina Initiative Purpose: The New England Marinas Initiative is a regionally coordinated assistance initiative to improve marina environmental performance through implementing an effective regional education and outreach campaign that includes: improving awareness and compliance, increasing implementation of best management practices, and enhancing the current assistance provider network to help achieve sustained industry environmental support. Performance Measures  # of measurement processes developed  # of workshops conducted  # of guidance documents and fact sheets distributed  Number of technical assistance requests responded to  #/% of marinas contacted  Level of satisfaction with technical assistance  #/% of marina owner & boaters reporting increase in knowledge of regulations and best practices  % of targeted marina owners/boaters adopting best practices  Gallons of waste gasoline emitted into marinas

41 41 Exercise 2: Performance Measures Developing Your Own Measures

42 42 Step 2: Assess Each Measure Assess the feasibility of the measures for:  data collection  data quality  analysis  reporting

43 43 Assess Each Measure for Data Collection Availability Existing data vs. new data  Frequency One time collection, continued data collection  Utility Data available for use Supports an acceptable baseline  Cost Overall implementation cost

44 44 Assess Each Measure for Data Quality  Reliability Provides consistent readings  Validity Measures what it is supposed to measure  Objectivity Free from bias and represents reality

45 45 Assess Each Measure for Analysis:  Statistically reliable Population data Sample data  Type of analysis Trends over time Actual performance against targets or standards Variation across units (internal benchmarking) Against benchmarks (external benchmarking)  External factors Direct or near direct control over the measure Account for impact of external factors on the measure

46 46 Step 3: Choose the Best Measures  Assess the value of the measures in relation to the goals and objectives of the program. Required Important Interesting  Select final list of measures – you won’t be able to collect data for all measures.  Check in with managers and stakeholders.  Identify a priority list of measures

47 47 Step 4: Identify a Standard For each performance measure develop a: 1.Baseline – current state 2.Target – desired level of performance 3.Timeline – date when performance will be achieved

48 48 Criteria for Useful Performance Measures Is each measure: If so, then it will be: Objective-linked Directly related to clearly stated objectives for your program. Responsibility- linked Matched to specific organizational units and people that are responsible for AND capable of taking action to improve performance. Organizationally acceptable Valued by all levels in the organization, used as a management tool, and viewed as being “owned” by those accountable for performance. Comprehensive Inclusive of all relevant aspects of the program performance; e.g., measuring quality and quantity. CredibleBased on accurate and reliable data sources and methods, and to the extent possible, not open to manipulation or distortion. Cost-effective Acceptable in terms of data collection, processing, and reporting. Compatible Integrated with existing information systems. Comparable with other data Useful in making comparisons; e.g., performance can be compared from period to period, with peers, to other programs. Easy to interpret and report Presented graphically and accompanied by commentary! [1][1] Adapted from Price Waterhouse – Center for Performance Measurement

49 49 Tips for Choosing the Best Measures For each measure ask…  Does the measure clearly relate to the project goal and objective?  Is the measure important to management and stakeholders?  Is it possible to collect accurate and reliable data for the measure?  Taken together, do the measures accurately reflect the key results of the program, activity or service?  Is there more than one measure for each goal or objective?  Are your measures primarily outcome, efficiency, or quality measures?

50 50 Muddy Waters  Describe three different types of performance measures (slide 36-37)  Why is it important to consider your audience when developing performance measures (slide 41)?  How can a logic model help you develop performance measures (slide 45)?

51 51 Contacts: Yvonne M. Watson watson.yvonne@epa.gov (202) 566-2239 Britta Johnson johnson.britta@epa.gov (202) 566-1465


Download ppt "1 Survival Skills: Learn How to Measure What Matters January 7, 2008 Presented by: Yvonne M. Watson and Britta Johnson Evaluation Support Division National."

Similar presentations


Ads by Google