Presentation is loading. Please wait.

Presentation is loading. Please wait.

John A. McLaughlin MACGROUPX@AOL.COM Logic Modeling: A Tool to Support the Development & Evaluation of State Unit of Aging Programs & Projects John A.

Similar presentations


Presentation on theme: "John A. McLaughlin MACGROUPX@AOL.COM Logic Modeling: A Tool to Support the Development & Evaluation of State Unit of Aging Programs & Projects John A."— Presentation transcript:

1 John A. McLaughlin MACGROUPX@AOL.COM
Logic Modeling: A Tool to Support the Development & Evaluation of State Unit of Aging Programs & Projects John A. McLaughlin

2 My Aim Today Orient you to a different way to think about conceptualizing and telling the performance story of your State Unit on Aging (SUA) programs and projects Provide a simple tool for creating a functional picture of how your SUA works to achieve its aims Offer some helpful hints for framing a useful performance measurement and evaluation strategy for your SUA.

3 Beliefs Social Advocacy Client/customer focus
The right to be part of a well run program Program Staff Advocacy Managing for Results Nobody gets it right the first time out!

4 Themes You’ll Hear Today
GOOD MANAGEMENT Relevance Quality Performance Connections Evidence

5 More Words Goals -- Impacts Objectives Outcome -- changes
Short-term (proximal) Intermediate (distal) Supporting Resources Activities Outputs: productivity and reach

6 PERFORMANCE MANAGEMENT TOOLS
Performance management includes activities to ensure that goals are consistently being met in an effective and efficient manner. Performance management tools include logic models, performance measurement and program evaluation. NOTE: Performance management can focus on performance of the organization, a department, processes to build a product or service, employees, etc. Logic Model Tool/framework that helps identify the program/project resources, activities, outputs customers, and outcomes. Performance Measurement Helps you understand what level of performance is achieved by the program/project. Program Evaluation Helps you understand and explain why you’re seeing the program/project results.

7 Logic Models as Recipes
Recipes have 3 essential components! A good cook follows the recipe – program staff would do well to create & follow their recipe for success!

8 Logic Models as Maps If you were going on a trip, what would be the first question you need to answer? Then, what tool would you need?

9 Recipes & Maps are used for:
Planning Communicating Performance Measurement and Evaluation

10 The Logic Model SCRIPT:
This slide illustrates the plight of many programs. Most people know what resources are used to support a program and they see the outcomes/results of a program, but they don’t have a clear understanding of how or why the results occurred. The piece in the middle is a mystery – this is the miracle that occurs. In the cartoon, the logic model is a tool that will help you demystify the piece in the middle. It provides a structure, framework & process for helping the user understand what happens in the middle. Government programs historically have focused attention on activities and outputs, and assumed that these would somehow translate (the “Miracle”) into achievement of their long-term strategic goals. In a logic model, unlike the cartoon, we want to adequately specify the program theory so we all know what has to be done (i.e. fill in the “then a miracle occurs”). You could introduce the logic model as a recipe or a map – in both cases you start with the end in mind and then look for a tool (recipe or map) that will help chart the course to success – getting to the end in mind!

11 What you do to achieve your long-term aims!
Level I Logic Model RESOURCES / INPUTS The ingredients you need to implement your program! YOUR PROGRAM What you do to achieve your long-term aims! RESULTS / IMPACT Why you are in Business!

12 Level II Logic Model HOW WHY Contextual Influences
1rst Order Outcome 2nd Order Outcome Resources Activities Outputs Customers Impact 1 2 3 4 5 6 7 Program’s Sphere of Influence HOW WHY

13 Understanding the Sphere of Influence
Ask your team to estimate their level of confidence that their program will lead to each outcome in the logic model. The Strategic Impact The Intermediate Outcomes The Short-term Identify Performance Partners!

14 Complex Effects Chain Partners Transparency Shared Common Outcomes

15 Elements of Logic Models
Resources / Inputs: Programmatic investments available to support the program. Objectives / Activities: Things you do– activities you plan to conduct in your program. Outputs: Product or service delivery/implementation targets you aim to produce. Customer: User of the products/services. Target audience the program is designed to reach. Outcomes: Changes or benefits resulting from activities and outputs. Outcome Structure Short-term (K, S, A) – Changes in learning, knowledge, attitude, skills, understanding Intermediate (Behavior) – Changes in behavior, practice or decisions Long-term (Condition) – Changes in condition External Influences: Factors that will influence change in the affected community.

16 Outputs & Outcomes An annual conference disseminates the latest forage research. Low-income families are better able to manage their resources. Program staff teach financial management skills to low-income families. Community volunteers have knowledge and skill to work effectively with at-risk youth. The camp experience provides leadership development opportunities for 4-H youth. Forage producers in Pasture County know current research information and use it to make informed decisions. The program trains and empowers community volunteers. Campers, aged years of age, learn new leadership and communication skills while at camp.

17 Outputs & Outcomes OUTPUT OUTCOME
An annual conference disseminates the latest forage research. Low-income families are better able to manage their resources. Program staff teach financial management skills to low-income families. Community volunteers have knowledge and skill to work effectively with at-risk youth. The camp experience provides leadership development opportunities for 4-H youth. Forage producers in Pasture County know current research information and use it to make informed decisions. The program trains and empowers community volunteers. Campers, aged years of age, learn new leadership and communication skills while at camp.

18 Volunteers If the program is addressing a situation of low volunteer involvement in community affairs and the purpose of the program is to increase volunteering among community residents as a part of a larger community development initiative, then increased numbers of residents volunteering in community life would be an outcome. The outcome is expressed as a behavioral change.

19 Number or type of participants who attend; number of clients served.
If the purpose of the program is to increase use of a service by an underserved group, then numbers using the service would be an outcome. The outcome is not numbers attending or served; the outcome is expressed as use that indicates behavioral change.

20 Participant Satisfaction.
For our purposes in education and outreach programming, client satisfaction may be necessary but is not sufficient. A participant may be satisfied with various aspects of the program (professionalism of staff, location, facility, timeliness, responsiveness of service, etc) but this does not mean that the person learned, benefited or his/her condition improved.

21 Training, Research, Producing
These are Outputs. They may be essential aspects that are necessary and make it possible for a group or community to change. But, they do not represent benefits or changes in participants and so are not outcomes. They lead to, result in outcomes, but in and of themselves, they are outputs.

22 Steps in the Logic Model Process
Establish a stakeholder work group and collect documents. Define the problem and context for the program or project. Define the elements of the program in a table. Develop a diagram and text describing logical relationships. Verify the Logic Model with INTERNAL / EXTERNAL stakeholders. Then use the Logic Model to identify and confirm performance measures, and in planning, conducting and reporting performance measurement and evaluation.

23 Step 1: Establish work group & collect documents & information.
Convene / consult a work group provides different perspectives and knowledge attempts agreement on program performance expectations Review sources of program or project documentation Strategic and operational plans Budget requests Current metrics Past evaluations Conduct interviews of appropriate staff There are two approaches to developing the logic model. Option 1: Lone Wolf This individual reviews the relevant materials, develops the straw logic model and then presents it to the work group for comment. Option 2: Workgroup Approach This approach involves allowing the stakeholder workgroup to be intimately involved in brainstorming the elements of the logic model from start to finish.

24 Step 2: Define problem program addresses & context.
The Context Drivers of Success Constraints on Success Factors leading to the Problem 1 2 3* your niche The Problem Program Addresses The Program

25 Step 3: Define elements of program or project in a table.
External Influences: Resources/ Activities Outputs Customers Short-term Intermediate Long-term Inputs Reached (Change in Attitude) (Change in Behavior) (Change in (Condition) Outcomes HOW WHO WHAT & WHY

26 Step 4: Develop a diagram & text describing logical relationships.
Draw arrows to indicate/link causal relationships between logic model elements. We use these resources… 1.0 For these activities… 2.0 To produce these outputs 3.0 So that the customers can change these ways. 4.0 Which leads to these outcomes. 5.0 Leading to these results! 6.0 Work from both directions (right-to-left and left-to-right)

27 Two Important Rules to Follow
For every action identified in the Logic Model, the must be an output that connects to an outcome through a specific customer. OR An action must produce an output that becomes a key input to another activity. THINK CONNECTIONS!

28 Logic Modeling Exercise 1
Brief application of logic modeling using a United Way example

29 Logic Modeling Exercise
GOAL: Provide an opportunity for participants to apply the principles and practices of logic modeling in an interactive setting. INSTRUCTIONS: The group will be given a set of index cards that contain words or statements that answer the list of questions on the next slide. As a group, review the questions and use the index cards to map out the logic of our case study program on your flip chart paper. When the cards are placed/glued on the paper in the correct order, draw lines connecting the cards to show the logic relationships. When you have completed your logic model, the cards will be ordered so that they describe the program logic and its underlying assumptions (boxes and connecting arrows). Check your logic using if, then and how, why statements. When you have completed this exercise be prepared to report out to the larger group. REMEMBER THE RULES!

30 Questions to Guide Modeling
What are the essential resources we need to implement program? What programs / activities do we have to implement with these people to achieve our results? What are the outputs of our programs? Who / what do we need to reach to achieve these results? What are the short-term and intermediate changes that will enable us to realize our strategic results? What are the strategic results / long-term environmental outcomes we are aiming for? What external influences to the program context do we have to be aware of?

31

32 Worksheet Simple Logic Model Diagram
(representative) Outputs Short-term Outcomes Intermediate Outcomes Target Audience Resources Activities Long-Term Outcomes EXTERNAL INFLUENCES

33

34 “Z” Logic Supplier-Customer Relationship
Unpacking supports more focused Performance Measurement & thus more useful evaluation, as well as better understanding & communication about how the “Program” is supposed to work!

35 “Unpacked Logic Models”

36 Improving Water Quality Training Program
LEVEL I LOGIC MODEL Clean Safe Swimable Fishable Water EPA State Local Private Improving Water Quality Training Program LEVEL II LOGIC MODEL Outcomes Resources Activities Short-term Customers Intermediate Long-term Impact EPA State Local Private Materials Development Recruitment Training Technical Assistance Website Increased awareness of harmful effects Increased awareness of new technologies and incentives Developers & builders acquire new technologies & change practice Reduction in NPS pollutants in waterways More Fish Clean Beaches Clean/ Safe/ Swim-able/ Fishable Water Healthier Wetlands Developers/ Builders

37 LEVEL III LOGIC MODEL EPA State Local Private Materials Development
Outcomes Resources Activities Outputs Short-term Intermediate Long-term Impact EPA State Local Private Materials Development Materials Ready Developers and builders acquire new technologies and change practice · Increased awareness of harmful effects Increased awareness of new technologies and incentives Reduction in NPS pollutants in waterways Healthier Wetlands More Fish Clean Beaches Clean/ Safe/ Swim-able/ Fishable Water Recruitment Trainees Ready Developers/ Builders Trained Training Technical Assistance Trainees Receive TA Website Trainees/ Others aware of/using new information

38

39 Testing The Logic of Strategic Plans

40 Strategic Plan Check: Goals
Is the goal statement outcome oriented? Does it specify the expected strategic change / impact for a specific target group (older persons & disabled)? What evidence is available that this impact / change is important (relevance)? Are there existing needs data? What specific roles, if any, do partners (internal & external) play in the success of this impact? Are there missing Goals to enable the mission / vision to be realized? What concerns you most about this goal, right now?

41 Strategic Plan Check: Objectives
Is the objective outcome oriented? Does it clearly specify the anticipated change for a specific target group & why they need to be changed? Does the change relate to the goal? Will success with this objective lead to success with the goal? (QUALITY) What evidence is available that that this change is important? Are there existing needs data? What specific roles, if any, do partners (internal & external) play in the success of this objective? Are there missing Objectives to enable the goal to be realized? What concerns you most about this objective, right now?

42 Strategic Plan Check: Strategies
Is there a reasonable degree of confidence that strategy will result in achievement of a specific outcome for a specific group? What evidence is available that this strategy is the right strategy – in comparison to others – to achieve the outcome that is specified? What specific roles, if any, do partners (internal & external) play in the success of this strategy? Considering the strategy you’ve adopted, do you have sufficient resources on hand or available to actualize the strategy? Are there missing strategies to enable the objectives to be realized? What concerns you most about this strategy, right now?

43 Scenario Checking What if’s!
Select several external forces & imagine related changes which might influence the SUA, e.g., change in regulations, demographic changes, etc. Scanning the environment for key characteristics often suggests potential changes that might effect the alliance, as does sharing the plan with stakeholders! For each change in a force, discuss 3 different future SUA scenarios (including best case, worst case, & reasonable case) which might arise with the SUA as a result of each change. Reviewing the worst-case scenario often provokes strong motivation to change the SUA – forming partnerships, changing strategy. Conduct likelihood / Impact assessment on each external influence.

44 Scenario Checking What if’s!
Select most likely external changes to effect the SUA, e.g., over the next 3-5 years, identify the most reasonable strategies the SUA can undertake to respond to change. Suggest what the SUA might do, or potential strategies, in each of the 3 scenarios to respond to each change. This process should be repeated for each element of the Logic Model Program structure – Resources, Activities, Outputs Outcome structure – Short-term, Intermediate, Strategic REMEMBER – “NOBODY GETS IT RIGHT THE FIRST TIME OUT!”

45 Logic Modeling Exercise 2
Brief application of logic modeling focusing on a typical SUA program

46 Logic Modeling Exercise
GOAL: Provide an opportunity for participants to apply the principles & practices of logic modeling in an interactive setting. INSTRUCTIONS: Participants will identify 1 SUA program (e.g., community awareness, home delivered or congregate meals, education) to Logic Model as it operates currently. Group will construct a Level I & Level II Logic Model. After constructing the Models and checking using if, then and how, why questions, the participants should discuss who they might tweak the Model to address Choice. Participants will be prepared to present their Modules to the whole group.

47 Benefits of Logic Modeling
Communicates the performance story of the program or project. Focuses attention on the most important connections between actions and results. Builds a common understanding among staff and with stakeholders. Helps staff “manage for results” and informs program design. Finds “gaps” in the logic of a program and works to resolve them.

48 Logic Modeling Benefits
Kellogg, 1998

49 The real value -- Most of the value in a logic model is in the process of creating, validating, and modifying the model … The clarity in thinking that occurs from building the model and the depth and breath of those involved are critical to the overall success of the process as well as the program. Adapted from W.K. Kellogg Foundation Handbook, 1998

50 Social Mechanism

51 Logic Modeling, Performance Measurement, & Evaluation

52 Orientations for Performance Measurement & Evaluation
Accountability, description What objectives/outcomes have been accomplished at what levels? PROGRAM EVALUATION Learning, Program Improvement, Defense What factors, internally and/or externally influenced my performance? (Retrospective) What effect will this level of performance have on future performance if I don’t do something? (Prospective) What roles (+/-) did context play in my performance? Move to after slide #12 (possibly as a summary slide) and make slide #12 the speaker notes that accompany this slide. Possibly include a copy of the memo about performance measures and evaluation from the managers? As you can see from what we’ve been discussing, there are two approaches to evaluation. We feel that the second approach is more desirable. An example of the second type of evaluation is a coach – the coach should not be keeping an eye of the score, but on the players and on performance data gathering. This leads to the learning and program improvement orientation approach. Neither orientation is a bad approach. We have to be accountable, but the second orientation provides more information and is a more well-rounded orientation. GAO, etc. focus on the first approach. Innovation, etc. is related to the second approach. Be sensitive to the audience response to the different orientations. ********** Orientation of the evaluation refers to the ultimate reason for undertaking the evaluation. Accountability Program improvement Program clarification Program development

53 Key Questions Grantees Need to Answer About Their Programs
What am I doing, with whom, to whom/what? (effort) How well am I doing it? (quality) Customer Feedback Peer Review for Technical Quality User Review for Social Validity Is anybody (anything) better off? (effect) Short-term Long-term What role, if any, did my program play in the results? What role, if any, did the context play? Were there any unintended outcomes? What will happen if I don’t do something? Performance Measurement Program Evaluation

54 Performance Measurement Hierarchy
Matching Levels of Performance Information Program Logic Elements 7. Measures of impact on overall Hierarchy of Performance Measurement Data Program Logic Hierarchy 7. End results problem, ultimate goals, side effects, social and economic consequences 6. Measures of adoption of new practices 6. Practice and behavior change and behavior over time 5. Measures of individual and group changes 5. Knowledge, attitude, and skill changes in knowledge, attitude, and skills 4. What participants and clients say about 4. Reactions the program; satisfaction; interest; strengths; weaknesses 3. The characteristics of program 3. Participation participants and clients; numbers, nature of involvement; background 2. Implementation data on what the program 2. Activities actually offers or does 1. Resources expanded; number and types of 1. Resources staff involved; time extended

55 Two Questions What is the right Outcome? Short-term Intermediate
Strategic Am I getting at the right Outcome, the right way? Efficiency Effectiveness

56 Types of Measures Category Definition Examples Resources/ Inputs
Resources consumed by the organization. Amount of funds, # of FTE, materials, equipment, supplies Activities Work performed that directly produces core products & services. # of training classes offered as designed; Hours of technical assistance training for staff. Outputs Products & services provided as a direct result of program activities. #of technical assistance requests responded to; # of compliance workbooks developed/delivered. Customer Reached Measure of target population receiving outputs. % of target population trained; # of target population receiving technical assistance. Customer Satisfaction Measure of satisfaction with outputs. % of customers dissatisfied with training; % of customers “very satisfied” with assistance received. Outcomes Accomplishment of program goals and objectives (short-term and intermediate outcomes, long-term outcomes--impacts). %  in citizen understanding of sound nutrition choices; # communities adopt research-based practices; %  in # community-based choices As a result of OMB’s PART, a number of Agency’s are being required to develop efficiency measures. Resource/Input measures – Resource measures represent investments made at the front end rather than something produced by the program. These measures are often used in computing other performance measures for example, cost-effectiveness and productivity measures. Output measures – Measure the immediate products and services produced by the agency or organization. Outcome measures – These measures are the most important because they represent the degree to which a program is achieving its intended results. Efficiency measures – These measures look at the ratio of outputs to the dollar cost of the collective resources consumed in producing them. Productivity measures – Most often measure the rate of production per some specific unit of resource, usually staff or employees. To be meaningful, they must be defined in terms of some unit of time. Cost-effectiveness measures – These measures relate cost to outcome measures. Service Quality measures – The concept of quality pertains to service delivery and measuring the quality of the outputs. Customer Satisfaction measures – This measure measure the level of satisfaction with the overall program

57 Work Quality Measures Category Definition Examples Efficiency
The ratio of the amount of input to the amount of output. Focus is on operating efficiency. Relating output to some specific resource in terms of cost or time. Cost per workbook produced; cost per meal delivered. Productivity Measure of the rate of production per some specific unit of resource (e.g., staff or employee). The focus is labor productivity. Number of meals delivered per volunteer. Cost Effectiveness Measure that relates outcomes to costs. Cost per number of persons who are elderly improving nutrient intake

58 Defending your Impact Claim
Did we observe a change in the anticipated outcome(s) as seen in your performance measures? Can we connect any element of our program (what we did) to that change using your performance measures? Are there any rival explanations (usually in the context)?

59 Definitions: Performance Measurement:
The ongoing monitoring & reporting of program progress & accomplishments, using pre-selected performance measures. Performance measure – a metric used to gauge program or project performance. Indicators – measures, usually quantitative, that provide information on program performance and evidence of a change in the “state or condition” in the system. SCRIPT: Definitions: Performance Measurement is a compilation of measures and involves a system for monitoring, tracking and reporting on these measures. A Performance measure is a statistic or value that helps gauge program/project performance. A Measures assess the effect of your program. Indicators measure the state or condition of the environment as a result of the program. The distinction is the degree of control. An example of a measure: The number or % of Tribal residents served by a water sewer system. An example of an indicator: Increase in Tribal residents with access to clean drinking water. Program evaluation is systematic in that it uses a methodology. It also uses performance measurement data to answer how well a program is working and helps explain why.

60 Definitions: Program Evaluation:
A systematic study that uses measurement & analysis to answer specific questions about how well a program is working to achieve its outcomes & why. SCRIPT: Definitions: Performance Measurement is a compilation of measures and involves a system for monitoring, tracking and reporting on these measures. A Performance measure is a statistic or value that helps gauge program/project performance. A Measures assess the effect of your program. Indicators measure the state or condition of the environment as a result of the program. The distinction is the degree of control. An example of a measure: The number or % of Tribal residents served by a water sewer system. An example of an indicator: Increase in Tribal residents with access to clean drinking water. Program evaluation is systematic in that it uses a methodology. It also uses performance measurement data to answer how well a program is working and helps explain why.

61 + DISCREPANCY STANDARD/ DESIRED LEVEL OF PROGRAM PERFORMANCE ACTUAL LEVEL OF PROGRAM PERFORMANCE - DISCREPANCY

62 Example Standard: 95% of targeted community-based treatment facilities will adopt BMPs by June 2006. Performance: 65% of targeted community-based treatment facilities adopt BMPs by June 2006. Managers’ Question: Should we act and if so, what should we do? Prospective Evaluation Question: What impact on predicted longer-term impacts will this observed level of performance have? Retrospective Evaluation Question: What programmatic or contextual factors influenced the observed level of performance?  SCRIPT: Bullet 1: Evaluation Client The individual who commissions the evaluation. They typically provide the resources that enable the evaluation to be undertaken. They also ask questions that will be the focus of the evaluation and will use the results. Bullet 2: Stakeholders Individuals with a vested interest in the evaluation. It is important to involve stakeholders early on in the evaluation process to ensure that evaluation questions that are of importance to them are considered. Bullet 3: Evaluator Carries out the evaluation. The evaluator can be either independent (audience is the company employing you, i.e., National Science Foundation), internal, or external (where the audience is the project staff). Note that for the Program Assessment Rating Tool, external evaluators are required. Bullet 4: Evaluation Consultants Consultants are professionals that have specific expertise in a subject area.

63 The Logic Model & Evaluation
Longer term outcome (STRATEGIC AIM) Intermediate Short term Customers Outputs Activities Resources/ Inputs WHY HOW PROGRAM RESULTS FROM EXTERNAL CONDITIONS INFLUENCING PERFORMANCE (+/-)

64 Assessing Strength of Evaluation Design for Impact
Is the population representing the counterfactual equivalent in all pertinent respects to the program population before that population is exposed to the intervention – selection bias Is the intervention the only force that could cause systematic differences between the 2 populations once exposure begins? Is the full force of the intervention applied to the program population, and is none applied to the counterfactual? Implementation evaluation Independence

65 In the end Logic Models:
Enable planners to: Develop a more convincing, plausible argument RE how their program is supposed to work to achieve their outcomes & communicate this to funding agencies & other stakeholders. Focus their PM/PE on the right elements of performance to enable program improvement & the estimation of causal relationships between & among elements. Be better positioned to present & defend their claims about their program performance to external stakeholders.


Download ppt "John A. McLaughlin MACGROUPX@AOL.COM Logic Modeling: A Tool to Support the Development & Evaluation of State Unit of Aging Programs & Projects John A."

Similar presentations


Ads by Google