Presentation is loading. Please wait.

Presentation is loading. Please wait.

Logic Model of Program Performance

Similar presentations


Presentation on theme: "Logic Model of Program Performance"— Presentation transcript:

1 Logic Model of Program Performance
Ensuring our actions - (INPUTS) are resulting in desired (OUTCOMES)

2 “The Big Five” View evaluation as learning-integrated into the way we work Build evaluation in upfront Ask ‘tough questions’ Make measurement meaningful Be accountable for highest professional standards

3 Shows difference between what we do and impact we are having
Provides a common vocabulary Focus on quality and continuous improvement Why the Logic Model?

4 Why Measure? What gets measured gets done
If you don’t measure the results, you can’t tell success from failure If you can’t see success, you can’t reward it If you can’t reward success, you’re probably rewarding failure If you can’t see success, you can’t learn from it If you can’t recognize failure, you can’t correct it If you can demonstrate results, you can win public support

5 The logic model contains six components with Inputs-Outputs-Outcomes being central to the model:
Situation: the context and need that gives rise to a program or initiative; logic models are built to respond to existing and projected situations. INPUTS: the resources, contributions and investments that are made in response to the situation. OUTPUTS: the activities, products, methods, services that are implemented. OUTCOMES: the targeted and measurable results and benefits for individuals and/or the school community. Environment: Where the program exists and what influences the implementation and success of the initiative, including politics, economic and human resource factors, etc. Assumptions: the beliefs we have about the program, the participants and the way we expect the program to operate; the principles that guide our work. Assumptions have a significant and direct impact on expected outcomes

6 Logic Model INPUTS OUTPUTS OUTCOMES SITUATION

7 Everyday Logic Model Get Food Eat Food Feel Better HUNGER

8 Logic Model: What Is It? Graphic representation of the program “theory” or “action” Relationship between inputs, outputs and outcomes Logical chain of if-then relationships This is at the core of effective program development

9 Logical Linkages Series of If-Then Relationships
IF THEN IF THEN IF THEN IF THEN Program invests time & money Resource inventory can be developed Students will know what is available Students know/ access services available Students will have needs met INPUTS OUTPUT OUTCOMES

10 Logic Model: Program Performance Framework
INPUTS What we invest Staff District support Students Parents Volunteers Money Materials Equipment Technology OUTPUTS Activities Participation What we do Who we reach Workshops Students Meetings Parents Counseling Staff Facilitation Community Assessments Product Dev. Media Work Teaching OUTCOMES Short Medium Long Term What the What the What the short term medium term impact(s) Is results are results are Learning Action Conditions Awareness Behavior Social Knowledge Practice Economic Attitudes Decisions Civic Skills Policies Opinions Socials action Aspirations Motivations SITUATION ENVIRONMENT Influential Factors

11 A K-12 Education Example: After School Academic Recovery Program
The School/District invests time and resources A variety of educational activities are provided for students Students who participate gain knowledge and change personal learning practices resulting in improved academic achievement

12 LOGIC MODEL: Program Performance
INPUTS OUTPUTS OUTCOMES Activities Participation Short Medium Long Term Program investments What we What we Who we What results Invest do reach

13 INPUTS What we’re putting in . . .
Staff Money Time Volunteers Partners Equipment Technology

14 OUTPUTS What we’re producing . . .
WHAT WE DO WHO WE REACH ACTIVITIES PARTICIPATION Workshops Counseling Educational Research Facilitation Program Development Curriculum Design Training Conferences Media Students Parents Staff Community

15 OUTCOMES What results for students?
SHORT Learning MEDIUM Action LONG – TERM Conditions Awareness Knowledge Attitudes Skills Opinion Aspirations Motivation Behavior Practice Decisions Policies Academic Social Emotional Civic

16 INPUTS OUTPUTS OUTCOMES Students increase knowledge of Study Skills
Designed Curriculum Staff Students use new learning practices Improved Student Achievement Targeted audience attends Money Materials Provide program Students learn new ways to allocate study time

17 ASSUMPTIONS Beliefs about the program
-the participants -the way the program will operate -how resources, staff will be engaged -the theory of action Assumptions are often the reason for poor results Check and test assumptions -Identify potential barriers for each ‘if-then’ sequence

18 OUTCOMES vs. ACTIVITIES
BE OUTCOME DRIVEN, NOT ACTIVITY DRIVEN

19 WHAT ARE OUTCOMES? Outcomes are the benefits or results of a program. They are changes or improvements for students, staff, parents, or the organization that occur during or after the program. Outcomes represent the difference between the original condition and that which exists after implementation. Be outcome driven, not activity-driven. ACTIVITY DRIVEN OUTCOME DRIVEN Examples: To provide after school study sessions To teach algebra concepts to students To engage students in best-practice study methods Examples: Students will have increased capacity to successfully complete homework Student algebra test scores will improve Students will demonstrate the ability to match study skill with learning task

20 INTERMEDIATE OUTCOMES
CHAIN OF OUTCOMES Outcomes often fall along a continuum from shorter to longer-term results. IMMEDIATE OUTCOMES SHORT TERM LEARNING FINAL OUTCOMES LONG TERM IMPACTS INTERMEDIATE OUTCOMES MEDIUM TERM ACTION TIME EXAMPLE Immediate Intermediate Final Students know how to develop a learning plan Students increase their knowledge of and ability to complete algebraic equations Students are more aware of school services Students learn to develop a college-prep plan Students access needed class offerings Student enrollment in advanced math courses increases Student services performance improves Admissions to desired colleges increases Students use a learning plan Student algebra grades improve Students access appropriate school services Students use and follow a college-prep plan

21 HOW FAR OUT THE OUTCOME CHAIN DO WE GO?
What is logical? What is realistic? What is meaningful?

22 WHICH OUTCOME? The questions often arises:
Which outcome in the chain will be the point where the value of the program will be assessed and judged? What is ‘good enough’? The school will develop and facilitate an after school academic recovery program. Small (3-5) student learning groups will be established and function with the direction and instruction of a teacher. Homework completion will increase through the use of the student learning groups Student achievement will improve with the assistance of the student learning groups that are facilitated by the teacher

23 When Selecting Outcomes think about:
Importance: Which outcomes are the most important? Meaningfulness: Which outcome or benefit is meaningful for the participants and stakeholders? Realistic: What is realistic given the nature of the problem and what we can expect to influence? Reasonable: Which outcomes are reasonable considering our investment and what we did and whom we reached? Outcomes must realistically reflect the changes, benefits, and improvements that the program can influence.

24 Issues in Defining Outcomes
There is no right number of outcomes. The number of outcomes selected by your program will depend upon the nature and purpose of the program, resources, size and number of constituencies represented, (i.e. students, staff, parents, etc.). There may be more than one “outcome track”. The program may have several “chain of events”, usually linked to different target groups or programming components. In some cases, immediate outcomes may seem like outputs. The misinterpretation occurs because initial outcomes may not represent major change. Critically ask yourself, “Is this an outcome or an output?” This is similar to instructional vs. learning objectives. The more immediate the outcome, the more influence the program has over its achievement.

25 Issues in Defining Outcomes Cont…..
Conversely, the longer term the outcome, the less direct influence the program has over its achievement and the more likely other extraneous forces are to intervene. The timing of the measurement strategy is a critical design decision, influencing the reliability and validity of the measurement strategy and resulting data. Because other forces affect an outcome doesn’t mean it shouldn’t be included. Despite the influence of other factors, you will want to measure and track these outcomes in order to understand what effect the program has and what might be done to achieve the desired results. Long-term outcomes, however, should not go beyond the program purpose or target audience. Think about what the program is designed to do-where its influence is likely to be felt and focus the outcome measurement at that level. Likewise, keep the the outcome measures focused on the targeted audience.

26 What does a logic model look like?
Often it is a graphic display of boxes or columns arranged in a horizontal or vertical display Arrows may be used to depict causal relationships, sequence of events Level of detail in the logic model depends upon your purpose and scope of the project: SIMPLE logic models are useful when communicating with externals. More DETAILED logic models are useful when developing consensus internally among design team members.

27 SAMPLE LOGIC MODEL: OUTPUTS OUTCOMES INPUTS

28 SAMPLE LOGIC MODEL: OUTPUTS OUTCOMES INPUTS

29 SAMPLE LOGIC MODEL: INPUTS OUTPUTS OUTCOMES

30 LIMITATIONS OF LOGIC MODEL
Logic model only represents reality, it is not reality Programs are not linear Programs are dynamic interrelationships that rarely follow sequential order Logic model focuses on expected outcomes: also need to pay attention to unintended or unexpected outcomes: positive, negative, neutral Program is likely to be just one of many factors influencing outcomes Consider other factors that may be affecting observed outcomes

31 BENEFITS Brings detail to broad goals
Shows the chain of events that link inputs to outcomes Builds understanding and consensus Identifies gaps in logic and uncertain assumptions Signals what to evaluate and when Summarizes a complex program to communicate with externals

32 BUILDING A LOGIC MODEL New program Existing program
Team and Collaboration Involvement of others Keep it dynamic

33 CREATING A LOGIC MODEL Where to Start?
The process of developing a logic model should bring all key stakeholders to a shared understanding of what the program is and what it will do. AN EXISTING PROGRAM You might start by asking, “What is it that we do?” “What are we hoping to accomplish?” “What does our program consist of?” “Who are we reaching?” “What results are we seeking?” A NEW PROGRAM If you are in the planning stage of a new program, you might start with the long-term expected end result—the impact—and work backwards. “What is our long-term desired result?” “What will be different as a result of this program?” “What must happen in each preceding step to get us there?” A logic model is dynamic. It will change as the program changes.

34 LOGIC MODEL: WORKSHEET
Program:__________________________________ Goal:__________________________________ OUTPUTS INPUTS OUTCOMES-IMPACT Activities Participation Short Medium Long-Term

35 CHECK YOUR LOGIC MODEL Are the outcomes really outcomes?
Is the longest-term outcome -meaningful? -logical? -realistic Does it represent the program’s purpose; response to the situation?

36 HOW GOOD IS YOUR LOGIC MODEL?
Ask Yourself: Is each listed outcome truly an ‘outcome’? Does the logic model clearly separate outcomes from outputs, or are the distinctions blurred? Does the highest-level outcome represent a meaningful benefit or value to the target audience? Can it be associated with the program? Is the model truly logical? Do the relationships among the program elements make sense? Are the casual relationships supported? (1) Start at inputs, ask “why?” at each level: why do we need these inputs? Why do we need to conduct these activities? (2) Starting at the output level ask, how are we going to produce these outcomes by looking at the items immediately preceding it.

37 HOW GOOD IS YOUR LOGIC MODEL – Continued . . .
Ask Yourself: Are the resources realistic? Is what you intend to do even possible given your resources? How valid are the assumptions? Are they based on experience and research, or are they your best guesses? Does the logic model reflect the opinions and support of key stakeholders? Were any stakeholders left out?

38 EVALUATION PLAN What do you want to know? Indicators- How will
What do you want to know? Indicators- How will You know it Source of Information Method to Collect info Schedule When/where

39 EVALUATION QUESTIONS- What do you want to know?
QUESTION TYPE OF EVALUATION Need? Needs assessment Process or implementation? Process evaluation Outcomes or impact? Performance analysis Costs and efficiencies? Cost-benefit/ analysis ROI – return on investment

40 INDICATORS- HOW WILL YOU KNOW IT?
The evidence or measures that indicates what you wish to know or see: Often multiple indicators are necessary May be quantitative or qualitative

41 INDICATORS (Data) An indicator is the evidence or information that represents the phenomena you are asking about. For example: EXAMPLE: Indicator of academic achievement = Improved CST’s Indicator of improved performance= Improved student projects Indicators help you know something. They are measurable or observable: they can be seen (e.g., observed behavior), heard (e.g., participant response), or read (i.e., student work). For each aspect you want to measure, ask yourself: What would it look like? If I were a visitor to the program, what would I see? Invite others to give their perspectives. Check your ideas with others.

42 INDICATORS continued . . . Indicators should be:
Sensible: Indicators need to make sense in relation to that which you are asking about. Direct: An indicator should measure as directly as possible what it is intended to measure. For example, if the outcome being measured is improved attendance, then the best indicator is the school’s attendance records. The number and percent of students that receive attendance improvement education does not directly measure the desired result. Proxy: Often, however, we do not have direct measures or we may be constrained by time and resources. Then, we use proxy measures as our best guesses. For example, a proxy measure of student satisfaction with educational programs might include enrollment patterns and changes in classroom performance.

43 INDICATORS CONT….. Specific: Clearly define the indicator so that anyone would understand it in the same way; would collect the same data. Example: Number and percent of students who adopted a best learning practice in the past year-which learning practices, which students, what is the time period, what constitutes adoption??? Useful: Indicators need to help us understand what it is we are measuring Practical: Indicators need to be practical; that is we need to be able to collect the data in a timely manner at reasonable cost. Is it reasonable to be able to collect the data given our existing or potential resources? Adequate: There is no ‘correct’ number or type of indicators. The number of indicators you choose depends upon the result being measured, the level of information you feel that you need, and the resources available. Often more than one indicator is necessary to capture a concept. However, too many is problematic also. A large number of indicators may mean that the result is too complex or not understood well enough.

44 INDICATORS CONT….. Quantitative-qualitative: Often indicators are expressed as a number or percent that shows attainment: number of….; percent of……; ratio of……;incidence of……..; proportion of….. Indicators do not always have to be a number: Qualitative indicators may be important. Example: indicators are frequently qualitative; proficient, and/or mastery level performances. “Not everything that counts can be counted” Comprehensive: We should include indicators that express all possible aspects of what we are wanting to measure, i.e., possible negative or detrimental aspects as well as the positive. What might be some negative effects or spin-offs of the program and include indicators for these.

45 INDICATORS: Worksheet
OUTCOME INDICATORS

46 INDICATOR REVIEW: Check your indicators according to the criteria below. Do the indicators make sense? Are the indicators directly related to the outcome? Do they provide evidence of what you want to know? Are the indicators specific and clearly define what information will be used? Do the indicators provide useful information? Is it practical to think that the data can be collected in a timely fashion with the resources available? Do the indicators adequately measure the concept? Are the indicators comprehensive and consider possible negative or detrimental effects?

47 INDICATORS-CONSIDERATIONS
Direct (proxy) Sensible, understandable Reliable, trustworthy Available Useful, credible

48 SOURCE AND METHOD OF DATA COLLECTION
Source of Information Method of Collecting Information Participants/Students Survey Parents Interview Staff/District Observation Community End-of-Program Questionnaire Focus group Specific school data

49 EVALUATION PLAN FOCUSING THE EVALUATION COLLECTING THE INFORMATION
WHAT DO WE WANT TO KNOW? The evaluation questions HOW WILL WE KNOW IT? Indicators-evidence SOURCES Who will have this information? METHODS How will we get the information Schedule: When will the information be collected?

50 EVALUATION PLAN CONT… How will the data be analyzed and interpreted? By whom and when? How will the results be communicated? By whom and when? Who is the receiving audience?

51 “The Big Five” View evaluation as learning-integrate into the way we work Build evaluation in upfront Ask ‘tough questions’ Make measurement meaningful Be accountable for highest professional standards


Download ppt "Logic Model of Program Performance"

Similar presentations


Ads by Google