Presentation is loading. Please wait.

Presentation is loading. Please wait.

Performance Measurement and Evaluation Basics 2014 AmeriCorps External Reviewer Training.

Similar presentations


Presentation on theme: "Performance Measurement and Evaluation Basics 2014 AmeriCorps External Reviewer Training."— Presentation transcript:

1 Performance Measurement and Evaluation Basics 2014 AmeriCorps External Reviewer Training

2  Theory of Change  Logic Models  Performance Measurement and Evaluation Basics 2 Topics

3 3 Theory of Change Theory of Change Elements Community Problem/Need Specific Intervention Intended Outcome

4  Cause and effect relationship between intervention and outcome  If…then… logic 4 Theory of Change

5 5 Theory of Change Elements Community Problem/Need Specific Intervention Intended Outcome Evidence Guides choice of intervention Supports cause-effect relationship Evidence Guides choice of intervention Supports cause-effect relationship Statistics documenting the need Theory of Change

6  A likelihood that the proposed intervention (design/dosage) will lead to the outcome  The stronger the evidence, the stronger the relationship between intervention and outcome Evidence Demonstrates… Theory of Change 6

7 Community Problem/Need 7 Statistics on the number of students at below grade level in program’s service area; Research on why reading proficiency by 3 rd grade is important. Children reading below grade level in 3 rd grade Example: Riverton Literacy Corps Theory of Change

8 Community Problem/Need Statistics on the number of students at below grade level in program’s service area; Research on why reading proficiency by 3 rd grade is important. Children reading below grade level in 3 rd grade 8 Students are able to read at 3 rd grade level (as measured by 3 rd grade reading exam) Example: Riverton Literacy Corps Intended Outcome Theory of Change

9 Community Problem/Need Statistics on the number of students at below grade level in program’s service area; Research on why reading proficiency by 3 rd grade is important. Children reading below grade level in 3 rd grade Students are able to read at 3 rd grade level (as measured by 3 rd grade reading exam) Intended Outcome 9 Specific Intervention Evidence: Research on building block skills leading to reading proficiency. Research on design, frequency, duration of tutoring sessions. Individualized tutoring 3 times/week for 20 min on five “building block” literacy skills through reading, writing and verbal communication activities Example: Riverton Literacy Corps Theory of Change

10  A visual summary or snapshot of your program that communicates how your program works, the resources you have to operate your program, the activities you plan, and the outcomes you hope to achieve  A well-specified conceptual framework that identifies key components of the proposed process, product, strategy, or practice and describes the relationships among the key components and outcomes, theoretically and operationally (NOFO p. 18)  The purpose of a logic model is to describe how a program will create change (theory of change) Logic Models 10

11 How a Logic Model Works 11 Resources/ Inputs ActivitiesOutputs Outcomes/ Impacts Long-Term Goal Certain resources are needed to operate your program If you have access to them, then you can use them to accomplish your planned activities If you accomplish your planned activities, then you will hopefully deliver the amount of product and/or service that you intended If you accomplish your planned activities to the extent you intended, then your participants will benefit in certain ways If these benefits to participants are achieved, then certain changes in organizations, communities, or systems might be expected to occur Figure 1: How to Read a Logic Model  Your Planned WorkYour Intended Results Logic Models

12  Applicants required to use the logic model template developed by CNCS  Criteria for assessing completeness and quality of logic model will be addressed in the next training module 12 AmeriCorps Logic Model Logic Models

13 13 CNCS Template Logic Models

14  Inputs – Resources, including AmeriCorps members  Activities (Interventions) – Services provided by members and/or volunteers  Outputs – Units of service delivered (# of beneficiaries served, etc.)  Outcomes  Short Term (Knowledge, Skills, Attitudes, Opinions)  Medium Term (Behavior)  Long-Term (Condition) Applicant Logic Models Logic Models 14

15 Performance Measurement & Evaluation What is Performance Measurement?  Performance measurement is the ongoing and systematic monitoring and reporting of program accomplishments, particularly progress towards pre-established targets  Monitors the vital signs of a program 15

16 Evaluation is the use of social science research methods to assess program design, implementation, and effectiveness. What is Evaluation? Performance Measurement & Evaluation 16

17 17 Comparing Performance Measurement & Evaluation  Performance measurement data can show that a change occurred but not why  Evaluation results can show whether the change that occurred can be attributed to the intervention  The more scientifically rigorous the evaluation design, the greater the certainty re: causality Performance Measurement & Evaluation

18 Performance MeasurementEvaluation What is it?A system of tracking progress in accomplishing specific pre-set targets (activities, outputs, and/or outcomes) A formal scientific process for collecting, analyzing, and interpreting data about how well a program was implemented (process evaluation) or how effectively the program accomplished desired outcomes/impacts (outcome/impact evaluation) Why is it typically used?To gauge program delivery, quality, participant satisfaction and engagement; to improve products, services, and efficiency; to inform/enhance decision making, and support planning and program development To assess program effectiveness and determine whether the program is responsible for changes found How does it work?Monitors a few vital signs related to program performance objectives, outputs, and/or outcomes Comprehensively examines programs using systematic, objective, and unbiased procedures in accordance with social science research methods and research designs Who typically does it?Program staffAn experienced researcher (often external to the program) who has formal training in evaluation When is it done?Ongoing BasisPeriodically Comparing Performance Measurement & Evaluation Performance Measurement & Evaluation 18

19 Performance Measurement 19 Outputs  Amount of service provided (people served, products created, or programs developed) Performance Measurement & Evaluation

20 Performance Measurement 20 Outcomes  Reflect the changes or benefits that occur  Can reflect changes in individuals, organizations, communities, or the environment  Address changes in attitudes/beliefs, knowledge/skills, behavior, or conditions Performance Measurement & Evaluation

21 Attitude/BeliefKnowledge/SkillBehaviorCondition Thought, feeling Understanding, know-how Action Situation, circumstance 21 Types of Outcomes Performance Measurement & Evaluation

22 22 Different Theories of Change Have Different Outcomes Attitude/BeliefKnowledge/SkillBehaviorCondition Increased interest in school Improved math ability Increased school attendance Successful completion of high school Performance Measurement & Evaluation

23 NEED: 35% of young veterans (18-24 year olds) are unemployed (Department of Veteran Affairs, 2011). Economists cite a lack of marketable civilian skills and the need for education degrees, vocational certifications… INTERVENTION: National service participants support veterans in completing training programs by assisting in locating appropriate programs, securing financial aid, and by providing tutoring resources and internship placements. 1.Veterans report increased confidence about finding employment. (attitude) 2.Veterans demonstrate new technical skills. (knowledge/skills) 3.Veterans are placed in jobs. (condition) Which Type of Outcome? Practice: Choosing an Outcome Performance Measurement & Evaluation 23

24 If AmeriCorps members support veterans to enroll in and complete training programs, then veterans will gain technical skills If veterans gain technical skills, then they will be more likely to find jobs 24 Following the Logic Performance Measurement & Evaluation

25 Evaluation and Theory of Change  Theory of change (as expressed in a logic model) is key to evaluation  Based on a program’s theory of change, evaluation tests assumptions/hypotheses about how a program works and whether or not it is achieving what it intended to accomplish Performance Measurement & Evaluation 25

26 Two levels/Types of Evaluation: 1.Implementation (Process)  Is the program operating as intended? 2.Outcomes (Short-, Intermediate, and Long-Term)  What difference did the program make? Levels/Types of Evaluation Performance Measurement & Evaluation 26

27  Is the program operating as designed, and, if not, how and why?  Do program staff clearly understand the program’s goals and objectives?  Do staff need additional training to implement the program correctly?  How consistent is the program’s implementation across sites?  Is the program reaching the intended target population?  Who are the individuals receiving/participating in the program?  Is the target population receiving the appropriate services? Why or why not?  What is the quality of services provided by the program?  How satisfied are participants with their program experiences?  Do staff believe that the program is effective? Why or why not?  Which program elements are or are not working well? Why? Implementation Evaluation: Types of Questions Performance Measurement & Evaluation 27

28 Outcomes/Impact Evaluation: Types of Questions  What did the program accomplish?  What impact did the program have on its participants?  Was the benefit greater with this program as compared with another program?  Did all types of students or clients benefit from the program or specific subgroups?  Did the program increase participants’ awareness, knowledge and skills? Performance Measurement & Evaluation 28

29 Internal Validity  The degree to which measured effects may be due to program vs. factors other than the program  High internal validity allows for more plausible causal attribution statements  Attaining moderate and strong evidence requires that a study has high internal validity Performance Measurement & Evaluation 29

30 Evaluation Study Designs Comparison Ability to make statements about causal attribution Experimental Design Studies Randomly Assigned Groups Quasi-Experimental Design Studies Statistically Matched Groups Non-Experimental Design Studies Not Statistically Matched Groups or Group Compared to Itself Evaluation Study Designs & Causal Impact Performance Measurement & Evaluation 30

31 Children receiving nutrition, fitness and counseling services through an after school program delivered by AmeriCorps members are compared to children attending an after school program that does not include the AmeriCorps nutrition, fitness and counseling program. Children in the two groups are matched based on demographic characteristics and baseline measures of nutrition, fitness and obesity. What kind of evaluation design is it? Practice Questions Practice Question I 31

32 This is a quasi-experimental design because it compares beneficiaries to a statistically matched group of individuals not receiving the AmeriCorps intervention. 32 Answer Practice Questions

33 High school students enrolled in a career preparation program delivered by AmeriCorps members are compared to high school students who wanted to the program but were not selected. 100 students wanted to join the program, but there were only 50 spots. 50 students were randomly selected for the program. The other 50 were placed on a waiting list to join the program. What kind of evaluation design is it? 33 Practice Question 2 Practice Questions

34 This is an experimental design because individuals were assigned to the beneficiary group or the control group through random assignment. 34 Answer Practice Questions

35 Students receiving academic mentoring services from AmeriCorps members take a survey that asks their view about the importance of education before participating in the AmeriCorps mentoring program and after participation in the program. What kind of evaluation design is it? 35 Practice Question 3 Practice Questions

36 This is a non-experimental design because it looks at changes in the beneficiaries’ attitudes over time. There is no comparison group or control group. 36 Answer Practice Questions

37 To check for understanding and verify that you have completed this orientation session, please complete the Assessment at the following link: https://www.surveymonkey.com/s/perfmeasure 37 Next Steps Practice Questions


Download ppt "Performance Measurement and Evaluation Basics 2014 AmeriCorps External Reviewer Training."

Similar presentations


Ads by Google