Presentation is loading. Please wait.

Presentation is loading. Please wait.

Process Evaluation: Considerations and Strategies CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health.

Similar presentations


Presentation on theme: "Process Evaluation: Considerations and Strategies CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health."— Presentation transcript:

1 Process Evaluation: Considerations and Strategies CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health

2 Making Widgets Count not only the widgets, but who gets the widgets and what goes into making the widgets.

3 Definitions l Systematic examination of coverage and delivery l Measuring inputs into the program l Finding out if the program has all its parts, if the parts are functional and operational

4 Program Components Discrete interventions, or groups of interventions, of the overall program that are designed to independently or synergistically effect recipients. Objectives per component

5 Types of Implementation Evaluations l Effort: quantity of input l Monitoring: use of MIS information l Process: internal dynamics, strengths and weaknesses l Component: assess distinct program parts l Treatment: what was supposed to have an effect

6 Purpose of Process Evaluation l Assurance and accountability l Understanding of outcomes l Mid-course corrections l Program replicability

7 From Perspective of: l Evaluator l Funders (Accountability) l Management for the program

8 Stakeholders and Expectations l Focus on the explicit Objectives Program descriptions l Uncover the implicit Review program theory Review objectives Role play possible outcomes

9 Program Theory Components

10 Organizational Plan How to garner, configure, and deploy resources, organize program activities so that the intended service is developed and maintained

11 Service Utilization Plan How the intended target population receives the intended amount of the intended intervention through interaction with the program’s service delivery system

12 Rigor in Process Evaluation l Appropriateness of the method l Sampling strategy l Validity and Reliability l Timing of data collection

13 Key Decision: 1 How much effort to expend ~ what data are needed to accurately describe the program. Choose based on: Expected across site variation Making report credible Literature about the intervention

14 Key Decision: 2 What to look for ~ what program features are most critical, valuable to describe. Choose based on: What is most often cited in program proposal The budget What may be related to program failure

15 Go Back to Objectives l Process objectives per program component: How much Of what Will be done By who By when

16 Components and Objectives

17 Possible Foci of Process Evaluation l Place: Site, Program l People Practitioner/provider Recipient/participant l Processes Activities Processes Structure l Policy

18 Levels of Analysis l Individuals Program participants Program providers l Programs As a whole l Geographic locations Regions and state

19 Types of Questions ? l What was done and by whom? l How well was it done and how much was done? l What contributed to success/failure? l How much of what resources were used? l Is there program drift?

20 Sources of Program Variability l Staff preferences and interest l Materials availability and appropriateness l Participants expectations, receptivity, etc l Site physical environment and organizational support

21 Roots of Program Failure

22 Causes of Program Failure l Non-Program No participants No program done l Wrong intervention Not appropriate for the problem l Unstandardized intervention Across site, within program variations

23 Program Failure cont l Mis-management of program operations l Wrong recipients l Barriers to the program l Program components unevenly delivered, monitored

24 Data Sources from Program l Resources used l Participant provided data Quality Match with process evaluation l Existing records Sampling of records Validity and reliability issues

25 Data Sources from Evaluator l Surveys and Interview of Participants l Observation of Interactions l Survey and Interview staff

26 Evaluating Structural Inputs l Organizational structure Supervision of staff Place in organizational hierarchy l Facilities, equipment l Human resources Leadership Training

27 Measures of Delivery l Measures of program delivery l Measures of coverage l Measures of effectiveness

28 Measures of Implementation l Measures of Volume (Outputs): Number of services provided l Measures of Workflow: Client time Staff work time

29 Targets, Recipients, and Coverage

30 Measures of Coverage Undercoverage= # recipients in need /# in need Overcoverage= # recipients not in need /# recipients Coverage efficiency= (under - over) x 100

31 Measures of Effectiveness Effectiveness Index = % reached per program standard per program component Program Effectiveness Index = Sum of Effectiveness Indexes/# program components

32 Bias in Participation l due to self-selection l results in under or overcoverage l may be related to recruitment l can be identified with good data collection (monitoring)

33 Measures of Efficiency l ratio of input per output l productivity per staff, per cost, per hour l cost per participant, per intervention l etc...

34 Evaluating Costs l Payments by agency l Payments by secondary funders l Payments by participants versus charges!

35 Monitoring and CQI l Similar types of data presentation Control charts Fishbone diagrams Flow charts Gantt charts etc. l Overlapping purposes

36 Reaching Conclusions l Compare data to objectives l Compare data to needs assessment data l Compare data to other sites or other programs

37 Worksheet Exercise l For each program objective: What is the focus and level of the process evaluation What data sources needed Who collects data

38 References Rossi, Freeman & Lipsey (1999). Evaluation: A systematic approach. Sage Publications Patton (1997). Utilization focused evaluation. Sage Publications. King, Morris, Fitz-Gibbon (1987). How to assess program implementation. Sage Publications. Weiss (1972). Evaluation Research. Prentice Hall


Download ppt "Process Evaluation: Considerations and Strategies CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health."

Similar presentations


Ads by Google