Presentation is loading. Please wait.

Presentation is loading. Please wait.

Program Evaluation Principles and Applications PAS 2010.

Similar presentations


Presentation on theme: "Program Evaluation Principles and Applications PAS 2010."— Presentation transcript:

1 Program Evaluation Principles and Applications PAS 2010

2 Session Outline  Overview of Program Evaluation Noelle Huntington, Ph.D.  Survey Development Sonja Ziniel, Ph.D.  Qualitative Methods Rani Gereige, MD  Q & A session

3 Overview of Program Evaluation Noelle Huntington, Ph.D Children’s Hospital Boston

4 Noelle Huntington, Ph.D. has no relevant financial relationships to disclose or COIs to resolve.

5 Outline  What is Program Evaluation?  Key terms and concepts  Steps in conducting an evaluation  Case exercise

6 Learning Objectives At the end of the session, participants are expected to:  Understand the key principles of program evaluation  Appreciate the importance of developing a clear evaluation plan from the start  Begin to master the process of creating an evaluation plan

7 What is Program Evaluation? The diligent investigation of a program’s characteristics and merits

8 Examples of Evaluation Questions  Did the program result in the expected changes in knowledge / attitudes / behaviors / skills?  How did the program impact health care practice or patient health status?  What were participants’ reactions to the program?  Was the program implemented as planned?

9 Process Evaluation  Assessing the extent to which the program was implemented as intended  Report at end – important for program credibility  Report in real time with the purpose of program improvement  Formative or implementation evaluation

10 Outcomes Evaluation  Assessing the extent to which the program met its goals and objectives – the program effects and benefits.  Summative or impact evaluation

11 What are Goals and Objectives?  Goals – relatively general and long term  Objectives – the specific ways your program will address the goals; what you intend to achieve

12 Outcomes Evaluation Take into consideration:  Who the stakeholders are and what their expectations are (their goals and objectives)  360° perspective

13 Who are Stakeholders? Any individual or group with an interest in the evaluation of their own performance or the performance of the program  Practitioners  Students  Patients  Program developers  Decision makers  Providers of resources

14 What is a 360° Perspective?  Looking for evidence of effectiveness from multiple angles  When the results from several sources of “weak” data tell a consistent story, the strength of the evidence increases.

15 Program Evaluation is Research You need to:  Pose your evaluation questions  Set standards of effectiveness  Design the evaluation  Select participants  Collect data  Analyze data  Report and disseminate results

16 Program Evaluation is Research You need to:  Pose your evaluation questions  Set standards of effectiveness  Design the evaluation  Select participants  Collect data  Analyze data  Report and disseminate results

17 Pose your Evaluation Questions Take into consideration:  Specific objectives of your program  360° perspective  Stakeholders’ perspectives  Process and outcomes  Limitations in time, money and measurement tools  The different “levels of impact”

18 Kirkpatrick’s Hierarchy

19 Evaluation questions must be…  Relevant  Specific  Measurable

20 Adding Routine Developmental Screening to Primary Care  Trained residents and faculty on the importance of standardized developmental screening  Trained residents and faculty on the use and scoring of the PEDS  Worked with administrative staff on implementation process / flow of activity

21 PEDS Evaluation Questions  Did providers learn to use the PEDS correctly?  Did parents receive and complete the PEDS?  What were providers’ reactions to the training?  What were providers’ reactions to using the PEDS in clinic?  What was the impact of adding the PEDS to the overall length of the well-child visit?  What was the impact of adding the PEDS on identification of developmental concerns?  What was the impact on parent perceptions of the well child visit?

22 Set Standards of Effectiveness  What do you consider to be evidence of success?  Standards must be: Meaningful Realistic Measurable  Set a priori

23 How are Standards Set?  Accreditation or national organization standards  Population or community statistics  Other programs or clinics  Stakeholders’ goals or desires  Your baseline status

24 PEDS project - Standards  Did providers learn to use the PEDS correctly? >90% of complete PEDS will be scored correctly  Did parents receive and complete the PEDS? >65% of well-child visits will have a completed PEDS

25 PEDS project - Standards  What were providers’ reactions to the training? >60% will rate the training positively on a number of dimensions  What were providers’ reactions to using the PEDS? >60% will find the PEDS to be a useful or beneficial addition to the visit

26 PEDS project - Standards  What was the impact of adding the PEDS to the overall length of the well-child visit? There will be no significant change in average length of visits  What was the impact on identification of developmental concerns? There will be a statistically significant increase in identification rates  What was the impact on parent perceptions of the well child visit? There will be a statistically significant increase in the % of parents who report that their concerns were addressed

27 In posing your questions and setting standards of effectiveness… You begin to determine:  The sources of your data  The variables you will measure

28 Design the Evaluation  What will be measured?  From whom will measurements be taken?  When will measurements be taken?  Will there be any control or comparison groups?

29 Design Considerations  How much time needs to pass before program effects can be evident?  What are your resources (time, money, staff)?  Can program be altered based on interim process or outcomes data?  If comparison groups, can they be comparable?

30 Common Designs  With a concurrent control group – measured at one or multiple points  Self-controls – measured at multiple points  With historical controls  Cross-sectional – measured at one point in time  Cohort – same people measured at multiple points in time

31 PEDS Designs  Chart reviews after adding PEDS to clinic flow - % with completed forms; accuracy of use  Pre-post knowledge and attitude surveys of providers  Interviews with providers after adding PEDS to clinic flow  Timing visits before and after adding PEDS  Chart reviews before and after adding PEDS to assess the number of visits with an identified concern  Phone interviews with parents before and after adding PEDS (separate groups) to assess their ratings of visit

32 Collect Data  Surveys  Interviews or Focus groups  Medical records review  Achievement tests  Direct observation  Clinical scenarios  Performance tests

33 Contact Information Noelle Huntington Noelle.huntington@childrens.harvard.edu Sonja Ziniel Sonja.ziniel@childrens.harvard.edu Rani Gereige Rani.gereige@mch.com


Download ppt "Program Evaluation Principles and Applications PAS 2010."

Similar presentations


Ads by Google