Presentation is loading. Please wait.

Presentation is loading. Please wait.

Program Evaluation: A Pseudo-Case Study

Similar presentations


Presentation on theme: "Program Evaluation: A Pseudo-Case Study"— Presentation transcript:

1 Program Evaluation: A Pseudo-Case Study
Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation National Institute of General Medical Sciences MORE Program Directors Meeting Colorado Springs, Colorado June 12, 2009

2 We’re looking for……

3

4 But we often get…..

5

6 An evaluation plan should include…..
Program description Purpose & rationale for evaluation Evaluation Design Data Collection & Analyses Products of evaluation & their use Project Management Budget estimate

7 Long-term Goal Example
Program Description Include program goals and baseline data Program goals are the intended effects of a program Activities should be organized to achieve specific goals Types of goals: Process, intermediate, long-term Long-term Goal Example Weak: To train a diverse biomedical workforce Strong: To significantly increase the # of URMs graduating with baccalaureate STEM degrees and persisting through to graduate study Program goals are the intended effects of a program, as noted in authorizing legislation or other documents written when the program was established. In some cases, additional program goals that are not listed in official documents may be included in the evaluation. For a program that is not yet established, the program goals should summarize the anticipated effects of the new program. There are three types of program goals: Process goals – Goals that describe how the program should operate and what levels of output should be expected. Intermediate goals – Goals that describe specific outcomes the program should achieve in the near term. Long-term goals – Goals that describe the ultimate outcomes the program is designed to achieve.

8 Purpose & Rationale of Evaluation
Type of Evaluation Needs assessment, Feasibility study, Process evaluation Outcome evaluation? Timing Why is right now the time to conduct an evaluation? Program Maturity Is it reasonable to expect certain levels of output or measurable changes at this stage?

9 Evaluation Design An evaluation design should include… Study questions
Target population Key variables Conceptual framework if applicable

10 Study Questions What are the key questions the evaluation is designed to answer? Key questions link to stated purpose of evaluation and program activities Include any hypotheses that will be tested Examples How is the training program being implemented? (process) What factors have inhibited the achievement of goals? (process)

11 Examples What has been the impact of the training program on the participants? (outcome) What is the quality and character of the mentorship that is being provided in the program? (outcome) How and to what extent does the program increase student skills and knowledge about laboratory research? (outcome)

12 Institutional Impact Questions
How has the training program affected your institution? Institutions have structures which are defined by formal rules (laws, regulations, policies) and informal rules (culture, tradition, trust, implied codes of conduct) that shape people’s behavior Where might we see institutional change? Curriculum development Policies and practices Services and support offered to students and faculty Increased faculty awareness and responsibility for diversity Impact on students not supported by the training program Reported Effects on Institutions – NSF Louis Stokes Alliances for Minority Participation Program LSAMP has affected institutions in multiple ways. Interviewees report that LSAMP has enhanced institutional capacity for student talent development, and brought about changes in institutional culture as well as in institutional policies and practices. Through LSAMP services and support, institutions assist students in their efforts to continue through the STEM pipeline. All three case study Alliances, along with other Alliances, report increases in minority and nonminority STEM enrollment and STEM degree attainment. In all three of the case studies, interviewees observed a change in institutional culture. For example, some COAMP interviewees spoke about greater faculty awareness, understanding, and responsibility for diversity. In the case of FGAMP, some credited the project with increasing dialogue among faculty about effective teaching and learning strategies, and the opening up of research labs to undergraduates. Similarly, some of the NYC LSAMP interviewees spoke about how more professors are now seeing research as an integral part of the undergraduate experience, and how institutions are placing a greater focus on affirming the equal opportunity clause. In addition, across the three case study sites, significant changes in practice and policies are attributed to LSAMP. For instance, projects such as the NYC LSAMP are heavily pursuing course restructuring; over 18,000 students are reported to have enrolled in LSAMP restructured courses. Data drawn from the telephone interviews show that over half of the LSAMP projects are engaged in course reform efforts. The case study data reveal the varying nature of LSAMP-inspired changes taking place across various partner sites, including new emphasis on student participation in research grant proposals, the pursuit of research expositions by individual schools, development of a schoolwide research opportunity database, improvements in advisement procedures, creation of a standardized campus scholarship/funding procedure, and enhancement of community outreach and recruitment. Some participants noted that LSAMP serves as a “great recruitment tool” for schools and that the prestige and recognition it brings help participating institutions to secure funding to bring other intervention programs to campus.

13 Target Population What groups or groups do you need information about in order to answer the study questions? People or institutions? Size, general characteristics, any subgroups, etc. ? Examples Trainees and students Project managers Academic coordinators Faculty High-ranking administrators

14 Key Variables What specific information is needed to answer the study questions? Examples Program resources – funding, staffing, infrastructure Population characteristics – demographics Program activities – operations, processes, other activities Program goals, performance measures, comparison measures Program goal: Provide training opportunity for participants Performance measure: Minimum # workshops held per year Comparison measure: At least 4 workshops held per year (recognized standard of performance) External factors – factors beyond control of the program that may influence program success

15 Conceptual Framework Consider developing a conceptual framework (logic model) to illustrate how the program is supposed to achieve its goals What is a logic model? A logic model is a systematic and visual way to present and share your understanding of the relationships among the resources your have to operate your program, the activities you plan, and the changes or results you hope to achieve W.K. Kellogg Foundation Logic Model Development Guide (

16 Model of a Training Program
Resources Activities Impact (Inputs) (Outputs) (Outcomes) What are the changes or benefits? What is invested? What is done? What is invested? Workshops & Seminars Short term Knowledge Skills Attitudes Intermediate Behaviors Practices Long term Enter PhD Program Faculty & Staff Money Training in scientific methods Equipment & Technology Mentoring by faculty member Research base

17 Conceptual Framework: Why should we use one?
Increases understanding of program Provides a common language & framework Links activities to results Helps identify variables to measure Reflects group process and shared understanding Strengthens case for program investment The bane of evaluation is a poorly designed program. Ricardo Miller, Director, Evaluation Unit WK Kellogg Foundation

18 Data Collection and Analysis
Will you use new data or secondary data? Will it be quantitative, qualitative or mixed? Are there appropriate comparison groups? How will you collect the data? Are there ethical or IRB considerations? What are the limitations of the data?

19 Typical Data Collection Strategies
Method Pro Con Bibliometric analysis Quantitative; useful in aggregate as tool to assess quality of medical research Measures only quantity; can be artificially influenced Case studies Provides understanding of interaction of various influences on research process Cases not necessarily representative within or across programs Database extractions, Document reviews Useful for analyzing archival data: databases, program records, literature review, etc Records incomplete

20 Typical Data Collection Strategies (Cont.)
Method Pro Con Expert panel Useful in research fields, especially when few quantifiable indicators exist Difficult to obtain systematic, objective assessment Focus groups Provides understanding of attitudes and thoughts on subject; group dynamic can help elicit honest responses Results cannot be statistically generalized to larger populations; not quantifiable Interviews Offer insight from perspective of specific program roles and expertise Limited perspective; time-intensive Surveys Generate statistically reliable data – rating services, behavior, demographics, etc Requires statistically representative sample & adequate response rate

21 Project Management: Who Participates?
Role Contributions Challenges Program manager and staff Program knowledge Vested interest Evaluator Evaluation expertise Independence Limited program knowledge Evaluation Advisory Committee Program familiarity Organizational context Senior Leader/ Decision-maker Resources Contributions and challenges are not exhaustive – just key highlights Organizational context includes knowledge of budget constraints, the NIH and/or IC scientific portfolio, external pressures (Congress, advocacy groups). Example – a program might be doing well, but if it is duplicative with another program, it may need to be modified or terminated. Organizational context is critical to using evaluation results effectively. Evaluator – internal vs. external

22 “Rule of thumb” – 10% of project’s total budget
Budget Estimate “Rule of thumb” – 10% of project’s total budget Common Pitfalls Failure to consider in up-front planning Lack of resources – for analysis & interpretation Lack of time – be realistic & consider time for each step Qualitative evaluation – more costly to implement

23 Products of Evaluation
What reports or products are planned? Executive summary & final report Briefings – for students, faculty, & administrators How will results be used?

24 The Evaluation Design Matrix: A Tool for Discussion
Key Question(s) Information Required Information Source(s) Data Collection Methods Data Analysis Methods Limitations Conclusions WHAT DO YOU WANT TO KNOW? WHAT DO YOU NEED TO ANSWER THE QUESTION? WHERE ARE YOU GOING TO GET IT? HOW ARE YOU GOING TO GET IT? WHAT WILL YOU DO WITH IT ONCE YOU GET IT? WHAT CAN'T YOU DO (CAVEATS)? WHAT CAN YOU SAY? Clear and specific Measurable Doable Key terms defined Scope Timeframe Population Program goals Evidence Program criteria Participant rates Cost information Funding levels Program officials or participants External stakeholders documents Databases Journals Structured interviews Focus groups Structured surveys Case studies Data extractions Document retrieval Descriptive statistics Inferential statistics (T-test, regression) Cost/ benefit analysis Qualitative analysis Data quality or reliability Access to records Staffing/ funding constraints Generalize Unexpected finding Anecdotal information Precise statements about sample Impact of program changes

25 Why do we ask for program evaluation? ….................Because we’re accountable
What gets measured get’s done If you don’t measure success, you can’t reward it If you can’t reward success, you’re probably rewarding failure If you can’t see success, you can’t learn from it If you can’t recognize failure, you can’t correct it If you can demonstrate results, you can win public support Osborne & Gabler (1992) in Reinventing Government As summarized by Ellen Taylor Powell -U of Wisc Extension


Download ppt "Program Evaluation: A Pseudo-Case Study"

Similar presentations


Ads by Google