Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011.

Similar presentations


Presentation on theme: "Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011."— Presentation transcript:

1 Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011 Judith M. Ottoson, Ed.D., M.P.H.

2 Evaluation practice What is it? Of what use is this information? Is it good or bad? How do you know? Evaluation Questions

3 Public Health Core Functions and Essential Services

4 Evaluation is…... … the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards as a means of contributing to the improvement of the program or policy Weiss, p4

5 Sources of Evidence Evaluation & Research Source: Weiss, 1998

6 Steps Engage Stakeholders Describe the program Focus the evaluation design Gather credible evidence Justify conclusions Ensure use and share lessons learned Standards Utility Feasibility Propriety Accuracy Framework for Program Evaluation in Public Health, MMWR, 1999

7 The Linchpin The fundamental purpose of evaluation theory is to specify feasible practices that evaluators can use to construct knowledge of the value of social programs that can be used to ameliorate the social problem to which programs are relevant. Shaddish, Cook and Leviton, 1991, p36

8 Practice -Engage the stakeholders -Describe the program -Ask questions -Make values transparent -Focus the design -Gather credible evidence -Justify conclusions -Ensure use & lessons -Manage evaluation Program Need/problem - Structure - Context - Change process -Kinds of use -By when & who -Reporting -Dissemination Valuing - Criteria of success - Standards of success - Who decides? Knowledge - “Real” knowledge - Design: who? when? - Methods - Analysis Adapted from: Shadish, Cook, & Leviton, Foundations of Program Evaluation, 1991 Feasibility Utility & accountability Accuracy Propriety Use Theories of program evaluation

9 Practice -Engage the stakeholders -Describe the program -Ask questions -Make values transparent -Focus the design -Gather credible evidence -Justify conclusions -Ensure use & lessons -Manage evaluation Program Need/problem - Structure - Context - Change process -Kinds of use -By when & who -Reporting -Dissemination Valuing - Criteria of success - Standards of success - Who decides? Knowledge - “Real” knowledge - Design: who? when? - Methods - Analysis Adapted from: Shadish, Cook, & Leviton, Foundations of Program Evaluation, 1991 Feasibility Utility & accountability Accuracy Propriety Use Theories of program evaluation

10 Program as Evaluand What is the problem or need? What is the “program?” –policy, program, project, component, element –before, during, after Internal process and structure External context Change process – levers of change

11 Types of Program Failure Successful Program “causal Process” “causal Process” “causal process” Desired effect Desired effect Desired effect Program Theory failure Implementation failure Set in motion Set in motion Did not set in motion Which led to Which did not lead to Which would have led to Source: Weiss, 1972

12 Basic Logic Model Input Resources &/or barriers that enable or limit program effectiveness, e.g. funds, people, supplies, equipment Activities/process Activities, techniques, tools, events, actions, technology, e.g. products, services, infrastructure Output Size & scope of services; dose e.g. # materials, rate participation, # hours Outcome Short-term Changes in attitudes, behavior, knowledge, skills, status, level of functioning Outcome Long-term Changes in organization, community, &/or system, e.g. improved condition, capacity, policy changes Adapted from: W.F. Kellogg Foundation, Logic Model Development Guide: http://www.wkkf.org/Pubs/Tools/Evaluation/Pub3669.pdfhttp://www.wkkf.org/Pubs/Tools/Evaluation/Pub3669.pdf

13 Development Strategies for Logic Models “But how?” questions: reverse logic –Start with distal effects –Ask: how to generate that effect –Create downstream of proximal effects & activities connected to needs “But why?” questions: forward logic –Start with needs –Ask: so what happens next to create distal effects –Create upstream of activities and proximal effects connected to distal effects “But how?” & “But why?” questions: middle-road logic –Start with activities or outputs –Ask: what will create downstream activities that connect to needs –Ask: but how will these activities or outputs create upstream distal effects

14 Patient Needs: Common medical problem w/ great impact resulting in ↓ Quality of Life (QOL) and loss of work Provincial Health Authority (PHA) Needs : Save $$, ↓ length of stay (LOS), and ↓waiting list Rapid Access Disc Herniation Program (RADH) Teaching  Resources: CD, pamphlet, helpline, contact no.  Small group discussion  Paramedical interview and teaching: nurses, PT Practice  Empowerment  Opportunity with involvement  Planning care and activities after surgery Resources  OR suited for MIS  Day Surgery Unit  Help line Output  # packages distributed  # procedures performed  # interviews with provider Surgeons  Knowledge  Training  Expertise with Minimally Invasive Surgery (MIS) Patient PHA Impact  Reallocation of hospital resources  Center of Excellence  Raise standard of Disc treatment to Nat’l level Referral from GP/Specialists Administration  Support/ attitude  Resources ($$,equipment)  Facility Output  # patients participated  # surveys  # follow-ups  # discharges Outcome  ↑QOL -pain reduction -improved activities  Early RTW Outcome  ↓ LOS  ↓waiting list  ↓ $$

15 Practice -Engage the stakeholders -Describe the program -Ask questions -Make values transparent -Focus the design -Gather credible evidence -Justify conclusions -Ensure use & lessons -Manage evaluation Program Need/problem - Structure - Context - Change process -Kinds of use -By when & who -Reporting -Dissemination Valuing - Criteria of success - Standards of success - Who decides? Knowledge - “Real” knowledge - Design: who? when? - Methods - Analysis Adapted from: Shadish, Cook, & Leviton, Foundations of Program Evaluation, 1991 Feasibility Utility & accountability Accuracy Propriety Use Theories of program evaluation

16 Focusing the Evaluation by Asking Questions Ask guiding (key) questions about the evaluand –Based on the logic model –Program level, not individual level Questions are something in which multiple stakeholders can engage vs. developing “measures,” writing objectives, or study design Questions become the guide to measures, analysis, & reporting One question may cover multiple measures. Group evaluation results by the forest (questions), not the trees (measures)

17 Determining the Value of a program or policy What is valued? Who decides? Prescribe or describe values Making values transparent Valuing logic – dimensions (criteria) of worth/merit – standards of worth/merit –performance

18 The logic of valuing Determine criteria of “success” –the dimensions of the evaluand on which stakeholders… …have questions …identify as key, core, or essential …are willing to hang evaluand success –Dimensions include: input, process, output, outcomes –Process ex: diversity, enrollees, materials, activities –Outcome ex: knowledge, behavior, jobs, scores, health status Set standards of “success” –how well must performance be on the criteria –Ex: #, %, increase or decrease, spread, Measure performance criteria = bar Standard =how high

19 Practice -Engage the stakeholders -Describe the program -Ask questions -Make values transparent -Focus the design -Gather credible evidence -Justify conclusions -Ensure use & lessons -Manage evaluation Program Need/problem - Structure - Context - Change process -Kinds of use -By when & who -Reporting -Dissemination Valuing - Criteria of success - Standards of success - Who decides? Knowledge - “Real” knowledge - Design: who? when? - Methods - Analysis Adapted from: Shadish, Cook, & Leviton, Foundations of Program Evaluation, 1991 Feasibility Utility & accountability Accuracy Propriety Use Theories of program evaluation

20 Knowledge construction Is evaluation knowledge special? What counts as “real” evaluation knowledge to you? To others? What are feasible, ethical, useful, and accurate ways to construct knowledge?

21 Evaluation Design Basics Who? (sample) When? (timing) What? (answer the questions) How? (data collection)

22 Practice -Engage the stakeholders -Describe the program -Ask questions -Make values transparent -Focus the design -Gather credible evidence -Justify conclusions -Ensure use & lessons -Manage evaluation Program Need/problem - Structure - Context - Change process -Kinds of use -By when & who -Reporting -Dissemination Valuing - Criteria of success - Standards of success - Who decides? Knowledge - “Real” knowledge - Design: who? when? - Methods - Analysis Adapted from: Shadish, Cook, & Leviton, Foundations of Program Evaluation, 1991 Feasibility Utility & accountability Accuracy Propriety Use Theories of program evaluation

23 Evaluation Use Kinds of use –instrumental –conceptual Who uses & when Facilitators and obstacles to use

24 The Program Evaluation Standards key features –Standards identify and define evaluation quality –Guide evaluators and evaluation users in pursuit of evaluation quality –“laws” vs voluntary, consensus Revised 2011 –Clarifications –Now fifth standard of evaluation accountability Trade-offs among standards

25 Standards of Program Evaluation Utility -- The utility standards support high quality evaluation use through attention to all aspects of an evaluation (8) Feasibility -- The feasibility standards encourage evaluation to be effective and efficient. (4) Propriety -- The propriety standards are intended to ensure that an evaluation will be proper, fair, legal, right, acceptable, and just. (7) Accuracy -- Accuracy is the truthfulness of evaluation representations, propositions, and findings, which is achieved through sound theory, methods, designs, and reasoning. (8) Evaluation Accountability -- Documenting and improving evaluation accountability requires similar efforts to those required for program accountability, i.e., an evaluation of the evaluation (metaevaluation) (3)

26 Evaluation Standards Describing the program U2: attention to stakeholders F2: Practical procedures F3: Contextual viability P1: Responsive & inclusive orientation P5: Transparency & disclosure P6: Conflicts of interest A2: Valid information A3: Reliable information A4: Explicit program & context descriptions A7: Explicit evaluation reasoning A8: Communication & reporting E1: Evaluation documentation The program evaluation standards, 2011

27 The Evaluation Debates

28 Evaluation Practice Debates Whether to evaluate? Who wants the evaluation and why? Do evaluators need to be content experts and/or evaluation experts? What are the advantages and disadvantages of being an internal vs external evaluator? Who counts as a stakeholder for you? Do you hold bias towards any stakeholders? Evaluation design, data collection, analysis, and reporting. Are evaluation and research the same?

29 Program Debates What counts as the evaluand, e.g., process, structure, context, people, planning? How do programs relate to policy (landscape) and components (portrait)? What assumptions guide the program? How is it supposed to work? What is the relationship between type of evaluation & stage of program development/ stability? What factors influence incremental vs radical program change? What are the levers of change?

30 Valuing Debates What kinds of values determine program value, e.g., instrumental, terminal? Where are the values embedded, e.g. needs, goals, problems? How might the use of prescriptive or descriptive values influence the evaluation? Who decides on evaluation criteria and standards, e.g. evaluator? stakeholders? Should evaluators synthesize or describe findings?

31 Knowledge Construction Debates What counts as “real” knowledge to you? To other stakeholders? What is acceptable evidence of program success? To whom? Is the quantitative / qualitative debate over for everybody? Is it possible to combine these different understandings of knowledge construction?

32 Evaluation Use Debates What counts as use of an evaluation: conceptual vs instrumental use? How much? By when? With what fidelity should evaluation findings be used? Who uses evaluation results? What are the evaluator’s responsibilities towards use?

33 Key Evaluation Resources American Evaluation Association –www. eval.org –November 2-5, 2011, Anaheim San Francisco Bay Area Evaluators – www. sfbae.org/ The Evaluators Institute: http://tei.gwu.edu –April 4-16, 2011

34 References Centers for Disease Control and Prevention. Framework for program evaluation in public health. MMWR 1999;48 (no.RR-11) http://www.cdc.gov/mmwr/preview/mmwrhtml/rr4811a1.htm http://www.cdc.gov/mmwr/preview/mmwrhtml/rr4811a1.htm Ottoson, J.M. & Martinez, D. (2010). An ecological understanding of evaluation use: A case study of the Active for Life evaluation. The Robert Wood Johnson Foundation. www.rwjf.org/pr/product.jsp?id=71148 www.rwjf.org/pr/product.jsp?id=71148 Ottoson, J.M. & Hawe, P. (eds). (Winter 2009). Knowledge utilization, diffusion, implementation, transfer, and translation: Implications for evaluation. New Directions for Evaluation, 124. Jossey-Bass: San Francisco. Shadish, W.R., Cook, T.D.,& Leviton, L.C. (1991). Good theory for social program evaluation. In The foundations of program evaluation. Newbury Park: Sage. The Joint Committee on Standards for Educational Evaluation. (2011). The program evaluation standards (3rd ed.). Thousand Oaks: Sage. Weiss, C.H. (1998) Evaluation. (2 nd Ed.). New Jersey: Prentice-Hall. W.K. Kellogg Foundation, Logic Model Development Guide. (2004). http://www.wkkf.org/knowledge-center/resources/2010/Logic-Model- Development-Guide.aspx http://www.wkkf.org/knowledge-center/resources/2010/Logic-Model- Development-Guide.aspx


Download ppt "Evaluation Theory and Practice Framework : UCSF EPI246 Translating Evidence Into Practice: Individual-Centered Implementation Strategies March 10, 2011."

Similar presentations


Ads by Google