Presentation on theme: "Yvonne Belanger, Duke University Library Assessment Conference"— Presentation transcript:
1Tools for Creating a Culture of Assessment The CIPP Model and Utilization-Focused Evaluation Yvonne Belanger, Duke UniversityLibrary Assessment ConferenceSeptember 25-27, 2006Charlottesville, VA
2Key Questions for Libraries How can we build a culture of evaluation, so that many people contribute to evaluation?How can we provide a context for evaluation strategies and results?How can we conduct evaluation that helps with decision making?OverviewCulture, Context, and Conducting evaluationVarying evaluation resources available at different institutionsDrive toward decision-making is acceptableToday’s presentationA framework and key issues to consider in evaluation planningSpecific tools, templates and strategy – focus on the CIPP model and utilization-focused evaluation
3Culture of Assessment“…organizational environment in which decisions are based on facts, research, and analysis, and where services are planned and delivered in ways that maximize positive outcomes and impacts for customers and stakeholders.”Lakos et al,
4Barriers to a Culture of Assessment Lack of evaluative thinking (at all levels)Lack of engagement in evaluationPseudoevaluations (Stufflebeam, 1999)Promote a positive or negative view of a program, irrespective of its actual merit and worthLack of evaluative thinking – includes intuition-based rather than data driven decision making, lack of systems thinking – how does what I am actually doing connect back to my goals and forward to my intended outcomesPR Inspired studies (withholding all negatives)Politically controlled studiesOther factors discussed by Jim Self and Steve Hiller and others at this conference – need for clear leadership, specific person or group tasked with assessment
5Building evaluative thinking: CIPP Model Stufflebeam’s CIPP Model - Context, Input, Process and Product evaluationFocus: decision-makingPurpose: facilitate rational and continuing decision-making, particularly for programs and services with long-term goalsA comprehensive framework for guiding formative and summative evaluationsBased on a presumption that evaluation’s most important purpose is not to prove but to improve programs
6Details of the CIPP Model Context: Environment & NeedsInput: Strategies & ResourcesProcess: Monitoring implementationProduct: Outcomes - both quality and significanceMore information atThe CIPP was developed by D. Stufflebeam (see annotated bibliography for references)A comprehensive framework for guiding formative and summative evaluationsBased on a presumption that evaluation’s most important purpose is not to prove but to improve programsHas evolved over 30 years but remained up to date with new ideas from evolving approaches – e.g. Patton’s Utilization-focused evaluation, Guba & Lincoln’s Stakeholder focused evaluationCIPP adapts well to carrying out evaluations on any scale (projects, programs, organizations)An organizing framework, not a lockstep linear processSensitive to needs of decision makers (more detail on that ahead…)Systems approach – for that reason, using logic modeling to get a systems view of projects and programs can be a useful first stepMultiple observers and informantsMining existing informationMultiple procedures for gathering data; cross-check qualitative and quantitativeIndependent review by stakeholders and outside groupsFeedback from Stakeholders
7CIPP approach recognizes… “All politics are local” – offers a tailored evaluation approach designed to answer locally interesting & useful questions, emphasis is on credibility and usefulness rather than generalizability to other places, times, audiencesTips taken from Stufflebeam’s recent writings on using the CIPP approach (OPEN, 2003):Multiple observers and informantsCross-check often referred to as “triangulating”Multiple procedures for gathering dataMining existing informationIndependent reviewStakeholder feedback “All politics are local” – a tailored evaluation approach designed to answer locally interesting & useful questions, emphasis is on credibility and usefulness rather than generalizability to other places, times, audiences
8CIPP View of Institutionalized Evaluation CIPP provides a systematic way of thinking about how evaluation can contribute to short term and long term organizational planningCIPP for Decision MakersC: Define goals and prioritiesI: Assess competing proposals in terms of feasibility, alignment with goalsP: Provide context for interpreting outcomes, plan for service improvementP: Keep organization focused on achieving important outcomes, gauge success of effortsConnects manager / decision-maker thinking with an evaluation structure that all staff can contribute to and see themselves as a part ofStufflebeam sees Input as potentially the most neglected type of evaluation (Stufflebeam, OPEN, 2003)Provides a framework for integrating evaluation as an activity central to achieving broader organizational goalsIllustrates the focus of the model on use of evaluation information to shape goals, plans, and actionsStufflebeam, OPEN, 2003
9Advantages of the CIPP Model Adapts well to carrying out evaluations on any scale (projects, programs, organizations)An organizing framework, not a lockstep linear processSensitive to needs of decision makersSystems approach – encourages a systems view of projects and programs
10Taking a utilization-focused approach means asking… Building evaluative thinking and engagement: Utilization-focused evaluation approachTaking a utilization-focused approach means asking…Why is this evaluation being undertaken?What decisions need to be made with the results?Who will be most affected by those decisions?How can we engage those people in the entire evaluation process?All participants in an evaluation should be clear as to why the evaluation is being conducted, whether or not the results (or all of the results) will be shared publicly, internally only, only with key decision-makers, etc. and how the outcomes of the evaluation might affect them. Failing to follow these procedures will jepoardize your efforts to build a culture of assessment by destroying good will for assessment efforts, contribute to negative view of assessment and increase the paranoia of any staff who already feel threatened by these efforts.
11Utilization-focused evaluation Premise – by engaging stakeholders in the entire evaluation process from design to implementation of recommendationsEvaluation addresses questions of greatest importance to those in a position to directly make use of its findingsReduces the cultural barriers that can inhibit use of results by increasing transparency, empowering stakeholders
12Another advantage of the Utilization-focused approach “Process Use” benefitsFirst described by Patton - ‘ways in which being engaged in the processes of evaluation can be useful quite apart from the findings that may emerge from these processes’Four types of Process Use1. Enhancing shared understandings, especially about results;2. Supporting and reinforcing the object of the evaluation through intervention-oriented evaluation;3. Increasing participants’ engagement, sense of ownership4. Organizational developmentPatton (1997) Utilization-focused evaluation – New Century Text (3rd ed) pp )Patton 1997, pp
13Process Use & Culture of Assessment Increased capacity to make use of evaluation findingsKnow how to use evaluation information – producing better evaluation users in the organization who can effectively “weigh evidence, consider contradictions and inconsistencies, articulate values, and examine assumptions”through their experiences interpret evidence, draw conclusions, and make judgmentsPatton, 2004, “On Evaluation Use: Evaluative Thinking and Process Use”
14ExampleEvaluation of the Duke iPod experiment & Duke Digital Initiative…
15Summary Foster a culture of assessment by: Adopting frameworks that support decision-makingEngaging staff as stakeholders in the entire process of evaluation from design to implementation of recommendationsLeverage the opportunity of Process Use to develop staff and make them more saavy evaluation consumers
16Final Thoughts…“…evaluation's most important purpose is not to prove, but to improve.”Daniel Stufflebeam(CIPP Model)“Research is aimed at truth. Evaluation is aimed at action.”Michael Quinn Patton(Utilization-focused Evaluation)Michael Patton, Former President of AEA, leader in evaluation"Research is aimed at truth. Evaluation is aimed at action.”Research efforts often focus on a particular variable, and is often narrowly focused to answer only question.
17Thank You! Yvonne Belanger Head, Program Evaluation Academic Technology & Instructional ServicesPerkins LibraryDuke University
18ReferencesStufflebeam, D. (1999). Foundational models for 21st century program evaluation.Stufflebeam, D. (2003). The CIPP Model for Evaluation: An update, a review of the model’s development, a checklist to guide implementation. Paper read at Oregon Program Evaluators Network Conference, at Portland, OR.Patton, M. Q. (2004). "On evaluation use: Evaluative thinking and process use." The Evaluation Exchange IX(4).Patton, M. Q Utilization-focused evaluation: The new century text (3rd ed.). Thousand Oaks, CA: Sage.