Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)

Similar presentations


Presentation on theme: "Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)"— Presentation transcript:

1 Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI) Associate Vice President for University Planning, Institutional Research, and Accountability (IU)

2 Overview What I think you think I might talk about What I think you need to think about

3 How to Assess Programs for Improvement A range of methods from simple to complex The Core Idea Simple models Quality improvement models Program review More complex models

4 The Core Idea: The Planning-Evaluation-Improvement Cycle Plan Implement Assess Improve Plan Check Do Act

5 Adapted from Norman Jackson Toward a Spiral of Improvement 2.ENGAGE WITH THE PROBLEM CALLED HOW DO WE IMPROVE PROGRAM? 5.EVALUATE IMPACT ON OUTCOMES * did it work as I intended? * how did people respond? * what were the results? 6.PLAN TO IMPROVE 1. THINK ABOUT PROGRAM ISSUES 3.DEVELOP RESOURCES/ STRATEGIES TO IMPROVE 4.IMPLEMENT CHANGES * experiment Back to the drawing board On to something else

6 Core Evaluation Cycle Questions What are you trying to achieve? Needs assessment What are you doing to achieve it? Process assessment How will you know when you get there Outcomes assessment What can you do with the results? Improvement

7 Why the Fixation on Outcomes? We havent paid sufficient systematic attention to this in the past We look at inputs (resources) and processes (curricula and programs) fairly systematically We tend to look at outcomes one student at a time The link to accountability

8 Simple Models of Assessment Advantages Easy to communicate, use, and learn from Can be built into everyday work Helps build and maintain culture of evidence Models The evaluation cycle (or spiral) The assessment matrix template

9 The Assessment Matrix

10 The Limits of Simple Models Often overly simplistic relative to problems Actual measures can be misguided Implementation can be inconsistent across units Not always easy to link outcome measures to responsible processes doing the right thing vs. doing it right

11 Quality Improvement Models Advantages Focus on process provides best chances for identifying points of improvement Collaborative teams empower staff and help improve communication across units Formulaic method and external staff support help guide and keep on track Sample methods Penn States F AST T RACK U of Wisconsin Accelerated Improvement

12 PSU Fast Track

13 Team # 526 -- Food Sciences Measures Task Force College of Agricultural Sciences December 2002 Objective Develop and implement a centralized system for collection and reporting of key performance indicators and departmental reports. Action Plan: 1. Evaluate current processes and data sources for gathering data for performance indicators and department reports. 2. With information from #1, work with Dept. Head to define "key performance indicators" from Department's Strategic Plan. 3. Working with Dept. Head, define "departmental reports and other measures". 4. Develop feasible solutions for a collection process for data/information defined in #2 and #3. 5. Present solutions to the Sponsor with cost/benefit analyses. 6. Sponor to disseminate solution and plan to faculty. 7. Assist in implementation of solutions including initial collection of information (test cycle for new process), revising process, flowcharting and writing procedures and training stakeholders. Results Achieved to Date Team was disbanded at the end of 2003 after achieving action plans #1 - 7. Most of the expected outcomes were achieved for a fundamental centralized data collection system. Given the current resource and budget constraints, the sponsor decided not to pursue further automation of the centralized data system at the time. The Food Sciences Task Force was disbanded at the end of 2003.

14 http://www.wisc.edu/improve/improvement/accel.html

15 UWisc Accelerated Improvement Define Goals and measures of success Document process Understand customer needs Check/refine goals Design Develop potential solutions Analyze solutions/options Finalize solution develop implementation plan Implement Inform affected people Conduct training, if needed Execute action plans w/timeline Follow-up Collect data to track improvement Review and refine process changes Issue final report with results

16 Limits of QI Models Academicians wary of business models Focus on process emphasizes doing it right over doing right thing Can be episodic rather than continuous

17 Program Review Program self-study, site visit by peers Common method for academic programs Increasing use for administrative programs Fits well with accreditation framework Guidelines shape tone and tenor Content standards Review team composition Flexibility accommodates range of inquiry orientations

18 Limits of Program Review Expensive and time-consuming Can be done with little participation Or with a lot Results not always directly useful for change Memorandum of understanding helpful Episodic nature not responsive to changing environment

19 More Complex Models Advantages Handle true complexity Provide in-depth insight into context Academicians respect the scholarship (although not necessarily the particular approach) Examples (from WMUs evaluation center) CIPP Constructivist Evaluation

20 More Complex Models The Evaluation Center, Western Michigan University http://www.wmich.edu/evalctr/ CIPP ModelCIPP Model Constructivist EvaluationConstructivist Evaluation Deliberative Democratic Evaluation Key Evaluation Checklist Qualitative Evaluation Utilization-Focused Evaluation

21 The CIPP Model 1.Contractual Agreements 2.Context Evaluation 3.Input Evaluation 4.Process Evaluation 5.Impact Evaluation 6.Effectiveness Evaluation 7.Transportability Evaluation 8.Sustainability Evaluation 9.Metaevaluation 10.The Final Synthesis Report

22

23 Constructivist Evaluation Guba & Lincoln (2001) Two-stage process Discovery - effort to describe whats going on here, the here being the evaluand and its context Assimilation - effort to incorporate new discoveries into the existing construction or constructions …so that the new construction will fit, work, demonstrate relevance, and exhibit modifiability.

24 Limits of Complex Models Too complex to be practical Expensive They require an… evaluation unit as a staff operation at a high level of the organization in order to help insulate the unit from inappropriate internal influences and enhance its influence on decision making.

25 Take Home Points There are many approaches to assessing for improvement Virtually any method of inquiry can be accommodated The point of all of them is to determine how well you are doing things and how they might be done better; and to then try doing better and to see if that improves the outcomes Each can be done well or poorly

26 Doing Assessment Well Being data- or evidence-driven is not, in and of itself, a good thing e.g., selective use of evidence to support a foregone conclusion Torture numbers long enough and theyll confess to anything Effective use of data requires sharing diverse and often divergent perspectives Its not what the data say, its what you say about the data Some disagreement and dissent is important to learning and innovation

27 Further Heresy Building effective programs requires some level of irrationality and disorder To learn from what we do requires that we unlearn some things that we often dont want to unlearn As if doing this by ourselves were not difficult enough, we must do this together

28 From Data- to Learning-Driven Data-driven implies… Rational, systematic testing of ideas through inspection of facts sequential, often individual decision-making process Learning-driven implies… Going beyond what we already know and can do to gain new competencies Deconstruction and reconstruction of ideas and beliefs Becoming irrational to become re-rational

29 Single- and Double-Loop Learning Learning is the detection and correction of error (unintended consequences) Governing Variables are those things what we feel are important to keep within acceptable limits Action Strategy is what we do or plan to do to keep the governing variables within limits Consequences are the intended and unintended outputs and outcomes Intended: confirm our theory in use Unintended: suggests error in our theory in use

30 Single-Loop Learning Governing variables not called into question Adjustments made to action strategies at best Defense mechanisms can readily arise to maintain single-loop learning Governing Variables Action Strategies Conse- quences

31 Double-Loop Learning Questioning the role of the framing and learning systems which underlie actual goals and strategies Reflection is fundamental Basic assumptions are confronted Hypotheses publicly tested Falsification is sought Ego is laid aside Governing Variables Action Strategies Conse- quences

32 Model I and II Org Learning Single- and double-loop learning at the organizational level Model I: Organizational members prescribe to a common theory in use Organizational policies and practices inhibit change Model II: Governing values, policies, and practices promote double-loop learning

33 A Model I Learning Organization Governing Variables Tow the line Win at all costs Suppress negative feelings Emphasize rationality Action Strategies Control environment and task unilaterally Protect self and others unilaterally Discourage inquiry Consequences Defensive relationships Low freedom of choice Reduced production of valid information Little public testing of ideas

34 A Model II Learning Organization Governing Variables Valid information is most important Free and informed choice Shared internal commitment Action Strategies Shared control Participation in design and implementation of action Consequences Minimally defensive relationships High freedom of choice Public testing of ideas

35 Participatory Action Research/Inquiry Systematic inquiry process Can use any of aforementioned methods Stakeholder empowerment through active and on-going participation Dialog throughout process promotes collaboration Active learning and discovery fostered by critical reflection process Action plans create shared responsibility for doing something with the results Follow-up to action (checking results) maintains relationships and commitments

36 Participatory Action Research/Inquiry Quotes from Handbook of Action Research by Peter Reason http://www.bath.ac.uk/~mnspwr/Papers/HandbookIntroduction.htm The aim of participatory action research is to change practices, social structures, and social media which maintain irrationality, injustice, and unsatisfying forms of existence. (Robin McTaggart) Participatory research is a process through which members of an oppressed group or community identify a problem, collect and analyse information, and act upon the problem in order to find solutions and to promote social and political transformation. Daniel Selener We must keep on trying to understand better, change and reenchant our plural world. Orlando Fals Borda

37 Participatory Action Research Who does what? Decides what actions are taken? Is responsible for effective implementation? Can devise appropriate evaluation protocols? Has access to or can collect appropriate evidence? Reviews the results and decides what to do? What can be done to get these people to work together and in concert?

38 Example: Evaluation of New Student Orientation Research Question and Evaluation Focus reassessment of goals; incoming students needs; impacts on knowledge, attitudes, and behaviors Data Collection focus groups and questionnaires, sought perspectives of all major stakeholders Data Reporting and Feedback meetings with orientation leaders and faculty stakeholders Development of Action Plans facilitation of dialogue and data-driven proposals Action implementation of proposed changes Assessment – on-going formative evaluation; re- administration of process and outcome instruments

39 Example: Indiana Project on Academic Success (IPAS) Research-based inquiry for enhancing academic success Four-stage method Assessment Organizing Action Inquiry Evaluation Supported by use of state and institutional student tracking records

40 Stage 1: Assessment Compare campus assessment information to statewide assessment results; identify possible challenges Collect additional information from campus sources, such as prior reports and studies and focus group interviews Organize teams of administrators, faculty, professional staff, and students to identify critical challenges on the campus Prioritize the challenges, identifying two or three that merit special attention at a campus level

41 Stage 2: Organizing Coordinate the assessment and inquiry process with campus-level planning and budgeting; integrate the challenges with strategic plans; coordinate budgeting to provide necessary support. Appoint workgroups to address critical, campus-wide challenges; consider providing release time to team leaders to work on tasks for the campus. Coordinate the inquiry process (activities of the workgroups) with campus planning and budgeting.

42 Stage 3: Action Inquiry Build an Understanding of the Challenge What solutions have been tried in the past, and how well did they work? What aspects of the challenge have not been adequately addressed? What aspects of the challenge require more study? Develop hypotheses about the causes for the challenges using data to test the hypotheses. Do the explanations hold up to the evidence? Look Internally and Externally for Solutions Talk with people on campus about how they have addressed related challenges. Consider best practices for retention and how they might be adapted to meet local needs. Visit other campuses that have tried out different approaches to the problem. How well would these alternatives address the challenge at your campus? Assess Possible Solutions Consider alternatives in relation to the understanding of the problem developed in Stage 3, step 1. Will the solutions address the challenge at your campus? How can the solution be pilot tested? If you tried out the solution, how would you know if it worked? What information would you need to know how well it worked?

43 Stage 3: Action Inquiry (cont.) Develop Action Plans Action plans should address the implementation of solutions that should be pilot tested. Consider solutions that can be implemented by current staff. If there are additional costs, develop budgets for consideration internally and externally. (Remember, seeking additional funds can slow down the change process.) Develop action plans with time frames for implementation and evaluation

44 Stage 4: Evaluate Implement Pilot Test and Evaluate Provide feedback to workgroups and campus coordinating team. Use evaluation results to refine the solution. Also, evaluation can be used as a basis for seeking additional funding from internal and external sources, if needed

45 Building Trust – Lowering Resistance to Change Do… Evaluate program effectiveness Provide incentive for using information (regardless of results) Raise expectations regarding quality and use of evidence Be patient with the learning curve Raise expectations for learning (for students and colleagues) Dont… Evaluate individual effectiveness Tie resource allocation directly to results Beat people over the head with findings Confuse anecdotes with evidence Keep changing direction based on initial findings Lower expectations for learning

46 Whats the Point? Assessment and evaluation are means not ends Other important ingredients include: Bringing the right people together A climate of trust and experimentation Incentives and support Its not rocket science An imprecise answer to the right question is much more useful than a precise answer to the wrong question


Download ppt "Assessing for Program Improvement Presented at the University of Arizona February 11, 2009 Victor M. H. Borden Associate Professor of Psychology (IUPUI)"

Similar presentations


Ads by Google