Presentation is loading. Please wait.

Presentation is loading. Please wait.

Aaron J. Scott Research Associate October 13 th, 2009.

Similar presentations


Presentation on theme: "Aaron J. Scott Research Associate October 13 th, 2009."— Presentation transcript:

1 Aaron J. Scott Research Associate October 13 th, 2009

2  To briefly revisit the culture of evidence definition and model and its associated stages.  To reconsider Stage 1 and expand our notion of planning.  To discuss specific items to consider during the planning process – i.e. going beyond the objective statements.  To provide examples from our own department that are in or have recently gone through this stage.

3  A culture of evidence is defined by Lakos & Phipps (2004) as: “An organizational environment in which decisions are based on facts, research, and analysis, and where services are planned and delivered in ways that maximize positive outcomes and impacts for customers and stakeholders” (p. 352).

4  Transparent Decision-Making  Charting progress toward goals  Documenting implementation and improvement  Data-Informed decisions through assessment  Organizational/Departmental Effectiveness  Integrating goals and working together  Reporting outcomes in a consistent format  Accountability  To the Department  To the University  To External Stakeholders  The State  Parents  Funding Sources

5 or Processes Stage 1 Stage 2 Stage 3 Stage 4

6 3 Steps in Affirming and Building Upon Our Purpose(s)  Through development/review of Mission  Through development of Goals  Through development of Objectives MissionGoalsObjectives

7  Action statements  Clear, specific and concrete  Measurable  Time limited  Realistic  Hierarchical (ultimate, intermediate, immediate)  Build on strengths and reduce need  Focus on outcome or process

8  Securing the Planning & Budgeting at the beginning sets the foundation for a clear course of implementation and assessment.  This first stage has only begun, however. You must determine how you will go about implementing the program or services, defining and measuring their degree of success, who will be involved, the resources it will take to work toward accomplishing your objectives, and how long the process will take, to name just a few. And this often requires much more work and time!

9

10

11  The quality of a program or service, and decisions made on its behalf, is dependent, in part, upon the quality and utility of the data informing it.  But the quality of the data, and therefore evidence, is proportional to the degree of planning and rigor invested in developing the clarity of purpose, the design, and the implementation of a program or service, and its associated assessment.

12  Example: Living on campus during a student’s first year is known from a plethora of previous research to have a positive impact on his/her retention and overall academic success. But less certainty exists about how this takes place(Braxton, 2007).  To what degree is this true on our campus and how do we know? Is it true for all sub-groups? Are there ways we can maximize our impact? What data are there to support what we know? What data do we still need?

13 Current or New Program/Service? Current Program/Service Define Purpose of Program/Service Define Problem or Need Define Success or Progress Identify Type of Evidence Needed List and Consider Methods Available to Collect Evidence Determine Resources/Instruments Available to Collect Evidence Create a Timeline to Collect, Analyze, and Report Evidence

14  Carefully choose your evidence and it’s appropriateness (and timeliness) for the program or service.  Choose how to collect your evidence  (interviews, focus groups, brief questionnaires, in- depth surveys, paper form, web-based, - based).  Choose how to frame your evidence by providing a context for the results.  Determine to whom will you be providing this evidence.

15  What is my potential available budget?  What is my timeline? How realistic is it?  What are my analysis capabilities?  Who needs to see these data? How long will it take to get the data in presentable format?  How easily can I fit this method into my annual responsibilities?  Who needs to make decisions with these data?  Will this kind of evidence help me make the decisions I need to make? How?  How will I document the evidence and the decisions made from that evidence?

16  Purpose of program/service and assessment is stated clearly, concisely, and completely in terms of outcomes or processes. Often times this is stated in terms of a problem or need.  Members involved have been consulted early on and have agreed to participate.  How purpose will be carried out (i.e. implementation) is spelled out clearly.  Expectations are clearly stated.  A timeline has been through at least two drafts and has been peer reviewed.  A method for measuring outcomes has been secured (pre-fab or home- made) and either has well-established reliability and validity or has been planned to be piloted.  IRB process has been completed, if required.  Anticipated costs have been outlined.

17  Is it measurable?  Is it meaningful?  Is it manageable?  Who is the target audience of my outcome?  How will I know if it has been met?  Will it provide me with evidence that will lead me to make a decision for continuous improvement?

18  A culture of evidence is a great motto, but what kind of evidence? How good is the evidence and how do you know? Only through rigorous planning and implementation can we begin to answer these and other important assessment questions.


Download ppt "Aaron J. Scott Research Associate October 13 th, 2009."

Similar presentations


Ads by Google