Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessment 101 WARP Meeting Friday November 13, 2009

Similar presentations


Presentation on theme: "Assessment 101 WARP Meeting Friday November 13, 2009"— Presentation transcript:

1 Assessment 101 WARP Meeting Friday November 13, 2009
Anne Marie Karlberg Director of Institutional Research and Assessment (360) There are many ways to think about assessment and structure a presentation; this is just one way of structuring a framework Hand-outs: CIPP model summary sheet (one side) Assessment products grid (other side)

2 Overview What is assessment?
Components of an effective assessment program Assessment plan Assessment website

3 Overview What is assessment?
Components of an effective assessment program Assessment plan Assessment website

4 What is assessment? “ The systematic collection of information about student learning…to inform decisions about how to improve learning.” (Walvoord, 2004, p. 2) This will be a review for some of you. But first, we need to step back and make sure we’re all operating with a similar definition of assessment. This is the most succinct and clearest definition I know of assessment. Assessment is...

5 improvement (formative) accountability (summative)
Purposes improvement (formative) accountability (summative) 2 purposes of assessment are 1. improve student learning and performance (internal improvement) 2. demonstrate to external accreditation bodies that relationships exist between the college’s mission and learning outcomes.

6 Overview What is assessment?
Components of an effective assessment program Assessment plan Assessment website

7 Context Inputs Processes Products
Educational Evaluation Model: CIPP By Daniel Stufflebeam Context Inputs Processes Products In 1965, an educator, named Daniel Stufflebeam, developed the context, input, processes, and products model – or CIPP model. The CIPP model provides a useful framework for evaluating educational initiatives by breaking them down into their context, input, processes, and products. [As the model evolved, he subdivided products into…impact, effectiveness, sustainability, and generalizability]. Since the 1960s, this model has evolved and is used in the educational world to evaluate programs. It is not necessary to evaluate all of 4 areas when evaluating a program, but it’s important to at least consider each of these components.

8 Context Inputs Processes Products
Using CIPP to Create an Effective Assessment Program Context Inputs Processes Products Stufflebeam’s CIPP model is useful in creating an effective assessment framework. So I am going to present the components of an effective assessment program within the CIPP framework.

9 Context Inputs Processes Products
Using CIPP to Create an Effective Assessment Program: Context Context Location of college Population we serve Faculty/staff Inputs Processes Products The context of a college includes things like the location of the college, the population served, and the characteristics of the faculty/staff.

10 Context: What do we know about the context of our colleges?
At Whatcom Community College… 8.3% of our employees 18.9% of our degree and certificate seeking students 16.0% percent of Whatcom County residents …indicate they are of color. So what are the implications of this data?

11 Context Inputs Processes Products
Using CIPP to Create an Effective Assessment Program: Inputs Context Location of college Population we serve Faculty/staff Inputs Resources Plans and strategies Processes Products The inputs to the Assessment program consist of the resources we invest and the plans and strategies that we employ to carry out the Assessment program.

12 Resources human resources financial support technical support
administrator and faculty support Human Resources: Whatcom… Funds a full-time Director of Institutional Research and Assessment Funds a part-time Outcomes Assessment Coordinator (faculty member) Funds faculty stipends to work on outcomes assessment-related projects Financial Support: Whatcom… Funds for the administration of assessment-related tasks (e.g., for conducting surveys) Technical support: Currently, this is one of our biggest challenges. Our databases are extremely cumbersome to use, so it is difficult to generate meaningful data in a reasonable time frame, but we are doing the best we can. Administrators: Whatcom administrators have… provided visible advocacy for assessment, including considerable financial support (provided necessary opportunities, incentives, material resources, and compensation to faculty and staff for assessment initiatives) appreciated and supported staff and faculty for their assessment efforts and achievements Faculty: Whatcom faculty members have… remained open-minded and respond in respectful, cooperative, and collaborative ways taken ownership of assessment and embrace it as an intrinsically valuable process

13 Plans and Strategies revise mission statement and familiarize faculty / staff with mission implement the strategic plan implement the assessment plan

14 Context Inputs Processes Products
Using CIPP to Create an Effective Assessment Program: Processes Context Location of college Population we serve Faculty/staff Inputs Resources Plans and strategies Processes Implementation Embedding assessment in college Learning and teaching practices Products The processes include 1. How we go about implementing the assessment program 2. The extent to which we embed assessment throughout the college 3. The learning and teaching practices we employ

15 Implementation Link assessment program to mission statement
redirect resources towards priorities increase responsiveness to community needs measure the extent to which we are successful Provide opportunities for meaningful, collaborative college-wide conversations, by structuring assessment in an ongoing, simplified, participatory way relevant and meaningful The way in which the assessment program is carried out is important. It is essential that we… 1. Link our assessment program to mission. If we do this, it redirects resources towards priorities and increase the college’s responsiveness to the needs of the community measures the extent to which we are successful 2. Provide opportunities for meaningful, collaborative college-wide conversations We do this by structuring the assessment program in an ongoing, simplified, participatory way that is relevant and meaningful to the college and faculty and staff

16 Embed assessment throughout college
strategic planning the website curriculum review employee professional development and performance budgeting program review student government the college catalogue program planning college publications The extent to which assessment is embedded throughout the college is an indication of the level of success of the assessment program.

17 Learning and teaching practices
Apply meaningful, relevant and contextualized experiences for students, such as… using self-reflection applying concepts to a relevant context teaching material to peers discovering connections between subjects The following teaching practices contribute to effective learning… for example, when faculty…

18 Context Inputs Processes Products
Using CIPP to Create an Effective Assessment Program: Products Context Location of college Population we serve Faculty/staff Inputs Resources Plans and strategies Processes Implementation Embedding assessment in college Learning and teaching practices Products Direct indicators Indirect indicators Institutional data The products include Direct indicators Indirect indicators Institutional data This provides the evidence that we are doing what we say we are doing.

19 Assessment data Direct indicators (outcomes): e.g., essays, capstone projects, demonstrations, presentations 1. Direct indicators require students to demonstrate their learning through for example, e.g., essays, capstone projects, demonstrations, presentations.

20 Assessment data Direct indicators (outcomes): e.g., essays, capstone projects, demonstrations, presentations Indirect indicators (perceptions): e.g., surveys, focus groups, interviews 2. Indirect indicators ask students to reflect on their learning thru for example… surveys, focus groups, interviews (are students perceptions of their learning)

21 Assessment data Direct indicators (outcomes): e.g., essays, capstone projects, demonstrations, presentations Indirect indicators (perceptions): e.g., surveys, focus groups, interviews Institutional data: e.g., retention, graduation, enrollment, transfer trends 3. Institutional data do not necessarily indicate student learning but do reflect the overall condition and effectiveness of the college e.g., retention, graduation, enrollment, transfer trends

22 Assessment levels College level Program level Course level
Try to use a combination of the 3 types of data at the college, program, and course levels

23 Assessment Products: Examples
Type of data College Program Course 1. Direct indicators (outcomes assessment) Core Learning Abilities and outcomes Program outcomes Course outcomes 2. Indirect indicators (surveys, interviews) 3. Institutional data (rates and numbers) As I just mentioned, there are 3 types of assessment data that we use to evaluate student learning: direct indicators, indirect indicators and institutional data. We need to collect this data at the college, program, and course levels.

24 Assessment Products: Examples
Type of data College Program Course 1. Direct indicators (outcomes assessment) Core Learning Abilities and outcomes Program outcomes Course outcomes 2. Indirect indicators (surveys, interviews) 3. Institutional data (rates and numbers) Sometimes people refer to “Direct indicators” as “outcomes assessment”. Outcomes are things students are able to do at the end of their college experience, their program, or a course. There are 2 phases of outcomes assessment (1) development of the outcomes process and (2) implementation of the outcomes process Example of a CLA: “Communication” Example of a program outcome from an AA in ECE: Students will be able to create and modify environments and experiences to meet the individual needs of all children, including children with disabilities, developmental delays, and special abilities. An example of a course outcomes: ECE Students will be able to recognize 6 strategies for dealing with children’s behavior that the students find challenging.

25 Assessment Products: Examples
Type of data College Program Course 1. Direct indicators (outcomes assessment) Core Learning Abilities and outcomes Program outcomes Course outcomes 2. Indirect indicators (surveys, interviews) Student opinion survey CCSSE Graduate survey Alumni survey Student evaluation of courses 3. Institutional data (rates and numbers) Graduation Performance after transfer Student enrollment Retention Transfer Course completion Grade distribution

26 Assessment Products: Examples
Type of data College Program Course 1. Direct indicators (outcomes assessment) Core Learning Abilities and outcomes Program outcomes Course outcomes 2. Indirect indicators (surveys, interviews) Student opinion survey CCSSE Graduate survey Alumni survey Student evaluation of courses 3. Institutional data (rates and numbers) Graduation Performance after transfer Student enrollment Retention Transfer Course completion Grade distribution You can use this general framework for evaluating anything—not just student learning: for example, if you are evaluating an employee, you can determine the degree to which they are accomplishing the tasks for which they are responsible (direct); you can ask others for their perceptions (indirect); and you might have data (attendance).

27 Context Inputs Processes Products
Using CIPP to Create an Effective Assessment Program Context Location of college Population we serve Faculty/staff Inputs Resources Plans and strategies Processes Implementation Embedding assessment in college Learning and teaching practices Products Direct indicators Indirect indicators Institutional data The intention of the assessment program is to effect learning and a change in the college community and the community at large–the context. When assessment is done well, it can improve student learning and clarify and strengthen the mission of a college. Can use this CIPP model to evaluate any aspect of our college

28 Overview What is assessment?
Components of an effective assessment program Assessment plan Assessment website

29 Context Inputs Processes Products
Using CIPP to Create an Effective Assessment Program Context Location of college Population we serve Faculty/staff Inputs Resources Plans and strategies Processes Implementation Embedding assessment in college Learning and teaching practices Products Direct indicators Indirect indicators Institutional data

30 Assessment Products: Examples
Type of data College Program Course 1. Direct indicators (outcomes assessment) Core Learning Abilities and outcomes Program outcomes Course outcomes 2. Indirect indicators (surveys, interviews) Student opinion survey CCSSE Graduate survey Alumni survey Student evaluation of courses 3. Institutional data (rates and numbers) Graduation Performance after transfer Student enrollment Retention Transfer Course completion Grade distribution Within WCC’s Assessment Plan, we’ve created a subplan for each of the 9 quadrants of this table. I am going to provide excerpts of examples of each under only the college heading.

31 Assessment Products: Examples
Type of data College Program Course 1. Direct indicators (outcomes assessment) Core Learning Abilities and outcomes Program outcomes Course outcomes 2. Indirect indicators (surveys, interviews) Student opinion survey CCSSE Graduate survey Alumni survey Student evaluation of courses 3. Institutional data (rates and numbers) Graduation Performance after transfer Student enrollment Retention Transfer Course completion Grade distribution

32 Revise CLAs and outcomes
Direct indicators: College outcomes plan Goal (# of CLAs) Whatcom will… Baseline (May ) (1) Development of the college outcomes process a. educate faculty / staff / students about assessment No formal initiatives ongoing b. revise Core Learning Abilities (CLAs) and outcomes Identified 5 CLAs in 1998 Revise CLAs and outcomes refine review c. develop assessment tools (e.g., scoring guides / rubrics) to measure the outcomes 5 d. determine which courses will be used to introduce, reinforce, and/or assess outcomes at entry, midway, and exit (e.g., curriculum map) e. include outcomes on syllabi 1 2 f. collect the instructional assignments, activities, projects, or experiences, in required courses that will be used to teach outcomes at entry, midway, and exit g. collect the activities, experiences, projects, essays, or assignments in required courses that will be used to assess outcomes at entry and exit h. attach anchor papers (i.e., examples) for each level of the scoring guide/rubric scale (2) Implementation of the college outcomes process a. assess students at (entry and) exit for outcomes b. analyze the (entry and) exit assessment data c. present analysis to faculty and students and consult on the results d. use the data to improve and revise curriculum e. document the process; create an assessment report about how the data were used to improve learning These are all excerpts from our assessment plan of what we hope to accomplish at the college level this year. You are not meant to be able to read this, but the intention of showing this to you is to give you a sense of all the steps involved in developing and implementing outcomes at the college, program, and course levels. The first 4 tasks, highlighted in yellow, are what we hope to accomplish this year.

33 Revise CLAs and outcomes
Direct indicators: College outcomes plan Goal (# of CLAs) Whatcom will… Baseline (May ) (1) Development of the college outcomes process a. educate faculty / staff about assessment No formal initiatives Ongoing b. revise Core Learning Abilities (CLAs) and outcomes Identified 5 CLAs in Revise CLAs and outcomes Refine Review c. develop assessment tools (e.g., scoring guides / rubrics, etc.) to measure the outcomes d. determine which courses will be used to introduce, reinforce, and/or assess outcomes at entry, midway, and exit (e.g., curriculum map)

34 Assessment Products: Examples
Type of data College Program Course 1. Direct indicators (outcomes assessment) Core Learning Abilities and outcomes Program outcomes Course outcomes 2. Indirect indicators (surveys, interviews) Student opinion survey CCSSE Graduate survey Alumni survey Student evaluation of courses 3. Institutional data (rates and numbers) Graduation Performance after transfer Student enrollment Retention Transfer Course completion Grade distribution

35 College survey schedule
Indirect indicators: College survey schedule Goal Survey Baseline data (May ) Student Opinion Survey (every 6 years – alternate with the CCSSE) Conducted January 2008 Community College Survey of Student Engagement (every 6 years – alternate with the Student Opinion Survey) Faculty Survey of Student Engagement (every 6 years) Staff satisfaction survey (every 4 years) Conducted in ) Alumni Survey (ACT with Whatcom- specific questions) (every 10 years) This is an excerpt from out assessment plan of some of the surveys we’ll be conducting. We are trying to be more systematic, coordinated, and intentional with our surveys so that we don’t over survey our students and employees. In , we conducted the SOS and we hope to alternate between the SOS and the CCSSE every 3 years. Assuming we have some funds, we hope to conduct the CCSSE this year. CCSSE is founded on 5 research-based national benchmarks of effective educational practice for community colleges that are highly correlated with student learning and retention: Active and collaborative learning Student effort Academic challenge Student-faculty interaction Support for learners

36 Assessment Products: Examples
Type of data College Program Course 1. Direct indicators (outcomes assessment) Core Learning Abilities and outcomes Program outcomes Course outcomes 2. Indirect indicators (surveys, interviews) Student opinion survey CCSSE Graduate survey Alumni survey Student evaluation of courses 3. Institutional data (rates and numbers) Graduation Performance after transfer Student enrollment Retention Transfer Course completion Grade distribution

37 Institutional data: College-level plan
Type of Data Baseline May 2008 2008 – 2009 2009 – 2010 2010 – 2011 2011 – 2012 Enrollment numbers Profile of all students Grade distribution Graduation rates / numbers Retention rates Success after transfer to WWU This is a small excerpt from our assessment plan from the section dealing with institutional data at the college-level. Hopefully, we’ll be generating a lot of data this year.

38 Overview What is assessment?
Components of an effective assessment program Assessment plan Assessment website

39 Website WCC Homepage “About Whatcom” “Assessment / Inst Research”
We have an Assessment website. To access it from the internet go to …. We are constantly adding new data and information to the website and we hope people will use this information to make decisions. This and other presentations are posted on the website under “Assessment Resources”.

40 Comments and questions?

41 Overview What is assessment?
Components of an effective assessment program Phases of an outcomes process Assessment plan Website

42 Assessment Products Type of data College Program Course
1. Direct indicators (outcomes assessment) Core Learning Abilities and outcomes Program outcomes Course outcomes 2. Indirect indicators (surveys, interviews) Student opinion survey CCSSE Graduate survey Alumni survey Student evaluation of courses 3. Institutional data (rates and numbers) Graduation Performance after transfer Student enrollment Retention Transfer Course completion Grade distribution At the 3 levels: College, program and course

43 Two phases of an outcomes process
Development (1) (2) Implementation

44 (1) Development of an outcomes process
Educate faculty / staff / students about assessment State learning outcomes Develop assessment tools (e.g., rubrics) to measure outcomes Determine courses that will introduce / reinforce / assess outcomes at entry / midway / exit (e.g., curriculum map) Include outcomes on syllabi Develop activities in required courses that will teach outcomes at entry, midway, exit Develop activities in required courses that will assess outcomes (at entry and) exit Attach anchor papers (examples) for each level of rubric scale State level of expected performance Establish a schedule for assessment Determine who will interpret results This example is at the college level… educate faculty / staff / students about assessment revise outcomes develop assessment tools (e.g., rubrics) to measure outcomes determine which courses or experiences will be used to introduce, reinforce, and/or assess outcomes at entry, midway, and exit (e.g., curriculum map) include outcomes on syllabi collect instructional assignments, activities, projects, or experiences, in required courses that will be used to teach outcomes at entry, midway, and exit collect activities, experiences, projects, essays, or assignments in required courses that will be used to assess outcomes at entry and exit attach anchor papers (i.e., examples) for each level of the scoring guide/rubric scale Usually takes years to develop well.

45 (2) Implementation of an outcomes process
Assess students at (entry and) exit for outcomes Analyze (entry and) exit assessment data Present analysis to faculty/students and consult on the results Use the data to improve and revise curriculum Document the process Create a report about how the data were used to improve learning educate faculty / staff / students about assessment revise outcomes develop assessment tools (e.g., rubrics) to measure outcomes determine which courses will be used to introduce, reinforce, and/or assess outcomes at entry, midway, and exit (e.g., curriculum map) include outcomes on syllabi collect instructional assignments, activities, projects, or experiences, in required courses that will be used to teach outcomes at entry, midway, and exit collect activities, experiences, projects, essays, or assignments in required courses that will be used to assess outcomes at entry and exit attach anchor papers (i.e., examples) for each level of the scoring guide/rubric scale

46 Hand-outs

47 Context Inputs Processes Products
Using CIPP to Create an Effective Assessment Program Context Location of college Population we serve Faculty/staff Inputs Resources Plans and strategies Processes Implementation Embedding assessment in college Learning and teaching practices Products Direct indicators Indirect indicators Institutional data The intention of the assessment program is to effect learning and a change in the college community and the community at large–the context. When assessment is done well, it can improve student learning and clarify and strengthen the mission of a college. Can use this CIPP model to evaluate any aspect of our college

48 Assessment Products: Examples
Type of data College Program Course 1. Direct indicators (outcomes assessment) Core Learning Abilities and outcomes Program outcomes Course outcomes 2. Indirect indicators (surveys, interviews) Student opinion survey CCSSE Graduate survey Alumni survey Student evaluation of courses 3. Institutional data (rates and numbers) Graduation Performance after transfer Student enrollment Retention Transfer Course completion Grade distribution Anne Marie Karlberg (Director of Institutional Research and Assessment) (360)


Download ppt "Assessment 101 WARP Meeting Friday November 13, 2009"

Similar presentations


Ads by Google