Assessment 101 WARP Meeting Friday November 13, 2009

Slides:



Advertisements
Similar presentations
Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
Advertisements

Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
1. Continue to distinguish and clarify between Student Learning Outcomes (SLOs) and Service Area Outcomes (SAOs) 2. Develop broad SLOs/SAOs in order to.
Anne Marie Karlberg Assessment Coordinator (360) NWIC Assessment Program.
Year Seven Self-Evaluation Workshop OR Getting from Here to There Northwest Commission on Colleges and Universities.
HECSE Quality Indicators for Leadership Preparation.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Developing the Year One Report: WVC’s Experience as a Pilot College Dr. Susan Murray Executive Director, Institutional Effectiveness.
Student Learning Outcomes at CSUDH. Outcomes assessment can tell us if our students are really learning what we think they should be able to do.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Implementing an Ability Based Education System Colleen Keyes Dean of Academic Affairs Dr. David England Director of Institutional Effectiveness.
January 26, 2011 Careers Conference, Madison, Wisconsin Robin Nickel, Ph.D. Associate Director, Worldwide Instructional Design System.
Learning Goals at St. John Fisher College Peter J. Gray, Ph.D. Director of Academic Assessment United States Naval Academy May 2004.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
INSTITUTIONAL RESEARCH PLANNING AND ASSESSMENT DR. SHEMEKA MCCLUNG DIRECTOR ARNITRA HUNTER RESEARCH ASSOCIATE.
Model of an Effective Program Review October 2008 Accrediting Commission for Community and Junior Colleges.
HLC Criterion Four Primer Thursday, Oct. 15, :40 – 11:40 a.m. Event Center.
INSTITUTIONAL RESEARCH PLANNING AND ASSESSMENT DR. SHEMEKA MCCLUNG DIRECTOR ARNITRA HUNTER RESEARCH ASSOCIATE.
Anna Parkman New Faculty Orientation ◦ ACCOUNTABILITY in Higher Education ◦ ASSESSMENT as validation of learning ◦ ASSESSMENT & ACCREDITATION ◦
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
Advanced Writing Requirement Proposal
CBU CALIFORNIA BAPTIST UNIVERSITY Assessment, Accreditation, and Curriculum Office CBU - OIRPA.
Accreditation and Assessment at Whatcom Community College
Implementing QM towards Program Certification
Student Learning Outcomes (SLOs) Module #4: SLO Annual Report
Assessment Basics PNAIRP Conference Thursday October 6, 2011
Indirect and Direct Evidence
Maja Holmes and Margaret Stout West Virginia University
Outcomes Assessment Committee
The assessment process For Administrative units
Clinical Practice evaluations and Performance Review
M-LANG project  Ref. n NO01-KA Interactive Exchange Workshop on how to use response systems and ICT tools for creating interactive learning.
Consider Your Audience
LASC 2010 Program Review Orientation
Alamance Community College
Child Outcomes Summary Process April 26, 2017
Component 4 Effective and Reflective Practitioner
Program Review For School Counseling Programs
Director of Policy Analysis and Research
Creating and Revising Curriculum: The Role of Program Review
Services Improvement Process (SIP)
Component 2 Differentiation in Instruction
ASSESSMENT OF STUDENT LEARNING
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
General Education Assessment
Director, Institutional Research
Partnering for Success: Using Research to Improve the Lowest Performing Schools June 26, 2018 Massachusetts Department of Elementary and Secondary Education.
Critical Element: Faculty Commitment
Implementing the Specialized Service Professional State Model Evaluation System for Measures of Student Outcomes.
Core Competencies: Moving forward with Self-Assessment
Assessment and Program Review Learning Centers
Component 4 Effective and Reflective Practitioner
Michele Hansen, Steven Graunke, Robbie Janik
Program Assessment Processes for Developing and Strengthening
Working with Student Success Data
Implementation Guide for Linking Adults to Opportunity
Component 4 Effective and Reflective Practitioner
The Heart of Student Success
What to do with your data?
Troy School District External Review Exit Report April 21-24, 2013.
The Nuts and Bolts of National Board Certification
Welcome to Your New Position As An Instructor
Michele Hansen, Steven Graunke, Robbie Janik
Student Learning Outcomes at CSUDH
Fort Valley State University
NON-ACADEMIC ASSESSMENT AND REPORTING FY’17
NON-ACADEMIC ASSESSMENT REPORTING FY’17
Presentation transcript:

Assessment 101 WARP Meeting Friday November 13, 2009 Anne Marie Karlberg Director of Institutional Research and Assessment (360) 383-3302 amkarlberg@whatcom.ctc.edu http://faculty.whatcom.ctc.edu/InstResearch/index.htm There are many ways to think about assessment and structure a presentation; this is just one way of structuring a framework Hand-outs: CIPP model summary sheet (one side) Assessment products grid (other side)

Overview What is assessment? Components of an effective assessment program Assessment plan Assessment website

Overview What is assessment? Components of an effective assessment program Assessment plan Assessment website

What is assessment? “ The systematic collection of information about student learning…to inform decisions about how to improve learning.” (Walvoord, 2004, p. 2) This will be a review for some of you. But first, we need to step back and make sure we’re all operating with a similar definition of assessment. This is the most succinct and clearest definition I know of assessment. Assessment is...

improvement (formative) accountability (summative) Purposes improvement (formative) accountability (summative) 2 purposes of assessment are 1. improve student learning and performance (internal improvement) 2. demonstrate to external accreditation bodies that relationships exist between the college’s mission and learning outcomes.

Overview What is assessment? Components of an effective assessment program Assessment plan Assessment website

Context Inputs Processes Products Educational Evaluation Model: CIPP By Daniel Stufflebeam Context Inputs Processes Products In 1965, an educator, named Daniel Stufflebeam, developed the context, input, processes, and products model – or CIPP model. The CIPP model provides a useful framework for evaluating educational initiatives by breaking them down into their context, input, processes, and products. [As the model evolved, he subdivided products into…impact, effectiveness, sustainability, and generalizability]. Since the 1960s, this model has evolved and is used in the educational world to evaluate programs. It is not necessary to evaluate all of 4 areas when evaluating a program, but it’s important to at least consider each of these components.

Context Inputs Processes Products Using CIPP to Create an Effective Assessment Program Context Inputs Processes Products Stufflebeam’s CIPP model is useful in creating an effective assessment framework. So I am going to present the components of an effective assessment program within the CIPP framework.

Context Inputs Processes Products Using CIPP to Create an Effective Assessment Program: Context Context Location of college Population we serve Faculty/staff Inputs Processes Products The context of a college includes things like the location of the college, the population served, and the characteristics of the faculty/staff.

Context: What do we know about the context of our colleges? At Whatcom Community College… 8.3% of our employees 18.9% of our degree and certificate seeking students 16.0% percent of Whatcom County residents …indicate they are of color. So what are the implications of this data?

Context Inputs Processes Products Using CIPP to Create an Effective Assessment Program: Inputs Context Location of college Population we serve Faculty/staff Inputs Resources Plans and strategies Processes Products The inputs to the Assessment program consist of the resources we invest and the plans and strategies that we employ to carry out the Assessment program.

Resources human resources financial support technical support administrator and faculty support Human Resources: Whatcom… Funds a full-time Director of Institutional Research and Assessment Funds a part-time Outcomes Assessment Coordinator (faculty member) Funds faculty stipends to work on outcomes assessment-related projects   Financial Support: Whatcom… Funds for the administration of assessment-related tasks (e.g., for conducting surveys) Technical support: Currently, this is one of our biggest challenges. Our databases are extremely cumbersome to use, so it is difficult to generate meaningful data in a reasonable time frame, but we are doing the best we can. Administrators: Whatcom administrators have… provided visible advocacy for assessment, including considerable financial support (provided necessary opportunities, incentives, material resources, and compensation to faculty and staff for assessment initiatives) appreciated and supported staff and faculty for their assessment efforts and achievements Faculty: Whatcom faculty members have… remained open-minded and respond in respectful, cooperative, and collaborative ways taken ownership of assessment and embrace it as an intrinsically valuable process

Plans and Strategies revise mission statement and familiarize faculty / staff with mission implement the strategic plan implement the assessment plan

Context Inputs Processes Products Using CIPP to Create an Effective Assessment Program: Processes Context Location of college Population we serve Faculty/staff Inputs Resources Plans and strategies Processes Implementation Embedding assessment in college Learning and teaching practices Products The processes include 1. How we go about implementing the assessment program 2. The extent to which we embed assessment throughout the college 3. The learning and teaching practices we employ

Implementation Link assessment program to mission statement redirect resources towards priorities increase responsiveness to community needs measure the extent to which we are successful Provide opportunities for meaningful, collaborative college-wide conversations, by structuring assessment in an ongoing, simplified, participatory way relevant and meaningful The way in which the assessment program is carried out is important. It is essential that we… 1. Link our assessment program to mission. If we do this, it redirects resources towards priorities and increase the college’s responsiveness to the needs of the community measures the extent to which we are successful 2. Provide opportunities for meaningful, collaborative college-wide conversations We do this by structuring the assessment program in an ongoing, simplified, participatory way that is relevant and meaningful to the college and faculty and staff

Embed assessment throughout college strategic planning the website curriculum review employee professional development and performance budgeting program review student government the college catalogue program planning college publications The extent to which assessment is embedded throughout the college is an indication of the level of success of the assessment program.

Learning and teaching practices Apply meaningful, relevant and contextualized experiences for students, such as… using self-reflection applying concepts to a relevant context teaching material to peers discovering connections between subjects The following teaching practices contribute to effective learning… for example, when faculty…

Context Inputs Processes Products Using CIPP to Create an Effective Assessment Program: Products Context Location of college Population we serve Faculty/staff Inputs Resources Plans and strategies Processes Implementation Embedding assessment in college Learning and teaching practices Products Direct indicators Indirect indicators Institutional data The products include Direct indicators Indirect indicators Institutional data This provides the evidence that we are doing what we say we are doing.

Assessment data Direct indicators (outcomes): e.g., essays, capstone projects, demonstrations, presentations 1. Direct indicators require students to demonstrate their learning through for example, e.g., essays, capstone projects, demonstrations, presentations.

Assessment data Direct indicators (outcomes): e.g., essays, capstone projects, demonstrations, presentations Indirect indicators (perceptions): e.g., surveys, focus groups, interviews 2. Indirect indicators ask students to reflect on their learning thru for example… surveys, focus groups, interviews (are students perceptions of their learning)

Assessment data Direct indicators (outcomes): e.g., essays, capstone projects, demonstrations, presentations Indirect indicators (perceptions): e.g., surveys, focus groups, interviews Institutional data: e.g., retention, graduation, enrollment, transfer trends 3. Institutional data do not necessarily indicate student learning but do reflect the overall condition and effectiveness of the college e.g., retention, graduation, enrollment, transfer trends

Assessment levels College level Program level Course level Try to use a combination of the 3 types of data at the college, program, and course levels

Assessment Products: Examples Type of data College Program Course 1. Direct indicators (outcomes assessment) Core Learning Abilities and outcomes Program outcomes Course outcomes 2. Indirect indicators (surveys, interviews) 3. Institutional data (rates and numbers) As I just mentioned, there are 3 types of assessment data that we use to evaluate student learning: direct indicators, indirect indicators and institutional data. We need to collect this data at the college, program, and course levels.

Assessment Products: Examples Type of data College Program Course 1. Direct indicators (outcomes assessment) Core Learning Abilities and outcomes Program outcomes Course outcomes 2. Indirect indicators (surveys, interviews) 3. Institutional data (rates and numbers) Sometimes people refer to “Direct indicators” as “outcomes assessment”. Outcomes are things students are able to do at the end of their college experience, their program, or a course. There are 2 phases of outcomes assessment (1) development of the outcomes process and (2) implementation of the outcomes process Example of a CLA: “Communication” Example of a program outcome from an AA in ECE: Students will be able to create and modify environments and experiences to meet the individual needs of all children, including children with disabilities, developmental delays, and special abilities. An example of a course outcomes: ECE 116 -- Students will be able to recognize 6 strategies for dealing with children’s behavior that the students find challenging.

Assessment Products: Examples Type of data College Program Course 1. Direct indicators (outcomes assessment) Core Learning Abilities and outcomes Program outcomes Course outcomes 2. Indirect indicators (surveys, interviews) Student opinion survey CCSSE Graduate survey Alumni survey Student evaluation of courses 3. Institutional data (rates and numbers) Graduation Performance after transfer Student enrollment Retention Transfer Course completion Grade distribution

Assessment Products: Examples Type of data College Program Course 1. Direct indicators (outcomes assessment) Core Learning Abilities and outcomes Program outcomes Course outcomes 2. Indirect indicators (surveys, interviews) Student opinion survey CCSSE Graduate survey Alumni survey Student evaluation of courses 3. Institutional data (rates and numbers) Graduation Performance after transfer Student enrollment Retention Transfer Course completion Grade distribution You can use this general framework for evaluating anything—not just student learning: for example, if you are evaluating an employee, you can determine the degree to which they are accomplishing the tasks for which they are responsible (direct); you can ask others for their perceptions (indirect); and you might have data (attendance).

Context Inputs Processes Products Using CIPP to Create an Effective Assessment Program Context Location of college Population we serve Faculty/staff Inputs Resources Plans and strategies Processes Implementation Embedding assessment in college Learning and teaching practices Products Direct indicators Indirect indicators Institutional data The intention of the assessment program is to effect learning and a change in the college community and the community at large–the context. When assessment is done well, it can improve student learning and clarify and strengthen the mission of a college. Can use this CIPP model to evaluate any aspect of our college

Overview What is assessment? Components of an effective assessment program Assessment plan Assessment website

Context Inputs Processes Products Using CIPP to Create an Effective Assessment Program Context Location of college Population we serve Faculty/staff Inputs Resources Plans and strategies Processes Implementation Embedding assessment in college Learning and teaching practices Products Direct indicators Indirect indicators Institutional data

Assessment Products: Examples Type of data College Program Course 1. Direct indicators (outcomes assessment) Core Learning Abilities and outcomes Program outcomes Course outcomes 2. Indirect indicators (surveys, interviews) Student opinion survey CCSSE Graduate survey Alumni survey Student evaluation of courses 3. Institutional data (rates and numbers) Graduation Performance after transfer Student enrollment Retention Transfer Course completion Grade distribution Within WCC’s Assessment Plan, we’ve created a subplan for each of the 9 quadrants of this table. I am going to provide excerpts of examples of each under only the college heading.

Assessment Products: Examples Type of data College Program Course 1. Direct indicators (outcomes assessment) Core Learning Abilities and outcomes Program outcomes Course outcomes 2. Indirect indicators (surveys, interviews) Student opinion survey CCSSE Graduate survey Alumni survey Student evaluation of courses 3. Institutional data (rates and numbers) Graduation Performance after transfer Student enrollment Retention Transfer Course completion Grade distribution

Revise CLAs and outcomes Direct indicators: College outcomes plan Goal (# of CLAs) Whatcom will… Baseline (May 2008) 2008 - 2009 2009 - 2010 2010 - 2011 2011 - 2012 (1) Development of the college outcomes process a. educate faculty / staff / students about assessment No formal initiatives ongoing b. revise Core Learning Abilities (CLAs) and outcomes Identified 5 CLAs in 1998 Revise CLAs and outcomes refine review c. develop assessment tools (e.g., scoring guides / rubrics) to measure the outcomes 5 d. determine which courses will be used to introduce, reinforce, and/or assess outcomes at entry, midway, and exit (e.g., curriculum map) e. include outcomes on syllabi 1 2 f. collect the instructional assignments, activities, projects, or experiences, in required courses that will be used to teach outcomes at entry, midway, and exit g. collect the activities, experiences, projects, essays, or assignments in required courses that will be used to assess outcomes at entry and exit h. attach anchor papers (i.e., examples) for each level of the scoring guide/rubric scale (2) Implementation of the college outcomes process a. assess students at (entry and) exit for outcomes   b. analyze the (entry and) exit assessment data c. present analysis to faculty and students and consult on the results d. use the data to improve and revise curriculum e. document the process; create an assessment report about how the data were used to improve learning These are all excerpts from our assessment plan of what we hope to accomplish at the college level this year. You are not meant to be able to read this, but the intention of showing this to you is to give you a sense of all the steps involved in developing and implementing outcomes at the college, program, and course levels. The first 4 tasks, highlighted in yellow, are what we hope to accomplish this year.

Revise CLAs and outcomes Direct indicators: College outcomes plan Goal (# of CLAs) Whatcom will… Baseline (May 2008) 2008 - 2009 2009 - 2010 2010 - 2011 2011 - 2012 (1) Development of the college outcomes process a. educate faculty / staff about assessment No formal initiatives Ongoing b. revise Core Learning Abilities (CLAs) and outcomes Identified 5 CLAs in 1998 Revise CLAs and outcomes Refine Review c. develop assessment tools (e.g., scoring guides / rubrics, etc.) to measure the outcomes  d. determine which courses will be used to introduce, reinforce, and/or assess outcomes at entry, midway, and exit (e.g., curriculum map)

Assessment Products: Examples Type of data College Program Course 1. Direct indicators (outcomes assessment) Core Learning Abilities and outcomes Program outcomes Course outcomes 2. Indirect indicators (surveys, interviews) Student opinion survey CCSSE Graduate survey Alumni survey Student evaluation of courses 3. Institutional data (rates and numbers) Graduation Performance after transfer Student enrollment Retention Transfer Course completion Grade distribution

College survey schedule Indirect indicators: College survey schedule Goal Survey Baseline data (May 2008) 2008 – 2009 2009 – 2010 2010 – 2011 2011 – 2012 2012 – 2013 Student Opinion Survey (every 6 years – alternate with the CCSSE) Conducted January 2008 Community College Survey of Student Engagement (every 6 years – alternate with the Student Opinion Survey)  Faculty Survey of Student Engagement (every 6 years) Staff satisfaction survey (every 4 years) Conducted in 2007) Alumni Survey (ACT with Whatcom- specific questions) (every 10 years) This is an excerpt from out assessment plan of some of the surveys we’ll be conducting. We are trying to be more systematic, coordinated, and intentional with our surveys so that we don’t over survey our students and employees. In 2007-8, we conducted the SOS and we hope to alternate between the SOS and the CCSSE every 3 years. Assuming we have some funds, we hope to conduct the CCSSE this year. CCSSE is founded on 5 research-based national benchmarks of effective educational practice for community colleges that are highly correlated with student learning and retention: Active and collaborative learning Student effort Academic challenge Student-faculty interaction Support for learners

Assessment Products: Examples Type of data College Program Course 1. Direct indicators (outcomes assessment) Core Learning Abilities and outcomes Program outcomes Course outcomes 2. Indirect indicators (surveys, interviews) Student opinion survey CCSSE Graduate survey Alumni survey Student evaluation of courses 3. Institutional data (rates and numbers) Graduation Performance after transfer Student enrollment Retention Transfer Course completion Grade distribution

Institutional data: College-level plan Type of Data Baseline May 2008 2008 – 2009 2009 – 2010 2010 – 2011 2011 – 2012 Enrollment numbers  Profile of all students Grade distribution Graduation rates / numbers Retention rates Success after transfer to WWU This is a small excerpt from our assessment plan from the section dealing with institutional data at the college-level. Hopefully, we’ll be generating a lot of data this year.

Overview What is assessment? Components of an effective assessment program Assessment plan Assessment website

Website WCC Homepage “About Whatcom” “Assessment / Inst Research” http://www.faculty.whatcom.ctc.edu/InstResearch/ We have an Assessment website. To access it from the internet go to …. We are constantly adding new data and information to the website and we hope people will use this information to make decisions. This and other presentations are posted on the website under “Assessment Resources”.

Comments and questions?

Overview What is assessment? Components of an effective assessment program Phases of an outcomes process Assessment plan Website

Assessment Products Type of data College Program Course 1. Direct indicators (outcomes assessment) Core Learning Abilities and outcomes Program outcomes Course outcomes 2. Indirect indicators (surveys, interviews) Student opinion survey CCSSE Graduate survey Alumni survey Student evaluation of courses 3. Institutional data (rates and numbers) Graduation Performance after transfer Student enrollment Retention Transfer Course completion Grade distribution At the 3 levels: College, program and course

Two phases of an outcomes process Development (1) (2) Implementation

(1) Development of an outcomes process Educate faculty / staff / students about assessment State learning outcomes Develop assessment tools (e.g., rubrics) to measure outcomes Determine courses that will introduce / reinforce / assess outcomes at entry / midway / exit (e.g., curriculum map) Include outcomes on syllabi Develop activities in required courses that will teach outcomes at entry, midway, exit Develop activities in required courses that will assess outcomes (at entry and) exit Attach anchor papers (examples) for each level of rubric scale State level of expected performance Establish a schedule for assessment Determine who will interpret results This example is at the college level… educate faculty / staff / students about assessment revise outcomes develop assessment tools (e.g., rubrics) to measure outcomes determine which courses or experiences will be used to introduce, reinforce, and/or assess outcomes at entry, midway, and exit (e.g., curriculum map) include outcomes on syllabi collect instructional assignments, activities, projects, or experiences, in required courses that will be used to teach outcomes at entry, midway, and exit collect activities, experiences, projects, essays, or assignments in required courses that will be used to assess outcomes at entry and exit attach anchor papers (i.e., examples) for each level of the scoring guide/rubric scale Usually takes years to develop well.

(2) Implementation of an outcomes process Assess students at (entry and) exit for outcomes Analyze (entry and) exit assessment data Present analysis to faculty/students and consult on the results Use the data to improve and revise curriculum Document the process Create a report about how the data were used to improve learning educate faculty / staff / students about assessment revise outcomes develop assessment tools (e.g., rubrics) to measure outcomes determine which courses will be used to introduce, reinforce, and/or assess outcomes at entry, midway, and exit (e.g., curriculum map) include outcomes on syllabi collect instructional assignments, activities, projects, or experiences, in required courses that will be used to teach outcomes at entry, midway, and exit collect activities, experiences, projects, essays, or assignments in required courses that will be used to assess outcomes at entry and exit attach anchor papers (i.e., examples) for each level of the scoring guide/rubric scale

Hand-outs

Context Inputs Processes Products Using CIPP to Create an Effective Assessment Program Context Location of college Population we serve Faculty/staff Inputs Resources Plans and strategies Processes Implementation Embedding assessment in college Learning and teaching practices Products Direct indicators Indirect indicators Institutional data The intention of the assessment program is to effect learning and a change in the college community and the community at large–the context. When assessment is done well, it can improve student learning and clarify and strengthen the mission of a college. Can use this CIPP model to evaluate any aspect of our college

Assessment Products: Examples Type of data College Program Course 1. Direct indicators (outcomes assessment) Core Learning Abilities and outcomes Program outcomes Course outcomes 2. Indirect indicators (surveys, interviews) Student opinion survey CCSSE Graduate survey Alumni survey Student evaluation of courses 3. Institutional data (rates and numbers) Graduation Performance after transfer Student enrollment Retention Transfer Course completion Grade distribution Anne Marie Karlberg (Director of Institutional Research and Assessment) (360) 383-3302 amkarlberg@whatcom.ctc.edu http://faculty.whatcom.ctc.edu/InstResearch/index.htm