Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps Office of Institutional Assessment & Effectiveness SUNY Oneonta Spring 2011.

Slides:



Advertisements
Similar presentations
The SUNY Assessment Initiative: Best Practices for Mapping Program Objectives to Curricular Activities Presentation to Middle States Commission on Higher.
Advertisements

Compiling and Reporting Your Assessment Plan Data Office of Institutional Assessment & Effectiveness SUNY Oneonta Spring 2012.
What “Counts” as Evidence of Student Learning in Program Assessment?
Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
Using Rubrics for Evaluating Student Learning. Purpose To review the development of rubrics for the purpose of assessment To share an example of how a.
Aim to provide key guidance on assessment practice and translate this into writing assignments.
An Assessment Primer Fall 2007 Click here to begin.
Assurance of Learning The School of Business and Economics SUNY Plattsburgh.
The Program Review Process: NCATE and the State of Indiana Richard Frisbie and T. J. Oakes March 8, 2007 (source:NCATE, February 2007)
Using Rubrics for Evaluating Student Learning Office of Assessment and Accreditation Indiana State University.
Standards and Guidelines for Quality Assurance in the European
Maine Course Pathways Maine School Superintendents’ Conference June 24 – 25, 2010.
FLCC knows a lot about assessment – J will send examples
ASSESSMENT CERTIFICATE CULMINATING PROJECT: ASSESSING STUDENT LEARNING OUTCOMES Presented by: Shujaat Ahmed and Kaitlin Fitzsimons.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Assessed: 5 Cycles 2006, 2009, 2010, 2011, 2013.
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
Emporia State University Phil Bennett (Some Slides by Dr. Larry Lyman) Teacher Work Sample The Teachers College.
Sub-theme Three The Self-Assessment Process and Embedding QA into the Life of an Institution by Terry Miosi, Ph.D. UAE Qualification Framework Project.
BY Karen Liu, Ph. D. Indiana State University August 18,
Assessing Student Learning Outcomes in Student Development – Part I Student Development Division Meeting SUNY Oneonta May 9, 2008.
Curriculum Mapping: Assessment’s Second Step Office of Institutional Assessment & Effectiveness SUNY Oneonta Fall 2010.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
August 3,  Review “Guiding Principles for SLO Assessment” (ASCCC, 2010)  Review Assessment Pulse Roundtable results  Discuss and formulate our.
Assessment Workshop SUNY Oneonta April 24, Patty Francis Associate Provost for Institutional Assessment & Effectiveness.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Individual Education Plan Overview Presented By: Pamela Cameron Fall 2014.
EDU 385 Education Assessment in the Classroom
Measuring Complex Achievement
ADEPT 1 SAFE-T Evidence. SAFE-T 2 What are the stages of SAFE-T? Stage I: Preparation  Stage I: Preparation  Stage II: Collection.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
Inter-Rater Reliability Respiratory Ivy Tech Community College-Indianapolis.
ILP Intervention Plans Tutorial. Intervention Plans in the ILP The Intervention Plan module was added to the ILP in May 2009 to meet requirements of SB.
STUDENT GROWTH UPDATE LEARNING GOALS FOR TODAY - Teachers will be able to clearly delineate between the 4 levels of expectations on student.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
Report on the Site-Visit and Our 14-day Responses.
Richard Beinecke, Professor and Chair Suffolk University Institute for Public Service.
LSAC Academic Assistance Training Workshop June 13 – 16, 2012 OUTCOMES ASSESSMENT – THE BASICS Janet W. Fisher Suffolk University Law School.
PRESENTATION TO ASSOCIATION OF INSTITUTIONAL RESEARCH AND PLANNING OFFICERS BUFFALO, NEW YORK JUNE 11, 2009 How Campuses are Closing the GE Assessment.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Last Revised: 10/01/15. Senate Bill 290 has specific goal-setting requirements for all licensed and administrative staff in the State of Oregon. In ,
Distance Learning and Accreditation Heather G. Hartman, Ph.D. Brenau University Online Studies and SACS Liaison.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Developing and Linking Objectives at the Program and Course Levels Presentation to Fulton-Montgomery Community College October 5, 2006.
DEVELOPED BY MARY BETH FURST ASSOCIATE PROFESSOR, BUCO DIVISION AMY CHASE MARTIN DIRECTOR OF FACULTY DEVELOPMENT AND INSTRUCTIONAL MEDIA UNDERSTANDING.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Best Practices in CMSD SLO Development A professional learning module for SLO developers and reviewers Copyright © 2015 American Institutes for Research.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
CBU CALIFORNIA BAPTIST UNIVERSITY Assessment, Accreditation, and Curriculum Office CBU - OIRPA.
Individualized Education Program Module 12
Classroom Assessments Checklists, Rating Scales, and Rubrics
Presented by Deborah Eldridge, CAEP Consultant
CALIFORNIA BAPTIST UNIVERSITY Office of Educational Effectiveness
The assessment process For Administrative units
Classroom Assessment A Practical Guide for Educators by Craig A
Consider Your Audience
Individualized Education Program Module 12
General Education Assessment
ASSESSMENT OF STUDENT LEARNING
Classroom Assessments Checklists, Rating Scales, and Rubrics
Institutional Learning Outcomes Assessment
Prepared by: Toni Joy Thurs Atayoc, RMT
Faculty Performance Reviews at MSU
Institutional Effectiveness USF System Office of Decision Support
Student Learning Outcomes Assessment
Evaluating the Quality of Student Achievement Objectives
Shazna Buksh, School of Social Sciences
Assessing Academic Programs at IPFW
Presentation transcript:

Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps Office of Institutional Assessment & Effectiveness SUNY Oneonta Spring 2011

Important Assessment “Basics” Establishing congruence among institutional goals, programmatic and course objectives, learning opportunities, and assessments Linkages to disciplinary (and, as appropriate, accreditation/certification) standards Using a variety of measures, both quantitative and qualitative, in search of convergence Value of course-embedded assessment Course- vs. program-level assessment

Course- Vs. Program-Level Assessment Focus of SUNY Oneonta assessment planning is programmatic student learning objectives  Not about assessment of individual students or faculty  Rather, the question is: To what extent are students achieving programmatic objectives? Data collection will still, for the most part, take place in the context of the classroom (i.e., course- embedded assessment) However, program must have process in place for compiling and aggregating data across courses and course sections, as appropriate

What You’ve Done So Far 1. Development of programmatic student learning objectives Including discipline-appropriate as well as college- wide expectations for student learning Covering cognitive, behavioral, and attitudinal characteristics as appropriate 2. Curriculum mapping Determining the extent to which learning objectives correspond to curricular experiences Reviewing rationale for program requirements and structure Exploring potential for developing “assessment database,” leading directly to Step 3

Sample Curriculum Map – Linking Step 2 to Step 3

Collecting Assessment Data: Assessment’s Third Step Finding Evidence that Students are Achieving Programmatic Goals

Important Preliminary Activities Reach consensus as a faculty on what constitutes good assessment practice  No point in collecting meaningless data! Develop strategies for assuring that measures to be used are of sufficient quality  Review by person/group other than the faculty member who developed the measure  Use of checklist that demonstrates how measure meets good practice criteria developed by program faculty Decide how issue of “different sections” will be addressed  Will same measures be used?  If not, how will comparability be assured?

Assuring Quality of Plan: Questions to Ask Are assessment measures direct?  Student perceptions of the program are valuable, but cannot be the only indicator of learning Is there logical correspondence between the measure(s) and the learning objective(s) being assessed? Is there a process for establishing reliable scoring of qualitative measures? Are data being collected from a range of courses across the program (i.e., are they representative)?

Suggestions for Maximizing Value of Assessment Data Use a variety of assessment measures Quantitative and qualitative Course-embedded and “stand-alone” measures (e.g., ETS Major Field tests, CLA results) Use benchmarking as appropriate and available Ultimately, convergence of assessment results is ideal (i.e., triangulation) Establish a reasonable schedule for collecting assessment data on an ongoing basis (i.e., approximately 1/3 of learning objectives per year)

Suggestions for Maximizing Value of Assessment Data (cont.) For each learning objective, collect assessment data from a variety of courses at different levels as much as possible Helps assure results aren’t “idiosyncratic” to one course or faculty member Can provide insight into extent to which students are “developing” (cross-sectionally, anyway)

Also Consider the Following: The value of a capstone experience for collecting assessment data “Double dipping” (i.e., using the same evaluative strategies and criteria to assign grades and produce programmatic assessment data) Working closely with other faculty in developing measures, especially when teaching courses with multiple sections  Do measures have to be the same? No, but the more different they are, the harder it will be to compile data and reach meaningful conclusions

From Learning Objectives to Assessment Criteria Once measures are selected, establish clear and measurable a priori “success” indicators For each measure, determine what constitutes meeting and not meeting standards While these definitions may vary across faculty, programs will need to use the same categories for results (e.g., exceeding, meeting, approaching, not meeting standards) Otherwise, reaching conclusions about “program effectiveness” will be difficult Again, the more faculty collaborate with each other in establishing standards, the easier it will be to organize results and reach meaningful conclusions Ultimately, it’s a programmatic decision

Post-Assessment Considerations Once data are collected, they must be organized and maintained in a single place An Excel spreadsheet will work just fine They will also have to be compiled in some fashion, although the form this takes will depend on the program’s approach One possibility: Examine for each learning objective the overall percentage of students who met or failed to meet standards (using averages) Or: Break these percentages down by course level Ultimately, some systematic organization and categorization of assessment results is necessary in order to move on to Step 4

Closing the Loop: Assessment’s Fourth Step Using Assessment Data to Improve Programs, Teaching, and Learning

Now That You’ve Gone to All This Trouble….. The only good reason to do assessment is to use the results to inform practice Can and should happen at the individual faculty level, but in the context of program assessment, the following needs to happen: Provision of compiled, aggregated data to faculty for review and consideration Group discussion of those data DOCUMENTATION of the assessment process and results, conclusions reached, by faculty, and actions to be taken (more about this later)

What Should be the Focus of Closing the Loop Process? Identification of “patterns of evidence” as revealed by the assessment data How are data consistent?  Do students at different course levels perform similarly?  Eventually, it will be possible to look at this issue over time How are they distinctive?  Do students perform better on some objectives than others? Comparison of expected to actual results What expectations were confirmed? What came as a complete surprise?  What are possible explanations for the surprise?

What Should be the Focus of Closing the Loop Process? (cont.) The decision as to whether assessment results are “acceptable” to faculty in the program What strengths (and weaknesses) are revealed? What explains the strengths and weaknesses?  Do they make sense, given results of curriculum mapping process and other information (e.g., staffing patterns, course offerings)? And, most important, what should (and can) the program do to improve areas of weakness? Process also provides an ideal opportunity to make changes in assessment process itself as well as in programmatic objectives for the next assessment round

Some Possible Ways to Close the Assessment Loop Faculty, staff, and student development activities Program policies, practices, and procedures Curricular reform Learning opportunities

A Final Issue: The Importance of Documenting Assessment Increasing requirements related to record-keeping on assessment and actions that are taken based on assessment results Frequently, actions that are taken don’t “match” results Documentation need not be highly formal, and in fact can be effectively done in tabular form for each objective, to include: Summary of results Brief description of strengths and weaknesses revealed by data Planned revisions to make improvements as appropriate Planned revisions to the assessment process itself Provides record that can then be referred to in later assessment rounds and a way of monitoring progress over time

Developing an Assessment Plan: Some Important Dates May 3, 2010: Submission of Step 1 (Establishing Objectives) of college guidelines December 1, 2010: Submission of Step 2 (Activities & Strategies) of guidelines June 1, 2011: Submission of Steps 3 (Assessment) and 4 (Closing the Loop) [plans only] academic year: First round of data collection

APAC Members Paul French Josh Hammonds Michael Koch Richard Lee Patrice Macaluso William Proulx Anuradhaa Shastri Bill Wilkerson (Chair) Patty Francis (ex officio)