February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 1 Integrating Student Learning into Program Review Barbara Wright Associate Director,

Slides:



Advertisements
Similar presentations
Assessment: A Mirror with 2 Faces Accountability Reflective Practice.
Advertisements

The PRR: Linking Assessment, Planning & Budgeting PRR Workshop – April 4, 2013 Barbara Samuel Loftus, Ph.D. Misericordia University.
March 2, 2007AAC&U Conference, Miami, FL1 The Bigger Picture: The Next Level of Assessment Practice Barbara D. Wright Associate Director, Western Association.
Leon County Schools Performance Feedback Process August 2006 For more information
Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
Accreditation Process Overview Presented By: The Saint John Vianney Accreditation Team Chris Gordon Pam Pyzyk Courtney Albright Dan Demeter Gloria Goss.
Institutional Effectiveness (ie) and Assessment
STRATEGIC PLAN Community Unit School District 300 7/29/
The Board’s Role in Accreditation Michael F. Middaugh Associate Provost for Institutional Effectiveness University of Delaware Celine R. Paquette Trustee,
Assessment Plans Discussion Career Services Julie Guevara, Accreditation & Assessment Officer February 6, 2006.
Program Review: The Foundation for Institutional Planning and Improvement.
Pace University Assessment Plan. Outline I. What is assessment? II. How does it apply to Pace? III. Who’s involved? IV. How will assessment be implemented.
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
Program Assessment Workshop Kathleen Harring. What is Assessment? Assessment is the systematic gathering and analysis of information to inform and improve.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Commission on Accreditation for Respiratory Care The Site Visitors Are Coming! Transitioning from Successful Self- Study to Successful Site Visit Bradley.
©2007 Dr. Karl Squier1 Toolkit 1 –Step 1b Understanding a Complete Planning Cycle.
SAR as Formative Assessment By Rev. Bro. Dr. Bancha Saenghiran February 9, 2008.
Closing the Loop in Assessment.
Iowa Support System for Schools and Districts in Need of Assistance (SINA & DINA) Phase I: Audit Keystone AEA January 28, 2008.
Focus on Learning: Student Outcomes Assessment and the Learning College.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
Writing Your Program’s SPA Report(s) Cynthia Conn, Ph.D., Associate Director, Office of Academic Assessment Chris Geanious, Project Director, College of.
Long-term twinning seconding and young talents’ involvement for the improvement of land administration development projects Fredrik Zetterquist Managing.
Vaal University of Technology (formerly Vaal Triangle Technikon ) Ms A.J. GOZO Senior Director: Library and Information Services.
Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,
Program Review In Student Affairs Office of the Vice President Division of Student Affairs Virginia Tech
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Biennial Report October 2008.
 Integrate the Bacc Core category learning outcomes into the course.  Clarify for students how they will achieve and demonstrate the learning outcomes.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
What the *!# Is Middle States Looking For? A Review of Standards 7 and 14—Institutional & Student Learning Assessment Presentation to the Association of.
ACCREDITATION Goals: Goals: - Certify to the public and to educational organizations that the school is recognized as an effective institution of learning.
Middle States Steering Committee Overview of Standards March 20, 2008.
From Inputs to Process to Outcomes: the Quality Management Program at the American University in Bulgaria Steven F. Sullivan Dean of Faculty.
Periodic Program Review Guiding Programs in Today’s Assessment Climate LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
From the IR Office To the Classroom: The Role of Assessment in Student Learning Dr. John W. Quinley Dr. Brett Parker.
Assessment: Research in Context Allison Sivak University of Alberta Libraries June 13, 2008.
Standard Two: Understanding the Assessment System and its Relationship to the Conceptual Framework and the Other Standards Robert Lawrence, Ph.D., Director.
Distinguished Educator Initiative. 2 Mission Statement The Mission of the Distinguished Educator is to build capacity in school districts to enable students.
1 SCU’s WASC Reaccreditation Diane Jonte-Pace, Self Study Steering Committee Chair Don Dodson, Academic Liaison Officer Winter 2007.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
Creating a Culture of Accountability and Assessment at an Urban Community College Diane B. Call, President Paul Marchese, VP Academic Affairs, Liza Larios,
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
A Student Learning Outcomes-Based Program Review Process Workshop for College of the Redwoods Fred Trapp August 18, 2008.
Assessment & Program Review President’s Retreat 2008 College of Micronesia - FSM May 13 – 15, 2008 FSM China Friendship Sports Center.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
Assessment for Student Learning Kick-Off: Assessment Fellows Assessment Coordinators Pat Hulsebosch Ex. Director-Office of Academic Quality August 28,
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
October 15, 2015 QEP: PAST AND PRESENT AND FUTURE.
OEPA West Virginia Board of Education Policy 2320: A Process for Improving Education: Performance- Based Accreditation System RESA 6 – October, 2014 Office.
Learning Goals at St. John Fisher College Peter J. Gray, Ph.D. Director of Academic Assessment United States Naval Academy May 2004.
STRATEGIC PLANNING & WASC UPDATE Tom Bennett Presentation to Academic Senate February 1, 2006.
District Accreditation Completing the Standards Assessment Report July 20, 2010.
Getting Ready for the Higher Learning Commission (NCA) July 2011 Dr. Linda Johnson HLC/Assessment Coordinator/AQIP Liaison Southeast Technical Institute.
PROGRAM ASSESSMENT BASICS Alan Kalish, PhD Teresa Johnson, PhD Director Assistant Director University Center for the Advancement of Teaching.
HLC Criterion Five Primer Thursday, Nov. 5, :40 – 11:40 a.m. Event Center.
DEFINING AND REFINING LEARNING OUTCOMES UK Office of Assessment.
1 UST Support for Institutional Effectiveness: Information for Academic OA Plan Writers.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Accreditation 2007 Undergraduate Council September 26, 2005.
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
CBU CALIFORNIA BAPTIST UNIVERSITY Assessment, Accreditation, and Curriculum Office CBU - OIRPA.
Student Learning Outcomes (SLOs) Module #4: SLO Annual Report
CALIFORNIA BAPTIST UNIVERSITY Office of Educational Effectiveness
DRAFT Standards for the Accreditation of e-Learning Programs
Self evaluation.
Program Review and Accreditation
Comprehensive Evaluation: Institutional Effectiveness Committee Recommendations Presentation to College Council Office of Institutional Effectiveness.
Presentation transcript:

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 1 Integrating Student Learning into Program Review Barbara Wright Associate Director, WASC

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 2 Assessment & Program Review: related but different  Program review typically emphasizes Inputs, e.g.  Mission statement, program goals  Faculty, their qualifications  Students, enrollment levels, qualifications  Library, labs, technology, other resources  Financial support

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 3 Assessment & Program Review: related but different, cont.  Program review typically emphasizes Processes, e.g.  Faculty governance  Curriculum review  Planning  Follow-up on graduates  Budgeting  And yes, assessment may be one of these

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 4 Assessment & Program Review: related but different, cont.  Program review typically emphasizes indirect indicators of student learning and academic quality, e.g.  Descriptive data  Surveys of various constituencies  Existence of relationships, e.g. with area businesses, professional community  Program review has traditionally neglected actual student learning outcomes

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 5 Assessment & Program Review: related but different, cont.  PR is typically conceived as Data-gathering Looking at the past 5-8 years Reporting after the fact where the program has been Using PR to garner resources – or at least protect what program has Projecting needs into the future Expressing “quality” & “improvement” in terms of a case for additional inputs

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 6 Capacity vs. Educational Effectivess for Programs:  Capacity questions: What does the program have in the way of inputs, processes, and evidence of outputs or outcomes? What does it need, and how will it get what it needs?  EE questions: How effectively do the inputs and processes contribute to desired outcomes? How good are the outputs? The student learning?

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 7 Assessment & Program Review: related but different  Assessment is all about Student learning & improvement at individual, program & institutional levels Articulation of specific learning goals (as opposed to program goals, e.g. “We will place 90% of graduates in their field.”) Gathering of direct, authentic evidence of learning (as opposed to indirect evidence, descriptive data)

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 8 Assessment & Program Review: related but different, cont.  Assessment is all about Interpretation & use of findings to improve learning & thus strengthen programs (as opposed to reporting of data to improve inputs) A future orientation: Here’s where we are – and here’s where we want to go in student learning over the next 3-5 years Understanding the learning “problem” before reaching for a “solution”

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 9 Assessment & Program Review: related but different, cont.  Assessment of student learning and program review are not the same thing. However, there is a place for assessment as a necessary and significant input in program review. We should look for A well-functioning process Key learning goals Standards for student performance A critical mass of faculty (and students) involved Verifiable results, and Institutional support

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine Goals, questions 2. Gathering evidence 3. Interpretation 4. Use The Assessment Loop

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine Does the program have student learning goals, questions? 2. Do they have methods, processes for gathering evidence? Do they have evidence? 3. Do they have a process for systematic, collective analysis and interpretation of evidence? 4. Is there a process for use of findings for improvement? Is there admin. support, planning, budgeting? Rewards for faculty? The Assessment Loop – Capacity Questions

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine How well do they achieve their student learning goals, answer questions? 2. How aligned are the methods? How effective are the processes? How complete is the evidence? 3. How well do processes for systematic, collective analysis and interpretation of evidence work? What have they found? 4. What is the quality of follow-through on findings for improve- ment? Is there improvement? How adequate, effective are admin. support, planning, budgeting? Rewards for faculty? The Assessment Loop – Effectiveness Questions

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 13 Don’t confuse program-level assessment and program review  Program-level assessment means we look at learning on the program level (not just individual student or course level) and ask what all the learning experiences of a program add up to, at what standard of performance (results).  Program review looks for program-level assessment of student learning but goes beyond it, examining other components of the program (mission, faculty, facilities, demand, etc.)

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 14 What does WASC want? Both!  Systematic, periodic program review, including a focus on student learning results as well as other areas (inputs, processes, products, relationships)  An improvement-oriented student learning assessment process as a routine part of the program’s functioning

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 15 Institutionalizing Assessment – 2 aspects:  The PLAN for assessment (i.e. shared definition of the process, purpose, values, vocabulary, communication, use of findings)  The STRUCTURES and RESOURCES that make the plan doable

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 16 How to institutionalize --  Make assessment a freestanding function  Attach to an existing function, e.g. Accreditation Academic program review Annual reporting process Center for Teaching Excellence Institutional Research

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 17 Make assessment freestanding --  Maximum flexibility  Minimum threat, upset  A way to start  Little impact  Little sustainability  Requires formalization eventually, e.g. Office of Assessment Positives and Negatives

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 18 Attach to Office of Institutional Research --  Strong data gathering and analysis capabilities  Responds to external expectations  Clear responsibility  IR has resources  Faculty not “burdened”  Perception: assessment = data gathering  Faculty see little or no responsibility  Faculty uninterested in reports  Little or no use of findings Positives and Negatives

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 19 Attach to Center for Teaching Excellence --  Strong impact possible  Ongoing, supported  Direct connection to faculty, classroom, learning  Chance for maximum responsiveness to “use” phase  Impact depends on how broadly assessment is done  No enforcement  Little/no reporting, communicating  Rewards, recognition vary, may be lip service Positives and Negatives

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 20 Attach to annual report --  Some impact (depending on stakes)  Ongoing  Some compliance  Habit, expectation  Closer connection to classroom, learning  Cause/effect possible  Allows flexibility  Impact depends on how seriously, how well AR is done  No resources  Reporting, not improving, unless specified  Chair writes; faculty involvement varies Positives and Negatives

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 21 Attach to accreditation --  Maximum motivation  Likely compliance  Resources available  Staff, faculty assigned  Clear cause/effect  Resentment of external pressure  Us/them dynamic  Episodic, not ongoing  Reporting, gaming, not improving  Little faculty involvement  Little connection to the classroom, learning  Main focus: inputs, process Positives and Negatives

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 22 Attach to program review --  Some impact (depending on stakes)  Some compliance  Some resources available  Staff, faculty assigned  Cause/effect varies  Impact depends on how seriously, how well PR is done  Episodic, not ongoing  Inputs, not outcomes  Reporting, not improving  Generally low faculty involvement  Anxiety, risk-aversion  Weak connection to the classroom, learning Positives and Negatives

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 23 How can we deal with the disadvantages?  Strong message from administration: PR is serious, has consequences (bad and good)  Provide attentive, supportive oversight  Redesign PR to be continuous  Increase weighting of assessment in overall PR process increase  Involve more faculty, stay close to classroom, program  Focus on outcomes, reflection, USE Focus on improvement (not just “good news”) and REWARD IT

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 24 How can we increase weighting of learning & assessment in PR? E.g.,  Optional part  One small part of total PR process  “Assessment” vague, left to program  Various PR elements of equal value (or no value indicated)  Little faculty involvement  Required  Core of the process (so defined in instructions)  Assessment expectations defined  Points assigned to PR elements; student learning gets 50% or more  Broad involvement From to

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 25 Assessment serves improvement and accountability  A well-functioning assessment effort systematically improves curriculum, pedagogy, and student learning; this effect is documented.  At the same time, The presence of an assessment effort is an important input & indicator of quality, The report on beneficial effects of assessment serves accountability; and Assessment findings support $ requests

February 1, 2008Retreat on StudentLlearning and Assessment, Irvine 26 New approaches to PR/assessment  Create a program portfolio  Keep program data continuously updated  Do assessment on annual cycle  Enter assessment findings, uses, by semester or annually  For periodic PR, review portfolio and write reflective essay on student AND faculty learning