Assessing the Work of Higher Education: Institutional Effectiveness and Student Learning Dr. Jo Allen, Senior Vice President & Provost Widener University.

Slides:



Advertisements
Similar presentations
The Commissions Expectations for the Assessment of Student Learning and Institutional Effectiveness Beth Paul Interim Provost and Vice President for Academic.
Advertisements

The PRR: Linking Assessment, Planning & Budgeting PRR Workshop – April 4, 2013 Barbara Samuel Loftus, Ph.D. Misericordia University.
MSCHE Standards: Institutional Effectiveness (7) and Student Learning (14) Dr. Jo Allen, Senior Vice President & Provost Widener University.
Twin Peaks of Assessment: Institutional Effectiveness and Student Learning Dr. Jo Allen, Senior Vice President & Provost Widener University.
Meeting MSCHE Assessment Expectations
This We Believe: Keys to Educating Young Adolescents The position paper of the Association for Middle Level Education.
 A strategic plan is a guiding document for an organization. It clarifies organizational priorities, goals and desired outcomes.  For the SRCS school.
Developing an Effective Tracking and Improvement System for Learning and Teaching Achievements and Future Challenges in Maintaining Academic Standards.
CYPRUS UNIVERSITY OF TECHNOLOGY Internal Evaluation Procedures at CUT Quality Assurance Seminar Organised by the Ministry of Education and Culture and.
Instructor Teaching Impact. University Writing Program 150 sections of required writing courses per semester, taught by Instructors and GTAs 33 Instructors–
A relentless commitment to academic achievement and personal growth for every student. Redmond School District Graduates are fully prepared for the demands.
The Board’s Role in Accreditation Michael F. Middaugh Associate Provost for Institutional Effectiveness University of Delaware Celine R. Paquette Trustee,
A Commitment to Excellence: SUNY Cortland Update on Strategic Planning.
Institutionalizing Assessment: Changing a Mindset Al Foderaro, Director Denise Schmidt, Associate Director Career Services & Cooperative Education COUNTY.
Understanding your District’s Plan Colleen Miller, Director of Leadership Development.
The Carnegie Classification for Institutions Engaged with Community: Challenges, Benefits, and Understandings from the Documentation Process Amy Driscoll,
Ivy Tech and the HLC Assessment Academy Learning College Conference February 26-27, 2009.
Five Guiding Themes Provide Civic Leadership through Partnerships --Lead as a civic partner, deepen our engagement as a critical community asset, demonstrate.
CEC Advisory Council October 25, 2013 Miami 2020 Plan: Moments that Transorm.
SEM Planning Model.
President’s Cabinet April 12,  Process review  The “why” for the plan  The draft plan  Q & A  Implementation.
1 GETTING STARTED WITH ASSESSMENT Barbara Pennipede Associate Director of Assessment Office of Planning, Assessment and Research Office of Planning, Assessment.
The Academic Assessment Process
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
Commission on Colleges of the Southern Association of Colleges and Schools (SACS) Reaffirmation of Accreditation.
 The Middle States Commission on Higher Education is a voluntary, non-governmental, membership association that is dedicated to quality assurance and.
School Leadership Evaluation System Orientation SY13-14 Evaluation Systems Office, HR Dr. Michael Shanahan, CHRO.
Assessment 101: Back-to-Basics An Introduction to Assessing Student Learning Outcomes.
INSTITUTIONAL RESEARCH AND DECISION-MAKING Vladimir Briller, Ed.D. Executive Director of Strategic Planning and Institutional Research Pratt Institute,
Session Materials  Wiki
Tusculum College School of Business. Tusculum College Program is: –Approved…Regionally by Southern Association of Colleges and Schools –Flexible & cost.
How many balls can you juggle at one time?. Identify 7 balls extension middle managers juggle every day in leading the extension program Identify strategies.
SAR as Formative Assessment By Rev. Bro. Dr. Bancha Saenghiran February 9, 2008.
Making It Meaningful: Authentic Assessment for Intentional Education David W. Marshall, PhD Joanna M. Oxendine, MEd.
Leading Change. THE ROLE OF POLICY IN CHANGE Leading Change – The Role of Policy Drift to Quantitative Compliance- Behavior will focus on whatever is.
Mission and Mission Fulfillment Tom Miller University of Alaska Anchorage.
The Report of the Provost’s Advisory Group on the SUNY Assessment Initiative September 2009 Tina Good, Ph.D. President Faculty Council of Community Colleges.
Preparing for SACS: Focusing our Quality Enhancement Plan.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
Preparing and Evaluating 21 st Century Faculty Aligning Expectations, Competencies and Rewards The NACU Teagle Grant Nancy Hensel, NACU Rick Gillman, Valporaiso.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Periodic Program Review Guiding Programs in Today’s Assessment Climate LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
Building the Board Your Organization Needs Presented by Indiana Youth Institute Community Foundation of Howard County 3/8/2007.
State Center Community College District 2008 Strategic Plan One-Year Status Report December 2008.
SUNY TAACCCT Grant PLA Advisory Board Agenda Review of Charge WordPress Site Policy – Philosophical differences on PLA – What makes a good: Policy?
Learning Goals at St. John Fisher College Peter J. Gray, Ph.D. Director of Academic Assessment United States Naval Academy May 2004.
Response due: March 15,  Directions state that the report must “focus on the institution’s resolution of the recommendations and Commission concerns.”
Assessment Committee 20 October Self Evaluation HAPS is the result of a process that began in 2012, the last Accreditation self- evaluation.
Ascending to Assessment Greatness in presented by the Division of Institutional Effectiveness Helena Mariella-Walrond, PhD Vice President Cory.
Ivy Tech Community College Student Life Ivy Tech Community College Student Life June 29, 2011.
Accreditation Update and Institutional Student Learning Outcomes Deborah Moeckel, SUNY Assistant Provost SCoA Drive in Workshops Fall 2015
UAA Fall 2002 Leadership Retreat “ Focusing on Student Success ” Noel-Levitz Student Satisfaction Survey Fall 2001 Results Anchorage Campus.
Time to answer critical and inter-related questions: Whom will we serve? What will we offer? How will we serve them?
Promoting the Vision & Mission of the School Governing Board Online Training Module.
Presentation on Outcomes Assessment Presentation on Outcomes Assessment toCCED Mohawk Valley Community College October 11, 2004.
For Strategic Planning. Today: Mission Future: 5-year Vision.
KEYS TO GREATNESS IN STRATEGIC PLANNING AND ASSESSMENT Presented by Helena Mariella-Walrond, PhD Provost and Senior Vice President Cory Potter Executive.
Assessment Committee 20 October Self Evaluation HAPS is the result of a process that began in 2012, the last Accreditation self- evaluation.
School Leadership Evaluation System Orientation SY12-13 Evaluation Systems Office, HR Dr. Michael Shanahan, CHRO.
Middle States Re-Accreditation Town Hall September 29, :00-10:00 am Webpage
GOVERNANCE COUNCILS AND HARTNELL’S GOVERNANCE MODEL
HLC/Strategic Planning Update Professional Development and Assessment Day August 15, 2017.
Why some schools succeed ?
Engaging Institutional Leadership
Assessing Student Learning
CUNY Graduate School and University Center
Fort Valley State University
Co-Curricular Assessment
Get on Board: Reaffirmation 2016
Presentation transcript:

Assessing the Work of Higher Education: Institutional Effectiveness and Student Learning Dr. Jo Allen, Senior Vice President & Provost Widener University Middle States Commission on Higher Education, April 2008

Overview of Presentation Operational Terms Drivers of assessment Assessment of institutional effectiveness Assessment of student learning outcomes Blending assessments Benefits and cautions Questions and concerns

Assessment: An Operational Definition Assessment is the process of asking and answering questions that seek to align our stated intentions with documentable realities. As such, in higher education, it deals with courses, programs, policies, procedures, and operations.

Evaluation: An Operational Definition Evaluation focuses on individual performance in the sense of job completion and quality, typically resulting in merit raises, plans for future improvement, orin less satisfying casesprobation and possibly firing.

Assessment vs. Evaluation Assessment focuses on the work to be done, the outcomes, and the impact on othersnot on the individuals doing the work. Evaluation focuses on the work of the individualstheir contributions, effectiveness, creativity, responsibility, engagement, or whatever factors the organization deems most desirable.

Assessment vs. Evaluation Assessment focuses on the work to be done, the outcomes, and the impact on othersnot on the individuals doing the work. Evaluation focuses on the work of the individualstheir contributions, effectiveness, creativity, responsibility, engagement, or whatever factors the organization deems most desirable.

Assessment of Institutional Effectiveness vs. Student Learning Institutional effectiveness = the results of operational processes, policies, duties and sitesand their success in working togetherto support the management of the academy Student learning = the results of curricular and co-curricular experiences designed to provide students with knowledge and skills

What or who is driving assessment? Accreditors… charged with determining the reputable from non-reputable institutions and programs charged with checking on practices that affect the viability and sustainability of the institution and its offerings represent disciplinary and institutional interests

Assessment drivers (contd.) The public: Ivory Tower, liberal bias, ratings/rankings Legislators: responsive to citizens concerns about quality, costs, biases….or? Prospective faculty: Quality and meaningful contributions to students lives Prospective parents: real learning and preparation for careers Prospective students: How will I measure up? And what kind of job can I get when I graduate? Funding agencies/foundations: evidence of commitment to learning and knowledge and evidence of [prior] success

Higher Education Realities Competitive nature of higher education – National rankings – Institutional research and data – Marketing – Niche markets Tuition Costs Consumer attitudes of students: learning outcomes and institutional effectiveness

Matters of Institutional Quality Can we justify costs/prices of attendance? Can we verify the quality of our educational offerings in measurable terms? Can we verify the effectiveness of operational contributors to a sustainable educational experience? Can we use data and other findings to improve the quality of our educational and operational offerings? Can we use those findings to align resources (financial, staff, curricular, co-curricular) to enhance desired outcomes?

Sites of Institutional Effectiveness Processes [existence and transparency] – Enrollment: Admissions, financial aid, registration – Curricular: Advising, progress toward degree completion – Budgeting: operations/salaries; capital; bond ratings and ratios; endowment management; benefits; etc. – Planning: strategic planning, compact planning, curricular planning, etc. – Judicial: education/training, communication, sanctions, etc. – Residence Life: housing selection, training for RAs, conflict resolution/mediation,

Sites of Institutional Effectiveness Units/Offices of operations – Advancement – Admissions, Bursar, Registrar – Center for Advising, Academic Support, etc. – Campus Safety – Maintenance – IT – Institutional Research – Athletics

The Assessment Cycle: Key Questions for Institutional Effectiveness What services, programs, or benefits should our offices provide? For what purposes or with what intended results? What evidence do we have that they provide these outcomes? How can we use information to improve or celebrate successes? Do the improvements we make work?

What are we looking for? EXAMPLES of evidence: Our admission of students for whom our institution is the first choice has risen 30%. 95% of students report satisfaction with the housing selection process. 5 faculty committees participated in the last planning cycle. Overall, faculty, staff, and students report feeling safe on campus, following the new Campus Safety Improvement initiatives.

Where do we seek improvement [and what evidence will help us]? We need to raise the number of students who choose our institution as their first choice to 95% by All faculty committees will be invited to participate in the next planning cycle. Students (39%) still report feeling unsafe in the mezzanine of the University Gallery. We will …..

The Iterative Assessment Cycle for Institutional Effectiveness Mission/Purposes Objectives/Goals Outcomes Implement Methods to Gather Evidence Gather Evidence Interpret Evidence Make decisions to improve programs, services, or benefits; contribute to institutional experience; inform institutional decision- making, planning, budgeting, policy, public accountability

What qualities point to institutional effectiveness? A well-articulated set of processes for critical functions A clear line of responsibility and accountability for critical functions An alignment of the importance of the function and sufficient resources (staff, budget, training, etc.) to support the function Evidence of institution-wide knowledge of those critical functions, processes, and lines of responsibility

What kinds of evidence points to institutional effectiveness? Well-managed budgets Accreditation and governmental compliance Clearly defined and supported shared governance (board, president, administration, faculty, staff, and students) Communication pathways and strategies [transparency] Consensus on mission, strategic plan, goals, priorities, etc. Student (and other constituencies) satisfaction

How do we measure institutional effectiveness? Tangible evidence: Audited budget statements, handbooks, enrollment data, institutional data Records/reports of activities and/or compliance Self-studies pointing to documented evidence Surveys of satisfaction, usage, attitudes, confidence, etc. Disciplinary accreditation reports

The Assessment Cycle: Key Questions for Student Learning What should our students know or be able to do by the time they graduate? What evidence do we have that they know and can do these things? How can we use information to improve or celebrate successes? Do the improvements we make work?

The Iterative Assessment Cycle Mission/Purposes Objectives/Goals Outcomes Implement Methods to Gather Evidence Gather Evidence Interpret Evidence Make decisions to improve programs; enhance student learning and development; inform institutional decision- making, planning, budgeting, policy, public accountability

Student Learning Assessment: What should students know or be able to demonstrate by the time they graduate? Civic engagement Diversity appreciation Communication skills Professional responsibility Ethics Critical thinking Collaborative learning Leadership Mathematical or Quantitative competence Technological competence Scientific competence Research skills Cultural competence Interdisciplinary competence Civic responsibility Global competence Economic/financial competence Social justice

What might our sources of evidence be? Essays/Theses Portfolios (faculty or external readers evaluated) Quizzes Oral presentations Homework assignments Lab experiments Tests Journal entries Projects Demonstrations

What are we looking for? Evidence of students skill level (basic competency to mastery) – based on faculty-articulated standards of quality and judgments – applied to all students work evenly – indicative of aggregate evaluations of performance or knowledge – informative for course or program improvements

Can we use the same processes and strategies to assess both arenas? Measuring learning versus effectiveness, efficiency, and/or satisfaction – BEYOND ANECDOTAL INTO EVIDENCE Methods of testing, projects, demonstrations versus surveys, records, reports – QUALIFY OR QUANTIFY THE OUTCOMES Use of results (revisions versus training) – MODIFY WHAT YOU DO TO AFFECT OUTCOMES

What is similar? A commitment to doing the very best job possible under whatever conditions exist A commitment to recognizing ways that altering those conditions can affect the outcomes A commitment to recognizing that altering the outcomes can affect the conditions

Ultimately…. We hold ourselves and our colleagues accountable for articulating the intentions of our work and then measuring the realities, resulting in designing and implementing strategies for improvement over time. How are we doing? How can we do better?

COMMITMENT CONSISTENCYCULTURE CARE

QUESTIONS? Comments?