Timothy S. Brophy, Ph.D., Director of Institutional Assessment University of Florida Office of the Provost Gainesville, FL THE BASICS OF STUDENT LEARNING.

Slides:



Advertisements
Similar presentations
General Education Assessment 2013: Introduction
Advertisements

Special recognition: University of Florida.  Participants will be able to: ◦ Articulate specifications for learning outcomes ◦ Classify learning outcomes.
Del Mar College Planning and Assessment Process Office of Institutional Research and Effectiveness January 10, 2005.
Dept. of Computing and Technology (CaT) School of Science and Technology B.S. in Computer Information Systems (CIS) CIP Code: Program Code: 411.
Assessment Plans Discussion Career Services Julie Guevara, Accreditation & Assessment Officer February 6, 2006.
Learning without Borders: Internationalizing the Gator Nation M. David Miller Director, Quality Enhancement Plan Timothy S. Brophy Director, Institutional.
ABET-ASAC Accreditation Workshop ABET Criteria and Outcomes Assessment
Bringing the World to UNO: Global Learning & Engagement Quality Enhancement Plan (QEP) SACSCOC Committee Presentation.
1 UCSC Computer Engineering Objectives, Outcomes, & Feedback Tracy Larrabee Joel Ferguson Richard Hughey.
Introduction to Student Learning Outcomes in the Major
Outcomes-Based Accreditation: An Agent for Change and Quality Improvement in Higher Education Programs A. Erbil PAYZIN Founding Member and Past Chairman.
ABET Accreditation Board for Engineering and Technology
Assessing Students Ability to Communicate Effectively— Findings from the College of Technology & Computer Science College of Technology and Computer Science.
Dr. Timothy S. Brophy Director of Institutional Assessment University of Florida GRADUATE AND PROFESSIONAL PROGRAM ASSESSMENT PLANS.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
ABET Accreditation Status CISE IAB MeeertingJanuary 24, CEN program fully ABET-accredited (in 2006) until 2012: no concerns, no weaknesses, no deficiencies.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Timothy S. Brophy, Ph.D., Director of Institutional Assessment University of Florida Office of the Provost.
Writing Measurable Student Learning Outcomes
External Examiners’ Briefing Day Assessment Policy Tuesday 6 th January 2015.
Session Goals: To redefine assessment as it relates to our University mission. To visit assessment plan/report templates and ensure understanding for.
TaskStream Training Presented by the Committee on Learning Assessment 2015.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
BY Karen Liu, Ph. D. Indiana State University August 18,
Academic Assessment at UTB Steve Wilson Director of Academic Assessment.
Periodic Program Review for Academics Affirming Excellence in Education LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
AL-QADISIYIA UNIVERSITY COLLEGE OF ENGINEERING SELF ASSESSMENT REPORT Submitted by SAR committee.
Overview of the Department’s ABET Criterion 3 Assessment Process.
Susan Agre-Kippenhan, Portland State University Professor, Art Department Evaluating the Effectiveness of Service Learning.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
ABET Assessing Program Outcomes Amir Rezaei. Outline Context of Assessment Process of Assessment Similarities and differences between classroom and program.
NAVIGATING THE PROCESS OF STUDENT LEARNING OUTCOMES: DEVELOPMENT, EVALUATION, AND IMPROVEMENT Shannon M. Sexton, Julia M. Williams, & Timothy Chow Rose-Hulman.
Undergraduate Academic Assessment Plans: What’s New for Timothy S. Brophy, Director of Institutional Assessment University of Florida Office of.
Assessing Program-Level SLOs November 2010 Mary Pape Antonio Ramirez 1.
Updating Curriculum to Support Learning Davidson County Community College May
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
 Introduction Introduction  Contents of the report Contents of the report  Assessment : Objectives OutcomesObjectivesOutcomes  The data :
The Basics of Student Learning Outcomes Assessment SACSCOC Summer Institute July 20, 2015 handout for this session is in the online program Timothy S.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
ABET 2000 Preparation: the Final Stretch Carnegie Institute of Technology Department Heads Retreat July 29, 1999.
SACS Coordinators Meeting Academic Units Wednesday, October 31, 2012 Timothy Brophy – Director, Institutional Assessment Cheryl Gater – Director, SACS.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
Copyright © 2014 by ABET Proposed Revisions to Criteria 3 and 5 Charles Hickman Managing Director, Society, Volunteer and Industry Relations AIAA Conference.
CEN Faculty MeetingMarch 31, ABET Accreditation Brief history. –1980’s: faculty qualifications sufficed. –1990s: quality of courses, materials, and.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
CISE IAB MeetingOctober 15, ABET Accreditation Brief history. –1980’s: faculty qualifications sufficed. –1990s: quality of courses, materials, and.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
DEFINING AND REFINING LEARNING OUTCOMES UK Office of Assessment.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
Canberra Chapter July PMI Chapter Meeting July 2007 PMCDF Competence Framework A presentation by Chris Cartwright.
Funded by a grant from the National Science Foundation. Any opinions, findings, conclusions or recommendations expressed are those of the authors and do.
SACSCOC Coordinators Meeting Academic Units September 14, 2015 Timothy S. Brophy, Director, Institutional Assessment Cheryl Gater, Assistant Provost and.
Sustaining Excellence in Academic Assessment: Designing and Implementing an Institutional Academic Assessment System SACSCOC Annual Conference December,
Writing Measurable Student Learning Outcomes
Consider Your Audience
SLOs: What Are They? Information in this presentation comes from the Fundamentals of Assessment conference led by Dr. Amy Driscoll and sponsored by.
Curriculum and Accreditation
Proposed Revisions to Criteria 3 and 5
Timothy S. Brophy, Director of Institutional Assessment
Institutional Effectiveness USF System Office of Decision Support
Department of Computer Science The University of Texas at Dallas
Student Learning Outcomes Assessment
Assessment and Accreditation
Assessing Academic Programs at IPFW
Presentation transcript:

Timothy S. Brophy, Ph.D., Director of Institutional Assessment University of Florida Office of the Provost Gainesville, FL THE BASICS OF STUDENT LEARNING OUTCOMES ASSESSMENT IN SENIOR INSTITUTIONS SACSCOC SUMMER INSTITUTE JULY 22, 2013

 Part 1: To introduce, describe, and explain the basic elements of student learning outcomes, their development, and measurement  Part 2: To share a structure for assessment planning and reporting for undergraduate, graduate, and professional programs  Part 3: Review five examples of undergraduate and graduate academic assessment plans, SLOs, measures, results, and use of results TODAY’S GOALS

Size and scope Multiple colleges Undergraduate programs Graduate programs Professional programs Institutional consistency Outcomes Assessment reporting Cycles Management and ToolsHonoring unit autonomy, disciplinary distinctions, and institutional requirements Faculty comportment COMMON CHALLENGES

 Research intensive, AAU member, comprehensive university  475 academic programs undergraduate programs, 299 graduate and professional programs, 54 certificate programs  16 colleges, 13 non-academic units, 4 Senior Vice Presidents  We track a total of 508 assessment units ASSESSMENT AT THE UNIVERSITY OF FLORIDA

 The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional effectiveness)  educational programs, to include student learning outcomes SACS STANDARD

PART 1: STUDENT LEARNING OUTCOMES

Student Learning Outcomes (SLOs) are defined generally as “what students are expected to know and be able to do by completion of their degree program” BASIC ELEMENT 1: define this for your faculty and ensure that this definition is consistent across campus and clearly posted BASIC ELEMENT 1: DEFINE AND DISSEMINATE THE TERMS

Undergraduate Content Knowledge Critical thinkingCommunicationGraduate Content Knowledge Professional Behavior Skills BASIC ELEMENT 2: CONSIDER A CATEGORICAL ORGANIZING FRAMEWORK

Student Learning Outcomes reflect the curriculum the discipline, and faculty expectations; as these elements evolve, learning outcomes change. Recency has to do with the degree to which the outcome reflects current knowledge and practice in the discipline. Relevance is the degree to which the outcome relates logically and significantly to the discipline and the degree. Rigor has to do with the degree of academic precision and thoroughness that the outcome requires to be met successfully. BASIC ELEMENT 3: RECENCY, RELEVANCE, AND RIGOR

 Outputs describe and count what we do and whom we reach, and represent products or services we produce. Processes deliver outputs; what is produced at the end of a process is an output.  An outcome is a level of performance or achievement. It may be associated with a process or its output. Outcomes imply measurement - quantification - of performance. BASIC ELEMENT 4: DISTINGUISH OUTPUTS FROM OUTCOMES

We seek to measure outcomes as well as their associated outputs; however, SLOs focus on outcomes. For example, while we produce a number of new graduates (the output), it is critical that we have a measure of the quality of the graduates as defined by the college or discipline (the outcome). Outcomes describe, in measurable terms, these quality characteristics by defining our expectations for students. OUTCOMES AND OUTPUTS: WHAT IS THE DIFFERENCE?

 Student Learning Outcomes (SLOs) describe what students should know and be able to do as a result of completing an academic program.  Program faculty set targets for their SLOs  Program Goals describe the unit’s expectations for programmatic elements, such as admission criteria, acceptance and graduation rates, etc. – see p. 4 of your handout BASIC ELEMENT 5: DISTINGUISH SLOS AND PROGRAM GOALS

EFFECTIVE SLOS:  Focus on what students will know and be able to do.  All disciplines have a body of core knowledge that students must learn to be successful as well as a core set of applications of that knowledge in professional settings.  Describe observable and measurable actions or behaviors.  Effective SLOs present a core set of observable, measureable behaviors. Measurement tools vary from exams to complex tasks graded by rubrics.  The key to measurability: an active verb that describes a observable behavior, process, or product  A framework for developing SLOs: Bloom’s Taxonomy (pages 6-9 in your handout)Bloom’s Taxonomy BASIC ELEMENT 6: ENSURE THE OUTCOME IS MEASURABLE

 Understand  An internal process that is indicated by demonstrated behaviors – OK for learning goals but not recommended for program or course SLOs  Appreciate; value  Internal processes that are indicated by demonstrated behaviors closely tied to personal choice or preference; OK if the appreciation or valuing is supported by discipline-specific knowledge  Become familiar with  Focuses assessment on “becoming familiar,” not familiarity  Learn about, think about  Not observable; demonstrable through communication or other demonstration of learning  Become aware of, gain an awareness of  Focuses assessment on becoming and/or gaining – not actual awareness  Demonstrate the ability to  Focuses assessment on ability, not achievement or demonstration of a skill VERBS AND PHRASES THAT COMPLICATE MEASURABILITY

This model connects course-level and program-level SLOs directly to the program learning goals Course-level Student Learning Outcome these are determined by the faculty and specify course-level, observable products or demonstrations Program-level – Student Learning Outcome these describe what students will do to demonstrate they have met the learning goals Program Learning Goal Level – programs establish learning goals for the degree these goals require multiple actions over time to measure DEVELOPING MEASURABLE SLOS: A THREE-LEVEL MODEL (CARRIVEAU, 2010)

 Direct assessments of student learning are those that provide for direct examination or observation of student knowledge or skills against measurable performance indicators.  Indirect assessments are those that ascertain the opinion or self- report of the extent or value of learning experiences (Rogers, 2011) BASIC ELEMENT 7: BALANCE DIRECT AND INDIRECT ASSESSMENTS

A SAMPLE UNDERGRADUATE PROGRAM AT THE UNIVERSITY OF FLORIDA: MATERIALS SCIENCE ENGINEERING

Learning Goals – these are found in the description of the major, or in the program mission or on the program website Example: Materials Science and Engineering The major enables you to develop an understanding of materials systems and their role in engineering. Emphasis is placed on the ability to apply knowledge of mathematics, science and engineering principles to materials science and engineering; to design and conduct experiments, as well as to analyze and interpret data; and to design a system, component or process to meet desired needs within realistic constraints such as economic, environmental, social, political, ethical, health and safety, manufacturability and sustainability. LEVEL 1: ESTABLISHING LEARNING GOALS FOR THE DEGREE Source: UF Undergraduate Catalog, engineering.aspx engineering.aspx

Content Knowledge Apply knowledge of mathematics, science and engineering principles to materials science and engineering. Design and conduct materials science and engineering experiments and analyze and interpret the data. Critical Thinking Design a materials science and engineering system, component or process to meet desired needs within realistic economic, environmental, social, political, ethical, health and safety, manufacturability and sustainability constraints. Communication Communicate technical data and design information effectively in speech and in writing to other materials engineers. LEVEL 2: PROGRAM STUDENT LEARNING OUTCOMES FOR MSE

Learning Goals: Understand materials systems and their role in engineering Design a system, component or process to meet desired needs within realistic constraints such as economic, environmental, social, political, ethical, health and safety, manufacturability and sustainability Student Learning Outcomes: Design a materials science and engineering system, component or process to meet desired needs within realistic economic, environmental, social, political, ethical, health and safety, manufacturability and sustainability constraints. Communicate technical data and design information effectively in speech and in writing to other materials engineers MSE: CONNECTING GOALS TO OUTCOMES Goal SLO

ALC Learning Goals: Understand materials systems and their role in engineering Apply knowledge of mathematics, science and engineering principles to materials science and engineering to design and conduct experiments, as well as to analyze and interpret data Student Learning Outcomes: Apply knowledge of mathematics, science and engineering principles to materials science and engineering. Design and conduct materials science and engineering experiments and analyze and interpret the data MSE: CONNECTING GOALS TO OUTCOMES Level 1 Level 2

SLOs Additional Assess-ments Content Knowledge EMA3050EMA3066EMA4714EMA3080CEMA3513CEMA4714 #1IRA Senior exit survey #2 IRA Senior exit survey Critical Thinking EMA3066EMA4223EMA4714 #3IRA Senior exit survey Communi- cation EMA3080CEMA3013CEMA3513C #4IRA Senior exit survey CONNECTING PROGRAM SLOS TO COURSES MSE CURRICULUM MAP Assessments in the boxes marked A are conducted using specific homework, exam, or assignment questions aligned with that SLO. Source: MSE Academic Assessment Plan

 These are determined by the faculty to teach the course  These should directly relate to the program SLOs LEVEL 3: COURSE LEVEL SLOS

PART 2: PLANNING AND REPORTING

Academic Assessment Plans provide a common framework for units to plan how they assess and measure student achievement of the SLOs Plans also present the process for how the data from these assessments are used to enhance the quality of student learning. At UF, 475 programs engage in Academic Assessment planning ACADEMIC ASSESSMENT PLANNING

Provides faculty a focal point for the discussion of the assessment of student learning in the degree programs. Planning discussions provide an opportunity to revisit the curriculum and its relationship to the SLOs. Provides a consistent reference resource when faculty and leadership change. WHY PLAN?

BASIC ELEMENT 8: DEVELOP A PLANNING TIMELINE/CYCLE – Assessment Plans reside in the units – Institutional Assessment office established; initial Undergraduate Assessment Plans submitted Initial Graduate and Certificate Assessment Plans, and second round of Undergraduate Plans submitted All Academic Assessment Plans updated annually The University of Florida Planning Cycle

BASIC ELEMENT 9: DEVELOP TEMPLATES AND RUBRICS

BASIC ELEMENT 10: DEVELOP AN APPROVAL AND MANAGEMENT PROCESS Program/Department Prepares the submission Submits request to the approval systemapproval system College Receives program/department submission Reviews and takes action - submits to Institutional Assessment Academic Assessment Committee Institutional Assessment review and initial recommendation Academic Assessment Committee review and recommendation University Curriculum Committee Chair review and initial recommendation University Curriculum Committee review and recommendation Student Academic Support System Screened for alignment with the catalog Entered into catalog The University of Florida SLO Approval Process (p. 18)

BASIC ELEMENT 11: DEVELOP A SYSTEM OR CYCLE OF ASSESSMENT AND REPORTING May – Assessment Plans and Effectiveness Documentation Reports submitted for the next AY October - Assessment Data, results, and use of results for previous AY reported Assessment and Institutional Effectiveness Data Reporting Establish Mission, Goals, and Outcomes Assessment Planning Implement the Plan and Gather Data Interpret and Evaluate the Data Modify and Improve

Figure 1 –Assessment Plan Review Rubrics – pp Figure 2 – New SLO/Academic Assessment Plan Submission form – p. 16 Figure 3 – SLO/Academic Assessment Plan Revision form – p. 16 MANAGEMENT TOOLS

Assessment Reporting Outcome Measure(s) Data Use of results Program modifications REPORTING ASSESSMENT RESULTS

BASIC ELEMENT 12: DEVELOP A QUALITY ASSURANCE PROCESS

Multi-step, institutional review and approval process Templates and rubrics for guiding faculty through the process Review and evaluate faculty submissions Cross-reference plans with data reported annually Develop and provide professional development Model the process: Modify and improve quality assurance processes based on the data you collect ELEMENTS OF QUALITY ASSURANCE

Section Revision Citations Mission Alignment77% Student Learning Outcomes68% Curriculum Map63% Assessment Cycle69% Methods and Procedures83% SAMPLE OF ACADEMIC ASSESSMENT PLAN REVIEW In 2012, 110 of the undergraduate plans were reviewed using a rubric (see page 13 in your handout). Faculty were asked to revise particular sections of their plans.

We evaluated the undergraduate assessment reports (See page 11 in your handout) Question: Why these findings? SAMPLE OF ASSESSMENT REPORT REVIEW

SLOs: Cross-check to ensure these are consistent in the Academic Assessment Plan, the catalog, and the online reporting system Update SLOs using the appropriate forms if needed Assessment Method: List the assignment, exam, project, etc. If this is a sample, describe the sampling procedure used Results: Enter the criterion for success, and if the criterion is less than 70%, provide a rationale. “X number of students passed the assessment out of a total of Y students, for a percentage of Z%”. This meets/does not meet the criterion for success. Attach the data you shared with your faculty (student names redacted). NOTE: Please have raw data available in case it is requested. Use of Results: State who reviewed the results. Refer to the results that were reviewed. State actions taken in past tense. A SAMPLE REPORTING TEMPLATE (PAGE 12)

1.Define the terms and disseminate them 2.Consider an institutional categorical organizing framework for SLOs 3.Recency, Relevance, and Rigor 4.Distinguish Outputs from Outcomes 5.Distinguish SLOs from Program Goals 6.Ensure the outcome is measurable 7.Balance direct and indirect assessments 8.Planning Timeline/Cycle 9.Templates and Rubrics 10.Approval and Management Process 11.A system or cycle of assessment and reporting 12.Quality Assurance Process BASIC ELEMENTS: A SUMMARY

Timothy S. Brophy, Director of Institutional Assessment University of Florida Office of the Provost QUESTIONS