Presentation is loading. Please wait.

Presentation is loading. Please wait.

Timothy S. Brophy, Ph.D., Director of Institutional Assessment University of Florida Office of the Provost Gainesville, FL THE BASICS OF STUDENT LEARNING.

Similar presentations


Presentation on theme: "Timothy S. Brophy, Ph.D., Director of Institutional Assessment University of Florida Office of the Provost Gainesville, FL THE BASICS OF STUDENT LEARNING."— Presentation transcript:

1 Timothy S. Brophy, Ph.D., Director of Institutional Assessment University of Florida Office of the Provost Gainesville, FL THE BASICS OF STUDENT LEARNING OUTCOMES ASSESSMENT IN SENIOR INSTITUTIONS SACSCOC SUMMER INSTITUTE JULY 22, 2013

2  Part 1: To introduce, describe, and explain the basic elements of student learning outcomes, their development, and measurement  Part 2: To share a structure for assessment planning and reporting for undergraduate, graduate, and professional programs  Part 3: Review five examples of undergraduate and graduate academic assessment plans, SLOs, measures, results, and use of results TODAY’S GOALS

3 Size and scope Multiple colleges Undergraduate programs Graduate programs Professional programs Institutional consistency Outcomes Assessment reporting Cycles Management and ToolsHonoring unit autonomy, disciplinary distinctions, and institutional requirements Faculty comportment COMMON CHALLENGES

4  Research intensive, AAU member, comprehensive university  475 academic programs - 122 undergraduate programs, 299 graduate and professional programs, 54 certificate programs  16 colleges, 13 non-academic units, 4 Senior Vice Presidents  We track a total of 508 assessment units ASSESSMENT AT THE UNIVERSITY OF FLORIDA

5  3.3.1 - The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional effectiveness)  3.3.1.1 educational programs, to include student learning outcomes SACS STANDARD 3.3.1.1

6 PART 1: STUDENT LEARNING OUTCOMES

7 Student Learning Outcomes (SLOs) are defined generally as “what students are expected to know and be able to do by completion of their degree program” BASIC ELEMENT 1: define this for your faculty and ensure that this definition is consistent across campus and clearly posted BASIC ELEMENT 1: DEFINE AND DISSEMINATE THE TERMS

8 Undergraduate Content Knowledge Critical thinkingCommunicationGraduate Content Knowledge Professional Behavior Skills BASIC ELEMENT 2: CONSIDER A CATEGORICAL ORGANIZING FRAMEWORK

9 Student Learning Outcomes reflect the curriculum the discipline, and faculty expectations; as these elements evolve, learning outcomes change. Recency has to do with the degree to which the outcome reflects current knowledge and practice in the discipline. Relevance is the degree to which the outcome relates logically and significantly to the discipline and the degree. Rigor has to do with the degree of academic precision and thoroughness that the outcome requires to be met successfully. BASIC ELEMENT 3: RECENCY, RELEVANCE, AND RIGOR

10  Outputs describe and count what we do and whom we reach, and represent products or services we produce. Processes deliver outputs; what is produced at the end of a process is an output.  An outcome is a level of performance or achievement. It may be associated with a process or its output. Outcomes imply measurement - quantification - of performance. BASIC ELEMENT 4: DISTINGUISH OUTPUTS FROM OUTCOMES

11 We seek to measure outcomes as well as their associated outputs; however, SLOs focus on outcomes. For example, while we produce a number of new graduates (the output), it is critical that we have a measure of the quality of the graduates as defined by the college or discipline (the outcome). Outcomes describe, in measurable terms, these quality characteristics by defining our expectations for students. OUTCOMES AND OUTPUTS: WHAT IS THE DIFFERENCE?

12  Student Learning Outcomes (SLOs) describe what students should know and be able to do as a result of completing an academic program.  Program faculty set targets for their SLOs  Program Goals describe the unit’s expectations for programmatic elements, such as admission criteria, acceptance and graduation rates, etc. – see p. 4 of your handout BASIC ELEMENT 5: DISTINGUISH SLOS AND PROGRAM GOALS

13 EFFECTIVE SLOS:  Focus on what students will know and be able to do.  All disciplines have a body of core knowledge that students must learn to be successful as well as a core set of applications of that knowledge in professional settings.  Describe observable and measurable actions or behaviors.  Effective SLOs present a core set of observable, measureable behaviors. Measurement tools vary from exams to complex tasks graded by rubrics.  The key to measurability: an active verb that describes a observable behavior, process, or product  A framework for developing SLOs: Bloom’s Taxonomy (pages 6-9 in your handout)Bloom’s Taxonomy BASIC ELEMENT 6: ENSURE THE OUTCOME IS MEASURABLE

14  Understand  An internal process that is indicated by demonstrated behaviors – OK for learning goals but not recommended for program or course SLOs  Appreciate; value  Internal processes that are indicated by demonstrated behaviors closely tied to personal choice or preference; OK if the appreciation or valuing is supported by discipline-specific knowledge  Become familiar with  Focuses assessment on “becoming familiar,” not familiarity  Learn about, think about  Not observable; demonstrable through communication or other demonstration of learning  Become aware of, gain an awareness of  Focuses assessment on becoming and/or gaining – not actual awareness  Demonstrate the ability to  Focuses assessment on ability, not achievement or demonstration of a skill VERBS AND PHRASES THAT COMPLICATE MEASURABILITY

15 This model connects course-level and program-level SLOs directly to the program learning goals Course-level Student Learning Outcome these are determined by the faculty and specify course-level, observable products or demonstrations Program-level – Student Learning Outcome these describe what students will do to demonstrate they have met the learning goals Program Learning Goal Level – programs establish learning goals for the degree these goals require multiple actions over time to measure DEVELOPING MEASURABLE SLOS: A THREE-LEVEL MODEL (CARRIVEAU, 2010)

16  Direct assessments of student learning are those that provide for direct examination or observation of student knowledge or skills against measurable performance indicators.  Indirect assessments are those that ascertain the opinion or self- report of the extent or value of learning experiences (Rogers, 2011) BASIC ELEMENT 7: BALANCE DIRECT AND INDIRECT ASSESSMENTS

17 A SAMPLE UNDERGRADUATE PROGRAM AT THE UNIVERSITY OF FLORIDA: MATERIALS SCIENCE ENGINEERING

18 Learning Goals – these are found in the description of the major, or in the program mission or on the program website Example: Materials Science and Engineering The major enables you to develop an understanding of materials systems and their role in engineering. Emphasis is placed on the ability to apply knowledge of mathematics, science and engineering principles to materials science and engineering; to design and conduct experiments, as well as to analyze and interpret data; and to design a system, component or process to meet desired needs within realistic constraints such as economic, environmental, social, political, ethical, health and safety, manufacturability and sustainability. LEVEL 1: ESTABLISHING LEARNING GOALS FOR THE DEGREE Source: 2013-14 UF Undergraduate Catalog, https://catalog.ufl.edu/ugrad/current/engineering/alc/materials-science-and- engineering.aspx https://catalog.ufl.edu/ugrad/current/engineering/alc/materials-science-and- engineering.aspx

19 Content Knowledge Apply knowledge of mathematics, science and engineering principles to materials science and engineering. Design and conduct materials science and engineering experiments and analyze and interpret the data. Critical Thinking Design a materials science and engineering system, component or process to meet desired needs within realistic economic, environmental, social, political, ethical, health and safety, manufacturability and sustainability constraints. Communication Communicate technical data and design information effectively in speech and in writing to other materials engineers. LEVEL 2: PROGRAM STUDENT LEARNING OUTCOMES FOR MSE

20 Learning Goals: Understand materials systems and their role in engineering Design a system, component or process to meet desired needs within realistic constraints such as economic, environmental, social, political, ethical, health and safety, manufacturability and sustainability Student Learning Outcomes: Design a materials science and engineering system, component or process to meet desired needs within realistic economic, environmental, social, political, ethical, health and safety, manufacturability and sustainability constraints. Communicate technical data and design information effectively in speech and in writing to other materials engineers MSE: CONNECTING GOALS TO OUTCOMES Goal SLO

21 ALC Learning Goals: Understand materials systems and their role in engineering Apply knowledge of mathematics, science and engineering principles to materials science and engineering to design and conduct experiments, as well as to analyze and interpret data Student Learning Outcomes: Apply knowledge of mathematics, science and engineering principles to materials science and engineering. Design and conduct materials science and engineering experiments and analyze and interpret the data MSE: CONNECTING GOALS TO OUTCOMES Level 1 Level 2

22 SLOs Additional Assess-ments Content Knowledge EMA3050EMA3066EMA4714EMA3080CEMA3513CEMA4714 #1IRA Senior exit survey #2 IRA Senior exit survey Critical Thinking EMA3066EMA4223EMA4714 #3IRA Senior exit survey Communi- cation EMA3080CEMA3013CEMA3513C #4IRA Senior exit survey CONNECTING PROGRAM SLOS TO COURSES MSE CURRICULUM MAP Assessments in the boxes marked A are conducted using specific homework, exam, or assignment questions aligned with that SLO. Source: 2012-13 MSE Academic Assessment Plan

23  These are determined by the faculty to teach the course  These should directly relate to the program SLOs LEVEL 3: COURSE LEVEL SLOS

24 PART 2: PLANNING AND REPORTING

25 Academic Assessment Plans provide a common framework for units to plan how they assess and measure student achievement of the SLOs Plans also present the process for how the data from these assessments are used to enhance the quality of student learning. At UF, 475 programs engage in Academic Assessment planning ACADEMIC ASSESSMENT PLANNING

26 Provides faculty a focal point for the discussion of the assessment of student learning in the degree programs. Planning discussions provide an opportunity to revisit the curriculum and its relationship to the SLOs. Provides a consistent reference resource when faculty and leadership change. WHY PLAN?

27 BASIC ELEMENT 8: DEVELOP A PLANNING TIMELINE/CYCLE 2003-2011 – Assessment Plans reside in the units 2011-12 – Institutional Assessment office established; initial Undergraduate Assessment Plans submitted 2012-13 Initial Graduate and Certificate Assessment Plans, and second round of Undergraduate Plans submitted 2013-14 All Academic Assessment Plans updated annually The University of Florida Planning Cycle

28 BASIC ELEMENT 9: DEVELOP TEMPLATES AND RUBRICS

29 BASIC ELEMENT 10: DEVELOP AN APPROVAL AND MANAGEMENT PROCESS Program/Department Prepares the submission Submits request to the approval systemapproval system College Receives program/department submission Reviews and takes action - submits to Institutional Assessment Academic Assessment Committee Institutional Assessment review and initial recommendation Academic Assessment Committee review and recommendation University Curriculum Committee Chair review and initial recommendation University Curriculum Committee review and recommendation Student Academic Support System Screened for alignment with the catalog Entered into catalog The University of Florida SLO Approval Process (p. 18)

30 BASIC ELEMENT 11: DEVELOP A SYSTEM OR CYCLE OF ASSESSMENT AND REPORTING May – Assessment Plans and Effectiveness Documentation Reports submitted for the next AY October - Assessment Data, results, and use of results for previous AY reported Assessment and Institutional Effectiveness Data Reporting Establish Mission, Goals, and Outcomes Assessment Planning Implement the Plan and Gather Data Interpret and Evaluate the Data Modify and Improve

31 Figure 1 –Assessment Plan Review Rubrics – pp. 13-15 Figure 2 – New SLO/Academic Assessment Plan Submission form – p. 16 Figure 3 – SLO/Academic Assessment Plan Revision form – p. 16 MANAGEMENT TOOLS

32 Assessment Reporting Outcome Measure(s) Data Use of results Program modifications REPORTING ASSESSMENT RESULTS

33 BASIC ELEMENT 12: DEVELOP A QUALITY ASSURANCE PROCESS

34 Multi-step, institutional review and approval process Templates and rubrics for guiding faculty through the process Review and evaluate faculty submissions Cross-reference plans with data reported annually Develop and provide professional development Model the process: Modify and improve quality assurance processes based on the data you collect ELEMENTS OF QUALITY ASSURANCE

35 Section Revision Citations Mission Alignment77% Student Learning Outcomes68% Curriculum Map63% Assessment Cycle69% Methods and Procedures83% SAMPLE OF ACADEMIC ASSESSMENT PLAN REVIEW In 2012, 110 of the 2011-12 undergraduate plans were reviewed using a rubric (see page 13 in your handout). Faculty were asked to revise particular sections of their plans.

36 We evaluated the undergraduate 2011-12 assessment reports (See page 11 in your handout) Question: Why these findings? SAMPLE OF ASSESSMENT REPORT REVIEW

37 SLOs: Cross-check to ensure these are consistent in the Academic Assessment Plan, the catalog, and the online reporting system Update SLOs using the appropriate forms if needed Assessment Method: List the assignment, exam, project, etc. If this is a sample, describe the sampling procedure used Results: Enter the criterion for success, and if the criterion is less than 70%, provide a rationale. “X number of students passed the assessment out of a total of Y students, for a percentage of Z%”. This meets/does not meet the criterion for success. Attach the data you shared with your faculty (student names redacted). NOTE: Please have raw data available in case it is requested. Use of Results: State who reviewed the results. Refer to the results that were reviewed. State actions taken in past tense. A SAMPLE REPORTING TEMPLATE (PAGE 12)

38 1.Define the terms and disseminate them 2.Consider an institutional categorical organizing framework for SLOs 3.Recency, Relevance, and Rigor 4.Distinguish Outputs from Outcomes 5.Distinguish SLOs from Program Goals 6.Ensure the outcome is measurable 7.Balance direct and indirect assessments 8.Planning Timeline/Cycle 9.Templates and Rubrics 10.Approval and Management Process 11.A system or cycle of assessment and reporting 12.Quality Assurance Process BASIC ELEMENTS: A SUMMARY

39 Timothy S. Brophy, Director of Institutional Assessment University of Florida Office of the Provost tbrophy@aa.ufl.edu 352-273-4476 QUESTIONS


Download ppt "Timothy S. Brophy, Ph.D., Director of Institutional Assessment University of Florida Office of the Provost Gainesville, FL THE BASICS OF STUDENT LEARNING."

Similar presentations


Ads by Google