Presentation is loading. Please wait.

Presentation is loading. Please wait.

Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,

Similar presentations


Presentation on theme: "Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,"— Presentation transcript:

1 Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies Ljeng@twu.edu June 27, 2011

2 2 Operational Assumptions Every program is unique, but it is still possible to have common process for quality assessment. Every program is unique, but it is still possible to have common process for quality assessment. Data used to inform decisions must be open, consistent, and continuous. Data used to inform decisions must be open, consistent, and continuous. Evidence of student academic attainment must be explicit, clear and understandable. Evidence of student academic attainment must be explicit, clear and understandable. No assessment can be good without direct measures on student learning outcomes. No assessment can be good without direct measures on student learning outcomes.

3 3 Accreditation Paradigm shift Paradigm shift –from “what faculty teach” to “what students learn” Focus shift Focus shift –from structure “input, process, output” to outcomes

4 4 The Anatomy of Accreditation Input (e.g., enrollment, faculty recruitment, facility) Input (e.g., enrollment, faculty recruitment, facility) Process (e.g., curriculum, services, advising) Process (e.g., curriculum, services, advising) Output (e.g., grades, graduation, placement) Output (e.g., grades, graduation, placement) Outcome Outcome –Goal oriented planning, support, teaching, as shown in evidence of learning

5 5 Example: Structure and Outcomes InputProcessOutputsOutcomes Studentsbackgrounds, enrollment Student services and programs grades, graduation, placement Skills gained, attitudes changes Facultybackgrounds, recruitment teaching assignment, class size work units, publications, conference presentations citations, impacts Programhistory, budget, resources allocated policies, procedures, governance participation rates, resources utilization program objectives achieved

6 6 Example: Output v. Outcome Output Course grades Faculty publications Enrollment growth > Outcome > Skills learned > Citation impacts > Objective achieved

7 7 Two Levels of Assessment Program level assessment Program level assessment Course level assessment Course level assessment

8 8 Implicit in COA Standards is the expectation that assessment is Explicit and in writing Explicit and in writing Integrated in the program’s planning process Integrated in the program’s planning process Done at both program level and course level Done at both program level and course level Accessible to those affected by the assessment Accessible to those affected by the assessment Implemented and used as feedback by the program Implemented and used as feedback by the program

9 9 The Guiding Principles of Assessment Backward planning Backward planning –Start with where we want to end Triangulation Triangulation –Multiple measures, both direct and indirect Gap analysis Gap analysis –Inventory of what has been done

10 10 Steps for Assessment Identify standards, sources of evidence and constituent inputs Identify standards, sources of evidence and constituent inputs Define student learning objectives Define student learning objectives Develop outcome measures Develop outcome measures –Direct and indirect measure Collect and analyze data Collect and analyze data –Methods, frequency, patterns Review and use data as feedback Review and use data as feedback –Impacts on decisions made

11 11 Faculty Expectations as the Basis The really important things faculty think students should know, believe, or be able to do when they receive their degrees Approaches to establishing faculty expectations Top down: Top down: –use external standards to define faculty expectations Bottom up: Bottom up: –identify recurring faculty expectations among courses, and use the list to develop overarching program level expectations

12 12 Example: A Bottom Up Approach Take all course syllabi Take all course syllabi Examine what expectations (i.e., course objectives) are included in individual courses Examine what expectations (i.e., course objectives) are included in individual courses Make a list of recurring ones as the basis for program level expectations Make a list of recurring ones as the basis for program level expectations Ask what else need to be at program level Ask what else need to be at program level State the expectations in program objectives State the expectations in program objectives

13 13 Activities for Direct Measures (e.g.) written exams written exams oral exams oral exams performance assessments performance assessments standardized testing standardized testing licensure exams licensure exams oral presentations oral presentations projects projects demonstrations demonstrations case studies case studies simulation simulation portfolios portfolios juried activities with outside panel juried activities with outside panel

14 14 Activities for Indirect Measures questionnaire surveys questionnaire surveys interviews interviews focus groups focus groups employer satisfaction studies employer satisfaction studies advisory board advisory board job/placement data job/placement data (examples only)

15 Demonstration of Assessment Program objectives aligned with mission and goals Multiple inputs in developing program objectives (both constituents and disciplinary standards) Program objectives stated in terms of student learning outcomes 15

16 Demonstration of Assessment (Cont.) Student learning outcome assessment addressed at both course level and program level Triangulation with both direct and indirect measures A formal, systematic process to integrate results of assessment into continuous planning 16

17 17 Assessment of input and process (i.e., structure) only determines capacity. It does not determine what students learn. Assessment of input and process (i.e., structure) only determines capacity. It does not determine what students learn. Don’t confuse “better teaching” with “better learning.” One is the means and the other is the outcome. Don’t confuse “better teaching” with “better learning.” One is the means and the other is the outcome. Everything we do in the classroom is about something outside the classroom. Everything we do in the classroom is about something outside the classroom. It’s what the learners do that determines what and how much is learned. It’s what the learners do that determines what and how much is learned. If I taught something and no one learned it, does it count? Words for Thought

18 18 A Program Director’s Perspective Map the objectives with professional standards Map the objectives with professional standards Make visible the invisible expectations Make visible the invisible expectations Make sure what we measure is what we value Make sure what we measure is what we value Begin with what could be agreed upon Begin with what could be agreed upon Include both program measures and course embedded measures Include both program measures and course embedded measures Make use of assessment in grading Make use of assessment in grading Harness the accreditation process to make it happen Harness the accreditation process to make it happen


Download ppt "Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,"

Similar presentations


Ads by Google