Presentation is loading. Please wait.

Presentation is loading. Please wait.

Institutional Effectiveness USF System Office of Decision Support

Similar presentations


Presentation on theme: "Institutional Effectiveness USF System Office of Decision Support"— Presentation transcript:

1 Institutional Effectiveness USF System Office of Decision Support
College of the Arts Assessment Feedback Spring 2018 Christopher Combie, Ph.D. Marvin Moore, Ph.D.

2 Introductions Institutional Effectiveness,
System Office of Decision Support Christopher Combie, Ph.D. Assistant Director for Assessment, Institutional Effectiveness Programs conduct internal assessments Plans/reports submitted via SAM database Reviewed, analyzed, and compiled by IE Used in reports for Deans, Provost, BOG, SACSCOC We’re are fundamentally ill-prepared to lead (most of us were and are professors at heart) – we know how to generate new knowledge and effectively communicate such to our students or to approach complex problem solving through our research, but… CoTA Faculty Workshop, April 18, 2018

3 Compliance vs. Quality Compliance: Enforced by IE to ensure compliance with SACSCOC and BOG requirements Quality: Fostered by ATLE to make assessment more meaningful

4 Today’s Outline Academic assessment plans and reports
Components and requirements of an assessment plan CoTA Assessment Feedback Questions

5 Academic Assessment Plans & Reports
Annual report for each major and certificate program Program goals and student learning outcomes Plans for future program improvement Formative assessment of the major or certificate program, not summative assessment of students Non-comprehensive Non-linear Program-centric Looking for opportunities for improvement Continuous

6 Components of an Assessment Plan Schema
Mission Program Goals Student Learning Outcome Student Learning Outcome Statement Method of Assessment Performance Targets Assessment Results Use of Assessment Results

7 Program Goals vs. Student Learning Outcomes
Not always measurable Broad statements Skills, content, knowledge or tasks students will have Example: “Development of critical thinking skills” Undergraduate BOG requirements (ALC) Content-specific knowledge Critical thinking skills Communication skills Graduate degrees & certificates No ALC requirement Student Learning Outcomes A demonstrable skill, measurable change in behavior Specific statements What students will know and demonstrate after the course(s) Example: “Majors in this program will be able to appropriately conduct quantitative methods of original research in this discipline.” Includes five detailed parts

8 3-year Academic Program Assessment Rubric Overview
Year 1: Develop and pilot test assessment plan. Can assessment results be meaningfully interpreted? If no, revise assessment plan.   Year 2: Implement assessment plan, interpret assessment results from years 1 and 2, and develop an action plan for improving curriculum or instruction. Year 3: Document implementation of curricular or pedagogical changes outlined in action plan for improvement; begin development of next 3-yr. plan SACSCOC requires that we provide evidence of seeking improvement based on analysis of assessment results.

9 Academic Assessment Reporting
Academic Reporting Cycle Dates Note: For all colleges except Education and Business Academic Assessment Reporting Plan Year Plan Due Dates Plan Reviewed* Final Report Date Report Reviewed* 2018 August 31 September 14 December 15 January 9, 2019 2019 January 31 February 14 January 9, 2020 2020 January 9, 2021 *Only plans submitted prior to the due date

10 Overall Plan Rating Scale*
If a plan receives a yellow or red overall rate, we will select that it is “non- compliant” Programs will be notified and asked to make changes to sections that still need clarification or modification When secondary adjustments have been made, the “Plan complete?” button should be changed to “Yes” or “Submit Plan” *Disclaimer: These changes are in beta testing and may differ in the live production environment.

11 We’re are fundamentally ill-prepared to lead (most of us were and are professors at heart) – we know how to generate new knowledge and effectively communicate such to our students or to approach complex problem solving through our research, but…

12 Method of Assessment Section
Should be the most detailed section Less flexible with several requirements of what should be included Examples of student work to assess: Written student work Student presentation/performance Portfolio of student work Part of a laboratory report Embedded test questions Part of a thesis or dissertation Standardized tests Internship/practicum evaluation

13 Method of Assessment: Requirements
A clear description of the assessment method A statement detailing how the assessment specifically measures the task, information, or competency stated in the student learning outcome Context of the assessment How the assessment will be scored Which students in the program will be assessed

14 Feedback: Method of Assessments
Provide rubric details How was it developed? What does it measure? What is the rating scale? How will inter-rater reliability be addressed? (typical methods include: averaging scores, using a third rater, or forced agreement via discussion)

15 Performance Targets: Target for measured program performance
SHORT statement on what benchmark(s) the program wants to meet This is not how we think our students will perform, but what we want our program to do Should be worded in terms of performance on the rubric or rating instrument “Performance target will be considered met if 75% of students achieve an overall score of 4 out of 5 or higher.” “Performance target will be considered met if 80% of students assessed will receive a final score of ‘Commendable’ or higher.”

16 Feedback: Performance Targets
Need the benchmark by which the department will measure success Instead of: All students are required to meet this requirement. Consider: 80% of the students will meet the target of 90% accuracy of lengths, quantities, and list order.

17 Assessment Results SHORT statement on the final results of the assessment(s) Stated in terms of the rubric or assessment instrument Should NOT include interpretation of the results Do not be afraid to report that the target was not met (focus is on finding strengths and weaknesses)

18 Feedback: Assessment Results
Needs to be stated in terms of the rubric Example: 80% of students scored a 4.5 or higher (n=12) 13% of students scored a 3.5 to 4.4 (n=2) 7% of students scores 2.5 to 3.4 (n=1)

19 Use of Assessment Results: “Dos”
Interpretation & analysis of results data Should include specified, actionable “next steps” the program will take to develop or improve on a programmatic level: For example: Curriculum mapping Revisit or revise the assessment method/rubric Revisions to plan of study/curricular offerings Development of new modules/courses Faculty development

20 Use of Assessment Results: “Don’ts”
This is not an assessment report of student achievement Do not include: tutoring effort, sending students to the writing center, counseling students, providing students feedback, or improved recruitment Remember: The responsibility is on the program, not its students. The use of assessment results section should indicate programmatic improvements.

21 Concluding Points The academic assessment is formative of the program, not summative of the students Not comprehensive assessment of all program efforts, make it manageable Not linear; if one area has met the performance target and is doing well, move on to another area Brainstorm possible area(s) of improvement to tackle over the next three years

22 PLEASE CONTACT US ALONG THE WAY! We are here to help!
Questions? PLEASE CONTACT US ALONG THE WAY! We are here to help! IE 813/ ATLE 813/


Download ppt "Institutional Effectiveness USF System Office of Decision Support"

Similar presentations


Ads by Google