Presentation is loading. Please wait.

Presentation is loading. Please wait.

Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.

Similar presentations


Presentation on theme: "Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment."— Presentation transcript:

1 Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

2 Workshop Learning Outcomes Upon conclusion of the workshop, workshop participants will be able to : 1)identify and define the elements of UNC Charlotte’s Student Learning Outcomes Assessment Plan and Report; 2)develop an analytic rubric; 3) link analytic rubrics and assessment data to Performance Outcomes; and 4)use assessment data to improve student learning.

3 Elements of UNC Charlotte’s Student Learning Outcomes Assessment Plan and Report 1)Student Learning Outcome 2)Effectiveness Measure with the scoring rubric 3)Assessment Methodology 4)Performance Outcome 5)Assessment Data 6)Continuous Improvement Plan

4 Student Learning Outcomes (SLOs) Are: competencies (i.e., knowledge, skills, abilities and behaviors) that students will be able to demonstrate as a result of an educational program SLOs need to be specific and measurable

5 Sample SLO That is Not Measurable Because it is Stated Too Broadly: “Upon conclusion, workshop participants will be knowledgeable about student learning outcomes assessment.”

6 Sample Measurable SLOs Workshop participants will be able to: 1)identify and define the elements of UNC Charlotte’s Student Learning Outcomes Assessment Plan and Report; 2)develop an analytic rubric; 3) link analytic rubrics and assessment data to Performance Outcomes ; and 4)use assessment data to improve student learning.

7 An Effectiveness Measure is: a student artifact, (e.g., exam, project, paper or presentation) that will be used to gauge students’ acquisition of the student learning outcome.

8 Important Attributes of an Effectiveness Measure Validity: Does the Effectiveness Measure assess what it is supposed to (i.e., the knowledge, skill, ability or behavior articulated in the SLO)? Example: An oral presentation would be a valid Effectiveness Measure for assessing students’ oral communication skills. Reliability: Does the Effectiveness Measure consistently assess the knowledge, skill, ability or behavior of interest? Analytic rubrics help to facilitate inter-rater reliability

9 Sample Effectiveness Measure “Workshop participants will develop an analytic rubric to demonstrate their ability to apply their new knowledge of analytic rubrics acquired in the workshop. The analytic rubric will model rubrics used by faculty to evaluate student artifacts and will include a minimum of a 3 point rating scale and 3 evaluation criteria with cell descriptors for each criterion for each level of competency.

10 Assessment Methodology Is: A description of the department’s process for assessing the student learning outcome and reviewing the resulting assessment data that includes: 1)when and in what course the assessment will be administered; 2)how the student artifact will be evaluated and by whom; 3)the department’s process for collecting, analyzing and disseminating assessment data to department faculty will be; and 4)the department’s process for annually reviewing assessment data and deciding changes/improvements to make will be.

11 Sample Assessment Methodology “Workshop participants will develop an analytic rubric after the instructor has defined a rubric, shared a sample rubric and explained the steps in constructing a rubric. Rubrics will be collected by the instructor and a Scoring Rubric for Analytic Rubric Activity will be used to assess workshop participants’ ability to develop a rubric. The instructor will summarize and disseminate the assessment findings to OAA staff who will identify areas needing improvement and decide changes to make prior to the next training workshop.”

12 A Performance Outcome Is: the percentage of students that will demonstrate proficiency on the student learning outcome and the level of proficiency expected.

13 Sample Performance Outcome “80% of workshop participants will score ‘Acceptable (2)’ or higher on the Scoring Rubric for Analytic Rubric Activity”

14 Scoring Rubric Defined A scoring tool that uses a set of evaluation criteria that are directly tied to the student learning outcome to assess student learning. When the content of the rubric is communicated prior to students completing the work, the grading process is very clear and transparent to everyone involved. When scoring rubrics are used, Performance Outcomes are to be tied to the rubric scale.

15 Scoring Rubric for Analytic Rubric Activity Criteria1 = Needs Improvement2 = Acceptable3 = Accomplished Rating (1 to 3) Ability to develop a rubric scale Is unable to develop a minimum of a 3 point scale that progresses from least proficient to most proficient Is able to develop a minimum of a 3 point scale that progresses from least proficient to most proficient Is able to develop a scale that is greater than 3 points that progresses from least proficient to most proficient Ability to develop evaluation criteria Is unable to identify at least 3 relevant evaluation criteria Is able to identify at least 3 relevant evaluation criteria Is able to identify more than 3 relevant evaluation criteria Ability to develop cell descriptors Cell descriptors are missing for some evaluation criteria. Many of the cell descriptors are not sufficiently detailed to differentiate between the levels of proficiency. Cell descriptors are present for all evaluation criteria but some do not clearly differentiate between the levels of proficiency. Cell descriptors are present for all evaluation criteria that clearly differentiate between the levels of proficiency.

16 Steps in Constructing an Analytic Scoring Rubric 1.Determine what kind of rubric scale to use (e.g., 3 pt., 5 pt.). 2.Label each level of proficiency across the top of the rubric (e.g., “(1) Needs Improvement,” “(2) Acceptable,” “(3) Accomplished”). 3.List evaluation criteria down the left side of the rubric template. 4.Write cell descriptors of what the highest level of proficiency “(3) Accomplished” looks like for each criterion. 5.Write cell descriptors of what the lowest level of proficiency “(1) Needs Improvement” looks like for each criterion. 6.Write cell descriptors of what the mid levels of proficiency look like for each criterion. 7.Test it: use the rubric to evaluate student artifacts. Make note of important evaluation criteria that were omitted, existing criteria to eliminate, and cell descriptors that need greater specificity. 8.Revise the rubric to reflect the desired changes.

17 Analytic Rubric Activity Using the blank rubric template provided, develop an Oral Presentation Rubric with a minimum of a 3 point rating scale and 3 evaluation criteria with cell descriptors for each criterion for each level of competency.

18 Tying it All Together Into a Student Learning Outcomes Assessment Plan

19 SLO Assessment Plan Student Learning Outcome: Workshop participants will be able to: 1)identify and define the elements of UNC Charlotte’s Student Learning Outcomes Assessment Plan and Report; 2)develop an analytic rubric; 3) link analytic rubrics and assessment data to Performance Outcomes; and 4)use assessment data to improve student learning. Effectiveness Measure: Workshop participants will develop an analytic rubric to demonstrate their ability to apply their new knowledge of analytic rubrics acquired in the workshop. The analytic rubric will model rubrics used by faculty to evaluate student artifacts. and will include a minimum of a 3 point rating scale and 3 evaluation criteria with cell descriptors for each criterion for each level of competency.

20 SLO Assessment Plan Assessment Methodology: Workshop participants will develop an analytic rubric after the instructor has defined a rubric, shared a sample analytic rubric and explained the steps in constructing a rubric. Rubrics will be collected by the instructor and a Scoring Rubric for Analytic Rubric Activity will be used to assess workshop participants’ ability to develop a rubric. The instructor will summarize and disseminate the assessment findings to OAA staff who will decide changes to make prior to the next training workshop. Performance Outcome: 80% of workshop participants will score “Acceptable (2)” or higher on the Scoring Rubric for Analytic Rubric Activity.

21 Reporting Assessment Data

22 Report assessment data in the same way that the Performance Outcome is stated. PO: 80% of workshop participants will score “Acceptable (2)” or higher on the Scoring Rubric for Analytic Rubric Activity Assessment Data: 89% of workshop participants scored “Acceptable (2)” or higher on the Scoring Rubric for Analytic Rubric Activity

23 Continuous Improvement Plan During the annual review of assessment data, department faculty will: 1.determine whether students are meeting the Performance Outcome(s) for each Student Learning Outcome; 2.identify changes that are needed to improve future student learning; 3.develop the department’s continuous improvement plan for the upcoming year; and 4.the following year, document in the Student Learning Outcomes Assessment Plan and Report whether the changes made improved student learning.

24 Changes to Consider When Students Are Not Meeting Performance Outcomes 1.Change the assessment instrument – Revise the assessment instrument (i.e., test questions or project requirements) – Change to an assessment instrument that measures deeper learning (e.g. from test questions to a written paper) – Revise or add rubric evaluation criteria 2.Change the assessment methodology – Change what you are assessing (i.e., revise the SLO statement) – Change when SLOs are assessed (junior year vs senior year) – Change how the assessment is administered (e.g., videotaped oral presentation vs oral presentation so that students can watch afterward to self-assess)

25 Changes to Consider When Student Are Not Meeting Performance Outcomes 3.Change the curriculum – Revise courses to provide additional coverage in areas where students did not perform well – Revise the curriculum to increase students’ exposure to SLOs by introducing, reinforcing and emphasizing competencies throughout the curriculum – Schedule an appt. with CTL curriculum design specialist 4.Change the pedagogy – Incorporate more active learning that provides students with opportunities to apply what they are learning – Incorporate mini assessments throughout the semester to gauge earlier whether students are grasping the material and adjust accordingly – Incorporate active learning opportunities for students to share/teach the material they are learning in your classroom

26 Questions/Discussion


Download ppt "Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment."

Similar presentations


Ads by Google