Presentation is loading. Please wait.

Presentation is loading. Please wait.

ASSESSMENT CERTIFICATE CULMINATING PROJECT: ASSESSING STUDENT LEARNING OUTCOMES Presented by: Shujaat Ahmed and Kaitlin Fitzsimons.

Similar presentations


Presentation on theme: "ASSESSMENT CERTIFICATE CULMINATING PROJECT: ASSESSING STUDENT LEARNING OUTCOMES Presented by: Shujaat Ahmed and Kaitlin Fitzsimons."— Presentation transcript:

1 ASSESSMENT CERTIFICATE CULMINATING PROJECT: ASSESSING STUDENT LEARNING OUTCOMES Presented by: Shujaat Ahmed and Kaitlin Fitzsimons

2 Objective 1: Assess student learning outcomes for required L1 competence

3 L1 Competence statement: Can use independent learning skills and strategies to organize, initiate, and document prior, current, and future college-level learning. Competence Criteria – Describe strategies for independent and experiential learning. – Use strategies to surface prior experiential learning in personal, professional, and academic settings and integrate these experiences with new learning. – Demonstrate skills in planning, organizing, assessing, and documenting competence based learning. OBJECTIVE 1 L1 COMPETENCE AND CRITERIA

4 Objective 2: Compare three course designs (1.0, 1.5, 2.0) of the Independent Learning Seminar

5 Independent Learning Seminar was a new course created to help students structure their independent learning experiences (V. 1.0) After the initial offering of the course, one of the instructors decided to adjust the original course design (V. 1.5) This Fall, a third course design was developed, replacing all of the 1.0 version offerings (V. 2.0) Our study is focused on determining whether the three course designs are achieving the L1 competence criteria, and if one design is better at meeting certain criteria than the others OBJECTIVE 2

6 1.Developed rubric to assess ILS projects 2.Identified 33 randomly selected assignments from 1.0, 1.5, and 2.0 (N=11 per section) for assessing L1. 3.Eight SNL TLA members rated selected assignments 4.Rubric data was analyzed by ILS section STEPS TAKEN TO ASSESS L1 COMPETENCE (OBJ 1)

7 ILP RUBRIC (OBJ 1) L1 Competence Criteria Below Average (1)Satisfactory (2)Above Average (3)Excellent (4) Describes strategies for independent and experiential learning. Does not or minimally identify(ies) strategies for independent and experiential learning. Superficially describes strategies for independent and experiential learning. Clearly describes strategies for independent and experiential learning. Clearly describes in detail an understanding of strategies of independent and experiential learning across multiple contexts. Uses strategies to surface prior experiential learning in personal, professional, or academic settings Does not apply strategies to surface prior learning in personal, professional, or academic settings Minimally applies strategies to surface prior learning in personal, professional, or academic settings Clearly applies strategies to surface prior learning in personal, professional, or academic settings Reflects upon the value of strategies to surface prior learning in personal, professional, or academic settings. Integrates learning experiences with new learning. Does not connect or integrate learning experiences Minimally connects learning experiences with new learning. Clearly connects learning experiences with new learning. Draws multiple connections between learning experiences across contexts and time. Demonstrates skills in documenting learning. Documentation of learning is unclear or incomplete Documents learning but does not emphasize learning strategies or future learning Documents learning with an emphasis on learning strategies OR an emphasis on future learning Documents learning effectively with an emphasis on learning strategies AND their application to future learning

8

9 Means and Standard Deviations for Version 1, 1.5, and 2.0 Note. Bold values in red represent highest means across versions; an asterisk (*) is indicated near each mean that is the highest per version ANALYSIS OF RUBRIC DATA (OBJ 1) Version 1Version 1.5Version 2 L1 CriteriaMean (N=11)SDMean (N=11)SDMean (N=11)SD Learning Strategies1.821.072.361.212.630.90 Surfacing Prior Learning2.180.753.09*0.702.82*0.60 Applying Learning to Future Learning2.55*0.822.721.192.550.69 Documenting Learning2.360.922.810.872.360.92

10 A 3 x 4 MANOVA was conducted to determine whether the mean differences between the three sections were statistically significant. Results revealed a significant multivariate main effect for section, Wilks’ λ =.540, F (8, 54) = 2.435, p <. 10, η2=.27, with 27% of the multivariate variance of the L1 criteria being associated with the grouping factor, section. In examining the univariate between-subject effects, only surfacing prior learning was significantly different, F (2, 30) =5.064, p=.013. Post hoc tests using Bonferroni procedure indicated that only when version 1.0 (M=2.18) was compared to version 1.5 (M=3.09), or vice versa for surfacing prior learning, the mean difference was statistically significant (p<.025). ANALYSIS OF RUBRIC DATA (OBJ 1)

11 Applying Assessment Concepts Assessment promotes continuous improvement – Goal of the project is to determine if the different course designs effectively assess the L1 competence, and provide information for future decisions on selection or improvement of course designs. Measuring student learning requires direct assessment – We chose a random sample of student work and assessed each using a rubric A mixed methods approach to assessment provides a full picture, bringing together the “What” and the “Why” – We used quantitative methods to assess the rubrics. We chose to run a MANOVA in order to see if there were significant differences between the sections – We used qualitative methods to analyze the open-ended student comments from the student online teaching evaluations for each section of Independent Learning Seminar.

12 What We Learned In the development of the L1 rubric, we used Bloom's Extended List of Cognitive Action Verbs to fill in the cells that describe below average, satisfactory, above average, and excellent demonstrations of the L1 criteria.Bloom's Extended List of Cognitive Action Verbs – As L1 competence criteria were sometimes "double-barreled,” we separated some of the criteria such that they were measuring a single outcome. [Writing and Revising Learning Outcomes Workshop, Direct v. Indirect Assessment Workshop] At this stage of the project, analysis of the L1 rubric Ratings was exclusively quantitative. We used SPSS to compare the different ILS sections by determining means, standard deviations, and running a MANOVA. [Quantitative Workshop]

13 What We Learned In analyzing survey data from instructors who taught each version of the Independent Learning Seminar, we wanted to gauge faculty perception on teaching experiences, as well as student learning. We realize that faculty perceptions constitute indirect evidence, so the survey was only one piece of our mixed methods approach, including direct evidence from the rubric data analysis. [Direct vs. Indirect Assessment Workshop] In analyzing the student online teaching evaluations for each section of Independent Learning Seminar we looked at open-ended student comments. We identified several themes based upon the frequency of student comments. Not surprisingly, students in the 1.0 version reported the course focusing on the development of an ILP, while students in versions 1.5 and 2.0 emphasized the class structure [Qualitative Workshop]

14 RELEVANCE TO WORK The Assessment Center provides analysis and expertise so that SNL can make data-driven decisions For last year’s Annual Assessment Project, SNL completed a similar study of four different versions of the Advanced Project course The findings from that report were presented to College leadership and were relied upon to make decisions about curricular offerings SNL is increasing its reliance on rubrics as a direct assessment tool to objectively measure student outcomes

15 REFLECTION AND MOVING FORWARD Working on the Annual Assessment project has given us the opportunity to incorporate what we've learned through the ACP workshops to improve SNL curricular offerings. We will continue to incorporate the assessment best practices regarding data collection, data analysis, and reporting findings in our positions at SNL. Employing methods of formal inquiry as a means of problem solving extends outside the area of assessment and can be used in various practice settings.

16 APPENDIX


Download ppt "ASSESSMENT CERTIFICATE CULMINATING PROJECT: ASSESSING STUDENT LEARNING OUTCOMES Presented by: Shujaat Ahmed and Kaitlin Fitzsimons."

Similar presentations


Ads by Google