Presentation is loading. Please wait.

Presentation is loading. Please wait.

ELO-based supplemental course evaluations Teaching Academy Assessment Subcommittee David Baum & Janet Batzli.

Similar presentations


Presentation on theme: "ELO-based supplemental course evaluations Teaching Academy Assessment Subcommittee David Baum & Janet Batzli."— Presentation transcript:

1 ELO-based supplemental course evaluations Teaching Academy Assessment Subcommittee David Baum & Janet Batzli

2 The Assessment Grid Consumer StudentInstructor University Beyond the university Target Program Instructional material Course Instructor Student

3 The Assessment Grid Consumer StudentInstructor University Beyond the university Target Programchoicex quality control accredit Instructional material ximprovexadopt? Coursechoiceimprove quality control x Instructorchoiceimprove promotion/ merit x Studentlearning improve/gra ding x employment /grad school

4 The Assessment Grid Implemented by StudentInstructor University Target Programxx Academic analysis Instructional materialx Targeted assessment x Course/InstructorNot at UWSurvey?? StudentSALG Exams/assignm ents x

5 The Assessment Grid Implemented by StudentInstructor University Target Programxx Academic analysis Instructional materialx Targeted assessment x Course/InstructorNot at UWSurvey?? StudentSALG Exams/assignm ents x

6 Goals Campus wide, quantitative course evaluations – Not to replace departmental evaluations Address Essential Learning Outcomes – Track student learning – Assess courses/programs – Educate students about the ELOs Post summary data on the web – Incentivize participation – Help students select courses – Be open with stakeholders

7 Why should instructors care? Know how we are doing Be judged on what students gain from the course (not charisma) Get students to reflect on learning rather than enjoyment

8 UW Essential Learning Outcomes

9 UW Essential Learning Outcomes

10 Approach Initial questions based on ELO language Survey Center conducted three rounds of focus groups to sequentially improve questions Also piloted data presentation

11 Initial question How much did this course enhance your knowledge of the human or natural world?

12 Final Recommendation In general, how much did this course enhance your knowledge of the world, such as knowledge of human cultures, society, or science? [Not at all, A little, Somewhat, Quite a bit, A great deal] – While students were not always sure what..of the world meant, with the defining words, they got the intended idea

13 Initial question How much did this course help you develop intellectual and practical skills, such as critical and creative thinking, written and oral communication, teamwork, and problem- solving? Combining intellectual and practical/pre- professional skills was confusing to students – split into two questions

14 Final Recommendations How much did this course help you develop intellectual skills, such as critical or creative thinking, quantitative reasoning, and problem solving? How much did this course help you develop professional skills, such as written and oral communication, computer literacy, and working in teams? [Not at all, A little, Somewhat, Quite a bit, A great deal]

15 Initial question How much did this course affect your values and sense of personal and social responsibility, for example by increasing your knowledge of policy issues, engagement in community and civic affairs, intercultural knowledge, or ability to reason ethically?

16 Final Recommendations How much did this course increase your sense of social responsibility, that is increased your knowledge of cultures or provided you with opportunities for civic or community involvement? [Not at all, A little, Somewhat, Quite a bit, A great deal] There was some confusion between feeling vs. acting responsibly The question mixes an impact on a student with contents of a course We tried reason ethically but this confused students – More editing might be needed

17 Initial question How much did this course advance your ability to integrate diverse areas of knowledge?

18 Final Recommendations How much did this course improve your ability to combine knowledge or skills from different fields of study? [Not at all, A little, Somewhat, Quite a bit, A great deal] If we interpret the ELO to mean interdisciplinarity within a course, then the question is working

19 Initial question How would you rate this course for its overall quality and educational impact? Quality and impact were found to be different.

20 Final Recommendation How would you rate the overall educational value of this course, that is the extent to which the course improved your all-around education or prepared you for the future? [Very poor, Poor, Fair, Good, Very good] – Students seem to understand the intent

21 Final Recommendation How would you rate the overall quality of this course, that is the extent to which it was structured and taught in order to maximize its educational value? [Very poor, Poor, Fair, Good, Very good] – Unclear to some students if they should assess the professor or course structure – We want the focus to be on the course rather than the instructor – Maybe need to narrow, split, or drop this question

22 Final Recommendation Students felt it would be helpful to know whether a student was or was not a major Question added (at the start of the survey): At the time you enrolled, did you primarily take this course to fulfill a requirement for your major? [yes/no] – Students seemed to understand this question, as intended

23 General student reactions to the survey The survey made them reflect on their courses in ways they hadnt before and thought the survey asked about aspects of courses where they did gain something Students see that you can appreciate classes for different reasons Some students had mentally compared the answers they gave from one course to another ELOs became clearer The overall reception of the utility of the survey was very positive

24 Ideal implementation After each semester, students receive a link to a personalized survey covering all the courses they took in the previous semester Data are collated and made available to students and the public (CouseGuide?)

25 Sample Data Presentation The students liked the idea of having such data available – it would definitely help because there isnt currently a way to know if a class is good or bad besides anecdotes – Better than RateMyProfessor.com - those ratings dont necessarily tell you what the class is going to be like But they are realistic: – …A student shouldnt put all his trust into what the graph might say.

26 5 Imaginary studies 101 (331 students; 105 majors) Imaginary studies 102 (183 students; 80 majors) Imaginary studies 202 (89 students; 33 majors) Imaginary studies 401 (23 students; 12 majors) Imaginary studies 402 (19 students; 17 majors) Imaginary studies 201 (113 students; 83 majors) Mean for nonmajors Mean for majors Sample data presentation Needs work!

27 Other things student want (but we dont expect to provide) How much work How easy it was to get an A Correlation between the amount of work students put in and how much they feel they actually learned from the course Data for specific professors or TAs

28 Next steps Run a pilot survey on a large sample of students Use focus groups to improve data presentation style Consult with central administration on whether funds would be available to establish ongoing surveys and posting of data for all courses Seek faculty/staff buy-in (allowing for instructors to opt-out)

29 Thanks Janet Batzli TA Executive Committee and Assessment Subcommittee UW Survey Center – John Stevenson; Jennifer Dykema; Jaime Faus; Tara Piche Mo Bischoff UW Assessment Council


Download ppt "ELO-based supplemental course evaluations Teaching Academy Assessment Subcommittee David Baum & Janet Batzli."

Similar presentations


Ads by Google