Presentation is loading. Please wait.

Presentation is loading. Please wait.

Gauging the Impact of Student Characteristics on Faculty Course Evaluations Laura Benson Marotta University at Albany – SUNY.

Similar presentations


Presentation on theme: "Gauging the Impact of Student Characteristics on Faculty Course Evaluations Laura Benson Marotta University at Albany – SUNY."— Presentation transcript:

1 Gauging the Impact of Student Characteristics on Faculty Course Evaluations Laura Benson Marotta University at Albany – SUNY

2

3 Student Characteristics UAlbany course evaluations ask student 1. level of study 2. course is required/ an elective 3. major/minor/other 4. GPA 5. expected grade 6. gender

4 Student Characteristics 40% of the University at Albany course evaluation instrument measures student characteristics Other institutions go further Average hours per week studying Average hours per week seeking outside help Level of interest in the subject before taking the course

5 Student Characteristics “ Surveys are not ends unto themselves. They are tools that are used to help make decisions. You wouldn’t buy tools for a workbench unless you knew what you would use them for and unless you knew for sure that you were going to use them.” Linda A. Suskie (1992)

6 Problems Students may view questions about them as intrusive or irrelevant Gathering more data than we can analyze wastes good will and instructional time Contributes to survey fatigue

7 Companion Survey Departments do not assess student satisfaction by bubble sheets alone Open-ended departmental course surveys ask students to describe themselves all over again

8 Student Characteristics Data warehouse has canned queries to report student demographics by course

9 Opportunities It is easier to establish a clear link between student characteristics and faculty evaluation outcomes if the characters are measured on the survey, rather than estimated from registration records after the fact

10 Opportunities Sampling bias is a fact of life Comparing demographic information between the survey respondents and the class population Underrepresented subpopulations?

11 Student Characteristics Grade Inflation Course Assessment Non-response bias

12 Grade Inflation “Evaluations depend solely on students, and grade inflation reflects faculty worried about the impact students may have on their careers.” Virginia Myers Kelly (2005)

13 Grade Inflation on your campus Does Expected Grade predict the response to Instructor, Overall

14 Grade Inflation Practice Data Set: Undergraduate Courses Student getting a grade (not pass/fail) Limited to students who are passing

15 Undergraduate Survey Responses for Instructor, Overall

16

17

18 Grade Inflation Karl Pearson published a model in 1900 that described experiments with mutually exclusive, categorical outcomes Row by Column test of independence SPSS output using this model is still labeled “Pearson’s Chi Square”

19 Nonparametric Test Assumptions for Chi-Square: “Nonparametric tests do not require assumptions about the shape of the underlying distribution…The expected frequencies for each category should be at least 1. No more than 20% of the categories should have expected frequencies of less than 5.” SPSS Base User’s Guide 12.0 page 466; follows guidelines set by W.G. Cochrain (1954).

20 Row by Column Test of Independence Instructor_OverallTotal 1 Poor2 Fair3 Average4 Good5 Excellent Expected Grade 2 D 3233598879291 3 C 1963436039988492989 4 B 37162214363872476511066 5 A 197305781277357749830 Total 7961303287977311146724176

21 Grade Inflation Null hypothesis: “Instructor, Overall” is independent of “Expected Grade” Alternative hypothesis “Instructor, Overall” and “Expected Grade” are dependent

22 Grade Inflation- RESULTS Chi-Square Tests ValuedfAsymp. Sig. (2-sided) Pearson Chi-Square 1488.890(a)12.000 N of Valid Cases 24176 (a) 0 cells (.0%) have expected count less than 5. The minimum expected count is 9.58.

23 Grade Inflation INTERPRETATION Faculty ratings on the Likert scale varies depending on the students’ expected grade Instructors have a reason to expect lower student satisfaction if they assign lower grades.

24 Grade Inflation Clear progression in students rating instructors as “Poor”: Expecting a D: 32/291 = 11% Expecting a C: 196/2989 = 6% Expecting a B: 371/11066 = 3% Expecting an A: 197/9830 = 2%

25 Grade Inflation Instructors Rated as “Excellent” Expecting a B: 4765/11066 = 43% Expecting an A: 5774/9830 = 59%

26 Policy Implications Faculty evaluations should be considered in conjunction with grade distributions If your institution wants to follow Harvard and fight grade inflation by setting a cap on “A” grades in undergraduate courses, expect lower student satisfaction ratings “Expected Grade” should be included during a survey redesign

27 Course Assessment 1 credit lower-division general education course in Information Science Gap between satisfaction with Instructors and satisfaction with course

28 Course Assessment The first step to solving the problem is to confirm that student satisfaction with the general education course in Information Science is different from the other lower-level undergraduate courses

29

30

31 Student Characteristics

32 Course Assessment Student Level Students said course “Poor” Total Students % rating course “Poor” Freshmen2258.0% Sophomore 98510.6% Junior3624.8% Senior106814.8% Graduate1812.5%

33 Course Assessment Exploring these data did not solve the curriculum coordinator’s original problem, but it did help focus our questions in designing a follow-up study.

34 Course Assessment The following semester the instructors handed out a two question survey on the first day of class: Why did you take this class? Are you a freshman, sophomore, junior, senior, or other?

35 Course Assessment 3 Readers Scored Open-Ended Responses (Reliability) Scored categories: 1. General Education 2. Need 1 Credit 3. Subject Matter 4. Other

36 Course Evaluation Results 1. ~ 1/3 Seniors interested in subject 2. ~ 4/5 Seniors needed 1 credit 3. Grads self-selecting for remediation

37 Course Evaluation Policy Implications Examine other opportunities for upperclassmen to earn 1 credit Make Seniors jump through hoops to get into this course

38 Student Characteristics Conclusion Institutional Researchers use student characteristics on faculty evaluations to: Track trends like grade inflation Conduct ad hoc analyses Estimate Sample Bias

39 Student Characteristics Conclusion: It is only wise to gather as much data as we will use.

40 Questions?


Download ppt "Gauging the Impact of Student Characteristics on Faculty Course Evaluations Laura Benson Marotta University at Albany – SUNY."

Similar presentations


Ads by Google