Presentation is loading. Please wait.

Presentation is loading. Please wait.

Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University.

Similar presentations


Presentation on theme: "Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University."— Presentation transcript:

1 Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University

2 NSSE Workshop, SCSU, October Mise en scène 23 items, written in the mid-1970s Scantron® with free response on reverse Conducted near end of semester Returned via dept chairs, 2-4 months later –In a plain manila envelope –With printout showing response distribution for section and means for section, department, division, and College

3 NSSE Workshop, SCSU, October Spring, 2000, Survey (part of Middle States self-study) Drews current use of student course evaluations for assessing teaching and learning is adequate (1 = strongly disagree; 6 = strongly agree) Faculty response –mean: 3.5 –s.d.: 1.5 (among the largest on the survey)

4 Audience Participation What would faculty on your campus say about student course- evaluation forms?

5 NSSE Workshop, SCSU, October Memorable Words When we surveyed several hundred faculty and administrators, we found a surprising lack of knowledge about the literature of student ratings and even about the basic statistical information necessary to interpret ratings reports accurately. That lack of knowledge correlated significantly with negative opinions about evaluation, student ratings, and the value of student feedback. (Theall and Franklin, p. 46)

6 NSSE Workshop, SCSU, October Two Handy Starting Points American Psychologist 52 (1997) Greenwald, ed. incl. McKeachie New Directions in IR, no. 109 (2001) Theall, Abrami, Mets, eds. incl. Theall & Franklin incl. Kulik

7 NSSE Workshop, SCSU, October Re: Presentation of Results [T]he use of norms not only leads to comparisons that are invalid but also is damaging to the motivation of the 50% of faculty members who find that they are below average. Moreover, presentation of numerical means or medians (often to two decimal places) leads to making decisions based on small numerical differences (McKeachie, p. 1223, emphasis added)

8 NSSE Workshop, SCSU, October Re: Use of Results [E]valuation experts usually advise teachers with low ratings to concentrate on their greatest relative weakness. Fix it, the experts advise, and the whole profile of ratings may go up.... changes in profile elevation are commonplace with highly intercorrelated rating scales (Kulik, p. 22).

9 NSSE Workshop, SCSU, October Facts about Drews data On each of the 11 items that are answered on a scale of 1-7: –Modal response = 7 (the best possible) –Mean response 5.9 –s.d. 1.3

10 NSSE Workshop, SCSU, October Analysis of Drew data, I Slight relationship (r < 0.2) between EXPECTED GRADE and other responses [N.B., absence of evidence is not evidence of absence (Rumsfeld)] 13 items load onto one factor that explains 38% of the variation in responses Both results replicate findings in the literature

11 Added to this mix: NSSE results

12 NSSE Workshop, SCSU, October Analysis of Drew data, II Specific request: what can current course evaluations tell us about engagement? Regression analyses using as independent variables –Class size –Level of study –Curricular division –Reason for taking –Anticipated grade

13 NSSE Workshop, SCSU, October Class Size Matters Two definitions of average –Mean class size: 18 (as seen by faculty) –Mean class size weighted by enrollment: almost 25 (as seen by students) –Why? more students in a large class than in a small class Punch line: Student effort, reported amount of work assigned, and satisfaction all decrease as class size increases.

14 NSSE Workshop, SCSU, October Added Other Items Some phenomenological (e.g., pacing) How often did instructor cancel class? Some inspired by NSSE & mission stmt –Student inputs (e.g., how often missed class) –Student outputs (e.g., how much class contributed to ability to write clearly and effectively)

15 NSSE Workshop, SCSU, October Sample Items Relating Mission to NSSE Items Questions assess the extent to which this course contributes to various learning objectives. We understand that not all courses are intended to contribute to all of the objectives listed. With that in mind, please feel free to select not at all if that seems to be the most appropriate answer for this course.

16 NSSE Workshop, SCSU, October College challenges students…to develop their capacities for: Phrase from Mission: critical thought NSSE Related Item (11c): This class contributed to my ability to think critically

17 NSSE Workshop, SCSU, October College challenges students…to develop their capacities for Phrase from Mission: Effective Communication NSSE Related Item (11d): This class contributed to my ability to speak clearly and effectively.

18 NSSE Workshop, SCSU, October College challenges students…to develop their capacities for: Phrase from Mission: Problem Solving NSSE Related Item (11f): This class contributed to my ability to analyze quantitative problems.

19 NSSE Workshop, SCSU, October College challenges students…to develop their capacities for: Phrase from Mission: Living… in an increasingly diverse world NSSE Related Item (11l): This class contributed to my ability to understand an increasingly diverse world.

20 NSSE Workshop, SCSU, October College challenges students…to develop their capacities for: Phrase from Mission: Creativity NSSE Related Item (none?): This class contributed to my ability to be creative.

21 NSSE Workshop, SCSU, October Faculty Discussions Draft circulated Sticking points –How often was class cancelled? –Length –Order of items –Unipolar v. bipolar Likert scales (!)

22 NSSE Workshop, SCSU, October Vote in May, 2005 Linked two items fate –student missed class (min response: never) –class cancelled (min response: never) Provide cover sheet for instructors to explain unusual circumstances Condone length for now; follow-up will identify redundant questions

23 NSSE Workshop, SCSU, October Work in Fall 2005 On-line version developed Pilot of on-line version to check technology –Small Sample of classes –Generate comments on each item –Check web-based interface

24 NSSE Workshop, SCSU, October Work in Spring 2006 Further Refinement Plan for Major Pilot-Testing –1/3 of Courses sampled –Communication with faculty and students carefully designed –Both Paper (current) and on-line versions were administered and linked –Administration window was planned –Ways of increasing response rate were considered, but not instituted.

25 NSSE Workshop, SCSU, October Work in Fall 2006 Pilot Data: Preliminary results –Response Rate, N=2397 (85% paper, 66% online, with n=730 able to be matched) –Relationship of paper to new on-line form Comparable paper and on-line items Correlation of items measuring quality –Relationship holds within different course levels. –Open-ended comments –Length, redundant items, and response rate –Considerable variability on NSSE related items –Factor structure

26 NSSE Workshop, SCSU, October Reporting to Faculty and Faculty Evaluators Should we provide several report formats? Should we provide an overall rating of teaching quality? What kinds of norms should we supply –Breakdown by class size? Should constituents get reports on-line? How do we protect student confidentiality? How do we report qualitative data?

27 NSSE Workshop, SCSU, October Should we Ask faculty to indicate their expectations for student responses to the mission related and other items? Discrepancies may provide a benefit similar to NSSE-FSSE comparisons. Develop an overall measure of quality and provide more norms?

28 NSSE Workshop, SCSU, October Work is Not Over Ongoing assessment was promised Eventually, hope to have a bank of optional items Suggestions for use by instructor

29 NSSE Workshop, SCSU, October Summary: NSSEs Influence NSSE results engaged and enraged faculty NSSE inspired content of many new items NSSE results suggested new kinds of questions to ask on course evaluations

30 NSSE Workshop, SCSU, October Summary: NSSEs Influence Embedding NSSE in course evaluations keeps NSSE prominent in faculty conversation New course evaluations will provide information useful for course revision, not just instructor evaluation Faculty able to derive useful information are less critical of the process.


Download ppt "Using NSSE to Inform Course-Evaluation Revision Edward Domber Christopher J. Van Wyk Drew University."

Similar presentations


Ads by Google