Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Whistleblower Effect: A Quantitative Analysis Demonstrating that Reporting Students for Academic Dishonesty Negatively Impacts Faculty Evaluations.

Similar presentations


Presentation on theme: "The Whistleblower Effect: A Quantitative Analysis Demonstrating that Reporting Students for Academic Dishonesty Negatively Impacts Faculty Evaluations."— Presentation transcript:

1 The Whistleblower Effect: A Quantitative Analysis Demonstrating that Reporting Students for Academic Dishonesty Negatively Impacts Faculty Evaluations Mihran Aroian

2 Presentation Outline -Student Judicial Services Overview -Faculty-in-Residence program -Faculty Perceptions -Why Conduct this Research -Results -Recommendations -Questions

3 SJS Overview

4 Student Judicial Services As a part of The University of Texas at Austin’s Office of the Dean of Students, Student Judicial Services (SJS) fosters moral development on campus through the resolution of academic integrity and conduct-related matters as well as educational outreach to students, faculty, and staff. In addition to our educational component, SJS investigates alleged violations of the Institutional Rules (both academic and non-academic) and implement the disciplinary process with a focus on student learning and development.

5 Primary SJS Roles Outreach/Education Faculty Consultations Case Management 1,600-1,800 annual referrals (AI and conduct) Institutional Rule Revision

6 Some Statistics  3,300 faculty  2,500 TA’s  50,000 student body  140 academic departments  2011-2013 academic years  320 faculty/TA’s reported cases to SJS  1,155 AI cases over 2 years  50% of all submissions came from (can you guess how many faculty)?  15 out of 320

7 SJS Data  Administrative disposition versus Faculty Disposition – 55/45  Top 3 AI violations  Cheating, Plagiarism, Collusion  Classification  Freshman (10); Sophomore (20%); Junior (20%); Senior (35%); Grad/Professional (15%)  GPA – even distribution  Gender (M/F) – 65/35

8 Establishing a Faculty-in- Residence Program at UT Austin

9 Academic Integrity  As Steve Covey says “seek first to understand, then to be understood”  Improving AI is simple  “just as plain as the nose on your face”  For example:  What is the shape and color of a Stop Sign?  What is the shape and color of a Yield Sign?

10

11

12 Benefits to SJS  Dedicated outreach time not available to typical conduct staff  Cultivation of relationships outside of “crisis” events typical in conduct offices  Increased exposure through meetings with stakeholders previously unidentified by SJS  Ally outside of the office to speak with unaffiliated individuals  Addition of a “faculty perspective” to conversations within the conduct office  Discussions at the Macro-level (student conduct as a field) rather than at the Micro-level (a student’s conduct)

13 Benefits for Me!  Greater appreciation for SJS  Understanding the complexity of student conduct  Part of the team  Ability to cross boundaries  Support SJS with other stakeholders  Educate campus stakeholders regarding SJS  Involvement with student groups  Being part of the solution  Satisfying my own needs as an educator

14 Faculty Perceptions  Lack of understanding by faculty as to what constitutes cheating.  Not all faculty engaged in high AI standards.  Faculty unaware of common forms of cheating.  Faculty unwilling to expend effort to meet with students and file paperwork.  Faculty avoid student conflict/tension  AI is an afterthought in the class

15 Faculty Perceptions  Lack of consistent message to students.  High level of indifference by many campus stakeholders.  Some faculty do not care because they are not rewarded for teaching.  Faculty need assistance on identifying cheating and how to deal with students who do cheat.

16 The Whistleblower Effect: A Quantitative Analysis Demonstrating that Reporting Students for Academic Dishonesty Negatively Impacts Faculty Evaluations Authors: Mihran Aroian Raymond Brown

17 Scenario A  Three students submit essentially the same writing assignment.  Student A provided paper to student B & C  Student B & C take responsibility  Student A believes he is innocent  Students upset at instructor  Students are found in violation  Students are members of the same student organization as are other students in the class

18 Scenario B  A group presentation of six students includes significant plagiarized material  Initially, nobody takes responsibility  One-on-one conversations  Students start throwing each other under the bus  Case sent to SJS for investigation  Investigation completed the following semester  Two of the six found in violation

19 Question  Will the end-of-semester evaluations for these instructors increase, decrease, or not be affected by this situation?  Scenario A?  Scenario B?

20 Why This Study?  Interviewing stakeholders on campus as the faculty-in-residence  Faculty were hesitant to report students  Faculty want to avoid conflict/confrontations  Belief that reporting students would impede professional advancement  As someone who always reports, I was shocked!  Original hypothesis  Perception by faculty was a false perception and analysis would demonstrate no correlation

21 Introduction  Examine the impact of sanctioning students for acts of academic misconduct on student evaluations of teaching (SETs).  use of student evaluations for faculty advancement and promotions  faculty may be conflicted between enforcing academic integrity and maximizing their professional advancement with high SET scores.

22 Method  A multi-level modeling design with 8,940 end-of-semester student evaluations  32 faculty members from 17 academic departments was employed to determine if there was a whistleblower effect.

23 Literature Search  Extensive qualitative articles  Lack of quantitative analysis  Only quantitative studies focused on grade inflation and higher SETs  GPAs have been increasing while SAT scores have been decreasing

24 Genesis of Research Design  Initial evaluation - faculty against peer instructors  Compare faculty against all faculty within department  Compare faculty against all faculty within college  Final analysis – compare faculty against themselves

25 Creation of Sample  Compare faculty against themselves  Analyze evaluations for course in a semester when they reported cheating  Analyze evaluations for same course in a semester when they did not report cheating  Evaluation must be for the same class  Looked for the closest semester in temporal proximity - summers excluded  Must have reported 3 or more cases

26 Challenges to Methodology  Highest reporting instructors reported cheating every semester therefore were not included  Cases had to be submitted to SJS prior to 10 days before the end of semester  Course Instructor Survey’s are given up to 10 days before the end of the semester

27 The Course Instructor Survey  Five point scale ranging from “strongly disagree” to “strongly agree”  Questions asked on survey  The course was well organized  The instructor communicated information effectively  The instructor showed interest in the progress of the student  The tests and assignments were usually graded and returned promptly  The instructor made me feel free to ask questions, disagree, and express my ideas

28 Survey  Overall rating for the instructor and course ranging from “very unsatisfactory” to “excellent” on a 5- point scale  Overall, this instructor was  Overall, this course was  Key criteria for performance evaluation is the overall instructor rating

29 Limitations of Analysis  Results based on data collected at one university  Unable to link evaluations to students  Unable to test highest reporting faculty

30 Results  Statistically significant effect for reporting students (p<.001)  Estimated mean rating for classes in which students were not reported was 3.95  Estimated mean rating for classes in which students were reported was 3.75  Hypothesis - instructors who report students tend to be more demanding of their students in general than those who do not report

31 Discussion  Is it time for a policy review?  Instructors expected to enforce AI  Colleges want to instill ethics – beyond college  By not enforcing, instructors are implicitly condoning cheating and undermining accurate student assessment  Evaluations play a critical role for advancement for both tenure-track and non tenure-track faculty  Do you harm yourself when you enforce AI?

32 Student Responsibility  UT participates in the National Assessment of Student Conduct Adjudication Process (NASCAP) Project  21 higher education institutions participate  Upon completion of adjudication, students complete survey  When asked the question “What was the outcome of your case” 6.9% of UT students answered “my case was dismissed”  In reality, only 0.15% of the cases were dismissed

33 Recommendations  Encourage all faculty to communicate expectations of AI  Provide regular communication to faculty regarding:  How students cheat  How to deal with cheaters  How to confront students  Best practices  How to reduce cheating  Require students to complete an online AI tutorial each year

34 Recommendations  Encourage faculty to increase frequency of message  Each academic department should have an AI point person  Faculty should regularly update assignments and exams

35 Recommendations  Include a question about faculty AI expectations and practices on evals  AI should be part of new faculty orientation  Administrators should include consideration for faculty who report cheating regarding merit review and promotions

36

37


Download ppt "The Whistleblower Effect: A Quantitative Analysis Demonstrating that Reporting Students for Academic Dishonesty Negatively Impacts Faculty Evaluations."

Similar presentations


Ads by Google