Presentation is loading. Please wait.

Presentation is loading. Please wait.

IDEA Student Ratings of Instruction Update Carrie Ahern and Lynette Molstad Selected slides reproduced with permission of Dr. Amy Gross from The IDEA Center.

Similar presentations


Presentation on theme: "IDEA Student Ratings of Instruction Update Carrie Ahern and Lynette Molstad Selected slides reproduced with permission of Dr. Amy Gross from The IDEA Center."— Presentation transcript:

1 IDEA Student Ratings of Instruction Update Carrie Ahern and Lynette Molstad Selected slides reproduced with permission of Dr. Amy Gross from The IDEA Center www.idea.ksu.edu

2 5/3/2015 2 Presentation Process at DSU for online IDEA surveys Review IDEA - Student Ratings of Instruction system Forms Reports Questions

3 5/3/2015 3 Process for IDEA Surveys Faculty receive e-mail for each course with a link to the FIF (new copy feature) Faculty receive unique URL for each course- must provide this to students Faculty receive status update on how many students completed Questions

4 IDEA as a Diagnostic to Guide Improvement And as a Tool to Evaluate Teaching Effectiveness

5 IDEA Student Ratings of Instruction The Student Learning Model

6 5/3/2015 6 Student Learning Model Types of learning must reflect instructor’s purpose Effectiveness determined by student progress on objectives stressed by instructor

7 IDEA Student Ratings of Instruction Overview Faculty Information Form Student Survey - Diagnostic Form

8 IDEA: FIF Faculty Information Form

9 5/3/2015 9 Faculty Information Form Some thoughts on selecting objectives http://www.theideacenter.org/SelectingObjectives Video for Faculty on completing the FIF http://www.theideacenter.org/FIFVideo

10 5/3/2015 10 Faculty Information Form One FIF per class being evaluated Course Information IDEA Department Codes Extended list: http://www.idea.ksu.edu/StudentRatings/deptcodes.html 12 Learning Objectives Course Description Items Optional Best answered toward end of semester

11 5/3/2015 11 FIF: Selecting Objectives 3-5 as “Essential” or “Important” Is it a significant part of the course? Do you do something specific to help students accomplish the objective? Does the student’s progress on the objective influence his or her grade? In general, progress ratings are negatively related to the number of objectives chosen. Research Note 3

12 5/3/2015 12 Best Practices Multi-section courses Curriculum committee review Prerequisite-subsequent courses Discuss meaning of objectives with students Incorporate into course syllabus

13 5/3/2015 13 New feature- as of 2/2010 Copy FIF objectives from one course to another Previous FIFs will be available in a drop down menu (linked by faculty e- mail address)

14 5/3/2015 14

15 Student Survey Diagnostic Form http://theideacenter.org/sites/default/f iles/Student_Ratings_Diagnostic_For m.pdf

16 5/3/2015 16 Student Survey: Diagnostic Form Teaching Methods: Items 1-20 Learning Objectives: Items 21-32 Student and Course Student Characteristics: Items 36-39, 43 Course Management/Content: Items 33-35 Global Summary: Items 40-42 Experimental Items: Items 44-47 Extra Questions: Items 48-67 Comments

17 5/3/2015 17 False False Assumptions Effective instructors effectively employ all 20 teaching methods. The 20 teaching methods items are used to make an overall judgment about teaching effectiveness. Students should make significant progress on all 12 learning objectives

18 5/3/2015 18 Resources: Administering IDEA www.idea.ksu.edu  Client Resources  IDEA Resources Best practices Directions to Faculty Using Additional Questions Some Thoughts on Selecting IDEA Objectives Disciplinary Selection of Learning Objectives Guide to Administering IDEA Team Teaching All resources on our website.

19 Report Background Comparison Groups Converted Scores

20 5/3/2015 20 The Report: Comparative Information Comparison Groups IDEA Discipline Institution

21 5/3/2015 21 Comparison Groups (norms) IDEA Comparisons Diagnostic Form Exclude first time institutions Exclude classes with fewer than 10 students No one institution comprises more than 5% of the database 128 institutions 44,455 classes Updated only periodically

22 5/3/2015 22 Comparison Groups (norms) Discipline Comparisons Updated annually (September 1) Most recent 5 years of data Approximately July 1-June 30 Exclusions same as IDEA Comparisons Also exclude classes with no objectives selected Minimum of 400 classes

23 5/3/2015 23 Comparison Groups (norms) Institutional Comparisons Updated annually (September 1) Most recent 5 years of data Approximately July 1-June 30 Most recent 5 years of data Includes Short and Diagnostic Form Exclude classes with no objectives selected Minimum of 400 classes

24 5/3/2015 24 Norms: Converted Averages Method of standardizing scores with different averages and standard deviations Able to compare scores on the same scale Use T Scores Average = 50 Standard Deviation = 10 They are not percentiles

25 Report Background Adjusted Scores

26 5/3/2015 26 Adjusted Scores Control for factors beyond instructor’s control Regression equations Link to video clip explaining Adjusted Scores http://theideacenter.org/taxonomy/term/ 109

27 5/3/2015 27 Adjusted Scores: Diagnostic Form Student Work Habits (#43) Student Motivation (#39) Class Size (Enrollment, FIF) Student Effort (multiple items) Course Difficulty (multiple items)

28 IDEA...The Report

29 5/3/2015 29 The IDEA Report Diagnostic Form Report What were students’ perceptions of the course and their learning? What might I do to improve my teaching?

30 5/3/2015 30 Questions Addressed: Page 1 What was the response rate and how reliable is the information contained in the report? What overall estimates of my teaching effectiveness were made by students? What is the effect of “adjusting” these measures to take into consideration factors I can’t control? How do my scores compare to other comparison groups?

31 5/3/2015 31 Summary Evaluation of Teaching Effectiveness

32 5/3/2015 32 Questions Addressed: Page 2 How much progress did students report on the learning objectives that I identified as “Essential”? How does this progress compare to the available comparison groups? How much progress did students report on the “Important” objectives? How does this progress compare to the available comparison groups? Do conclusions change if “adjusted” rather than “raw” ratings are used?

33 5/3/2015 33 Progress on Specific Objectives 4.1+4.1 4.0+4.0 +3.8 +3.9 6

34 5/3/2015 34 Questions Addressed: Page 3 Which of the 20 teaching methods are most related to my learning objectives? How did students rate my use of these important methods? What changes should I consider in my teaching methods? Do these results suggest some general areas where improvement efforts should focus?

35 5/3/2015 35 Improving Teaching Effectiveness

36 5/3/2015 36 Improving Teaching Effectiveness IDEA Website: http://theideacenter.org/http://theideacenter.org/ IDEA Papers http://www.theideacenter.org/category/helpf ul-resources/knowledge-base/idea-papers http://www.theideacenter.org/category/helpf ul-resources/knowledge-base/idea-papers

37 5/3/2015 37 Questions Addressed: Page 2 How distinctive is this class with regard to the amount of reading, amount of other work (non-reading) and the difficulty of the subject matter? How distinctive is this class with regard to student self-ratings?

38 5/3/2015 38 Description of Course and Students

39 5/3/2015 39 Questions Addressed: Page 4 What was the average rating on each of the questions on the IDEA form? How much variation was there in these ratings? Are the distributions of responses relatively “normal” (bell-shaped) or is there evidence of distinctive subgroups of students? What are the results for the additional questions I used?

40 5/3/2015 40 Statistical Detail

41 5/3/2015 41 Statistical Detail

42 Questions & Discussion


Download ppt "IDEA Student Ratings of Instruction Update Carrie Ahern and Lynette Molstad Selected slides reproduced with permission of Dr. Amy Gross from The IDEA Center."

Similar presentations


Ads by Google