Presentation on theme: "IDEA Student Ratings of Instruction Update Carrie Ahern and Lynette Molstad Selected slides reproduced with permission of Dr. Amy Gross from The IDEA Center."— Presentation transcript:
IDEA Student Ratings of Instruction Update Carrie Ahern and Lynette Molstad Selected slides reproduced with permission of Dr. Amy Gross from The IDEA Center www.idea.ksu.edu
5/3/2015 2 Presentation Process at DSU for online IDEA surveys Review IDEA - Student Ratings of Instruction system Forms Reports Questions
5/3/2015 3 Process for IDEA Surveys Faculty receive e-mail for each course with a link to the FIF (new copy feature) Faculty receive unique URL for each course- must provide this to students Faculty receive status update on how many students completed Questions
IDEA as a Diagnostic to Guide Improvement And as a Tool to Evaluate Teaching Effectiveness
IDEA Student Ratings of Instruction The Student Learning Model
5/3/2015 6 Student Learning Model Types of learning must reflect instructor’s purpose Effectiveness determined by student progress on objectives stressed by instructor
IDEA Student Ratings of Instruction Overview Faculty Information Form Student Survey - Diagnostic Form
5/3/2015 9 Faculty Information Form Some thoughts on selecting objectives http://www.theideacenter.org/SelectingObjectives Video for Faculty on completing the FIF http://www.theideacenter.org/FIFVideo
5/3/2015 10 Faculty Information Form One FIF per class being evaluated Course Information IDEA Department Codes Extended list: http://www.idea.ksu.edu/StudentRatings/deptcodes.html 12 Learning Objectives Course Description Items Optional Best answered toward end of semester
5/3/2015 11 FIF: Selecting Objectives 3-5 as “Essential” or “Important” Is it a significant part of the course? Do you do something specific to help students accomplish the objective? Does the student’s progress on the objective influence his or her grade? In general, progress ratings are negatively related to the number of objectives chosen. Research Note 3
5/3/2015 12 Best Practices Multi-section courses Curriculum committee review Prerequisite-subsequent courses Discuss meaning of objectives with students Incorporate into course syllabus
5/3/2015 13 New feature- as of 2/2010 Copy FIF objectives from one course to another Previous FIFs will be available in a drop down menu (linked by faculty e- mail address)
Student Survey Diagnostic Form http://theideacenter.org/sites/default/f iles/Student_Ratings_Diagnostic_For m.pdf
5/3/2015 16 Student Survey: Diagnostic Form Teaching Methods: Items 1-20 Learning Objectives: Items 21-32 Student and Course Student Characteristics: Items 36-39, 43 Course Management/Content: Items 33-35 Global Summary: Items 40-42 Experimental Items: Items 44-47 Extra Questions: Items 48-67 Comments
5/3/2015 17 False False Assumptions Effective instructors effectively employ all 20 teaching methods. The 20 teaching methods items are used to make an overall judgment about teaching effectiveness. Students should make significant progress on all 12 learning objectives
5/3/2015 18 Resources: Administering IDEA www.idea.ksu.edu Client Resources IDEA Resources Best practices Directions to Faculty Using Additional Questions Some Thoughts on Selecting IDEA Objectives Disciplinary Selection of Learning Objectives Guide to Administering IDEA Team Teaching All resources on our website.
Report Background Comparison Groups Converted Scores
5/3/2015 20 The Report: Comparative Information Comparison Groups IDEA Discipline Institution
5/3/2015 21 Comparison Groups (norms) IDEA Comparisons Diagnostic Form Exclude first time institutions Exclude classes with fewer than 10 students No one institution comprises more than 5% of the database 128 institutions 44,455 classes Updated only periodically
5/3/2015 22 Comparison Groups (norms) Discipline Comparisons Updated annually (September 1) Most recent 5 years of data Approximately July 1-June 30 Exclusions same as IDEA Comparisons Also exclude classes with no objectives selected Minimum of 400 classes
5/3/2015 23 Comparison Groups (norms) Institutional Comparisons Updated annually (September 1) Most recent 5 years of data Approximately July 1-June 30 Most recent 5 years of data Includes Short and Diagnostic Form Exclude classes with no objectives selected Minimum of 400 classes
5/3/2015 24 Norms: Converted Averages Method of standardizing scores with different averages and standard deviations Able to compare scores on the same scale Use T Scores Average = 50 Standard Deviation = 10 They are not percentiles
5/3/2015 29 The IDEA Report Diagnostic Form Report What were students’ perceptions of the course and their learning? What might I do to improve my teaching?
5/3/2015 30 Questions Addressed: Page 1 What was the response rate and how reliable is the information contained in the report? What overall estimates of my teaching effectiveness were made by students? What is the effect of “adjusting” these measures to take into consideration factors I can’t control? How do my scores compare to other comparison groups?
5/3/2015 31 Summary Evaluation of Teaching Effectiveness
5/3/2015 32 Questions Addressed: Page 2 How much progress did students report on the learning objectives that I identified as “Essential”? How does this progress compare to the available comparison groups? How much progress did students report on the “Important” objectives? How does this progress compare to the available comparison groups? Do conclusions change if “adjusted” rather than “raw” ratings are used?
5/3/2015 33 Progress on Specific Objectives 4.1+4.1 4.0+4.0 +3.8 +3.9 6
5/3/2015 34 Questions Addressed: Page 3 Which of the 20 teaching methods are most related to my learning objectives? How did students rate my use of these important methods? What changes should I consider in my teaching methods? Do these results suggest some general areas where improvement efforts should focus?
5/3/2015 36 Improving Teaching Effectiveness IDEA Website: http://theideacenter.org/http://theideacenter.org/ IDEA Papers http://www.theideacenter.org/category/helpf ul-resources/knowledge-base/idea-papers http://www.theideacenter.org/category/helpf ul-resources/knowledge-base/idea-papers
5/3/2015 37 Questions Addressed: Page 2 How distinctive is this class with regard to the amount of reading, amount of other work (non-reading) and the difficulty of the subject matter? How distinctive is this class with regard to student self-ratings?
5/3/2015 39 Questions Addressed: Page 4 What was the average rating on each of the questions on the IDEA form? How much variation was there in these ratings? Are the distributions of responses relatively “normal” (bell-shaped) or is there evidence of distinctive subgroups of students? What are the results for the additional questions I used?