Presentation on theme: "Using Student Ratings to Improve Program Quality and Student Learning Dr. Randi Hagen, Flagler College Dr. Sarah Logan, Angelo State University Dr. William."— Presentation transcript:
Using Student Ratings to Improve Program Quality and Student Learning Dr. Randi Hagen, Flagler College Dr. Sarah Logan, Angelo State University Dr. William Pallett, The IDEA Center Dr. Barry Stein, Tennessee Tech University
Introduction: IDEA Student Ratings Presenters Content –Bill PallettIDEA & Program Assessment –Randi Hagen Student Learning Outcomes –Bill PallettQEP at Tennessee Tech –Sarah Logan Involving the Campus Feel free to ask questions during presentations. At the end, we want to hear your campus stories.
IDEA and Program Assessment: Aggregating Data Assessment Questions: IDEA as Supporting Evidence –Are course emphases consistent with stated curricular purposes? –Do courses’ overall student progress ratings compare favorably to courses at other institutions? –When a learning objective is selected as Essential or Important, does student self-report of learning meet our expectations?
Assessment Questions What teaching methods might we employ more effectively to support student learning? How do students’ work habits, motivation, etc. compare to students at other institutions? How do students view course work demands? What factors do instructors report have a positive/negative influence on student learning?
Longitudinal Questions Do results change over time in the desired direction? Does IDEA provide supporting evidence that innovations and interventions have been successful?
Student Learning Outcomes What should a student know and do as a result of taking this course at Flagler College?
IDEA Summary of Your Teaching Effectiveness
Instructor’s Progress on Specific Objectives
Areas to Improve Your Teaching Effectiveness
Getting Started Improving Your Teaching Effectiveness Design your college professional development program around the weaknesses of your faculty Individual Use: POD-IDEA Center Notes Individual or Group Use: IDEA Papers IDEA Seminars
Use of Data File InstructorDisciplineClass #Score AACC CACC EACC CACC EACC GACC HANT HANT KANT MART NART OART Progress on Relevant Objectives: Progress on those objectives selected by the instructor as important or essential to this class. 1= No apparent progress 2= Slight progress 3= Moderate progress 4= Substantial progress 5= Exceptional progress
Quality Enhancement Plan (QEP) QEP and SACS Accreditation –IDEA –NSSE
NSSE and IDEA Relationship NSSE Benchmarks Level of Challenge Active/Collaborative Learning Student/Faculty Interaction IDEA Objectives 3. Apply course material 4. Professional point of view 11. Analysis/critical evaluation 5. Work with others as team USE: Individual Report Group Summary pp. 5-6 IDEA Items 8. Stimulate effort beyond most classes 15. Inspired to set own challenging goals 5. Teams/discussion groups 14. “hands on” projects 16. Share ideas with others 18. Help each other understand USE: Individual Report Group Summary pp. 7-8
Using IDEA Results for the QEP at Tennessee Tech University Measuring Progress Selecting a QEP Topic Assessment Plan for the QEP Identifying Problem Areas
IDEA Teaching Evaluation Instrument Frequency Goals Selected Progress on Goals
Frequency IDEA Objectives Selected
Progress on IDEA Teaching Objectives
Assessment Plan Options - IDEA SampleCompare University Wide UseFrequency & Student Progress over time Targeted DisciplinesFrequency & Student Progress over time Targeted CoursesOld vs New Course (Frequency & Progress) Cost
IDEA Works with Other Assessments Enrolled Student Surveys (NSSE) Alumni Surveys Employer Surveys Performance Measures*
Involving the Campus: Angelo State University Diagnostic form features –Improvement of teaching –Programs add items of interest to satisfy accrediting agencies Answer internal questions Group reports –External comparisons at a point in time –Internal comparisons across time
Using Group Reports Campus-wide –Provost reviews the university and college reports and compares results from multiple years –Deans review college and department results University, college, and department results for Excellent teacher Excellent course Progress on Objectives are on a network drive for everyone to use.
Academic Departments Meet to compare current department ratings to –External benchmark –Former department ratings –Current college ratings Questions –In what ways are comparisons meaningful? –With what level of ratings are we satisfied? –What are reasons for lower than anticipated ratings? –In what areas do we want to improve?
Academic Programs Faculty teaching the same course discuss choice of objectives (and list them on syllabi). Faculty with highly structured curricula outline objectives for each course level. Questions –What objectives are appropriate for certain courses? –In what ways do objectives differ for upper- vs. lower-level courses so that students receive a well-rounded educational experience?
Summary: IDEA Student Ratings Assessment –Students’ perceived learning on course goals – Efficacy of instructors’ teaching methods Provide evidence for –Review and improvement of core curriculum academic programs –Accreditation and other reporting Faculty development