Presentation is loading. Please wait.

Presentation is loading. Please wait.

DEVELOPED BY MARY BETH FURST ASSOCIATE PROFESSOR, BUCO DIVISION AMY CHASE MARTIN DIRECTOR OF FACULTY DEVELOPMENT AND INSTRUCTIONAL MEDIA UNDERSTANDING.

Similar presentations


Presentation on theme: "DEVELOPED BY MARY BETH FURST ASSOCIATE PROFESSOR, BUCO DIVISION AMY CHASE MARTIN DIRECTOR OF FACULTY DEVELOPMENT AND INSTRUCTIONAL MEDIA UNDERSTANDING."— Presentation transcript:

1 DEVELOPED BY MARY BETH FURST ASSOCIATE PROFESSOR, BUCO DIVISION AMY CHASE MARTIN DIRECTOR OF FACULTY DEVELOPMENT AND INSTRUCTIONAL MEDIA UNDERSTANDING & USING YOUR ONLINE IDEA RESULTS

2 Individual Development and Educational Assessment

3 STUDENT LEARNING MODEL Specific teaching behaviors influence certain types of student progress under certain circumstances Specific Teaching Behaviors Student Progress Certain Circumstances

4 HOW IS THE INFORMATION CAPTURED? Faculty Information Form (FIF) Relevance of 12 learning objectives Describe course conditions Student Survey Diagnostic Form Overall teaching effectiveness Course improvement strategies Short form Overall teaching effectiveness

5 13 Learning Objectives 6 Categories Importance Rating Essential (E) Important (I) Minor/No importance (M) (Determined by the division)

6 HOW IS THE INFORMATION CAPTURED? Faculty Information Form (FIF) Relevance of 13 learning objectives Describe course conditions Student Survey Diagnostic Form Overall teaching effectiveness Course improvement strategies Short form Overall teaching effectiveness

7 Teaching methods/behaviors Only used for diagnostic purpose Not required to employ all methods Not used to make summary evaluation of teaching effectiveness Omitted in short form Student progress on learning objectives Identical to FIF Also in Short form Course Characteristics Useful for identifying characteristics of specific disciplines

8 SUMMARY OF IDEA MEASURES Specific Teaching Behaviors Student Progress Diagnostic Form Teaching methods 1 -20 Diagnostic Form Progress on 12 Learning Objectives - Items 21-32 Faculty Information Form Rating of Learning objectives Minor - weighted 0 Important - weighted 1 Essential - weighted 2 Certain Circumstances Student Motivation DF - Items 36, 38, 39 Student Effort DF - Item 37 Course Difficulty DF- items 33-35 Student's work habits DF - Item 43 Class Size FIF - Number enrolled Summary measures of effectiveness DF - Items 40-42

9 Summative Tab

10 Formative Tab

11 Quantitative Tab

12 Comments What did you like best about this course? What did you like least? What would you change? Qualitative Tab

13 No Segment Howard Community College Division All Sections in Course Segment Comparison Tab* * NEW

14 RAW AND ADJUSTED SCORES

15 Raw score – actual rating Adjusted score – raw score adjusted for ‘extraneous factors’ Major Factors: Student motivation to take the course & Student work habits Minor factors: Student effort, Difficulty of subject matter, & Class size

16 QUESTIONS ADDRESSED BY IDEA REPORT Q1. Overall, how effectively was this class taught? Q2. How does this compare with the ratings of other teachers? Q3. Were you more successful in facilitating progress on some class objectives than on others? Q4. How can instruction be made more effective? Q5. Do more salient characteristics of this class and its students have implications for instruction?

17 QUESTION 1 Overall, how effectively was this class taught?

18 PRO – Progress on Relevant Objectives Overall index of teaching effectiveness Single best estimate of teaching effectiveness Two additional overall measures of teaching effectiveness (1- definitely false; 5- definitely true) Overall, I rate this instructor as an excellent teacher Overall, I rate this course as excellent Avg. of these two regarded as equal in value to PRO Hence summary evaluation – average of these 2 with PRO

19 OBTAINING AVERAGE “PRO” SCORE Essential objectives weighted double Minor objectives ignored {(4.1*2)+(4.0*2)+3.8+3.9} ────────────────── 6

20 QUESTION 2 How do your ratings compare with the ratings of other teachers?

21 Converted averages are preferred when making comparisons among faculty members or classes They take into account the fact that average progress ratings are much higher for some objectives than for others Assure faculty members that they will not be penalized for selecting objectives that are especially difficult Compare with  All Classes in IDEA database  All Classes in your institution  All Classes in your course CONVERTED AVERAGES

22 QUESTION 3 Were you more successful in facilitating progress on some class objectives than on others?

23

24 Knowing the percent of students making ratings in the two highest and two lowest categories is helpful in identifying classes where student outcomes are “bi-modal”  divided fairly evenly between students who profited greatly and those whose sense of progress was disappointing Reason 1 - Often occur when a substantial portion of the class lacks the background needed to profit from the course BI-MODAL RATINGS Reason 2 - May reflect differences in preferred learning styles of students

25 QUESTION 4 How can instruction be made more effective?

26 INSTRUCTIONAL IMPROVEMENT Teaching methods grouped into 5 teaching styles 5 point scale ; 1 - hardly ever … 5 - almost always A.Stimulating student interest B.Fostering student collaboration C.Establishing rapport D.Encouraging student involvement E.Structuring classroom experiences

27

28 QUESTION 5 Do more salient characteristics of this class and its students have implications for instruction?

29 COURSE CHARACTERISTICS Students described the class by comparing it to other classes they have taken in terms of : (1) amount of reading (2) amount of work in non-reading assignments (3) difficulty Average ratings are compared with “All classes” in the IDEA database; If sufficient data were available, comparisons are also made with classes in the broad discipline group in which this class was categorized and all other classes at your institution. Because relatively large disciplinary differences have been found on these three characteristics the disciplinary comparison may be especially helpful.

30 USING STATISTICAL DETAIL Distribution of responses Average rating Standard deviation of ratings Attention should be concentrated on ‘ important ’ or ‘ essential ’ objectives and on methods that are closely related to progress ratings on these objectives

31 USING STATISTICAL DETAIL Standard deviations of about 0.7 are typical. When these values exceed 1.2, the class exhibits unusual diversity. In such cases, distribution of responses should be examined closely, primarily to detect tendencies toward a bimodal distribution (one in which class members are about equally divided between the “high” and “low” end of the scale, with few “in-between.”) Bimodal distributions suggest that the class contains two types of students who are so distinctive that what “works” for one group will not for the other.

32 CONCLUSION Reliability of results – largely affected by class size Validity Consistent results on positive correlation between “ student rating on progress ” & “ faculty rating on importance ” (besides other tests) Results relevant for both summative evaluation and formative evaluation IDEA suggests a comprehensive evaluation process and that student ratings constitute no more than 30-50% of the final judgment.

33 WANT ADDITIONAL SUPPORT? Watch for professional development offerings Contact IMFacultyLab@howardcc.edu to request individual support IMFacultyLab@howardcc.edu


Download ppt "DEVELOPED BY MARY BETH FURST ASSOCIATE PROFESSOR, BUCO DIVISION AMY CHASE MARTIN DIRECTOR OF FACULTY DEVELOPMENT AND INSTRUCTIONAL MEDIA UNDERSTANDING."

Similar presentations


Ads by Google