Presentation is loading. Please wait.

Presentation is loading. Please wait.

 Shelley A. Chapman, PhD Texas A & M University February 2013.

Similar presentations


Presentation on theme: " Shelley A. Chapman, PhD Texas A & M University February 2013."— Presentation transcript:

1  Shelley A. Chapman, PhD Texas A & M University February 2013

2  “Teaching Effectiveness”  What it is  Uniqueness of IDEA  Conditions for the Good Use of IDEA  3-Phase Process for Faculty Evaluation  Using Reports to Improve Teaching

3 Most Surveys How well do the instructor’s methods resemble those of a “model” teacher? How well do students rate their progress on the types of learning the instructor targeted? Teaching Effectiveness

4 1. Focus on Student Learning 2. Focus on Instructor’s Purpose 3. Adjustments for Extraneous Influences 4. Validity and Reliability 5. Comparison Data 6. Flexibility

5 The instrument o Focuses on learning o Provides suggested action steps

6 The Faculty o Trust the process o Value student feedback o Are motivated to make improvements

7 Campus Culture o Teaching excellence - high priority o Resources to improve - provided o Student ratings - appropriate weight

8 The Evaluation Process o 30-50% of evaluation of teaching o 6-8 classes, more if small (<10) o Not over-interpreted (3-5 performance categories)

9 Teaching effectiveness is determined primarily by students’ progress on the types of learning the instructor targets.

10

11 1.How did students rate their learning experience? 2.What contextual factors impacted those ratings? 3.How do my scores compare to: IDEA, discipline, and institution? 4.What might I do to facilitate better learning for my students next time?

12 Wisdom Knowledge Information Data What the Report Can Provide Calculation of Scores Context: Variables and Comparisons Suggested Action Steps

13 As Part of a Faculty Evaluation Process

14 Student Ratings External Perspective Artifacts Balanced Plan for Summative Evaluation

15 Artifacts Syllabi Graphic Organizers Assignments and project descriptions Rubrics Written Teaching Philosophy/Reflections Samples of Student Work CATs and results

16 Classroom Observation Classroom Visitation Invited Presentations Alumni Surveys Focus Groups of Graduating Students External Perspective

17 TimeWhat HappenedWhat Was Said

18 TimeWhat HappenedWhat Was Said 8:05 8:10 8:15 Instructor shut door Students are shuffling papers, opening books. Student comes in late Several students raise hands Female in first row is called on Instructor (I): OK, Class. Let’s begin. Make sure you turned in your homework as you came in. Today we will begin our discussion on the brain. Turn in your textbooks to chapter 5. Is your brain more like a computer or a jungle? Who would like to respond first? Student (S) My brain is a jungle! I am so unorganized! (class laughs)…

19 Instructor M F F F MMM M M M M M F F F F F F F F F

20 Administer Appropriately Collect 6-8 Reports (more if class size is <10) 30-50% of Overall Evaluation Student Ratings

21 Student Comments-formative Be mindful of standard error of measurement (±.3) Use 3-5 Performance Categories Student Ratings

22 Set Expectations Collect Data Use Data

23 Set Expectations Collect Data Use Data I.Set Expectations What does this entail regarding IDEA?

24 Criterion o Use averages on 5-point scale o Recognize that some objectives are more difficult to achieve o “Authenticate” objectives Page 1

25 Use Converted Averages o IDEA o Discipline o Institution

26

27 40% Similar Lowe r 20% 10% Much Lower 10% Much Higher 20% Highe r Gray Band

28

29 Below Acceptable Standards Marginal, Needs Improvement Meets Expectations Exceeds Expectations Outstanding Does Not Meet Expectations Meets Expectations Exceeds Expectations

30 Criterion Average Rating Effectiveness CategoryNormative T-Score Below 3.0Below acceptable standardsBelow 38 3.0-3.4Marginal, improvement needed38-44 3.5-3.9Meets expectations45-54 4.0-4.4Exceeds expectations55-62 4.5 or higherOutstanding63 or higher

31 Set Expectations Collect Data Use Data II. Collect Data What do you look for regarding IDEA?

32 Create value for student feedback Monitor and Communicate through multiple modalities: Twitter Facebook Other Prepare Students Talk about it Syllabus

33 Objective 3: Learning to apply course material (to improve thinking, problem solving, and decisions) Students will be able to apply the methods, processes, and principles of earth science to understanding natural phenomena Students will think more critically about the earth and environment Objective 8: Developing skill in expressing myself orally or in writing Students will be able to present scientific results in written and oral forms IDEA Center Learning Objective Course Learning Outcomes

34 Pages 1 and 2 What were students’ perceptions of the course and their learning?

35 Were the appropriate objectives selected? How many? Do they match the course? How might you “authenticate” the objectives selected?

36 3-5 as “Essential” or “Important” o Is it a significant part of the course? o Do you do something specific to help students accomplish the objective? o Does the student’s progress on the objective influence his or her grade? Be true to your course.

37 What were the students’ perceptions of their course and their learning?

38

39 Your Average (5-point Scale) RawAdj. A.Progress on Relevant Objectives 1 Four objectives were selected as relevant (Important or Essential—see page 2) 4.14.3 1 If you are comparing Progress on Relevant Objectives from one instructor to another, use the converted average.

40 Progress On Relevant Objectives 4 4.3 + 4.3 4.1 4.2 3.6 5

41 Report Page 1 Your Average Score (5-point scale) RawAdj. A.Progress on Relevant Objectives Four objectives were selected as relevant (Important or Essential—see page 2) 4.14.3 Overall Ratings B. Excellent Teacher 4.74.9 C. Excellent Course 4.14.4 D. Average of B & C 4.44.7 Summary Evaluation (Average of A & D) 4.34.5 50% 25%

42

43 Work Habits (Item 43) Student Motivation (Item 39) High Avg. Avg. Low Avg. Low High 4.484.384.284.134.04 High Avg. 4.384.294.143.963.76 Average 4.284.144.013.833.64 Low Avg. 4.154.053.883.703.51 Low 4.113.963.783.583.38  Gaining Factual Knowledge – Average Progress Ratings Technical Report 12, page 40

44 Work Habits (Item 43) Student Motivation (Item 39) High Avg. Avg. Low Avg. Low High 4.484.38 High Avg. 4.384.29 Average 4.01 Low Avg. 3.703.51 Low 3.583.38  Gaining Factual Knowledge – Average Progress Ratings Technical Report 12, page 40

45 PurposeRaw or Adjusted? How much did students learn? Raw What were the instructor’s contributions to learning? Adjusted How do faculty compare? Adjusted

46 Do raw scores meet or exceed Expectations?* Are adjusted scores lower or higher than raw scores? Use adjusted scores Use raw scores Lower Yes Higher When to Use Adjusted Scores *Expectations defined by your unit. No

47 Set Expectations Collect Data Use Data III. Use Data Which data will you use and how?

48 .  Keep track of reports  Look for longitudinal trends  Use for promotion and tenure Created by Pam Milloy, Grand View University Available from The IDEA Center Website

49 Summative (pp.1-2)  Criterion or Norm- referenced  Adjusted or raw  Categories of Performance  30-50% of Teaching Evaluation  6-8 Classes (more if small) Formative (p.3)  Identify areas to improve  Access applicable resources from IDEA website  Read and have conversations  Implement new ideas

50 Collect Feedback Interpret Results Read & Learn Reflect & Discuss Improve IDEA resources that are keyed to reports Talk with colleagues Try new ideas Online, Paper What the reports say and what they mean

51 Relationship of Learning Objectives to Teaching Methods

52

53

54


Download ppt " Shelley A. Chapman, PhD Texas A & M University February 2013."

Similar presentations


Ads by Google