Presentation is loading. Please wait.

Presentation is loading. Please wait.

An Evaluation of an Observation Rubric Used to Assess Teacher Performance Kent Sabo Kerry Lawton Hongxia Fu Arizona State University.

Similar presentations


Presentation on theme: "An Evaluation of an Observation Rubric Used to Assess Teacher Performance Kent Sabo Kerry Lawton Hongxia Fu Arizona State University."— Presentation transcript:

1 An Evaluation of an Observation Rubric Used to Assess Teacher Performance Kent Sabo Kerry Lawton Hongxia Fu Arizona State University

2 Introduction School reform in Arizona –TAP reform model –Revamped teacher evaluation & compensation systems –Teacher evaluation includes Calculation of teacher “value-added” on achievement. Qualitative measure of teacher behavior and practice

3 TAP Rubric Developed in 2001 by TAP (now NIET) Teacher effectiveness framework includes four domains. –Instruction (D1) –Designing & Planning Instruction (D2) –Learning Environment (D3) –Teacher Responsibilities Rubric includes 19 items divided across first three domains.

4 D1: Instruction 1. Standards & Objectives7. Academic Feedback 2. Motivating Students8. Grouping Students 3. Present Instructional Content9. Teacher Content Knowledge 4. Lesson Structure & Pacing10. Teacher Know of Students 5. Activities & Materials11. Thinking 6. Questioning12. Problem Solving

5 D2: Designing & Planning Instruction 1.Instructional Plans 2.Student Work 3.Assessment

6 D3: The Learning Environment 1.Expectations 2.Managing Student Behavior 3.Environment 4.Respectful Culture

7 Rubric Scoring Between 4-6 observations per school year Multiple observers –Administrators & experienced teachers are observers –Trained & certified to score rubric by NIET

8 Rubric Scoring Items are scored 1-5 –Behavioral definitions at 1, 3, 5 Behaviors, characteristics, artifacts of teacher performance –3= “Proficient” Scores are averaged & weighted by both domain & observer type –For a single observation, item scores are averaged into subscale scores. An overall score is also averaged & entered into equation for performance awards

9 Purpose of Current Study Establishing validity is crucial when an instrument is used in making high-stakes decisions (Messick, 1995). To date, most studies have focused on establishing evidence for test-retest reliability & relationship with value-added achievements. This study investigated the proposed latent variable structure of the rubric.

10 Methods Step 1. Conduct CFA on 2 nd -order factor model –Not explicitly defined by TAP but implicit in TAP scoring system (averaging) Step 2. If CFA suggests miss-specification, conduct EFA on 19 indicators Sample: 1497 teacher observation scores –Across 53 public schools in Arizona.

11 2 nd -order Factor Model

12 Methods 2 nd - order model is just-identified. –Impossible to interpret fit statistics –Must make decision on fit by analyzing nested models—both more and less restrictive (Rindskoph & Rose, 1988). Additional models examined: –Bi-factor (least restrictive) –Correlated group factor –One-factor (most restrictive)

13 Bi-factor Model

14 Correlated Group Factor Model

15 One-factor Model

16 CFA Results Results for all models suggest misspecification. –Bi-factor: CFI =.96; RMSEA =.05; SRMR =.02; 8 of 12 neg. factor loadings (Instruction); 18 sig. mod. indices –Group Factor: CFI =.93; RMSEA =.08; SRMR =.04; High factor corr. (.81-.95); 29 sig. mod. indices –One Factor: CFI =.90; RMSEA =.09; SRMR =.04; 79 sig. mod. indices

17 Step 2: Method Exploratory factor analysis on 19 indicators –Extraction: Principal axis factoring –Factor retention: Parallel analysis –Rotation: Promax Sample: 1497 teachers (observation scores)

18 Results of PAF Extraction (Factor Pattern Matrix—Rotated) Factors Item123 Instructional Plans.780-.013.034 Student Work.728-.027.132 Assessment.796-.113.076 Expectations.442.499-.068 Managing Student Behavior.004.869-.055 Environment-.005.706.180 Respectful Culture-.059.898.043 St&ards & Objectives.835-.008-.041 Motivating Students.459.278.109 Presenting Instruct. Content.772.091-.054 Lesson Structure & Pacing.616.264-.095 Activities & Materials.692.018.110 Questioning.511.076.222 Academic Feedback.507.167.111 Grouping Students.481.284.019 Teacher Content Knowledge.714.054.030 Teacher Knowledge of Students.500.311-.044 Thinking.060.032.830 Problem Solving.027.012.856

19 PAF Extraction (Rotated): Factor 1 Instructional Plans.780Lesson Structure & Pacing.616 Student Work.728Activities & Materials.692 Assessment.796Questioning.511 Expectations.442Academic Feedback.507 Managing Student Behavior.004Grouping Students.481 Environment-.005Teacher Content Knowledge.714 Respectful Culture-.059Teacher Knowledge of Students.500 Standards & Objectives.835Thinking.060 Motivating Students.459Problem Solving.027 Presenting Instruct. Content.772

20 PAF Extraction (Rotated): Factor 2 Instructional Plans-.013Lesson Structure & Pacing.264 Student Work-.027Activities & Materials.018 Assessment-.113Questioning.076 Expectations.499Academic Feedback.167 Managing Student Behavior.869Grouping Students.284 Environment.706Teacher Content Knowledge.054 Respectful Culture.898Teacher Knowledge of Students.311 St&ards & Objectives-.008Thinking.032 Motivating Students.278Problem Solving.012 Presenting Instruct. Content.091

21 PAF Extraction (Rotated): Factor 3 Instructional Plans.034Lesson Structure & Pacing-.095 Student Work.132Activities & Materials.110 Assessment.076Questioning.222 Expectations-.068Academic Feedback.111 Managing Student Behavior-.055Grouping Students.019 Environment.180Teacher Content Knowledge.030 Respectful Culture.043Teacher Knowledge of Students-.044 St&ards & Objectives-.041Thinking.830 Motivating Students.109Problem Solving.856 Presenting Instruct. Content-.054

22 EFA Results Currently ProposedOur Results 1. Instruction (12 items)1. Instructional Effectiveness (13 items) 2. Learning Environment (4 items) 3. Designing & Planning Instruction (3 items)3. Thinking/Problem Solving (2 items) Extracted 3 factors—same number as proposed Identical item loadings on “Learning Environment” factor Different item loadings for other factors ‒ Our “Instructional Effectiveness” factor included all items included in proposed “Instruction and Designing & Planning Instruction” ‒ The two items in out “Thinking/Problem Solving are in the “Instruction” factor in the proposed model.

23 Discussion This study did not add evidence that scores from the TAP rubric (as currently scored) can be used to make global inferences regarding teacher quality. –The original structure proposed by the rubric developers was not retained. –Differences in the number of items per factor is problematic when composite is simple average. Thinking and Problem-solving factor may be problematic –Scoring these items requires observers to infer across multiple time points –Appropriate within a teacher evaluation assessment? Teacher influenced or student-related items? Rubric may provide useful information for formative assessment and evaluation –Items have “face validity.” –Examine individual items rather than composites to focus improvement

24 Limitations and Cautions Factor analysis is only one method through which to create evidence for validity. –Further research should attempt to correlate rubric scores with other positive academic outcomes (e.g., graduation; pro-academic student behaviors) FA results are based on a specific dataset –Our results should not be extended beyond the data used and the population assessed. Ideas for future research: –Estimate CFA models from averaged & weighted scores. –Examine measurement invariance across grade level, subject taught & teacher type. –Examine latent growth patterns within and across several school years.


Download ppt "An Evaluation of an Observation Rubric Used to Assess Teacher Performance Kent Sabo Kerry Lawton Hongxia Fu Arizona State University."

Similar presentations


Ads by Google