Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 The Thinking Behind · PACT· Performance Assessment for California Teachers Raymond Pecheone Raymond Pecheone Stanford University Stanford University.

Similar presentations


Presentation on theme: "1 The Thinking Behind · PACT· Performance Assessment for California Teachers Raymond Pecheone Raymond Pecheone Stanford University Stanford University."— Presentation transcript:

1 1 The Thinking Behind · PACT· Performance Assessment for California Teachers Raymond Pecheone Raymond Pecheone Stanford University Stanford University April 16, 2008 April 16, 2008

2 2 The PACT Assessment System A performance assessment for teacher candidates created in response to SB 2042, with new subject matter standards, new program standards, and new assessment standards A performance assessment for teacher candidates created in response to SB 2042, with new subject matter standards, new program standards, and new assessment standards Alternate assessments permitted must meet California Quality Standards for reliability/validity (i.e., AERA/APA test standards). Alternate assessments permitted must meet California Quality Standards for reliability/validity (i.e., AERA/APA test standards). Aligned with the California Teaching Performance Expectations (standards) and California Content Standards Aligned with the California Teaching Performance Expectations (standards) and California Content Standards High stakes assessment designed to initially license beginning teachers High stakes assessment designed to initially license beginning teachers 2

3 3 PACT Institutions ‣ UC Berkeley ‣ UC Davis ‣ UC Irvine ‣ UCLA ‣ UC Riverside ‣ UC San Diego ‣ UC Santa Barbara ‣ UC Santa Cruz 3 ‣ Cal Poly — SLO ‣ CSU Channel Islands ‣ CSU Chico ‣ CSU Dominguez Hills ‣ CSU Monterey Bay ‣ CSU Northridge ‣ Humboldt State ‣ Sacramento State ‣ San Diego State ‣ San Francisco State ‣ San Jose State ‣ Sonoma State ‣ Stanford ‣ Holy Names University ‣ Mills College ‣ Notre Dame de Namur University ‣ Pepperdine University ‣ St. Mary’s College of California ‣ University of the Pacific ‣ University of San Diego ‣ Antioch University ‣ USC ‣ San Diego Intern

4 444 The PACT Assessment System Assessments Embedded in Local Programs — examples — Assessments Embedded in Local Programs — examples — Observation/Supervisory Evaluation & Feedback Child Case Studies Analyses of Student Learning Curriculum /Teaching Analyses The Capstone Teaching Event Teaching Event Demonstrates : ‣ Planning ‣ Instruction ‣ Assessing ‣ Reflecting ‣ Academic Language Teaching Event Demonstrates : ‣ Planning ‣ Instruction ‣ Assessing ‣ Reflecting ‣ Academic Language

5 5 Teaching Event Records of Practice* Instructional and Social Context 3 to 5 Days Planning Lesson Plans Lesson Plans Handouts, overheads, student work Handouts, overheads, student work Lesson Commentary Lesson CommentaryInstruction Video clip(s) Video clip(s) Teaching Commentary Teaching CommentaryAssessment Analysis of Whole Class Assessment Analysis of Whole Class Assessment Analysis of learning of 2 students Analysis of learning of 2 studentsReflection Daily Reflections Daily Reflections Reflective Commentary Reflective Commentary Evidence of Academic Language 5 * 24 Teaching Events in 13 credential areas

6 6 Teaching Event Subject Areas Multiple Subjects Multiple Subjects ‣ Literacy ‣ Mathematics 6 Single Subject Single Subject ‣ Agriculture ‣ English language arts ‣ History social science ‣ Mathematics ‣ Science ‣ Art ‣ Music ‣ Physical Education ‣ World languages

7 7 Guiding Questions and Analytic Rubrics PLANNING PLANNING ‣ Establishing a Balanced Instructional Focus ‣ Making Content Accessible ‣ Designing Assessments INSTRUCTION INSTRUCTION ‣ Engaging Students in Learning ‣ Monitoring Student Learning During Instruction 7 ASSESSMENT ASSESSMENT ‣ Analyzing Student Work From an Assessment ‣ Using Assessment to Inform Teaching REFLECTION REFLECTION ‣ Monitoring Student Progress ‣ Reflecting on Teaching ACADEMIC LANGUAGE ACADEMIC LANGUAGE ‣ Understanding Language Demands ‣ Supporting Academic Language Development

8 8 PACT Rubrics (one example) 8 ELEMENTARY LITERACY TEACHING EVENT ( 2004-05 PILOT) GUIDING QUESTION:How does the candidate use analysis of student learning to propose next steps in instruction? Level 1 Level 2 Level 3 Level 4 Next steps are vaguely related to or not aligned with the analysis of student misunderstandings and needs. — OR — Next steps are vaguely related to or not aligned with the analysis of student misunderstandings and needs. — OR — Next steps are not described in sufficient detail to understand them. — OR — Next steps are not described in sufficient detail to understand them. — OR — Next steps are based on inaccurate conclusions about student development from the assessment analysis. Next steps are based on inaccurate conclusions about student development from the assessment analysis. Next steps focus on improving student performance through support that addresses student misunderstandings or needs. Next steps focus on improving student performance through support that addresses student misunderstandings or needs. Next steps are based on broad patterns of performance on the assessment. Next steps are based on broad patterns of performance on the assessment. Next steps focus on improving student performance through targeted support to individuals and groups to address specific misunderstandings or needs. Next steps focus on improving student performance through targeted support to individuals and groups to address specific misunderstandings or needs. Next steps are based on analysis of whole class patterns of performance, some patterns for individuals and/or subgroups and general knowledge of indvidiual students and/or subgroups. Next steps are based on analysis of whole class patterns of performance, some patterns for individuals and/or subgroups and general knowledge of indvidiual students and/or subgroups. All components of Level 3 plus: Next steps demonstrate a strong understanding of both the identfied content and language standards and of individual students and/or subgroups. Next steps demonstrate a strong understanding of both the identfied content and language standards and of individual students and/or subgroups. Level 2 Next steps focus on improving student performance through support that addresses student misunderstandings or needs. Next steps focus on improving student performance through support that addresses student misunderstandings or needs. Next steps are based on broad patterns of performance on the assessment. Next steps are based on broad patterns of performance on the assessment.

9 9 2-Day Subject Specific Scorer Training DAY 1 DAY 1 ‣ Overview of PACT Teaching Event and scoring process ‣ Discussion on bias ‣ Note taking and Documentation ‣ Understanding Level “2” DAY 2 DAY 2 ‣ Understanding Level “1” ‣ Understanding Level “3” ‣ Independently score a Calibration Teaching Event & Debrief

10 10 PACT Scores Inter-rater Reliability Level of Agreement Percent Exact Match 46% ± 1 point 34% ± 2 points or greater 10% Sample Size · 2,580 Spearman-Brown Reliability Estimate · 0.88

11 11 Content validity Content validity ‣ Development teams, Program directors, Program faculty, & Leadership team ‣ TPE alignment study Concurrent validity Concurrent validity ‣ Evaluation of score validity ‣ Decision Consistency · Holistic vs. Analytic ratings Bias and fairness review Bias and fairness review PACT Validity Studies 11 Construct validity Construct validity ‣ Factor Analysis (2002-03 Pilot Year): Reflection & AssessmentReflection & Assessment InstructionInstruction PlanningPlanning Predictive Validity (Carnegie/CT Study) Predictive Validity (Carnegie/CT Study)

12 12 What We Learn from the PACT Analyses How our candidates do: How our candidates do: ‣ On different aspects of teaching ‣ In different subject areas ‣ In comparison to other institutions ‣ Over time ‣ With different kinds of supports 12

13 13 Data Charts · 2003-04 Campus/Task Scores

14 14 Data Charts · 2003-04 Content Area/Task Scores

15 15 PACT Scores - Assessment of Student Learning (2003- 2005)

16 16 Faculty Learning & Program Improvement Increased articulation across courses, structures and roles Increased articulation across courses, structures and roles Changes in content of some courses Changes in content of some courses Structural changes in Teacher Education Program Structural changes in Teacher Education Program

17 17 PACT Teaching Event · DNA Documents teaching of learning segment (3-5 lessons or hours of instruction) Documents teaching of learning segment (3-5 lessons or hours of instruction) Subject specific Subject specific Standardized tasks & core questions across programs Standardized tasks & core questions across programs Scored with common rubrics, passing standard Scored with common rubrics, passing standard During student teaching During student teaching 17

18 18 For More Information... See Teaching Event Handbooks and Rubrics at www.pacttpa.org. See Teaching Event Handbooks and Rubrics at www.pacttpa.org.

19 19ActionsActions Scoring theTE Collaborative planning acrossUniver sity &K-12 schools Professional Development PACT Advisor Analysis of Candidate Work Program Meetings

20 20

21 21 The Research Base for Teacher Licensing Tests Weak relationship between traditional licensing tests and teacher effectiveness (NRC, 2001) Weak relationship between traditional licensing tests and teacher effectiveness (NRC, 2001) ‣ Strauss & Sawyer (1986) ‣ Ferguson (1991, 1998) ‣ Ferguson & Ladd (1996) ‣ Clotfelter, Ladd, & Vigdor (forthcoming) ‣ Goldhaber (2005, 2006) Effect sizes quite small in recent value added research (.01.06) Effect sizes quite small in recent value added research (.01.06) 21

22 22 Educative Assessment Teachers Matter Teachers Matter Subject Matter Matters Subject Matter Matters Preparation (support) Matters Preparation (support) Matters Authenticity Matters Authenticity Matters Integration of Practice Matters Integration of Practice Matters 22

23 23

24 24

25 25 California Teaching Performance Expectations ‣ TPE 1 · Specific Pedagog- ical Skills for Subject Matter Instruction ‣ TPE 2 · Monitoring Student Learning During Instruction ‣ TPE 3 · Interpretation and Use of Assessments ‣ TPE 4 · Making Content Accessible ‣ TPE 5 · Student Engagement ‣ TPE 6 · Developmentally Appropriate Teaching Practices 25 ‣ TPE 7 · Teaching English Learners ‣ TPE 8 · Learning about Students ‣ TPE 9 · Instructional Planning ‣ TPE 10 · Instructional Time ‣ TPE 11 · Social Environment ‣ TPE 12 · Professional, Legal, and Ethical Obligations ‣ TPE 13 · Professional Growth

26 26 What is Subject Specific about the Teaching Event? Focus of learning segment & aligned to Ca. content standards Focus of learning segment & aligned to Ca. content standards Teaching/learning tasks on video clip(s) Teaching/learning tasks on video clip(s) Additional prompts in some content areas (e.g., misconceptions in science, dispositions in mathematics, description of text in English/language arts) Additional prompts in some content areas (e.g., misconceptions in science, dispositions in mathematics, description of text in English/language arts) Common and subject specific rubrics Common and subject specific rubrics Benchmarks within subject areas Benchmarks within subject areas 26

27 27 ScoringScoring Trained and calibrated subject specific assessors Trained and calibrated subject specific assessors Campus based with central audits & regional scoring Campus based with central audits & regional scoring Rubric based scoring in real time (web based platforms) Rubric based scoring in real time (web based platforms) Organized around dimensions of teaching (PIARA) and guiding questions Organized around dimensions of teaching (PIARA) and guiding questions Sequentially Scored By PIARA Tasks Sequentially Scored By PIARA Tasks 27

28 28 PACT Scores Assessment of Student Learning (2003 - 2005)


Download ppt "1 The Thinking Behind · PACT· Performance Assessment for California Teachers Raymond Pecheone Raymond Pecheone Stanford University Stanford University."

Similar presentations


Ads by Google