Presentation is loading. Please wait.

Presentation is loading. Please wait.

Western Suffolk BOCES Boot Camp Emma Klimek Eastern Suffolk BOCES 2012.

Similar presentations


Presentation on theme: "Western Suffolk BOCES Boot Camp Emma Klimek Eastern Suffolk BOCES 2012."— Presentation transcript:

1 Western Suffolk BOCES Boot Camp Emma Klimek eklimek33@gmail.com Eastern Suffolk BOCES 2012

2 Emma Klimek 2012

3 What do we mean by “Growth”? Growth is change from point A to point B Growth is an expectation of learning Growth is a relative measure compared to like students in like conditions

4 The What, Why, and How of Growth Models and Measures

5 By the End of This Section…. You should be able to:  Explain why the state is measuring student growth and not achievement  Describe how the state is measuring growth compared to similar students  Define a student growth percentile and mean growth percentile

6 Prior Year Performance for Students in Two Teachers’ Classrooms ─ Proficiency

7 Current Year Performance of Same Students ─ Proficiency

8 Prior and Current Year Performance for Ms. Smith’s Students Ms. Smith’s Class Prior ScoreCurrent Score Student A450510 Student B470500 Student C480525 Student D500550 Student E600650

9 Prior and Current Year Performance for Ms. Smith’s Students Ms. Smith’s Class Prior ScoreCurrent Score Student A450510 Student B470500 Student C480525 Student D500550 Student E600650

10 www.engageNY.org 10 ELA Scale Score 20112012 Studen t A 450 High SGPs Low SGPs Student A’s Current Year Performance Compared to “Similar” Students If we compare student A’s current score to other students who had the same prior score (450), we can measure her growth relative to other students. We describe her growth as a “student growth percentile (SGP”). Student A’s SGP is 45, meaning she performed better in the current year than 45 percent of similar students.

11 Comparing Performance of “Similar” Students Prior Year Score Current Year Score Given any prior score, we see a range of current year scores, which give us SGPs of 1 to 99.

12 SGPs for Ms. Smith’s Students Ms. Smith’s Class Prior Score Current Score SGP Student A450510 45 Student B470500 40 Student C480525 70 Student D500550 60 Student E600650 40

13 Which Students Count in a Teacher or Principal’s MGP for 2011- 12? Student has valid test scores for at least 2011-12 and 2010-11 Student scores do not count for 2011-12 Yes Student meets continuous enrollment standard for 2011-12 No Student growth is attributed to the teacher and the school Yes No Expected for 2012-13: students weighted by duration of instructional linkage

14 From Student Growth to Teachers and Principals In order for an educator to receive a growth score, he or she must have a minimum sample size of 16 student scores in ELA or mathematics across all grades he or she teaches. Examples:  A teacher has a self-contained classroom with 8 students who take the 4 th grade ELA and math assessments; this teacher would then have 16 student scores contributing to his or her growth score.  A teacher has a class with 12 students who are in varied grades (4 th, 5 th, 6 th ) who take the ELA and math assessments for their respective enrolled grade level; this teacher would then have 24 student scores contributing to his or her growth score. If an educator does not have 16 student scores, they will not receive a growth score from the State and will not receive information in the reporting system at the educator level.  Educators likely to have fewer than 16 scores should use SLOs.

15 From Student Growth to Teachers and Principals (continued) Ms. Smith’s Class SGP Student A45 Student B40 Student C70 Student D60 Student E40 To measure teacher performance, we find the mean growth percentile (MGP) for her students. To find an educator’s mean growth percentile, take the average of SGPs in the classroom. In this case: Step 1: 45+40+70+60+40=255 Step 2. 255/5=51 Ms. Smith’s mean growth percentile (MGP) is 51, meaning on average her students performed better than 51 percent of similar students. A principal’s performance is measured by finding the mean growth percentile for all students in the school.

16 Expanding the Definition of “Similar” Students So far we have been talking about “similar” students as those with the same prior year assessment score We will now add two additional features to the conversation: Two additional years of prior assessment scores Remember—a student MUST have current year and prior year assessment score to be included Student-level factors Economic disadvantage Students with disabilities (SWDs) English language learners (ELLs)

17 Adjustments for Three Student-Level Factors in Measuring Student Growth Student performance Teacher Instruction Other factors (12-13) Economic disadvantage Language proficiency Disability

18 ELA Scale Score 20112012 Studen t A 450 High SGPs Low SGPs Going Back to Student A’s Current Year Performance Compared to “Similar” Students If we compare student A’s current score to other students who had the same prior score (450), we can measure her growth relative to other students. We describe her growth as a “student growth percentile (SGP”). Student A’s SGP is 45, meaning she performed better in the current year than 45 percent of similar students.

19 ELA Scale Score 20112012 Studen t A 450 High SGPs Low SGPs Expanding the Definition of “Similar” Students to Include Economically Disadvantaged—An Example Now if student A is economically disadvantaged, we compare student A’s current score to other students who had the same prior score (450) AND who are also economically disadvantaged. In this new comparison group, we see that student A now has an SGP of 48.

20 Further Information on Including Student Characteristics in the Growth Model The following slides were developed using sample data from 2010-2011.  The “combined” MGPs on the charts have been calculated at the educator level (combining all grades and subjects).  Not all districts provided data linked to teachers for grades 4-8 ELA/Math in 2010-11.

21 Teacher MGPs after Accounting for Economic Disadvantage Taking student-level characteristics into account helps ensure educators with many students with those characteristics have a fair chance to achieve high or low MGPs. For example, note that for teachers with any percent of economically disadvantaged students, teacher MGPs range from 1 to 99. NOTE: Beta results using available 2010-2011 data.

22 Teacher MGPs after Accounting for SWD NOTE: Beta results using available 2010-2011 data.

23 Teacher MGPs after Accounting for ELL Percent of ELL Students in Class NOTE: Beta results using available 2010-2011 data.

24

25 “Similar” Students: A Summary “Similar” Student Characteristics Unadjusted Mean Growth Percentiles Adjusted Mean Growth Percentiles Up to Three Years of Prior Achievement English Language Learner (ELL) Status Students with Disabilities (SWD) Status Economic Disadvantage Reported to Educators Used for Evaluation

26 One Last Feature of the Growth Model…. All tests contain measurement error, with greater uncertainty for highest and lowest- achieving students The New York growth model accounts for measurement error in computing student growth percentiles.

27 State Growth Model Summary Regulations allow Prior years of student test results Three student- level variables: SWD, ELL, Econ Disadvantage Measurement error correction Model includes Up to three years, as available All three Measurement error correction Growth model for 2011-12 only for grades 4-8 ELA/Math for teachers and principals

28 Using Growth Measures for Educator Evaluation

29 MGPs and Statistical Confidence 87 Confidence Range Uppe r Limit Lowe r Limit MGP NYSED will report a 95 percent confidence range, meaning we can be 95 percent confident that an educator’s “true” MGP lies within that range. Upper and lower limits of MGPs will also be reported. An educator’s confidence range depends on a number of factors, including: number of student scores included in their MGP and the variability of student performance in their classroom.

30 Illustrating Possible Growth Ratings MGP 1 MGP 99 Well Below Average Below Average Average Well Above Average MGP 50 MGP

31 From MGPs to Growth Ratings: Teachers Rules on last slide result in these HEDI criteria for 2011-12 Yes No Is your MGP ≥ 69? Is your Lower Limit > Mean of 52? Highly Effective: Results are well above state average for similar students Is your MGP ≤ 35? Is your Upper Limit < 44? Ineffective: Results are well below state average for similar students Developing: Results are below state average for similar students No Effective: Results equal state average for similar students Mean Growth Percentile Confidence RangeHEDI Rating Is your MGP 42-68? Any Confidence Range Yes No Is your MGP 36-41? Is your Upper Limit < Mean of 52? Yes No

32 Illustrating Possible Growth Ratings MGP 1 MGP 99 Well Below Average Below Average Average Well Above Average MGP 50 MGP

33 Illustrating Possible Growth Ratings MGP 1 MGP 99 Well Below Average Below Average Average Well Above Average MGP 50 MGP Ineffective Developing Highly Effective Effective

34 Illustrating Possible Growth Ratings MGP 1 MGP 99 Well Below Average Below Average Average Well Above Average MGP 50 MGP Developing Effective

35 Assignment of Points with HEDI Category HEDI Points Min MGP Max MGP N of Teachers 0328660 12932651 23335693 32935241 43637826 538 495 639 535 740 561 841 683 936442661 1045462001 1147493376 1250512432 1352543648 1455562415 1557593144 1660622624 1763683277 186970662 197173666 207496878 HEDI Points Min MGP Max MGP N of Schools 01636.571 1373975 239.54197 334.54122 441.54265 542.5 40 643 37 743.5 41 844 64 941.546270 1046.548350 1148.549209 1249.550.5328 135152313 1452.553.5324 155455316 1655.557353 1757.563.5358 186161.565 19626370 2063.57488 Point value of 3 includes educators with MGPs in the Ineffective category but CRs above 44 (for teachers) and above 46 (for principals) Point value of 9 includes educators with MGPs in the Developing category but CRs above state average Point value of 17 Includes educators with MGPs in the Highly Effective category but CRs below state average Teachers Principals

36

37 Definitions SGP (student growth percentile): measure of a student’s growth relative to similar students Similar students: students with the same prior test scores, ELL, SWD, and economic disadvantage status ELLs: English language learners SWD: students with disabilities Economic disadvantage: a student who participates in, or whose family participates in, economic assistance programs such as the Free- or Reduced-price Lunch Programs (FRPL), Social Security Insurance (SSI), Food Stamps, Foster Care and others High-achieving, low-achieving: defined by the performance of students based on prior year State assessment scores (i.e., Level 1 = low-achieving, Level 4 = high-achieving)

38 Definitions MGP (mean growth percentile): the average of the student growth percentiles attributed to a given educator “Unadjusted” MGP: an MGP based on SGPs for which ELL, SWD, and economic disadvantage status have NOT been accounted “Adjusted” MGP: an MGP based on SGPs for which ELL, SWD, and economic disadvantage status have been accounted Growth rating: HEDI rating based on growth Growth score: growth subcomponent points from 0-20

39 Definitions Measurement error: uncertainty in test scores due to sampling of content and other factors Standard error: a measure of the statistical uncertainty surrounding a score Upper/lower limit: highest and lowest possible MGP taking statistical confidence into account Confidence range: range of MGPs within which we have a given level of statistical confidence that the true MGP falls (95 percent statistical confidence level used for state growth measure)

40 Break

41 STUDENT LEARNING OBJECTIVES:

42 The State Language “For teachers where there is no State- provided measure of student growth, “comparable measures” are the state- determined District-wide growth goal- setting process. Student Learning Objectives (SLOs) are the State determined process.”

43 Student Learning Objective Do a close reading of the state’s paragraph on Student Learning Objectives Highlight or underline 5 key words (only 5) with a shoulder partner Use all 5 of the words, if possible, in a “sound bite” or graphic Emma Klimek 2012

44 State Message Regarding Student Learning Objectives  SLOs name what students need to know and be able to do at the end of the year.  SLOs place student learning at the center of the conversation.  SLOs are a critical part of all great educator’s practice.  SLOs are an opportunity to document the impact educators make with students. Emma Klimek 2012

45 Key Messages for SLOs continued…  SLOs provide principals with critical information that can be used to manage performance, differentiate and target professional development, and focus support for teachers.  The SLO process encourages collaboration within school buildings.  School leaders are accountable for ensuring all teachers have SLOs that will support their District and school goals. Emma Klimek 2012

46 Who has SLOs and how will SLOs be set? WHO NEEDS AN SLO? Emma Klimek 2012

47 www.engageNY.org 47 Emma Klimek 2012

48

49 General Rules If 50% or more of a teacher’s students receive a State Growth Percentile (SGP) then the teacher does not need an SLO Common Branch teachers, who do not get an SGP, must have 2 SLOs, one in ELA and one in math A common branch teacher will not have an SLO in other subjects If a teacher has 16 or more State test scores in grades 4-8, ELA or math, but that is less than 50% of his/her students, he/she must use this as the first SLO Emma Klimek 2012

50 IDENTIFYING WHO NEEDS AN SLO AND WHICH ONE Emma Klimek 2012

51 Determining SLOs Locate the handout With a partner or in a triad, complete the table Be prepared to share out one scenario Emma Klimek 2012

52 Lunch

53 STATE RULES REGARDING SLOS

54 Assessment Rules Emma Klimek 2012

55 Assessment Rules Teacher developed assessments are not permitted The same assessment may be used for the 2 nd 20% but it must be used in a different way Emma Klimek 2012

56 Assessment Rules Use common assessments across grade levels or courses, if at all possible Use high quality assessments, if possible Emma Klimek 2012

57 Third Party Assessments There is no variance process in place to use a non-Approved 3 rd party assessment for the purposes of APPR. Emma Klimek 2012

58 Attendance Rule You may not exclude students due to attendance for the computation of the SLO Emma Klimek 2012

59 Format Rules Must include all the elements of the SLO as outlined by the state  Population  Learning Content  Instructional Interval  Target  Baseline  Evidence  HEDI  Rationale Emma Klimek 2012

60 Other Criteria Rules 50% rule for students is set on BEDS day Individual student growth must be determined, but the final outcome is on the aggregate group No teacher may score an assessment in which they have a vested interest Lead evaluators must be trained Emma Klimek 2012

61 Target Setting Emma Klimek 2012

62 District determines: Who, specifically in the district needs and SLO and in what academic area,based on state rules System for scoring and teacher ratings District wide process for setting, reviewing and assessing SLOs Emma Klimek 2012

63 Determine District-wide Priorities and Academic Needs Assess and identify District-wide priorities and academic needs. Start with commitments and focus areas in District strategic plans. Decide how prescriptive the District will be and where decisions will be made by principals, or principals with teachers. Emma Klimek 2012

64 SLO EXCEPTIONAL CASES Emma Klimek 2012

65 Case 1: Less than 50% of students have a SGP measurement 18-20 9-17 3-80-2 Emma Klimek 2012

66 Case 2: Multiple Sections w/wo SGP Emma Klimek 2012

67 Case 3: ESL/Bilingual/SWD Emma Klimek 2012

68 Case 4: Co-Teachers Emma Klimek 2012

69 Case 5: Push-in, pull-out Emma Klimek 2012

70 Case 6: NYSESLAT Students Emma Klimek 2012

71 Case 7: NYSAA Emma Klimek 2012

72 Case 8: Special Cases Emma Klimek 2012

73 SLO EXCEPTIONAL CASES Emma Klimek 2012

74 Case 1: Less than 50% of students have a SGP measurement 18-20 9-17 3-80-2 Emma Klimek 2012

75 Case 2: Multiple Sections w/wo SGP Emma Klimek 2012

76 Case 3: ESL/Bilingual/SWD Emma Klimek 2012

77 Case 4: Co-Teachers Emma Klimek 2012

78 Case 5: Push-in, pull-out Emma Klimek 2012

79 Case 6: NYSESLAT Students Emma Klimek 2012

80 Case 7: NYSAA Emma Klimek 2012

81 Case 8: Special Cases Emma Klimek 2012

82 Individual Teacher’s Student Option For all teachers an SLO can be based solely on the students they specifically teach  For example: a second grade teacher’s class room of students determine the “first 20%” of the teacher’s APPR Emma Klimek 2012

83 Team, school, district, BOCES wide measures If a group measure is chosen then the following apply:  This cannot be used for 6-8 science and social studies teachers and any teacher whose SLO course ends in a Regents  A state assessment must be used Emma Klimek 2012

84 An example: Emma Klimek 2012

85 An Example: Emma Klimek 2012

86 SLO EXCEPTIONAL CASES Emma Klimek 2012

87 Case 1: Less than 50% of students have a SGP measurement 18-20 9-17 3-80-2 Emma Klimek 2012

88 Case 2: Multiple Sections w/wo SGP Emma Klimek 2012

89 Case 3: ESL/Bilingual/SWD Emma Klimek 2012

90 Case 4: Co-Teachers Emma Klimek 2012

91 Case 5: Push-in, pull-out Emma Klimek 2012

92 Case 6: NYSESLAT Students Emma Klimek 2012

93 Case 7: NYSAA Emma Klimek 2012

94 Case 8: Special Cases Emma Klimek 2012

95 Individual Teacher’s Student Option For all teachers an SLO can be based solely on the students they specifically teach  For example: a second grade teacher’s class room of students determine the “first 20%” of the teacher’s APPR Emma Klimek 2012

96 Team, school, district, BOCES wide measures If a group measure is chosen then the following apply:  This cannot be used for 6-8 science and social studies teachers and any teacher whose SLO course ends in a Regents  A state assessment must be used Emma Klimek 2012

97 An example: Emma Klimek 2012

98 An Example: Emma Klimek 2012

99 REVIEWING EXAMPLES Emma Klimek 2012

100 Reviewing Examples Locate handout of examples Each team choose one example to review Identify which method was used:  Target for mastery of standards  Target for score gain  Individual student growth gain Emma Klimek 2012

101 Identify which assessment was chosen or required Does the assessment exist? If not, how will it be developed? Is the assessment measure an equal interval unit assessment? Will the metric work for all students in all conditions? Explain the scoring method

102 Is the SLO as presented “executable”? Does the SLO meet state requirements? Is the SLO fair and reasonable? Rate it on the SLO rubric

103 Thank you eklimek@optonline.net


Download ppt "Western Suffolk BOCES Boot Camp Emma Klimek Eastern Suffolk BOCES 2012."

Similar presentations


Ads by Google