Presentation is loading. Please wait.

Presentation is loading. Please wait.

Department of Elementary and Secondary Education July, 2011

Similar presentations


Presentation on theme: "Department of Elementary and Secondary Education July, 2011"— Presentation transcript:

1 Department of Elementary and Secondary Education July, 2011
MA new Educator Evaluation Framework #2 MCAS Growth Scores as a starting point for self-assessment and goal setting This is the second presentation in a planned series of presentations on the Commonwealth’s new educator evaluation framework designed to support districts in their efforts to implement the regulations adopted by the Board of Elementary and Secondary Education in June It follows presentation #1 – an overview of the key features of the new framework. While presentation #2 will make the most sense to those who have seen presentation #1 or have otherwise learned about the key features of the framework, this presentation can stand on its own. Department of Elementary and Secondary Education July, 2011

2 Intended Outcomes Participants will:
Understand what MCAS Growth Scores are and what they can tell us about student learning gains. Learn how one district has used MCAS Growth Scores to assess practice and develop goals. Have a starting point for planning how to use MCAS Growth Scores in ways that will support effective self-assessment and goal setting. Have a basis to give ESE useful, informed feedback to guide our efforts to support districts to make effective use of self-assessment and goal setting in their evaluation systems. These are the intended outcomes of this presentation. Please take a moment to read through them. Please put a plus next to the outcomes that are most important to you, and a minus next to any that don’t seem important at all to you…. We’ll return to these at the end to see how well the presentation worked.

3 Agenda Context for MCAS within the new framework
MCAS Growth Scores: a primer MCAS Growth Scores: a case study MCAS Growth Scores: tool for self-assessment MCAS Growth Scores: tool for goal-setting Goal Setting Next Steps We’ll work our way through this agenda, starting with setting some context, then we’ll zero in on MCAS Growth Scores themselves: What are they? How are they derived? What can they tell us? What can’t they tell us. From there we will delve into a case study of how one Massachusetts school district has used MCAS growth scores. Using that same district, we’ll apply MCAS growth to the self-assessment step of the 5-step evaluation cycle and then to the goal-setting step. We’ll then focus on goal setting when we’ll work with more than MCAS Growth scores. And we’ll finish up with some time to work on next steps. There will be breaks periodically when you will have a chance to mull over what you’re hearing with a partner/at your table and begin to apply it, as well.

4 Context for MCAS within the Framework
“MCAS Growth” = “Student Growth Percentile Scores” We’ve had MCAS Growth Scores since 2008 All 4th, 5th, 6th, 7th, 8th and 10th grade students who have taken MCAS in the prior grade ELA Score separate from Math Score First, a few basic “facts” about MCAS Growth: When we talk about “MCAS Growth”, we’re not talking about a score of 240 or 250. We’re talking about a score between 1 and 99 that is a “student growth percentile score”. That’s the official name. We’ve had MCAS Scores since 1998, but we have had MCAS Growth scores only since 2008 when ESE staff, working with national experts, figured out a statistically sound way to compare one year’s MCAS scores with the next. 3. Not all students have MCAS Growth Percentile Scores: only those in these grades who took the MCAS at least in the one prior year (for 10th graders, it’s those who took it in at least 8th grade 4. And fourth, ELA Growth Percentile Scores are calculated separately from Math Growth Percentile Scores. So this means that not all schools and certainly not all educators can have MCAS Growth Scores directly attributed to their work as individuals. An estimate by the Task Force that helped shape these regulations is that only 17% of teachers could have student growth percentile scores directly attributed to them as individuals.

5 A 5-Step Evaluation Cycle
35.06 More context: Every teacher and administrator participates in a 5-step evaluation process that begins with educator self-assessment and then goal setting – individually or in teams. MCAS Growth Scores can come into play at several of the steps. 5

6 Self-assessment 35.06(2) Individual and/or Group Based on:
Professional standards & indicators District & school priorities An analysis of multiple measures of learning and growth of my/our students in the past An analysis of the students I/we have now At the self-assessment stage, MCAS growth scores can be one of the multiple measures of student learning and growth that an educator – alone or in his/her team, grade level or department examine to sharpen practice.

7 Goal Setting 35.06(3)(a-c) Educator and Evaluator must consider team, grade, or department goals Educator proposes; supervisor determines At least: One goal for professional practice One goal for student learning, growth and achievement At the goal setting step in the cycle, educators establish at least one goal related to student learning, growth and achievement. That goal COULD involve MCAS Growth scores – again, individually or as a team.

8 “Multiple Measures of Student Learning” 35.07(1)(a)(1-4)
Measures of student progress on classroom assessments that are aligned with the Massachusetts Curriculum Frameworks, Vocational-Technical Curriculum Frameworks, or other relevant frameworks and are comparable within grades or subjects in a school; Measures of student progress on learning goals set between the educator and evaluator for the school year; State-wide growth measure(s) where available, including the MCAS Student Growth Percentile and the Massachusetts English Proficiency Assessment (MEPA); and District-determined measure(s) of student learning comparable across grade or subject district-wide. MCAS Growth scores are one of the multiple measures of student learning identified in the regulations. #3 here. Note #4, District determined measures. These are the “… measures of student learning, growth and achievement related to the Massachusetts Curriculum Frameworks, Vocational-Technical Education Frameworks or other relevant frameworks, that are comparable across grade or subject level district-wide. These measures may include, but shall not be limited to: portfolios, approved commercial assessments and district-developed pre and post unit and course assessments, and capstone projects.”

9 Rating Impact on Student Learning 35.09
Only a subset of “multiple measures” count in rating impact: MCAS Growth and MEPA gains, when available “District-determined measures” Examine “trends” and “patterns” Assign a rating: High impact Moderate impact (“one year’s growth”) Low impact In the new framework, every educator earns a rating related to their impact on student learning. Evaluators use only a subset of the “multiple measures” when deciding on the rating for impact: MCAS Growth if it’s available, and the other “district-determined measures” the district has identified. The evaluators examine trends and patterns in the results of two or more measures, and then assign one of three ratings: high, moderate, or low. So this is another way MCAS Growth comes into play in the new framework.

10 Linking Impact on Learning and Practice
Rating of Educator Practice Exemplary Proficient Needs Improvement Unsatisfactory Low Moderate High Impact on Student Learning (multiple measures of performance, including MCAS Student Growth Percentile scores) The framework links the two ratings this way: educators earn a rating on their practice: the vertical axis here showing exemplary, proficient, needs improvement, or unsatisfactory. They also earn a rating for their impact on student learning, the horizontal axis here showing low, moderate, and high. It’s worth repeating here that we all anticipate that a large majority of educators will be in the green area: rated proficient or exemplary on practice and rated as having a moderate or high impact on student learning. The intersection of those ratings determines the kind and duration of plan the educator is on.

11 Agenda Context for MCAS within the new framework
MCAS Growth Scores: a primer MCAS Growth Scores: a case study MCAS Growth Scores: tool for self-assessment MCAS Growth Scores: tool for goal-setting Goal Setting Next Steps Let’s move to a closer look at MCAS Student Growth Percentile.

12 MCAS Growth to Grade 7: Three Students
Gina Advanced SGP 80 to 99 65% Proficient 60 to 79 40 to 59  230 20 to 39 Needs Improvement 35% 1 to 19 These next slides are from ESE’s orientation materials on the MCAS Student Growth Percentile (SGP) available on the ESE website 1. Each student’s rate of change is compared to other students with a similar test score history (“academic peers”) 2. The rate of change is expressed as a percentile. How much did Gina improve in mathematics from 5th grade to 6th grade, relative to her academic peers? If Gina improved more than 35 percent of her academic peers, then her student growth percentile would be 35. 3. Gina scored 230 on the ELA MCAS in grade 5 in 2006, and 230 in grade 6 in We looked at all the students who scored similarly to her to see how they did in ELA in grade 7. Some of those students’ scores dropped substantially, to Warning/Failing. Some students with this test score history get all the way up to Advanced. The typical youngster (in the orange part of the triangle) improves a little. Note that this is real data, not projections – this is how students with this test score history actually performed in 2008. 5. So what happened to Gina? In grade 7, Gina again scores However, look at the distribution of her academic peers. Most of her academic peers scored higher than she did. In fact, 65% of them scored higher than she did. She outscored 35% of her academic peers. Therefore, Gina has a student growth percentile of 35. 6. Without the growth model, it would seem as if a student scoring 230, 230, 230 is staying put...neither improving nor declining. However, we can now see from the growth model that this student improved less than 65% of her peers. 7. Note: The SGP groups on the right define the meaning of the bands in the triangles. The top part of the triangle – dark brown – corresponds to the scaled scores for students who achieved 80th to 99th percentile growth. The next triangle corresponds to those with 60th to 79th percentile growth, etc. SGPs between 40 to 59 are typical Warning/Failing 2006 2007 2008 source:

13 Growth to Grade 7: Three Students
Harry Advanced 75%  244 Proficient 25% Needs Improvement Harry scored 248 in grade 5, and 248 in grade 6. He’s a high scorer. He’s compared with all the students that demonstrated a similar pattern of scoring. 3. In grade 7, he scores 244, which doesn’t seem like a big dip in achievement. However, look at the distribution of his academic peers. Most of his academic peers scored higher than he did. In fact, 75% of them scored higher than he did. He only outscored 25% of his academic peers. Therefore, Harry has a student growth percentile of 25. 4. SGP gives us a way to understand growth for high achieving students. High achieving students can grow at a low, moderate or high rate, just as low achieving students can. Warning/Failing 2006 2007 2008 source:

14 Growth to Grade 7: Three Students
Ivy Advanced Proficient 8%  226 Needs Improvement 1. Here’s a low achieving student. Ivy scored 214 in grade 5, and 214 in grade 6. She’s compared with all the students that demonstrated a similar pattern of scoring. 2. In grade 7, she scores While she’s still in the Needs Improvement performance level, look at the distribution of her academic peers. She outscored 92% of her academic peers. Only 8% of her academic peers scored higher than she did. Therefore, Ivy has a student growth percentile of 92. to 226 seems like a modest improvement, but a growth percentile of 92 indicates that this student grew an unusually high amount between grades 6 and 7. Without the growth model, we would have known Ivy had made a big improvement, but we would not have been able to understand how really big it was. 92% Warning/Failing 2006 2007 2008

15 Growth to Grade 7: Three Students
Gina, Harry, and Ivy Advanced Harry Proficient Gina Needs Improvement Ivy 1. Each student – high achieving Harry, moderate achieving Gina, and low-achieving has the exact same opportunity to grow at the 99th percentile and the 1st percentile. Let me repeat that because during the public comment period leading up to the Board’s vote on the new regulations, we heard concern that schools with high achieving students would be at a disadvantage if we use MCAS Growth scores. They worried that high achieving students have little room for improvement: Harry can’t improve as much as Ivy. In fact, when each is compared to his/her “academic peers” (students with the same scoring pattern in past years), each has an equal chance of winding up growing faster than those peers. Each peer group is going to have a student whose growth places him at the 99th percentile of the group, and each is going to have one who places in the bottom 1% - and everything in between. Note: The trend in the range is pointing up slightly. That indicates that the median student in grade 6 ELA tends to score a bit better in grade 7 ELA, no matter if they’re in the Warning, Needs Improvement, Proficient, or Advanced category in grade 6. Warning/Failing 2006 2007 2008 source:

16 Growth to Grade 7: Three Students
English language arts Grade 5 2006 Grade 6 2007 Grade 7 2008 SGP 2008 Gina 230 230 230 35 Harry 248 248 244 25 1. This is a numerical representation of the graphical data we just showed. 2. The student with the lowest absolute score actually demonstrated the most growth (and vice versa). 3. Low-achieving Ivy is high-growing Ivy; High-achieving Harry is low-growing Harry. Ivy 214 214 226 92 source:

17 Median Student Growth Percentile
Last name SGP Lennon 6 McCartney 12 Starr 21 Harrison 32 Jagger 34 Richards 47 Crosby 55 Stills 61 Nash 63 Young 74 Joplin 81 Hendrix 88 Jones 95 Imagine that the list of students to the left are all the students in your 6th grade class. Note that they are sorted from lowest to highest SGP. The point where 50% of students have a higher SGP and 50% have a lower SGP is the median. Median SGP for the 6th grade class Rules of thumb: 1. Typical student growth percentiles are between about 40 and 60 on most tests. 2. A student or group outside this range has higher or lower than typical growth. 3. Differences of fewer than 10 student growth percentile (SGP) points are likely not educationally meaningful. Reminder: to calculate median – sort student SGPs from lowest to highest and find the middle student. Note: this “class” of students is composed of 1960’s era rockers, e.g., “Jones” is Davy Jones source:

18 Agenda Context for MCAS within the new framework
MCAS Growth Scores: a primer MCAS Growth Scores: a case study MCAS Growth Scores: tool for self-assessment MCAS Growth Scores: tool for goal-setting Goal Setting Next Steps So let’s take a look at how Student Growth Percentile can be used. We’ll turn to Lowell.

19 Challenging a Level 4 School
Median Student Growth Percentile English Language Arts Murkland Lincoln Sokolovsky Lowell had one of its school named a Level 4 school in spring 2010, the Murkland School. At first the staff objected. They knew the school’s very low performance on MCAS – as measured by the composite proficiency index (CPI) – was among the lowest for an elementary school in the Commonwealth. But they attributed their students’ weak performances on MCAS ELA to the large percentage of poor students they taught, many of whom were English Language Learners from the City’s poorest neighborhood. CLICK But then MCAS Growth Scores caught up with them: here was the three-year trend. They looked 10 blocks away to another Lowell school – the Lincoln School – that taught just as many poor students and even more English language learners and students with disabilities. And what they found nearly-instantly changed their frame of mind. CLICK They saw the Lincoln’s Growth Scores over those same years told a very different story of student achievement. The Murkland teachers got angry – not at ESE this time, but at themselves. They said: “We can do better! Look at what our colleagues at the Lincoln are doing!” And so they set about the hard work of figuring out what THEY had to differently to get results that compare with the Lincoln. They’re now one of the most focused, most purposeful of the Commonwealth’s Level 4 schools. And their preliminary results for 2011 tell a very different story of student growth (one we can’t share in public until MCAS scores are official in September!) By the way, using a new ESE resource, DART (District Analysis and Review Tool), Lowell folks are now looking elsewhere in the state to find schools like their own that are outperforming Lowell’s best. CLICK According to DART, here is the top performer among comparable when looking at Student Growth Percentile – it’s the Sokolovsky in Chelsea. The url for the DART is on the bottom of the slide. source:

20 Agenda Context for MCAS within the new framework
MCAS Growth Scores: a primer MCAS Growth Scores: a case study MCAS Growth Scores: tool for self-assessment MCAS Growth Scores: tool for goal-setting Goal Setting Next Steps So let’s take a look at how Student Growth Percentile can be used. We’ll turn to Lowell.

21 A 5-Step Evaluation Cycle
Remember, the first step in the evaluation cycle – for an individual, a team or a school – is self-assessment. We want to share an example of how one district – Lowell again – took a hard look at MCAS Student Growth Data to help inform one school’s self-assessment efforts – to understand better its strengths and weaknesses. 21

22 Lowell likes to use distribution charts that show the percentage of students in each quintile (20%) of SGP distribution. By definition, 20% (or 1/5) of all the students statewide at each grade/subject fall in each quintile: very high, high, moderate, low and very low. Lowell wanted to investigate mathematics performance, so they first looked at differences in growth percentile distribution by grade. They were excited by what they saw at the 6th grade, especially: statewide in 2009, 40% of 6th grade students demonstrated high or very high growth; in Lowell 55% of 6th graders had high or very high growth in 2009. They wanted to unpack this data – so they looked at the data by school. Lowell Public Schools 22 22

23 They found tremendous variation among schools in Grade 6 performance: at the Pyne, only 6% of all students had low or very low growth; at the Bartlett only 24% of 6th graders grew at a high or very high rate. What especially caught their eye, though, was performance at the Stoklosa Middle School. While Pyne, Wang and Daley serve a a heterogeous mix of students, Stoklosa serves the poorest section of the city. Over 90% of its students are poor; 47% are English Language Learners. Yet 44% of 6th graders had grown at a very high rate and another 27% at a high rate, a total of 71% of students with SGP’s over 60. Lowell Public Schools 23

24 They wanted to know if this was a fluke. It wasn’t
They wanted to know if this was a fluke. It wasn’t. The year before 62% of Stoklosa’s 6th graders had grown at the high or very high rate. Note, too, that the Bartlett’s 6th graders the year before had grown at a faster rate than the state: 55% had grown at the high or very high rate.

25 Lowell officials wanted to dig deeper, so they disaggregated the data for Stoklosa’s 6th graders. They discovered that their ELL students’ MCAS growth was high. They dug deeper. Lowell Public Schools 25

26 Disaggregating results for regular education and ELL students by grade led to new discoveries:
Very high growth for both regular education and ELL students was 2 to 3 times higher than the state at all grade levels except 7th grade. High and very high growth was exceptionally strong for both 5th and 6th grade students, and considerably lower for 7th grade students. Lowell Public Schools 26

27 Other measures that can help Stoklosa staff assess impact on student learning and growth
Massachusetts English Proficiency Assessment (MEPA) District math benchmark assessments District-adopted curriculum-embedded performance assessment Lowell is seeking out other data for its self-assessment. They will look at MEPA results for sure. Two “district-determined measures” that they are likely to use are these two: district math benchmark assessments and curriculum-embedded performance assessments the district is hoping to develop so that assessment can be more embedded in regular instruction and less teaching time will be lost to testing.

28 Agenda Context for MCAS within the new framework
MCAS Growth Scores: a primer MCAS Growth Scores: a case study MCAS Growth Scores: tool for self-assessment MCAS Growth Scores: tool for goal-setting Goal Setting Next Steps So this takes us to goal setting. We’re going to first touch on MCAS Growth Scores as a tool, but to do so, we need to look at the attributes of good goals in general.

29 A 5-Step Evaluation Cycle
First, goal setting can’t be done outside the context of analysis of data and practice and the development of a plan of action. That’s why we lay out here key attributes for effective goal setting. 29

30 Attributes of a Useful Goal
Specific and Strategic Measurable and Monitored Action oriented and Agreed upon Realistic and Results oriented Timed and Tracked - Dr. Edward W. Costa II, Lenox Public Schools Table Talk: Think of the last time you set a goal for yourself: learning a new skill, losing weight, completing an important task. Circle the attributes here that you included in your goal; put an X through any you neglected to include. With a partner, share what you learned about your skills in goal setting from your circles and x’s.

31 Goal Setting 35.06(3)(a,c) Educator and Evaluator must consider team, grade, or department goals At least: One goal for professional practice One goal for student learning, growth and achievement Here is what the new framework calls for in goals: you must consider group goals; you must have at least one goal dealing with professional practice; you must have at least one dealing with student learning. Let’s look at how a team of principals might use MCAS Growth Scores as a tool in goal setting for their own evaluation by the superintendent.

32 SMART Goals Middle School Principal
Professional practice goal: I will complete 100% of goal setting conferences with my teaching teams by October 15th; seek anonymous feedback about staff perceptions of their usefulness; study effective goal setting with my colleagues; and identify steps to take in our mid-year formative assessment conferences with each team to improve the likelihood that their practice and student growth goals will be achieved. Student learning goal: The proportion of students with “high” or “very high” SGP will increase by 5% points in both ELA and Math.

33 Agenda Context for MCAS within the new framework
MCAS Growth Scores: a primer MCAS Growth Scores: a case study MCAS Growth Scores: tool for self-assessment MCAS Growth Scores: tool for goal-setting Goal Setting Next Steps Let’s move to goal setting more broadly, using other “district-determined measures”.

34 SMART Goals 8th Grade Social Studies Teacher
Professional practice goal: To improve expository writing, middle school social studies teachers will study the “workshop process” for writing, observe it in practice, and introduce it in at least two of my classes by the start of second term. Student learning goal: At the end of the third quarter unit on the constitution, students will demonstrate proficiency by writing a pamphlet for new citizens about their constitutional rights. Using a department-developed rubric, a majority of my students will have moved one level on the writing component of the rubric since the end of the first term.

35 SMART Goals Middle School Music Teacher
Professional Practice goal: I will collaborate with my colleagues in the music department to develop, pilot, analyze, revise and share 2 performance-based assessments. Student Learning goal: My students will be able to identify and apply music terms, symbols and definitions in the curriculum guide for 6th, 7th and 8th grade. Using a department-developed assessment, 75% of my students will score 85% or above on the third quarter assessment.

36 SMART Goals 10th Grade Geometry Teacher
Professional practice goal: To engage students more, starting second term, we will incorporate at least one real-world application of geometry into 2 of every 5 homework assignments. Student learning goal: 10% more of our students will score 80% or above on the district-developed third quarter exam.

37 SMART goals Stoklosa’s 7th grade team
Professional Practice goal: I/We will… Student Learning goal: My/Our students will… Let’s return to the Stoklosa now. None of us knows anywhere near everything we would need to know to develop goals we could have real confidence will make a difference. But imagine you know enough. Table Talk: with a partner, draft practice goal and a student learning goal for the Stoklosa team that holds promise of moving the majority of their students’ student growth percentile scores into “high” or “very high” as their colleagues in 5th, 6th and 8th grade have done. Share out

38 Agenda Context for MCAS within the new framework
MCAS Growth Scores: a primer MCAS Growth Scores: a case study MCAS Growth Scores: tool for self-assessment MCAS Growth Scores: tool for goal-setting Goal Setting Next Steps Let’s focus on next steps.


Download ppt "Department of Elementary and Secondary Education July, 2011"

Similar presentations


Ads by Google