Evaluation Results 2002-2008. MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.

Slides:



Advertisements
Similar presentations
PD Plan Agenda August 26, 2008 PBTE Indicators Track
Advertisements

FRANKLIN PUBLIC SCHOOLS SCHOOL COMMITTEE MAY 27, 2014 Massachusetts Kindergarten Entry Assessment (MKEA)
Gwinnett Teacher Effectiveness System Training
The SCPS Professional Growth System
Determining Validity For Oklahoma’s Educational Accountability System Prepared for the American Educational Research Association (AERA) Oklahoma State.
Missouri Reading Initiative Evaluation Plan: Goals, Activities, and Responsibilities.
School District of University City Jackson Park Elementary School SCHOOL IMPROVEMENT PLAN Joylynn Wilson, Superintendent Monica Hudson, Principal.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
One Voice – One Plan Office of Education Improvement and Innovation MI-CSI: Do Stage Implement Plan and Monitor Plan.
5-Step Process Clarification The 5-Step Process is for a unit, topic, or “chunk” of information. One form should be used for the unit, topic, etc. The.
KCCT Kentucky’s Commonwealth Accountability Testing System Overview of 2008 Regional KPR.
Springfield Public Schools Adequate Yearly Progress 2010 Overview.
Missouri Reading Initiative Evaluation Plan: Goals, Activities, and Responsibilities.
Compass: Module 2 Compass Requirements: Teachers’ Overall Evaluation Rating Student Growth Student Learning Targets (SLTs) Value-added Score (VAM) where.
EDU 385 Education Assessment in the Classroom
1 Results for Students with Disabilities and School Year Data Report for the RSE-TASC Statewide Meeting May 2010.
The Missouri Reading Initiative Spring 2008 Annual Participant Survey Results.
Teacher Evaluation and Professional Growth Program Module 4: Reflecting and Adjusting December 2013.
Student Growth in the Washington State Teacher Evaluation System Michelle Lewis Puget Sound ESD
1 Watertown Public Schools Assessment Reports 2010 Ann Koufman-Frederick and Administrative Council School Committee Meetings Oct, Nov, Dec, 2010 Part.
Ohio’s New Accountability System Ohio’s Response to No Child Left Behind (NCLB) a.k.a. Elementary & Secondary Education Act a.k.a. ESEA January 8, 2002.
Evaluation Results Missouri Reading Initiative.
Hastings Public Schools PLC Staff Development Planning & Reporting Guide.
Standards IV and VI. Possible Artifacts:  School Improvement Plan  School Improvement Team  North Carolina Teacher Working Conditions Survey  Student.
Exploring Evidence.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Melrose High School 2014 MCAS Presentation October 6, 2014.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Evaluation Results Missouri Reading Initiative.
Candidate Assessment of Performance Conducting Observations and Providing Meaningful Feedback Workshop for Program Supervisors and Supervising Practitioners.
Capacity Development and School Reform Accountability The School District Of Palm Beach County Adequate Yearly Progress, Differentiated Accountability.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Documenting Completion of your PDP
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
+ SOUTH DAKOTA PRINCIPAL EFFECTIVENESS MODEL PROCESS OVERVIEW PE WEBINAR I 10/29/2015.
Data for Student Success May, 2010 Ann Arbor, MI “It is about focusing on building a culture of quality data through professional development and web based.
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
Measuring Turnaround Success October 29 th, 2015 Jeanette P. Cornier, Ph.D.
Literacy-Based Promotion Act & 3 rd Grade Summative Assessment Parent Information Night September 29, 2015.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
1 Testing Various Models in Support of Improving API Scores.
Reading Well by Third Grade
Performance Goals Samples (Please note, these goals are not proficient- they are for training purposes) What do you think?
Introduction to Teacher Evaluation
Conversation about State Report Card November 28, 2016
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Teacher SLTs
The New Educator Evaluation System
The New Educator Evaluation System
Teacher SLTs
Wethersfield Teacher Evaluation and Support Plan
Student Achievement Data Displays Mathematics & Reading Grade 3
The New Educator Evaluation System
Measuring Project Performance: Tips and Tools to Showcase Your Results
Office of Education Improvement and Innovation
Reading Well by Third Grade
2018 OSEP Project Directors’ Conference
Teacher SLTs
Introduction to Student Achievement Objectives
Georgia Department of Education
Teacher SLTs
Overview This presentation provides information on how districts compile evaluation ratings for principals, assistant principals (APs), and vice principals.
Annual Title I Meeting and Benefits of Parent and Family Engagement
An Introduction to Evaluating Federal Title Funding
Teacher SLTs
TAPTM System Overview Teacher Excellence Student Achievement
Presentation transcript:

Evaluation Results

MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection Test Scores Standardized Tests Classroom Assessments (DRA) MAP Demographics Special Education Information MAP Analyses

MAP ANALYSES: Map analyses compare schools that have finished the MRI program with a randomly chosen sample of non-MRI elementary schools Results indicate MRI schools generally outperform non-MRI schools (Not proof of a causal relationship)

Notes for MAP Analyses Note: With the following MAP Analyses charts the numbers are not as important as the comparative performance between MRI and non-MRI schools. This is because: 1.There is variation in the scores from year to year and school to school. 2.The calculation of the baseline changes as more data becomes available. The longer baselines mean there is less variation resulting in “flatter” or lower results. –For 2002 schools 1999 was the baseline –For 2003 schools an average of 1999/2000 was the baseline –For 2004 schools an average of 1999/2001 was the baseline –For 2005 schools an average of 2000/2002 was the baseline

Comparison of MRI and Random Samples Average % Change in Communication Arts Index per School

MAP Results In 2006 the MAP Communication Arts test was changed in ways that make comparisons to previous years difficult: Achievement levels were reduced from five to four Scaled Score intervals for categories were changed Questions were adjusted to apply to multiple grade levels that were tested (Grades 3-8 instead of only 3 and 7)

MAP Results: For the comparison between MRI and the random sample of Missouri elementary schools was made in terms of the percentage change between a 3 Year Baseline and the outcome year of students who scored in the top two achievement levels (Proficient and Advanced) 2006: Baseline= : Baseline= In 2006 This was done for 1st and 2nd year K-3 MRI schools (n=20) because there was only one 3rd year graduating school in 2006 In 2007 the analysis was done for 3rd Year schools only (n=17) for all grades 3-8

MAP Analysis In 2008 we now have three years of data after the MAP test was revised in In this analysis we compare the results of MRI, two other Missouri Professional Development programs (Programs I and II), and a Random Sample (RS) of Missouri Elementary Schools The Outcome Measure is the same as used by Federal and State programs in determining Adequate Yearly Progress or AYP: The percentage of students scoring at or above Proficiency

Steps in the MAP Analysis Step 1: Get percentages of students Proficient or Advanced (Prof+) for each school Source:://dese.mo.gov/schooldata/school_data.html AYP Reports://dese.mo.gov/schooldata/school_data.html Step 2: Calculate a baseline of the average Prof+ of Step 3: Calculate the change (∆) in Prof+ in 2008 from the baseline for each school Step 4: Calculate average and median ∆ for each group: (MRI, I, II, and Random Sample) Step 5: Calculate standard deviations, skew, and pre- baseline average for each group Step 6: Remove all schools from each group whose ∆ was >2*SD Step 7: Repeat Steps 1-5

The data in this Table supports the statement that between 2006 and 2008 MRI schools made larger gains on average in the percentage of students scoring at Proficient or Better on the 2008 MAP Communication Arts test than two other Missouri professional development programs and a random sample of Missouri elementary schools. Samples in Table are from three different professional development programs and a random sample of schools in Missouri. The samples have been adjusted by removing “outliers” beyond +/- 2 Standard Deviations. Complete analysis, including supporting data, is available from MRI Research and Assessment.

As mandated by federal law, Missouri schools must make yearly progress goals in MAP scores For Communication Arts those goals were defined as the percentage of students scoring at Proficient or better % % % % % % The following Table provides a comparison between MRI schools and state-wide results. Adequate Yearly Progress

Percentage of Schools Meeting AYP Levels =19.4% 2004=20.4% 2005=26.6% 2006=34.7% 2007=42.9% 2008=51.0% Proficient and Advanced YearMRIState % (60 / 74) 50.9% (1,0469/2,053) % (50 / 50) 77.27% (1,569/2,033) % (28 /35) 64.7% (1,317/2,036) % (22/27) 62.6% (1,291/2,061) % (17/21) 53.6% (1,125/2,100) % (28/41) +/-40% 3 (+/- 881/2,203) 1 Includes “Safe Harbor” and “Confidence Interval” results 2 Beginning in 2006 AYP was calculated for grades 3-8 and 11 3 In 2008 DESE reported the results for all schools as follows: “Only one-fourth of all school districts and about 40 percent of school buildings met this year’s proficiency targets for adequate yearly progress (AYP). “ The Title I ratio was more specific : 44.8% met AYP in 2008 (See-

Teaching and Learning Survey In this survey classroom teachers were asked to identify instructional practices and frequencies of use (using a scale of 1=Never to 5=Almost Daily) of a number of critical elements related to the goals of MRI training. One way of looking at the data is by identifying those practices that were not frequently utilized by “pre” respondents (less than “3”), and ask if there were any changes reflected in the “post” responses.

7: Assesses reading progress by use of informal assessments (running records, CAP, DRA, letter identification, etc.) 8: Implements reading workshop 11: Writes a text collaboratively with students sharing the pen 15: Collects student writing samples to document writing progress over time 16: Uses scoring guides/rubrics to assess student writing 17: Implements writing workshop 20: Organizes literacy corners to provide independent practice for students 21: Provides opportunities for students to use computers to write, publish, and practice Teaching and Learning Survey Items: K-3 “pre” (2005) Mean <3

A7A8A11A15A16A17A20A In most of these categories there has been significant change of self- reported implementation of critical practices. Unlike previous years, however, three of the practices (A11, A15 and A16), do not show the same kind of robust growth as in the past. An early analysis of the data suggests this result may be because of a relatively high number of kindergarten teachers in the sample, teachers who might be less likely to use writing strategies than in other grades. 3rd Year Respondents (n=170) K-3 Practice Changes: K-3 Practice Changes

Teaching and Learning Survey Items: Upper Grades “pre” (2005) Mean <3 7: Assesses reading progress by use of informal assessments (running records, CAP, DRA, letter identification, etc.) 8: Implements reading workshop 12: Conferences with students individually to discuss their writing progress 13: Collects student writing samples to document writing progress over time 15: Implements writing workshop 18: Provides opportunities for students to use computers to write, publish, and practice

Upper Grade Respondents (n=116) A7A8A12A13A15A Practice Changes The evidence presented here supports the statement that while there were practice changes, the strength of the variation is less than that which was observed for the K-3 school. Indeed, in one case (A13) no change in component usage was reported, and bears closer scrutiny. The differences in intensity between K-3 and Upper Grade teaching cohorts are likely a result of the fact that the upper grades are more departmentalized with more content area teachers whose primary responsibilities are in subject areas other than literacy. In addition, as noted in previous reports, upper grade teachers are more likely to use technology as an instructional tool (A18).

Participants rate the usefulness of component utilization, practice change, "buy in", attitudes toward the program and trainer, etc. Results drive program change; e.g.; Program Orientation Upper Grade Program 2008 Participant Survey Please see the “2007 Survey Results” Power Point presentation at for more detailed results of the Participant Survey between 2002 and 2007.

There are two positive trends reflected in the MRI End of the Year Participant Questionnaire: (1) Participants rate the program higher with passage of time; and (2) each year sees the entry level of satisfaction rise for new cohorts. The following tables demonstrate these trends between 2002 and 2008 Participant Survey

*3rd Year schools were interviewed in 2002 We have found that ratings generally go up from year to year as participants become more familiar with the program and, more importantly, begin to see the tangible results of improved student reading in their classrooms. Participant Survey “Rate” by MRI Program Year Reflecting on the effectiveness of the MRI program as a whole, how would you rate it? Poor Excellent

Beginning in 2005 MRI began expanding to higher grades which have different dynamics and different scoring tendencies. Briefly, because the upper grades are increasingly departmentalized, content area teachers are usually more resistant to literacy professional development when compared to communication arts specialists. Over time, however, upper grade scores improved to K-3 levels as MRI Trainers responded to participants’ concerns and adapted the program to upper grade teachers’ needs. Participant Survey “Rate” by MRI Program Year *In st year and 4-6 scores were depressed by an “outlier” district where four participating schools had unusually low scores. MRI staff will use this information to address whatever implementation issues there are and, as a consequence, we would expect to see the scores rebound in

DRA Results: The Developmental Reading Assessment tool (DRA) is a formalized classroom assessment that has proven to be an accurate indicator of a student’s actual reading level. This is a key element of the MRI program as “assessment drives instruction” and allows teachers to be highly specific in responding to each individual student’s needs. The following slide presents information about the changes in the percentages of students reading “At or Above” Grade Level at 2nd and 3rd MRI year schools for which the DRA data has been reported and analyzed as of 9/30/2008. The results are organized by grade level cohorts; that is, students who are in the same class as they move up grade levels. ALL reporting cohorts show significant increases in students reading “At or Above” Grade Level as measured by the DRA.

DRA Change in Percentage of Students Reading “At or Above” Grade Level (“F”=Fall; “S”=Spring) SchoolGrade CohortPre-datePost-datePre%Post% Change 112S07S S07S S06S S06S S06S S06S S06S F06S F05S F05S F06S F06S F05S F05S F05S F05S F05S Average for All reported schools164.1