Evaluation Results 2002-2007. MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.

Slides:



Advertisements
Similar presentations
RTI RESPONSIVENESS TO INSTRUCTION Creating Tier I/PIPs Cleaning the Windshield.
Advertisements

Second Information Technology in Education Study (SITES) A Project of the International Association for the Evaluation of Educational Achievement (IEA)
Reading Recovery: Can School Psychologists Contribute? Ruth M. Kelly, Western Illinois University Kelly R. Waner, Special Education Association of Adams.
PD Plan Agenda August 26, 2008 PBTE Indicators Track
Pennsylvania’s Continuous Improvement Process. Understanding AYP How much do you know about AYP?
1.  Why and How Did We Get Here? o A New Instructional Model And Evaluation System o Timelines And Milestones o Our Work (Admin and Faculty, DET, DEAC,
Missouri Reading Initiative Evaluation Plan: Goals, Activities, and Responsibilities.
Student Learning Targets (SLT) You Can Do This! Getting Ready for the School Year.
BOARD ENDS POLICY REVIEW E-2 Reading and Writing Testing Results USD 244 Board of Education March 12, 2001.
Curriculum Based Evaluations Informed Decision Making Leads to Greater Student Achievement Margy Bailey 2006.
Response to Intervention (RTI) Presented by Ashley Adamo and Brian Mitchell January 6, 2012.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
School Performance Measure Calculations SY Office of Achievement and Accountability.
Keystone State Reading Conference October 29, 2012 Dr. Deb Carr, King’s College.
Common Questions What tests are students asked to take? What are students learning? How’s my school doing? Who makes decisions about Wyoming Education?
Title I Annual Meeting  Information about Title I  Requirements of Title I  Rights of parents to be involved  Curriculum  Academic assessments.
DSD Curriculum & Instruction Department
Horizon Middle School June 2013 Balanced Scorecard In a safe, collaborative environment we provide educational opportunities that empower all students.
One Voice – One Plan Office of Education Improvement and Innovation MI-CSI: Do Stage Implement Plan and Monitor Plan.
TITLEIIA(3) IMPROVING TEACHER QUALITY COMPETITIVE GRANTS PROGRAM 1.
AFT 7/12/04 Marywood University Using Data for Decision Support and Planning.
5-Step Process Clarification The 5-Step Process is for a unit, topic, or “chunk” of information. One form should be used for the unit, topic, etc. The.
Missouri Reading Initiative Evaluation Plan: Goals, Activities, and Responsibilities.
9/18/ Title I/Reading Support Program Meet Your Teacher Night Dallastown Area School District.
Compass: Module 2 Compass Requirements: Teachers’ Overall Evaluation Rating Student Growth Student Learning Targets (SLTs) Value-added Score (VAM) where.
Response to Intervention (RTI) at Mary Lin Elementary Principal’s Coffee August 30, 2013.
Assessing Students With Disabilities: IDEA and NCLB Working Together.
Evaluating a Research Report
Blue Springs Elementary School Standards Based Report Card Parent Meeting.
The Missouri Reading Initiative Spring 2008 Annual Participant Survey Results.
Teacher Evaluation and Professional Growth Program Module 4: Reflecting and Adjusting December 2013.
Assessment of an Arts-Based Education Program: Strategies and Considerations Noelle C. Griffin Loyola Marymount University and CRESST CRESST Annual Conference.
Teacher and Principal Evaluation A new frontier….
Advancing Assessment Literacy Data Gathering IV: Collecting and Collating Data.
Evaluation Results Missouri Reading Initiative.
Hastings Public Schools PLC Staff Development Planning & Reporting Guide.
2008 FAEIS Annual Longitudinal Assessment With a Comparison to the 2007 Survey Results The purpose of the FAEIS annual evaluation is to develop longitudinal.
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
MEAP / MME New Cut Scores Gill Elementary February 2012.
Edit the text with your own short phrases. To change the sample image, select the picture and delete it. Now click the Pictures icon in the placeholder.
Melrose High School 2014 MCAS Presentation October 6, 2014.
Evaluation Results Missouri Reading Initiative.
Candidate Assessment of Performance Conducting Observations and Providing Meaningful Feedback Workshop for Program Supervisors and Supervising Practitioners.
Third Grade Guarantee. Overview of the requirements of the Third Grade Reading Guarantee. Implications of the Third Grade Guarantee for Worthington City.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Documenting Completion of your PDP
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
DEVELOPING INFORMATION TECHNOLOGY LITERACY AMONG TECHNOLOGY EDUCATION PRE-SERVICE TEACHERS IN PARAGUAY Hong Kong, January 2006 Aichi University of Education.
Writing a Professional Development Plan.  Step 1–Identify Indicators to be Assessed  Step 2 –Determine Average Baseline Score  Step 3 –Develop a Growth.
Quality Assurance Review Team Oral Exit Report School Accreditation Sugar Grove Elementary September 29, 2010.
Effectiveness of Selected Supplemental Reading Comprehension Interventions: Impacts on a First Cohort of Fifth-Grade Students June 8, 2009 IES Annual Research.
Learning Goals & Scales EAGLE POINT ELEMENTARY SEPTEMBER 16, 2015 SCHOOL PRESENTATION.
Purpose of Teacher Evaluation and Observation Minnesota Teacher Evaluation Requirements Develop, improve and support qualified teachers and effective.
The North Carolina Teacher Evaluation Process November 1, 2012
Closing the Educational Gap for Students with Disabilities Kristina Makousky.
Educator Recruitment and Development Office of Professional Development The NC Teacher Evaluation Process 1.
Title I Annual Meeting
Teacher SLTs
The New Educator Evaluation System
The New Educator Evaluation System
The New Educator Evaluation System
Measuring Project Performance: Tips and Tools to Showcase Your Results
TESTING: How We Measure Academic Achievement
2018 OSEP Project Directors’ Conference
Title I Parent Meeting September 29, 2015
Teacher SLTs
Assessing Students With Disabilities: IDEA and NCLB Working Together
An Introduction to Evaluating Federal Title Funding
TAPTM System Overview Teacher Excellence Student Achievement
Presentation transcript:

Evaluation Results

MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection Test Scores Standardized Tests Classroom Assessments (DRA) MAP Demographics Special Education Information MAP Analyses

MAP ANALYSES: Map analyses compare schools that have finished the MRI program with a randomly chosen sample of non-MRI elementary schools Results indicate MRI schools generally outperform non-MRI schools (Not a proof of a causal relationship)

Notes for MAP Analyses Note: With the following MAP Analyses charts the numbers are not as important as the comparative performance between MRI and non-MRI schools. This is because: 1.There is variation in the scores from year to year and school to school. 2.The calculation of the baseline changes as more data becomes available. The longer baselines mean there is less variation resulting in “flatter” or lower results. –For 2002 schools 1999 was the baseline –For 2003 schools an average of 1999/2000 was the baseline –For 2004 schools an average of 1999/2001 was the baseline –For 2005 schools an average of 2000/2002 was the baseline

Comparison of MRI and Random Samples Average % Change in Communication Arts Index per School

MAP Results In 2006 the MAP test was changed in ways that make comparisons to previous years difficult: –Communication Arts only –Achievement levels were reduced from five to four –Scaled Score intervals for categories were changed –Questions were adjusted to apply to multiple grade levels that were tested (Grades 3-8 instead of only 3 and 7)

MAP Results: For the comparison between MRI and the random sample of Missouri elementary schools was made in terms of the percentage change between a 3 Year Baseline and the outcome year of students who scored in the top two achievement levels (Proficient and Advanced) 2006: Baseline= : Baseline= In 2006 This was done for 1 st and 2 nd year K-3 MRI schools because there was only one 3 rd year graduating school in 2006 In 2007 the analysis was done for 3 rd Year schools only (n=17) for all grades 3-8

Adequate Yearly Progress As mandated by federal law, Missouri schools must make yearly progress goals in MAP scores For Communication Arts those goals were defined as the percentage of students scoring at Proficient or better 2003 – 19.4% % 2005 – 26.6% 2006 – 34.7% % The following Table provides a comparison between MRI schools and state-wide results.

Percentage of Schools Meeting AYP Levels 2003=19.4% 2004=20.4% 2005=26.6% 2006=34.7% 2007=42.9% Proficient and Advanced YearMRIState % (60 / 74) 50.9% (1,0469/2,053) % (50 / 50) 77.27% (1,569/2,033) % (28 / 35) 64.7% (1,317/2,036) 2006*78.5% (22/27) 62.6% (1,291/2,061) % (17/21) 59.4% (77/131)** * Beginning in 2006 AYP was calculated for grades 3-8 **Results for a randomly selected sample of elementary schools; State did not publish figures for entire population

Teaching and Learning Survey In this survey classroom teachers were asked to identify instructional practices and frequencies of use (using a scale of 1=Never to 5=Almost Daily) of a number of critical elements related to the goals of MRI training. One way of looking at the data is by identifying those practices that were not frequently utilized by “pre” respondents (less than “3”), and ask if there were any changes reflected in the “post” responses.

Teaching and Learning Survey Items: K-3 “pre” (2004) Mean <3 7: Assesses reading progress by use of informal assessments (running records, CAP, DRA, letter identification, etc.) 8: Implements reading workshop 11: Writes a text collaboratively with students sharing the pen 16:Uses scoring guides/rubrics to assess student writing 17: Implements writing workshop 21: Provides opportunities for students to use computers to write, publish, and practice

A7A8A11A15A16A17A20A rd Year Respondents (n=63) K-3 Practice Changes: The only category for which there was not significant change was #21: “Provides opportunities for students to use computers to write, publish, and practice”, suggesting that technology does not necessarily need to play large a role in literacy instruction.

Teaching and Learning Survey Items: 4-6 “pre” (2004) Mean <3 6: Conferences with students individually to discuss reading and comprehension strategies 7: Assesses reading progress by use of informal assessments (running records, CAP, DRA, letter identification, etc.) 8: Implements reading workshop 12: Conferences with students individually to discuss their writing progress 13: Collects student writing samples to document writing progress over time 15: Implements writing workshop 18: Provides opportunities for students to use computers to write, publish, and practice

A6A7A8A12A13A15A Grade Respondents (n=142) Practice Changes: The evidence presented here supports the statement that while there were practice changes, the strength of the variation is less than what was observed for K-3. The differences in intensity between K-3 and Upper Grade teaching cohorts are likely a result of the fact that the upper grades are more departmentalized with more content area teachers whose primary responsibilities are in subject areas other than literacy. Also, the upper grade teachers were more likely than the K-3 respondents to have increased their usage of computers during the three years of training.

Participant Survey Participants rate the usefulness of component utilization, practice change,"buy in", attitudes toward the program and trainer, etc. Results drive program change; e.g.; Upper Grade changes over time Please see the “2007 Survey Results” Power Point presentation at for more detailed results of the Participant Survey between 2002 and 2007.

There are two positive trends reflected in the MRI End of the Year Participant Questionnaire: (1) Participants rate the program higher with passage of time; and (2) each year sees the entry level of satisfaction rise for new cohorts. The following table demonstrates these trends between 2002 and 2007 Participant Survey

Participant Survey “Rate” by MRI Program Year MRI Coho rt (N=733)(N=956)(N=770)(N=642)(N=617)(N=488) 1 st Year (K-3=4.2; 4-6=3.6)(K-3=4.2; 4-6=3.7)(K-3=4.2; 4-6=4.6) 2 nd Year (K-3=4.1; 4-6=3.7)(K-3=4.3; 4-6=4.2) 3 rd Year* (K-3=4.2; 4-6=3.9) *3rd Year schools were interviewed in 2002 Reflecting on the effectiveness of the MRI program as a whole, how would you rate it? Poor Excellent

Participant Survey “Rate” by Participant Position The following chart presents the average responses to the “Rate” question over time by the respondents’ position. Note how the scores have improved for Grades 4-6. Analysis of the earlier lower scores revealed issues related to varying professional development needs for grades where teachers are more likely to teach specific content areas.

Position/Grade K st nd rd thna thna thna Reading Title I Special Ed Administration Total Average of “Rate” by Position