Download presentation
Presentation is loading. Please wait.
Published byEverett Richards Modified over 9 years ago
1
1. The Process Rubrics (40 or 90) should be open soon. 2. The 2011-12 Data Profile and SI Plan are expected to open in December. 3. The complete CNA will be on-line this year. Three Parts * School Data Profile and Analysis * School Process Profile and Analysis * Summary Report (Goals Management) 4. The State will not be adjusting cut scores this year – schools are encouraged to compare their scores to the college-ready targets. 1
2
EdYes! is becoming MI-SAAS (Michigan School Accreditation and Accountability System) Category 1: Student Proficiency and Improvement (Statewide Top to Bottom Ranking) on all tested content areas Proficiency - Two-year average Improvement – 2-year average increasing or decreasing in ELA and math or four-year slope for writing, science, and social studies (elementary/middle) or all-subjects (HS) * Ranking less than 5% - unaccredited * Ranking greater than 5% and less than 20% - interim * Ranking 20% or greater - accredited Category 2: Persistently Lowest Achieving (PLA) Schools list – If a school is on the PLA list, the initial accreditation status becomes “unaccredited”. Category 3: Adequate Yearly Progress (AYP) status – If a school fails to make AYP AND the initial accreditation status is “Accredited” THEN the initial accreditation status is lowered to “Interim.” AYP alone cannot drop you to “Unaccredited.” 2
3
Category Four 1. Do 100% of school staff hold Michigan certification (different from highly qualified)? 2. Is the school's annual School Improvement Plan published? 3. Are required curricula offered: Grade Level Content Expectations in grades K-8, MMC in grades 9 - 12? 4. Is a fully compliant Annual Report published? 5. Have the School Performance Indicators (40 or 90) been submitted? 6. Are literacy and math tested annually in grades 1 – 5? 7. Is the five-year high school graduation rate 80% or above OR Is the attendance rate 90% or above if the school does not have a graduation rate? 8. If the school was selected to participate in NAEP, did it do so? 9. Did the school test 95% of all students in every tested content area? If the answer is "no" to any question in two consecutive years, the accreditation status is lowered one level, even if the "no" is for a different question each year. It is expected that these nine items will be assurances located in the School Improvement plan. 3
4
Why should we monitor our school improvement efforts? 4
5
Leadership and Learning Center 2010
6
What should we be monitoring in our school improvement efforts? How should we be monitoring our school improvement efforts? What are you currently doing in your school to monitor your school improvement efforts? 6
7
MONITOR MONITOR IMPLEMENTATION OF THE PLAN (FORMATIVE) 1.HOW ARE THESE PILLARS BEING IMPLEMENTED WITH FIDELITY BY ALL STAKEHOLDERS? 2.HOW IS STUDENT ACHIEVEMENT BEING IMPACTED? EVALUATE THE PLAN (SUMMATIVE) 1. DID ALL STAKEHOLDERS IMPLEMENT THE PLAN WITH FIDELITY? 2. DID IT IMPACT STUDENT ACHIEVEMENT IN THE WAY WE THOUGHT IT WOULD? ARE PEOPLE COMPLETING THEIR ASSIGNED TASKS? ARE WE IMPLEMENTING THE PLAN CORRECTLY AND CONSISTENTLY? ARE WE REGULARLY ANALYZING STUDENT DATA? DO WE HAVE THE RIGHT TOOLS TO MEASURE OUR MEASURABLE OBJECTIVES OR DO WE NEED OTHERS? ARE WE GIVING THE PROCESS ENOUGH TIME? ENOUGH RESOURCES?
8
MONITOR IMPLEMENTATION OF THE PLAN (FORMATIVE) 1.HOW ARE THESE PILLARS BEING IMPLEMENTED WITH FIDELITY BY ALL STAKEHOLDERS? 2.HOW IS STUDENT ACHIEVEMENT BEING IMPACTED? EVALUATE THE PLAN (SUMMATIVE) 1. DID ALL STAKEHOLDERS IMPLEMENT THE PLAN WITH FIDELITY? 2. DID THE PLAN IMPACT STUDENT ACHIEVEMENT IN THE WAY WE THOUGHT IT WOULD? DID PEOPLE COMPLETE THEIR ASSIGNED TASKS? DID WE IMPLEMENT THE PLAN CORRECTLY AND CONSISTENTLY? DID WE REGULARLY ANALYZE STUDENT DATA? DID WE HAVE THE RIGHT TOOLS TO MEASURE OUR MEASURABLE OBJECTIVES? DID WE GIVE THE PROCESS ENOUGH TIME? ENOUGH RESOURCES?
9
9 Demographic DataAchievement/ Outcome Data Process DataPerception Data Enrollment Subgroups of students Staff Attendance (Students and Staff) Mobility Graduation and Dropout Conference Attendance Education status Student subgroups Parent Involvement Teaching Staff Course enrollment patterns Discipline referrals Suspension rates Alcohol‐tobacco‐drugs violations Participation extra‐curriculars Physical, mental, social and health Local assessments: District Common Assessments, Classroom Assessments, Report Cards State assessments: MME, ACT, MEAP, MIAccess, MEAP Access, ELPA National assessments: ACT Plan, ACT Explore, ACT WorkKeys, NWEA, ITBS, CAT, MET NAEP, PSAT GPA Dropout rates College acceptance Policies and procedures (e.g. grading, homework, attendance, discipline) Academic and behavior expectations Parent participation – PT conferences, PTO/PTA, volunteers Suspension data School Process Profile Rubrics (40 or 90) or SA/SAR (NCA) Event occurred: Who, what, when, where, why, how What you did for Whom: Eg. All 8th graders received violence Prevention Survey data (student, parent, staff, community) Opinions Clarified what others think People act based on what they believe How do they see you/us?
10
Building Your Data Profile 1. What data was lacking in last year’s data profile? 2. What needs to be done to fill the gap? 3. What could/should you be doing NOW to begin filling the gap? 10
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.