Presentation is loading. Please wait.

Presentation is loading. Please wait.

Michigan School Testing Conference Ann Arbor, Michigan March 1, 2005 Michigan Department of Education Office of School Improvement.

Similar presentations


Presentation on theme: "Michigan School Testing Conference Ann Arbor, Michigan March 1, 2005 Michigan Department of Education Office of School Improvement."— Presentation transcript:

1 Michigan School Testing Conference Ann Arbor, Michigan March 1, 2005 Michigan Department of Education Office of School Improvement

2 Michigan School Testing Conference Education YES! A New School Improvement Framework + Revised School Performance Indicators = Changes in Education YES!

3 Michigan School Testing Conference The participants will receive an overview of the: Draft School Improvement Framework for Michigan Development of revised school performance indicators Possible changes to Education YES!

4 Michigan School Testing Conference The participants will provide: Feedback throughout the presentation

5 A New School Improvement Framework

6 The Vision… A coherent, comprehensive research-based School Improvement Framework Serve as a foundation for: Professional Development Technical Support Grant Criteria Assessment and Accountability Accreditation – Performance Indicators A practitioners’ “collaborative”

7 Convened 60 educators (July ‘04) Workgroup of ISD School Improvement Specialists drafted revisions (Aug – Dec) Field Services followed-up on “discrepancy list” (SY ’04-’05) State Board Review (Jan ‘05) Field Review/Feedback of SI Framework (Feb-Apr ’05) Overview of Milestones NOW Product

8 Overview of Workgroup Process Reviewed “Kent Report” for recommendations Reviewed current Performance Indicators Reviewed the literature on school improvement Cross-referenced research – search for common elements Developed a “school improvement framework” – strands, standards, benchmarks, criteria, evidence OSI develops framework; OEAA develops measurements

9 Criteria for SI Framework Based on Something (External Validity) “Logical”- Makes sense to various audiences (State Board, Legislature, Schools, Teachers…) Build on current Indicators (Internal Validity) Easy to Understand & User Friendly Measurable Self-sufficient/Stand Alone

10 Criteria for SI Framework Aligned - NCLB, Research, State/Federal Programs, PA 25, existing Performance Indicators Address triple purpose: Accreditation, School Improvement feedback and guidance, and Accountability Student achievement focus Strand/Standard/Benchmark/Criteria format District/School-based

11 SI Framework Structure Strand – General Area of Focus Standard - Category of Influence within the Strand. Benchmark - Focus of Influence within a Standard. Criteria - Process that drives the Benchmark. Evidence - Hard and/or soft data that provides evidence of continuous assessment or progress in each identified expectation.

12 SI Framework Structure 5 Strands 12 Standards 26 Benchmarks 87 Criteria

13 Strand I - LEADERSHIP Strand II – TEACHING & LEARNING Strand III - PERSONNEL & PROFESSIONAL DEVELOPMENT Strand IV – SCHOOL & COMMUNITY RELATIONS Strand V - DATA & KNOWLEDGE MANAGEMENT The Strands

14 Strand I - LEADERSHIP INSTRUCTIONAL LEADERSHIP OPERATIONAL RESOURCE MNGT. DISTRIBUTED LEADERSHIP CURRICULUM INSTRUCTION ASSESSMENT Strand II – TEACHING & LEARNING Strand III - PERSONNEL & PROF. DEVELOPMENT PERSONNEL QUALIFICATIONS PROFESSIONAL DEVELOPMENT Strand IV - SCHOOL/ COMMUNITY RELATIONS PARENT/FAMILY INVOLVEMENT COMMUNITY INVOLVEMENT Strand V DATA & KNOWLEDGE MANAGEMENT DATA MANAGEMENT KNOWLEDGE MANAGEMENT The Standards

15 Strand I - LEADERSHIP Educational Program Instructional Support Resource Allocation Operational Management School Climate and Culture Continuous Improvement Strand II – TEACHING & LEARNING Curriculum – Written & Aligned Curriculum – Communicated Instructional Planning Instructional Delivery Assessment Aligned to Curriculum and Instruction Reporting and Use of Data The Benchmarks

16 Strand III - PERSONNEL & PROFESSIONAL DEVELOPMENT Strand IV - SCHOOL/ COMMUNITY RELATIONS Strand V - DATA & KNOWLEDGE MANAGEMENT Requirements Skills, Knowledge, Dispositions Collaboration Content & Pedagogy Alignment Communication with Families/ Community Authentic Engagement with Families/ Community Identification & Collection Analysis Accessibility Reporting Interpretation & Application

17 Questions for Consideration Does each benchmark carry the same weight in improving student achievement? What are the implications?

18 The Framework Strand I – Leadership Standard A: Instructional Leadership 1. Educational Program Knowledge of Curriculum, Instruction, and Assessment Knowledge and Use of Data Technology Knowledge Student Development/Learning Knowledge of Adult Learning Change Agent Focus on Student Results

19 The Framework… Standard A: Instructional Leadership 2. Instructional Support Monitoring Coaching/Facilitating Staff Evaluation of Staff Clear Expectations Collaboration/Communication

20 The Framework… Standard B: Operational/Resource Management 1. Resource Allocation Human Resources Fiscal Equipment and Materials Time Space

21 The Framework… Standard B: Operational/Resource Management 2. Operational Management State and Federal District School

22 The Framework… Standard C: Distributed Leadership 1. School Culture and Climate Safe and Orderly Learning Focused Inclusive/Equitable Collaborative Inquiry Data-Driven Culture Collaborative Decision-Making

23 The Framework… Standard C: Distributed Leadership 2. Continuous Improvement Shared Vision/Mission Results-Focused Planning Planning Implemented Planning Monitored/Evaluated

24 The Framework, continued… Strand II – Teaching and Learning Standard A: Curriculum 1. Written and Aligned Curriculum Documents Curriculum Review Curriculum Alignment (MCF and GLCE) Articulated Design Inclusive

25 The Framework… Standard A: Curriculum 2. Communicated Staff Students Parents

26 The Framework… Standard B: Instruction 1. Planning Content Pedagogy Knowledge Developmental Appropriateness 2. Delivery Enacted Curriculum Research-based/Best Practices Focus on Student Engagement

27 The Framework… Standard C: Assessment 1. Aligned to Curriculum and Instruction Alignment/Content Validity Consistency/Reliability Multiple Measures 2. Reporting and Use of Data Systemic Reporting Informs Curriculum and Instruction Meets Needs of Students

28 The Framework, continued…. Strand III – Personnel and Professional Development Standard A: Personnel Qualifications 1. Requirements Certification/Requirements NCLB – Highly Qualified

29 The Framework… Standard A: Personnel Qualifications 2. Skills, Knowledge, and Dispositions Content Knowledge and Pedagogy Communication School/Classroom Management Collaboration Student-Centered Instructional Technology

30 The Framework… Standard B: Professional Development 1. Content and Pedagogy Use of Research-based/Best Practices Application to Curriculum Content Instructional Mentoring/Coaching 2. Collaboration Staff Participates in Learning Teams Collaborative Analysis of Student Work 3. Alignment Aligned Job-embedded Results-driven

31 The Framework, continued…. Strand IV – School and Community Relations Standard A: Family Involvement 1. Communications Variety of Methods Regard for Diversity 2. Authentic Engagement in Life of School Volunteering Extended Learning Opportunities Decision-Making

32 The Framework… Standard B: Community Involvement 1. Communication About/With School Variety of Methods Regard for Diversity 2. Authentic Engagement Businesses Educational Community-based Variety of Methods

33 The Framework, continued…. Strand V – Data & Knowledge Management Standard A: Data Management 1. Data Identification and Collection Systematic and Applied Multiple Types Multiple Sources Technical Quality

34 The Framework… Standard A: Data Management 1. Analysis Format Supports Analysis Format Supports Longitudinal Comparisons 2. Accessibility Retrievable Secure

35 The Framework… Standard B: Knowledge Management 1. Reporting User-friendly Appropriate 2. Interpretation and Application Meaningful Dialogue Use in Decision-Making

36 Questions for Consideration Are there other important criteria? Which of the SI Framework elements are the “performance indicators” – the 12 standards, the 26 benchmarks, or the 87 criteria? Data-based evidence – should all evidence be quantifiable? How to measure?

37 Revised school performance indicators

38 Revised School Improvement Indicators –How? Teacher Survey Focus on instruction and collaboration School Leader Survey Focus on Leadership School Report Focus on Process

39 Revised School Improvement Indicators – How? May include externally scored “constructed response” Other Potential Tools Parent Survey Student Survey

40 Questions for Consideration Do we need a parent survey? Do we need a student survey? If so, how does it look different at each grade range? Are we overlooking groups whose perspective is important? When is the appropriate time to administer the data collection? - November-December?

41 Develop rubric, point distribution, collect feedback, revise the SI Framework Develop tools, data collection instruments, and methods Develop a marketing plan, common message about the framework, pilot, and where/how to roll it out Prepare materials and MDE staff to support the pilot & roll-out IndicatorsMeasurementProfessional Development Communications SI Steering Committee Committees’ recommended work plan supported by OSI & OEAA. Next Steps: Committee Work

42 Questions for Consideration How might the self-assessment be submitted? Transparency of self-assessment – should it be visible to the general public via the web through a link with EdYES!?

43 Questions for Consideration Monitoring – who should be involved? Dissemination – what is the best way to let districts/schools/ISD’s know that the system is changing?

44 Next steps: Process (2005) Development of rubric, point distribution (Jan–Feb) Measurement development (Jan-March) Pilot SI Framework/Self-Assessment (April-May ‘05) Development of Self-Assessment Tool (March-July) Revise indicators and measures (June)

45 Next Steps, continued… State Board approves revisions (July) Launch Self-Assessment Tool (Sept) Schools self-assess (Oct-Nov) Data submitted and analyzed (Nov) Board reviews/approves results (Dec) Report cards released (Jan ‘06)

46 Questions for Consideration What didn’t we ask? What issues remain?

47 PI Work Group Contact Information: Dr. Ed Roeber, Executive Director Office of Educational Assessment and Accountability Roebere@michigan.gov Dr. Yvonne Caamal Canul, Director Office of School Improvement Canuly@michigan.gov Linda Forward, Consultant Office of School Improvement ForwardL@michigan.gov


Download ppt "Michigan School Testing Conference Ann Arbor, Michigan March 1, 2005 Michigan Department of Education Office of School Improvement."

Similar presentations


Ads by Google