Rubrics-Based Evaluation of a Statewide System of Support A Tool to Enhance Statewide Systems of Support.

Slides:



Advertisements
Similar presentations
Practice Profiles Guidance for West Virginia Schools and Districts April 2012.
Advertisements

(Individuals with Disabilities Education Improvement Act) and
We are using GoToWebinar for our Distance Learning sessions this year. Please be sure that you are using a headset with microphone and muting all other.
Management Plans: A Roadmap to Successful Implementation
Student Learning Targets (SLT)
The Marzano School Leadership Evaluation Model Webinar for Washington State Teacher/Principal Evaluation Project.
Ohio Improvement Process (OIP) Your Local School District District Team Orientation Date Time.
Please be sure that your audio is working properly. Go to: Tools Audio Audio Setup Wizard Differentiation in the Statewide System of Support Distance Learning.
Toolkit Series from the Office of Migrant Education Webinar: SDP Toolkit August 16, 2012.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Service Agency Accreditation Recognizing Quality Educational Service Agencies Mike Bugenski
Rubrics-Based Evaluation of a Statewide System of Support A Tool to Enhance Statewide Systems of Support.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Webinar #1 The Webinar will begin shortly. Please make sure your phone is muted. (*6 to Mute, #6 to Unmute) 7/3/20151.
Schoolwide Planning, Part III: Strategic Action Planning
Designing and Implementing An Effective Schoolwide Program
Coaching for School Improvement: A Guide for Coaches and Their Supervisors An Overview and Brief Tour Karen Laba Indistar® Summit September 2, 2010.
“Singing in Tune and Staying on Message” A Musical aka Strengthening a Standards-Aligned System For a Statewide System of Support In Alaska.
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
Progress Monitoring and Action Planning Using the Team Implementation Checklist The Wisconsin RtI Center/Wisconsin PBIS Network (CFDA #84.027) acknowledges.
2014 AmeriCorps External Reviewer Training
2007 Institute for School Improvement and Education Options How the System Might Address Parental Involvement.
New England Regional Colloquium Series “Systems of State Support” B. Keith Speers January 24, 2007.
Evaluating the Outcomes of SSOS: Qualities of an Effective Evaluation Steven M. Ross, Ph.D., Johns Hopkins University.
Performance and Development Culture Preparing for P&D Culture accreditation April 2008.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
The Academy of Pacesetting Districts Introducing...
Utilizing the School Restructuring Resources Lauren Morando Rhim & Bryan C. Hassel Public Impact For Center on Innovation and Improvement.
Introduction & Step 1 Presenter:. Training Overview Introduction Participation requirements FET Tool Orientation Distribution of username & passwords.
What is Alaska STEPP Alaska STEPP, which stands for Steps Toward Educational Progress and Partnership, is an online, school improvement planning tool based.
Targeted Assistance Programs: Requirements and Implementation Spring Title I Statewide Conference May 15, 2014.
National Center on Response to Intervention NCRTI TECHNICAL ASSISTANCE DOCUMENTATION AND IMPLEMENTATION Tessie Rose, PhD NCRTI Co-coordinator of TA and.
Overview – Indistar® SSOS Online Web Tool in comparison to the publication “ Evaluating the Statewide System of Support” Assessment Process Planning Process.
Title I Schoolwide Planning Comprehensive Needs Assessment Wednesday, October 24, 2012.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Virginia Association of School Superintendents Annual Conference Patty.
Evaluating the Outcomes of SSOS: Qualities of an Effective Evaluation Steven M. Ross, Ph.D., Johns Hopkins University.
Using the Indistar® Web-based Planning Tool to Support School Improvement Session #2 Presenters: Yvonne A. Holloman, Ph.D. Michael Hill Office of School.
Overview NIATx-SI Business Practices for the Future Learning Collaborative Fee-for-Service Cohort II Informational Call Jeanne Pulvermacher, Project Director.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
1 The Oregon Reading First Model: A Blueprint for Success Scott K. Baker Eugene Research Institute/ University of Oregon Orientation Session Portland,
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
Family Resource and Youth Services Centers: Action Component Plan.
On Site Review Process Office of Field Services Last Revised 8/15/2011.
Ohio Department of Education March 2011 Ohio Educator Evaluation Systems.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
ESEA FOR LEAs Cycle 1 Monitoring Arizona Department of Education Revised October 2015.
Orientation and Summer Institutes Implementer’s Forum October 2005 Susan Barrett PBIS Maryland.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
A Capacity Building Program of the Virginia Department of Education Division Support for Substantial School Improvement 1.
Moving Title IA School Plans into Indistar ESEA Odyssey Summer 2015 Presented by Melinda Bessner Oregon Department of Education.
Managing Performance in the System of Support A Guide for Providing Technical Assistance to SEAs for Assessing and Improving their System of Recognition,
A simple tool for a complex job INDISTAR. Learning Outcomes As a result of this training, participants will be able to… Navigate the Wisconsin Indistar.
Leadership Teams Implementing PBIS Module 14. Objectives Define role and function of PBIS Leadership Teams Define Leadership Team’s impact on PBIS implementation.
V 2.1 Version 2.1 School-wide PBIS Tiered Fidelity Inventory.
Session 2: Developing a Comprehensive M&E Work Plan.
Statewide System of Support For High Priority Schools Office of School Improvement.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Governor Support Service Training Governor Workshop 31 st March 2016 As a service we have a responsibility to enable all governors to access appropriate,
Office of School Improvement Contractor Update Division Leadership Support Team Meeting The College of William and Mary March 31, 2014.
February 25, Today’s Agenda  Introductions  USDOE School Improvement Information  Timelines and Feedback on submitted plans  Implementing plans.
Chair: Linda Miller, Great Lakes West Comprehensive Center Statewide Systems of Support: The RCC & State Story.
Dutchess Community College Middle States Self-Study 2015
Statewide Systems of Support
Implementation Guide for Linking Adults to Opportunity
Framework for an Effective Statewide System of Support
Differentiation in the Statewide System of Support
McREL TEACHER EVALUATION SYSTEM
McREL TEACHER EVALUATION SYSTEM
Implementing, Sustaining and Scaling-Up High Quality Inclusive Preschool Policies and Practices: Application for Intensive TA September 10, 2019 Lise.
Presentation transcript:

Rubrics-Based Evaluation of a Statewide System of Support A Tool to Enhance Statewide Systems of Support

Purpose To present a framework for how a State Education Agency (SEA) can evaluate the capacity, operational efficiency, and effectiveness of its Statewide System of Support (SSOS). For guiding an SEA’s internal evaluation of its SSOS or its development of specifications for an external evaluation. In establishing ongoing monitoring, reporting, and formative evaluation processes for an SEA’s SSOS.

Development of the SSOS Evaluation Rubrics Basis-A Framework for Effective Statewide Systems of Support developed by Rhim, Hassel, and Redding Research on roles of states in school improvement, including case studies of five State Education Agencies and surveys of all 50 states, Washington DC and Puerto Rico. Intensive work with a pacesetting group of 9 states.

Conclusions to the Research Successful systemic reform requires incentives, capacity, and opportunities Each SEA needs an organizational framework to document its strengths and weaknesses and for planning SSOS improvement. There is a need for a strong, continuous, state designed and district-directed improvement process to assist schools at all levels of performance

Components of the Rubric-Based Evaluation Part A: SSOS Plan and Design 1. Specified comprehensive plan for SSOS 2. Defined evidence-based programs/interventions for all students and subgroups 3. Plan for formative evaluation

Components of the Rubric-Based Evaluation Part B: Resources 4. Staff 5. Funding 6. Data Analysis and Storage 7. Distinguished educators, consultants, experts, etc. 8. External providers

Components of the Rubric-Based Evaluation Part C: Implementation 9. Removal of barriers 10. Incentives for change 11. Communications 12. Technical assistance 13. Dissemination of Knowledge 14. Formative evaluation and monitoring (audits)

Components of the Rubric-Based Evaluation Part D: Outcomes Student achievement Student attendance Graduation rate

Essential Indicators Within these 4 Parts are 42 Essential Indicators that define the critical components of a State’s SSOS Four-point rubrics with cells individualized to each of the 42 indicators help explain and define the different stages a State will go through as it successfully meets each indicator

Rubric Decisions Next to each indicator there are 4 columns describing the possible continuum of progress Little or No Development or Implementation Limited Development or Partial Implementation Mostly Functional level of Development and Implementation Full Level of Implementation and Evidence of Impact

Sample Essential Indicator Coordination among state and federal programs  Little or No Development of Implementation: There is no apparent plan to efficiently coordinate programs with different funding sources that are aimed at improving schools receiving SSOS services.  Limited Development or Partial Implementation: The state has a written plan and has made some preliminary attempts to integrate multiple state and federal programs aimed at school improvement.  Mostly Functional Level of Development and Implementation: The state has begun to integrate multiple programs with common goals but different funding streams in areas such as planning, resource allocation, training, reporting, and compliance monitoring.  Full Level of Implementation and Evidence of Impact: The state has fully implemented its program integration plan, and there is evidence of greater efficiency in planning, resource allocation, and compliance monitoring.

Cumulative Scoring To receive a rating of III “Mostly functional level of development and implementation”, the SSOS must also fulfill the requirements to receive a rating of II “Limited development or partial implementation”.

Explanatory Materials Provided in the Evaluation Rubric Report Evaluation rubric with 42 Essential Indicators Sample ratings for each indicator along with examples of evidence to help each SEA Team rate its own SSOS Examples from states that help explain the Indicator statements A template for SEA Team self- scoring Essential components of an evaluation plan

Determining the Rating Essential Indicator 7.2: Training for distinguished educators and support teams

What the SEA said it had accomplished As required by the state plan, all Distinguished Educators (DE) must participate in three levels of training/professional development: (a) a one-week summer session, (b) a two-day refresher in early fall, and (c) ongoing coaching mentoring during the DE’s first year. The “DE Academy,” which delivers the training, conducts regular formative evaluations of the activities and services, using the data to make refinements as needed.

Determining the rating The reviewers rated the state as operating at Level IV on this indicator. The training process for DEs was formally defined, comprehensive, fully implemented, and subjected to continuing review, evaluation and improvement.

State Examples Related to the Indicators* Indicator 2.2—Coordination of services across SEA departments The example shows how Ohio worked with the Department of Education, its own Regional Programs, and internally to model how cooperation can be accomplished so funds and requirements can be integrated. * See the Evaluation Rubric Report for state examples for each indicator

Rubric-Based Evaluation Activities The rubrics illustrate the continuum that occurs with each Indicator as States develop their SSOS. Each State Team (using evidence) should develop a profile of how its SSOS lines up with all 42 indicators by using the Rubric’s template to note the present stage of development. Comments should be included to note what needs to be done to improve the initial results of the self-rating. Each State Team should choose at least six indicators for immediate action after this self-review process

Role of CII in this process Each State Team should develop a plan of action including tasks, timelines and the responsibilities of each team member as they begin to turn the indicator statements into objectives. Staff from CII will be available by webinar as well as on- site work to assist State Teams as they use the Rubric’s template to document the status of their SSOS.

Evaluation Each SEA Team should use the initial results from this rubric as baseline information Periodically (and certainly annually) each SEA Team should check for progress on the entire rubric and specifically on those sections of the Rubric that generated recommendations. CII staff are available to assist in any of these evaluations of SEA progress

The Evaluation Rubric & Indistar The Indistar system can be used to choose indicators and document planning. Using Indistar procedures, a team can begin the process of selecting indicators through the needs assessment, creating plans and assigning tasks to certain team members and other staff, as well as monitor the progress of the work as a whole. To view the sample Indistar site, go to and click on the Indistar login in the bottom, left corner of the page. Use the following login information… Login: ssos Password: ssoswww.centerii.org Each state that is interested in using the online version of this tool will be given their own unique login and password.

The Evaluation Rubric & Indistar(cont.) Before you will be given your unique login and password, we ask that you participate in an additional webinar for the “SSoS Online Tool Orientation Training”. The webinar will be scheduled for May 13 th at 1:00 pm CST. If you are interested in joining that webinar, please send an to and we will send you the registration link. An alternative date of May 27th at 1:00 pm CST will also be available if you cannot make the first webinar.

Assess….Plan ….Monitor If your State team is interested in using the Indistar Tool and would like to get an individual state login/password, please contact Stephanie Benedict, For all CII support for SSoS, please contact Tom Kerins,