We are using GoToWebinar for our Distance Learning sessions this year. Please be sure that you are using a headset with microphone and muting all other.

Slides:



Advertisements
Similar presentations
[Imagine School at North Port] Oral Exit Report Quality Assurance Review Team School Accreditation.
Advertisements

Consensus Building Infrastructure Developing Implementation Doing & Refining Guiding Principles of RtI Provide working knowledge & understanding of: -
Key Stage 3 National Strategy
Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
RESPONSE TO INTERVENTION Policy & Practice Institute June 25, 2008 Mike Stetter and Lori Duerr Delaware Department of Education.
IMPLEMENTING EABS MODERNIZATION Patrick J. Sweeney School Administration Consultant Educational Approval Board November 15, 2007.
Title I, Part A and Section 31a At Risk 101
School Based Assessment and Reporting Unit Curriculum Directorate
1 SESSION 5- RECORDING AND REPORTING IN GRADES R-12 Computer Applications Technology Information Technology.
Management Plans: A Roadmap to Successful Implementation
A Roadmap to Successful Implementation Management Plans.
PD Plan Agenda August 26, 2008 PBTE Indicators Track
Response to Instruction and Intervention Process Presentation.
Title I, Part A Targeted Assistance 101 Field Services Unit Office of School Improvement.
Parents as Partners in Education
BEST PRACTICES in RtI to Theresa M. Janczak, Ph.D.
The Need To Improve STEM Learning Successful K-12 STEM is essential for scientific discovery, economic growth and functioning democracy Too.
1 Phase III: Planning Action Developing Improvement Plans.
Virginia Teacher Performance Evaluation System 0 August 2012.
Student Learning Targets (SLT)
Campus Improvement Plans
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
The Marzano School Leadership Evaluation Model Webinar for Washington State Teacher/Principal Evaluation Project.
Please be sure that your audio is working properly. Go to: Tools Audio Audio Setup Wizard Differentiation in the Statewide System of Support Distance Learning.
Rubrics-Based Evaluation of a Statewide System of Support A Tool to Enhance Statewide Systems of Support.
Rubrics-Based Evaluation of a Statewide System of Support A Tool to Enhance Statewide Systems of Support.
Essential Elements in Implementing and Monitoring Quality RtI Procedures Rose Dymacek & Edward Daly Nebraska Department of Education University of Nebraska-
Webinar #1 The Webinar will begin shortly. Please make sure your phone is muted. (*6 to Mute, #6 to Unmute) 7/3/20151.
Schoolwide Planning, Part III: Strategic Action Planning
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Principal Leadership Academy Basic Leadership Training November 2012.
Designing and Implementing An Effective Schoolwide Program
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
1 GENERAL OVERVIEW. “…if this work is approached systematically and strategically, it has the potential to dramatically change how teachers think about.
Evaluating the Outcomes of SSOS: Qualities of an Effective Evaluation Steven M. Ross, Ph.D., Johns Hopkins University.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Please be sure that your audio is working properly. Go to: Tools Audio Audio Setup Wizard Evaluating Components of the Statewide System of Support Distance.
Please be sure that your audio is working properly. Go to: Tools Audio Audio Setup Wizard Differentiation in the Statewide System of Support Distance Learning.
GTEP Resource Manual Training 2 The Education Trust Study (1998) Katie Haycock “However important demographic variables may appear in their association.
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
The Academy of Pacesetting Districts Introducing...
Utilizing the School Restructuring Resources Lauren Morando Rhim & Bryan C. Hassel Public Impact For Center on Innovation and Improvement.
Please be sure that your audio is working properly. Go to: Tools Audio Audio Setup Wizard Differentiation in the Statewide System of Support Distance Learning.
Presentation Topics School Improvement Plans Combing Title I Schoolwide with SIPs Submitting Plans & Required Documents Districtwide Initiatives Training.
Targeted Assistance Programs: Requirements and Implementation Spring Title I Statewide Conference May 15, 2014.
Division Liaison Update Division Liaison Meeting The College of William and Mary January 7, 2013.
The Instructional Decision-Making Process 1 hour presentation.
Karen Seay PARENTAL INVOLVEMENT 101 – Writing a compliant policy and compact We’re all in this together:  State Department of Education 
Overview – Indistar® SSOS Online Web Tool in comparison to the publication “ Evaluating the Statewide System of Support” Assessment Process Planning Process.
The Center on Innovations in Learning (CIL ) A National Content Center Content Center Presentation January 31, 2014.
Evaluating the Outcomes of SSOS: Qualities of an Effective Evaluation Steven M. Ross, Ph.D., Johns Hopkins University.
Using the Indistar® Web-based Planning Tool to Support School Improvement Session #2 Presenters: Yvonne A. Holloman, Ph.D. Michael Hill Office of School.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Equitable Services, Part 2 Planning for Equitable Services Virginia Department of Education Office of Program Administration and Accountability Title I.
1 The Oregon Reading First Model: A Blueprint for Success Scott K. Baker Eugene Research Institute/ University of Oregon Orientation Session Portland,
Cohort 2 Focus School Technical Assistance Webinar Session 3 December 12, 2013 Yvonne A. Holloman, Ph.D. Associate Director Office of School Improvement.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
OSEP Project Directors’ Conference Washington, DC July 21, 2008 Tools for Bridging the Research to Practice Gap Mary Wagner, Ph.D. SRI International.
Suggested Components of a Schoolwide Reading Plan Part 1: Introduction Provides an overview of key components of reading plan. Part 2: Component details.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
A Capacity Building Program of the Virginia Department of Education Division Support for Substantial School Improvement 1.
Moving Title IA School Plans into Indistar ESEA Odyssey Summer 2015 Presented by Melinda Bessner Oregon Department of Education.
Statewide System of Support For High Priority Schools Office of School Improvement.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Chair: Linda Miller, Great Lakes West Comprehensive Center Statewide Systems of Support: The RCC & State Story.
McREL TEACHER EVALUATION SYSTEM
Annual Title I Meeting and Benefits of Parent and Family Engagement
McREL TEACHER EVALUATION SYSTEM
Presentation transcript:

We are using GoToWebinar for our Distance Learning sessions this year. Please be sure that you are using a headset with microphone and muting all other speakers OR you may call the conference number located on your message center screen. If you need any other technical assistance, please call Stephanie at ext. 32 or me at Distance Learning – Cohort #2 Building a Statewide System of Support with Evaluation in Mind

Agenda Greetings & whos here Special report from Tom Kerins, CII and Steven Ross, Johns Hopkins University Questions, comments, whats next

Whos Here Cohort 2 State Teams and RCC Liaisons: West Virginia, Montana, Vermont, Nevada, Wisconsin & BIE Cohort 1 visitors Presenting: Tom Kerins, CII & Steven Ross, Johns Hopkins University CII Staff: Stephanie Benedict, Marilyn Murphy & Tom Kerins What is one thing you had hoped to learn or hear more about when you signed up for this session?

Topic Building a Statewide System of Support with Evaluation in Mind

Rubrics-Based Evaluation of a Statewide System of Support A Tool to Enhance Statewide Systems of Support

Purpose To present a framework for how a State Education Agency (SEA) can evaluate the capacity, operational efficiency, and effectiveness of its Statewide System of Support (SSOS). For guiding an SEAs internal evaluation of its SSOS or its development of specifications for an external evaluation. In establishing ongoing monitoring, reporting, and formative evaluation processes for an SEAs SSOS.

Development of the SSOS Evaluation Rubrics Basis-A Framework for Effective Statewide Systems of Support developed by Rhim, Hassel, and Redding Research on roles of states in school improvement, including case studies of five State Education Agencies and surveys of all 50 states, Washington DC and Puerto Rico. Intensive work with a pacesetting group of 9 states.

Conclusions to the Research Successful systemic reform requires incentives, capacity, and opportunities Each SEA needs an organizational framework to document its strengths and weaknesses and for planning SSOS improvement. There is a need for a strong, continuous, state designed and district-directed improvement process to assist schools at all levels of performance

Components of the Rubric-Based Evaluation Part A: SSOS Plan and Design 1. Specified comprehensive plan for SSOS 2. Defined evidence-based programs/interventions for all students and subgroups 3. Plan for formative evaluation

Components of the Rubric-Based Evaluation Part B: Resources 4. Staff 5. Funding 6. Data Analysis and Storage 7. Distinguished educators, consultants, experts, etc. 8. External providers

Components of the Rubric-Based Evaluation Part C: Implementation 9. Removal of barriers 10. Incentives for change 11. Communications 12. Technical assistance 13. Dissemination of Knowledge 14. Formative evaluation and monitoring (audits)

Components of the Rubric-Based Evaluation Part D: Outcomes Student achievement Student attendance Graduation rate

Essential Indicators Within these 4 Parts are 42 Essential Indicators that define the critical components of a States SSOS Four-point rubrics with cells individualized to each of the 42 indicators help explain and define the different stages a State will go through as it successfully meets each indicator

Rubric Decisions Next to each indicator there are 4 columns describing the possible continuum of progress Little or No Development or Implementation Limited Development or Partial Implementation Mostly Functional level of Development and Implementation Full Level of Implementation and Evidence of Impact

Sample Essential Indicator Coordination among state and federal programs Little or No Development of Implementation: There is no apparent plan to efficiently coordinate programs with different funding sources that are aimed at improving schools receiving SSOS services. Limited Development or Partial Implementation: The state has a written plan and has made some preliminary attempts to integrate multiple state and federal programs aimed at school improvement. Mostly Functional Level of Development and Implementation: The state has begun to integrate multiple programs with common goals but different funding streams in areas such as planning, resource allocation, training, reporting, and compliance monitoring. Full Level of Implementation and Evidence of Impact: The state has fully implemented its program integration plan, and there is evidence of greater efficiency in planning, resource allocation, and compliance monitoring.

Cumulative Scoring To receive a rating of III Mostly functional level of development and implementation, the SSOS must also fulfill the requirements to receive a rating of II Limited development or partial implementation.

Explanatory Materials Provided in the Evaluation Rubric Report Evaluation rubric with 42 Essential Indicators Sample ratings for each indicator along with examples of evidence to help each SEA Team rate its own SSOS Examples from states that help explain the Indicator statements A template for SEA Team self- scoring Essential components of an evaluation plan

Determining the Rating Essential Indicator 7.2: Training for distinguished educators and support teams

What the SEA said it had accomplished As required by the state plan, all Distinguished Educators (DE) must participate in three levels of training/professional development: (a) a one-week summer session, (b) a two-day refresher in early fall, and (c) ongoing coaching mentoring during the DEs first year. The DE Academy, which delivers the training, conducts regular formative evaluations of the activities and services, using the data to make refinements as needed.

Determining the rating The reviewers rated the state as operating at Level IV on this indicator. The training process for DEs was formally defined, comprehensive, fully implemented, and subjected to continuing review, evaluation and improvement.

State Examples Related to the Indicators* Indicator 2.2Coordination of services across SEA departments The example shows how Ohio worked with the Department of Education, its own Regional Programs, and internally to model how cooperation can be accomplished so funds and requirements can be integrated. * See the Evaluation Rubric Report for state examples for each indicator

Rubric-Based Evaluation Activities The rubrics illustrate the continuum that occurs with each Indicator as States develop their SSOS. Each State Team (using evidence) should develop a profile of how its SSOS lines up with all 42 indicators by using the Rubrics template to note the present stage of development. Comments should be included to note what needs to be done to improve the initial results of the self-rating. Each State Team should choose at least six indicators for immediate action after this self-review process

Role of CII in this process Each State Team should develop a plan of action including tasks, timelines and the responsibilities of each team member as they begin to turn the indicator statements into objectives. Staff from CII will be available by webinar as well as on- site work to assist State Teams as they use the Rubrics template to document the status of their SSOS.

Evaluation Each SEA Team should use the initial results from this rubric as baseline information Periodically (and certainly annually) each SEA Team should check for progress on the entire rubric and specifically on those sections of the Rubric that generated recommendations. CII staff are available to assist in any of these evaluations of SEA progress

The Evaluation Rubric & Indistar The Indistar system can be used to choose indicators and document planning. Using Indistar procedures, a team can begin the process of selecting indicators through the needs assessment, creating plans and assigning tasks to certain team members and other staff, as well as monitor the progress of the work as a whole. To view the sample Indistar site, go to and click on the Indistar login in the bottom, left corner of the page. Use the following login information… Login: ssos Password: ssoswww.centerii.org Each state that is interested in using the online version of this tool will be given their own unique login and password.

The Evaluation Rubric & Indistar(cont.) Before you will be given your unique login and password, we ask that you participate in an additional webinar for the SSoS Online Tool Orientation Training. The webinar will be scheduled for May 13 th at 1:00 pm CST. If you are interested in joining that webinar, please send an to and we will send you the registration link. An alternative date of May 27th at 1:00 pm CST will also be available if you cannot make the first webinar.

Assess….Plan ….Monitor If your State team is interested in using the Indistar Tool and would like to get an individual state login/password, please contact Stephanie Benedict, For all CII support for SSoS, please contact Tom Kerins,

Questions Comments….

Evaluating the Outcomes of SSOS: Qualities of an Effective Evaluation Steven M. Ross, Ph.D., Johns Hopkins University

Steven M. Ross, Ph.D. Steven M. Ross is a senior research scientist and professor at the Center for Research and Reform in Education at Johns Hopkins University. Dr. Ross expertise is in educational research and evaluation, school reform and improvement, at-risk learners, and technology integration.

Questions to Ponder Is evaluation substantively and routinely embedded in your SSOS? Are we as states going beyond completing the rubrics and looking at root causes and data? Do evaluation results help to scale up successful activities and discontinuing others (e.g., certain providers)? Are external evaluators being used for support? Should they be?

The Evaluation Rubric : A Quick Review Part A: SSOS Plan and Design 1. Specified comprehensive plan 2. Defined evidence-based programs 3. Plan for evaluation Part A: SSOS Plan and Design 1. Specified comprehensive plan 2. Defined evidence-based programs 3. Plan for evaluation

The Evaluation Rubric : A Quick Review Part B: Resources 4. Staff 5. Funding 6. Data analysis and storage 7. Distinguished educators, consultants, & experts 8. External providers

Part C: Implementation 9. Removal of barriers for change and innovation 10. Incentives for change 11. Communications 12. Technical Assistance 13. Dissemination of knowledge 14. Monitoring and program audits The Evaluation Rubric : A Quick Review

Part D: Outcomes for Schools Served by SSOS 15. Student achievement 16. Student attendance 17. Graduation rate The Evaluation Rubric : A Quick Review

42 Rubrics and Their Rating Scales I. Limited or No Development or Implementation II. Limited Development or Partial Implementation III. Mostly Functional Level of Development and Implementation IV. Full Level of Implementation and Evidence of Impact

SSOS Evaluation Rubric 2.5 From Evaluating the Statewide System of Support, by S. Hanes, T. Kerins, C. Perlman, S. Redding, & S. Ross, 2009, p. 24, Table 2. Copyright 2009 by Academic Development Institute. Reprinted with permission. 2. Defined evidence-based programs/interventions for all students & subgroups

SSOS Evaluation Rubric 15.1 From Evaluating the Statewide System of Support, by S. Hanes, T. Kerins, C. Perlman, S. Redding, & S. Ross, 2009, p. 37, Table 15. Copyright 2009 by Academic Development Institute. Reprinted with permission.

Why Evaluate SSOS: Isnt There Enough to Do Already? Being able to make reliable and valid judgments of the status of the services provided How fully are services being implemented? To what extent are expected outcomes being achieved?

To provide accountability information for SDE and external organizations To demonstrate accountability to consumers (districts, schools, educators, parents) To develop a process for continuous program improvement Why Evaluate SSOS: Isnt There Enough to Do Already?

Properties of an Effective Evaluation Validity (rigorous, reliable) Evidence-Based – Documented plan – Meeting agenda – Survey responses – Performance standards – Outcome measures

Strong Evidence: 80% of principals surveyed rated the Distinguished Educators support as very helpful and provided a specific example of how the DE helped their school. Weak Evidence: The Governor touted the states progress in assisting low-performing schools during his tenure. Properties of an Effective Evaluation

Evaluative rather than descriptive Properties of an Effective Evaluation From Evaluating the Statewide System of Support, by S. Hanes, T. Kerins, C. Perlman, S. Redding, & S. Ross, 2009, p. 91, Table 1. Copyright 2009 by Academic Development Institute. Reprinted with permission.

Evaluating Educational Outcomes Part D, Section 15 of Rubric SSOS is ultimately about improving student achievement and educational success These are distal or culminating outcomes that may not show immediate change Nonetheless, it is educationally and politically important to monitor these indicators

Evaluating Educational Outcomes: Recommendation 1 1.Treat the essential indicators (Part D, Section 15) as a starting point only Given the 42 rubric indicators, which services appear to be the most essential to improve? Prioritize these improvement needs

Evaluating Educational Outcomes: Recommendation 2 2. Supplement the essential indicators with follow- up analyses of root causes and data Potentially successful turnaround strategies that may be scalable Unsuccessful strategies that need replacement Explanation of the outcomes relative to SSOS service provided Example: Although we train distinguished educators and revise the training each year, what evidence is there that the training is effective?

Analyses of Root Causes Example A: Schools showing the strongest increases in mathematics are visited by SDE and found to be using highly interactive teaching strategies and expanded learning opportunities Implication for SSOS?

Analyses of Root Causes Example B: School B increased its reading scores significantly over the past year. Follow-up study of student enrollment patterns reveals that student rezoning decreased the number of disadvantaged students by 50%. Implication for SSOS?

Analyses of Root Causes Example C: School C had several student subgroups fail to attain AYP in reading. Follow-up interviews with the principal and literacy coaches reveal that the new R/LA curriculum was poorly supported by the provider. Implication for SSOS?

Evaluating Educational Outcomes: Recommendation 3 3. Supplement the beginning evaluation (Recommendation 1) and follow-up analyses (Recommendation 2) with rigorous evaluations of selected interventions RFPs for external studies Assistance to school districts interested in evaluation research Rigorous data analyses by SDE to study achievement patterns associated with SSOS interventions

Accurate and Relevant Evidence Strong, Suggestive, or Weak ? Teachers liked the professional development activities.

Accurate and Relevant Evidence Strong, Suggestive, or Weak ? Systematic observation by independent observers shows significant increases in student-centered instruction.

Accurate and Relevant Evidence Strong, Suggestive, or Weak ? Teachers indicate that they use more student-centered instruction than in the past.

Accurate and Relevant Evidence Strong, Suggestive, or Weak ? Principals and grade-level leaders indicate observing more frequent cooperative learning than last year.

Accurate and Relevant Evidence Strong, Suggestive, or Weak ? The providers of the professional development believed the offerings to be successful.

Accurate and Relevant Evidence Strong, Suggestive, or Weak ? Reading scores increased by 15% for the schools receiving SSOS in literacy.

Working with External Evaluators Question: Is it more or less costly than using SDE staff? Answer: It depends on the expertise and availability of the latter.

Working with External Evaluators What types of evaluation tasks most need external evaluators? The Basic Rubric (Study I) and the essential indicators (Study II) might best be performed in- house The external evaluator (at low cost) would be helpful to corroborate the Study I and II findings Rigorous studies of specific interventions (Study III) are most appropriate for external evaluators

Working with External Evaluators Advantages of External Evaluators Expertise in research design/data analysis School/district staff likely to be more disclosive Independence/credibility

Working with External Evaluators Key Steps Use systematic process to select the evaluator (careful review of prior work and client satisfaction) Establish clear plan of work and budget Clearly define evaluation/research questions Monitor the study via regular meetings/reports, etc. Work with the evaluator to disseminate results to improve policies and practices

Concluding Comments Unless SSOS is evaluated, it is unlikely to improve The benefits of the evaluation depend on its rigor and quality There is little to gain by painting rosy pictures of mediocre outcomes – The message is that all is well and should be left alone – A truthful negative evaluation is a stimulus for change There is much to gain by presenting objective results to sustain services that work and improve those that are ineffective

Questions Comments Whats Next

Center on Innovation & Improvement Staff Tom Kerins, Programs Director, Marilyn Murphy, Communication Director, Lisa Kinnaman, Director of Improvement Support to States, Stephanie Benedict, Client Relations Coordinator,