Presentation is loading. Please wait.

Presentation is loading. Please wait.

We are using GoToWebinar for our Distance Learning sessions this year. Please be sure that you are using a headset with microphone and muting all other.

Similar presentations


Presentation on theme: "We are using GoToWebinar for our Distance Learning sessions this year. Please be sure that you are using a headset with microphone and muting all other."— Presentation transcript:

1 We are using GoToWebinar for our Distance Learning sessions this year. Please be sure that you are using a headset with microphone and muting all other speakers OR you may call the conference number located on your message center screen. If you need any other technical assistance, please call Stephanie at 217-732-6462 ext. 32 or email me at sbenedict@adi.org. Distance Learning – Cohort #2 Building a Statewide System of Support with Evaluation in Mind

2 Agenda Greetings & whos here Special report from Tom Kerins, CII and Steven Ross, Johns Hopkins University Questions, comments, whats next

3 Whos Here Cohort 2 State Teams and RCC Liaisons: West Virginia, Montana, Vermont, Nevada, Wisconsin & BIE Cohort 1 visitors Presenting: Tom Kerins, CII & Steven Ross, Johns Hopkins University CII Staff: Stephanie Benedict, Marilyn Murphy & Tom Kerins What is one thing you had hoped to learn or hear more about when you signed up for this session?

4 Topic Building a Statewide System of Support with Evaluation in Mind

5 Rubrics-Based Evaluation of a Statewide System of Support A Tool to Enhance Statewide Systems of Support

6 Purpose To present a framework for how a State Education Agency (SEA) can evaluate the capacity, operational efficiency, and effectiveness of its Statewide System of Support (SSOS). For guiding an SEAs internal evaluation of its SSOS or its development of specifications for an external evaluation. In establishing ongoing monitoring, reporting, and formative evaluation processes for an SEAs SSOS.

7 Development of the SSOS Evaluation Rubrics Basis-A Framework for Effective Statewide Systems of Support developed by Rhim, Hassel, and Redding Research on roles of states in school improvement, including case studies of five State Education Agencies and surveys of all 50 states, Washington DC and Puerto Rico. Intensive work with a pacesetting group of 9 states.

8 Conclusions to the Research Successful systemic reform requires incentives, capacity, and opportunities Each SEA needs an organizational framework to document its strengths and weaknesses and for planning SSOS improvement. There is a need for a strong, continuous, state designed and district-directed improvement process to assist schools at all levels of performance

9 Components of the Rubric-Based Evaluation Part A: SSOS Plan and Design 1. Specified comprehensive plan for SSOS 2. Defined evidence-based programs/interventions for all students and subgroups 3. Plan for formative evaluation

10 Components of the Rubric-Based Evaluation Part B: Resources 4. Staff 5. Funding 6. Data Analysis and Storage 7. Distinguished educators, consultants, experts, etc. 8. External providers

11 Components of the Rubric-Based Evaluation Part C: Implementation 9. Removal of barriers 10. Incentives for change 11. Communications 12. Technical assistance 13. Dissemination of Knowledge 14. Formative evaluation and monitoring (audits)

12 Components of the Rubric-Based Evaluation Part D: Outcomes Student achievement Student attendance Graduation rate

13 Essential Indicators Within these 4 Parts are 42 Essential Indicators that define the critical components of a States SSOS Four-point rubrics with cells individualized to each of the 42 indicators help explain and define the different stages a State will go through as it successfully meets each indicator

14 Rubric Decisions Next to each indicator there are 4 columns describing the possible continuum of progress Little or No Development or Implementation Limited Development or Partial Implementation Mostly Functional level of Development and Implementation Full Level of Implementation and Evidence of Impact

15 Sample Essential Indicator 5.1 5.1 Coordination among state and federal programs Little or No Development of Implementation: There is no apparent plan to efficiently coordinate programs with different funding sources that are aimed at improving schools receiving SSOS services. Limited Development or Partial Implementation: The state has a written plan and has made some preliminary attempts to integrate multiple state and federal programs aimed at school improvement. Mostly Functional Level of Development and Implementation: The state has begun to integrate multiple programs with common goals but different funding streams in areas such as planning, resource allocation, training, reporting, and compliance monitoring. Full Level of Implementation and Evidence of Impact: The state has fully implemented its program integration plan, and there is evidence of greater efficiency in planning, resource allocation, and compliance monitoring.

16 Cumulative Scoring To receive a rating of III Mostly functional level of development and implementation, the SSOS must also fulfill the requirements to receive a rating of II Limited development or partial implementation.

17 Explanatory Materials Provided in the Evaluation Rubric Report Evaluation rubric with 42 Essential Indicators Sample ratings for each indicator along with examples of evidence to help each SEA Team rate its own SSOS Examples from states that help explain the Indicator statements A template for SEA Team self- scoring Essential components of an evaluation plan

18 Determining the Rating Essential Indicator 7.2: Training for distinguished educators and support teams

19 What the SEA said it had accomplished As required by the state plan, all Distinguished Educators (DE) must participate in three levels of training/professional development: (a) a one-week summer session, (b) a two-day refresher in early fall, and (c) ongoing coaching mentoring during the DEs first year. The DE Academy, which delivers the training, conducts regular formative evaluations of the activities and services, using the data to make refinements as needed.

20 Determining the rating The reviewers rated the state as operating at Level IV on this indicator. The training process for DEs was formally defined, comprehensive, fully implemented, and subjected to continuing review, evaluation and improvement.

21 State Examples Related to the Indicators* Indicator 2.2Coordination of services across SEA departments The example shows how Ohio worked with the Department of Education, its own Regional Programs, and internally to model how cooperation can be accomplished so funds and requirements can be integrated. * See the Evaluation Rubric Report for state examples for each indicator

22 Rubric-Based Evaluation Activities The rubrics illustrate the continuum that occurs with each Indicator as States develop their SSOS. Each State Team (using evidence) should develop a profile of how its SSOS lines up with all 42 indicators by using the Rubrics template to note the present stage of development. Comments should be included to note what needs to be done to improve the initial results of the self-rating. Each State Team should choose at least six indicators for immediate action after this self-review process

23 Role of CII in this process Each State Team should develop a plan of action including tasks, timelines and the responsibilities of each team member as they begin to turn the indicator statements into objectives. Staff from CII will be available by webinar as well as on- site work to assist State Teams as they use the Rubrics template to document the status of their SSOS.

24 Evaluation Each SEA Team should use the initial results from this rubric as baseline information Periodically (and certainly annually) each SEA Team should check for progress on the entire rubric and specifically on those sections of the Rubric that generated recommendations. CII staff are available to assist in any of these evaluations of SEA progress

25 The Evaluation Rubric & Indistar The Indistar system can be used to choose indicators and document planning. Using Indistar procedures, a team can begin the process of selecting indicators through the needs assessment, creating plans and assigning tasks to certain team members and other staff, as well as monitor the progress of the work as a whole. To view the sample Indistar site, go to www.centerii.org and click on the Indistar login in the bottom, left corner of the page. Use the following login information… Login: ssos Password: ssoswww.centerii.org Each state that is interested in using the online version of this tool will be given their own unique login and password.

26 The Evaluation Rubric & Indistar(cont.) Before you will be given your unique login and password, we ask that you participate in an additional webinar for the SSoS Online Tool Orientation Training. The webinar will be scheduled for May 13 th at 1:00 pm CST. If you are interested in joining that webinar, please send an email to pacesetters@adi.org and we will send you the registration link. An alternative date of May 27th at 1:00 pm CST will also be available if you cannot make the first webinar.

27 Assess….Plan ….Monitor If your State team is interested in using the Indistar Tool and would like to get an individual state login/password, please contact Stephanie Benedict, sbenedict@adi.org. For all CII support for SSoS, please contact Tom Kerins, tkerins@adi.org. sbenedict@adi.orgtkerins@adi.org

28 Questions Comments….

29 Evaluating the Outcomes of SSOS: Qualities of an Effective Evaluation Steven M. Ross, Ph.D., Johns Hopkins University

30 Steven M. Ross, Ph.D. Steven M. Ross is a senior research scientist and professor at the Center for Research and Reform in Education at Johns Hopkins University. Dr. Ross expertise is in educational research and evaluation, school reform and improvement, at-risk learners, and technology integration.

31 Questions to Ponder Is evaluation substantively and routinely embedded in your SSOS? Are we as states going beyond completing the rubrics and looking at root causes and data? Do evaluation results help to scale up successful activities and discontinuing others (e.g., certain providers)? Are external evaluators being used for support? Should they be?

32 The Evaluation Rubric : A Quick Review Part A: SSOS Plan and Design 1. Specified comprehensive plan 2. Defined evidence-based programs 3. Plan for evaluation Part A: SSOS Plan and Design 1. Specified comprehensive plan 2. Defined evidence-based programs 3. Plan for evaluation

33 The Evaluation Rubric : A Quick Review Part B: Resources 4. Staff 5. Funding 6. Data analysis and storage 7. Distinguished educators, consultants, & experts 8. External providers

34 Part C: Implementation 9. Removal of barriers for change and innovation 10. Incentives for change 11. Communications 12. Technical Assistance 13. Dissemination of knowledge 14. Monitoring and program audits The Evaluation Rubric : A Quick Review

35 Part D: Outcomes for Schools Served by SSOS 15. Student achievement 16. Student attendance 17. Graduation rate The Evaluation Rubric : A Quick Review

36 42 Rubrics and Their Rating Scales I. Limited or No Development or Implementation II. Limited Development or Partial Implementation III. Mostly Functional Level of Development and Implementation IV. Full Level of Implementation and Evidence of Impact

37 SSOS Evaluation Rubric 2.5 From Evaluating the Statewide System of Support, by S. Hanes, T. Kerins, C. Perlman, S. Redding, & S. Ross, 2009, p. 24, Table 2. Copyright 2009 by Academic Development Institute. Reprinted with permission. 2. Defined evidence-based programs/interventions for all students & subgroups

38 SSOS Evaluation Rubric 15.1 From Evaluating the Statewide System of Support, by S. Hanes, T. Kerins, C. Perlman, S. Redding, & S. Ross, 2009, p. 37, Table 15. Copyright 2009 by Academic Development Institute. Reprinted with permission.

39 Why Evaluate SSOS: Isnt There Enough to Do Already? Being able to make reliable and valid judgments of the status of the services provided How fully are services being implemented? To what extent are expected outcomes being achieved?

40 To provide accountability information for SDE and external organizations To demonstrate accountability to consumers (districts, schools, educators, parents) To develop a process for continuous program improvement Why Evaluate SSOS: Isnt There Enough to Do Already?

41 Properties of an Effective Evaluation Validity (rigorous, reliable) Evidence-Based – Documented plan – Meeting agenda – Survey responses – Performance standards – Outcome measures

42 Strong Evidence: 80% of principals surveyed rated the Distinguished Educators support as very helpful and provided a specific example of how the DE helped their school. Weak Evidence: The Governor touted the states progress in assisting low-performing schools during his tenure. Properties of an Effective Evaluation

43 Evaluative rather than descriptive Properties of an Effective Evaluation From Evaluating the Statewide System of Support, by S. Hanes, T. Kerins, C. Perlman, S. Redding, & S. Ross, 2009, p. 91, Table 1. Copyright 2009 by Academic Development Institute. Reprinted with permission.

44 Evaluating Educational Outcomes Part D, Section 15 of Rubric SSOS is ultimately about improving student achievement and educational success These are distal or culminating outcomes that may not show immediate change Nonetheless, it is educationally and politically important to monitor these indicators

45 Evaluating Educational Outcomes: Recommendation 1 1.Treat the essential indicators (Part D, Section 15) as a starting point only Given the 42 rubric indicators, which services appear to be the most essential to improve? Prioritize these improvement needs

46 Evaluating Educational Outcomes: Recommendation 2 2. Supplement the essential indicators with follow- up analyses of root causes and data Potentially successful turnaround strategies that may be scalable Unsuccessful strategies that need replacement Explanation of the outcomes relative to SSOS service provided Example: Although we train distinguished educators and revise the training each year, what evidence is there that the training is effective?

47 Analyses of Root Causes Example A: Schools showing the strongest increases in mathematics are visited by SDE and found to be using highly interactive teaching strategies and expanded learning opportunities Implication for SSOS?

48 Analyses of Root Causes Example B: School B increased its reading scores significantly over the past year. Follow-up study of student enrollment patterns reveals that student rezoning decreased the number of disadvantaged students by 50%. Implication for SSOS?

49 Analyses of Root Causes Example C: School C had several student subgroups fail to attain AYP in reading. Follow-up interviews with the principal and literacy coaches reveal that the new R/LA curriculum was poorly supported by the provider. Implication for SSOS?

50 Evaluating Educational Outcomes: Recommendation 3 3. Supplement the beginning evaluation (Recommendation 1) and follow-up analyses (Recommendation 2) with rigorous evaluations of selected interventions RFPs for external studies Assistance to school districts interested in evaluation research Rigorous data analyses by SDE to study achievement patterns associated with SSOS interventions

51 Accurate and Relevant Evidence Strong, Suggestive, or Weak ? Teachers liked the professional development activities.

52 Accurate and Relevant Evidence Strong, Suggestive, or Weak ? Systematic observation by independent observers shows significant increases in student-centered instruction.

53 Accurate and Relevant Evidence Strong, Suggestive, or Weak ? Teachers indicate that they use more student-centered instruction than in the past.

54 Accurate and Relevant Evidence Strong, Suggestive, or Weak ? Principals and grade-level leaders indicate observing more frequent cooperative learning than last year.

55 Accurate and Relevant Evidence Strong, Suggestive, or Weak ? The providers of the professional development believed the offerings to be successful.

56 Accurate and Relevant Evidence Strong, Suggestive, or Weak ? Reading scores increased by 15% for the schools receiving SSOS in literacy.

57 Working with External Evaluators Question: Is it more or less costly than using SDE staff? Answer: It depends on the expertise and availability of the latter.

58 Working with External Evaluators What types of evaluation tasks most need external evaluators? The Basic Rubric (Study I) and the essential indicators (Study II) might best be performed in- house The external evaluator (at low cost) would be helpful to corroborate the Study I and II findings Rigorous studies of specific interventions (Study III) are most appropriate for external evaluators

59 Working with External Evaluators Advantages of External Evaluators Expertise in research design/data analysis School/district staff likely to be more disclosive Independence/credibility

60 Working with External Evaluators Key Steps Use systematic process to select the evaluator (careful review of prior work and client satisfaction) Establish clear plan of work and budget Clearly define evaluation/research questions Monitor the study via regular meetings/reports, etc. Work with the evaluator to disseminate results to improve policies and practices

61 Concluding Comments Unless SSOS is evaluated, it is unlikely to improve The benefits of the evaluation depend on its rigor and quality There is little to gain by painting rosy pictures of mediocre outcomes – The message is that all is well and should be left alone – A truthful negative evaluation is a stimulus for change There is much to gain by presenting objective results to sustain services that work and improve those that are ineffective

62 Questions Comments Whats Next

63 Center on Innovation & Improvement Staff Tom Kerins, Programs Director, tkerins@adi.org Marilyn Murphy, Communication Director, mmurphy@centerii.org Lisa Kinnaman, Director of Improvement Support to States, lkinnaman@adi.org Stephanie Benedict, Client Relations Coordinator, sbenedict@adi.org


Download ppt "We are using GoToWebinar for our Distance Learning sessions this year. Please be sure that you are using a headset with microphone and muting all other."

Similar presentations


Ads by Google