Presentation is loading. Please wait.

Presentation is loading. Please wait.

The New York State School Improvement Grant Initiative Scientific and Evidence Based Evaluation of SIG/SPDG Initiatives: One State’s Response Office of.

Similar presentations


Presentation on theme: "The New York State School Improvement Grant Initiative Scientific and Evidence Based Evaluation of SIG/SPDG Initiatives: One State’s Response Office of."— Presentation transcript:

1 The New York State School Improvement Grant Initiative Scientific and Evidence Based Evaluation of SIG/SPDG Initiatives: One State’s Response Office of Professional Research & Development, Syracuse University, NY

2 REVISITING NYSIG AS DESIGNED: PROGRAM & EVALUATION

3 The NYS SIG Initiative is designed to: n Reduce achievement gap between special and general education students in high and low need schools. n Reduce or eliminated the disproportionality of language and ethnic minority students in classification and placement practices.

4 LEA REGIONAL SCHOOL SUPPORT CENTERS (RSSCs) SPECIAL EDUCATION RESOURCE AND TRAINING CENTERS (SETRCs) INSTITUTIONS OF HIGHER EDUCATION (IHEs) SPECIAL EDUCATION QUALITY ASSURANCE (SEQA) STATE IMPROVEMENT GRANT HIGHER EDUCATION SUPPORT CENTER VESID: RESOURCES, TA, OVERSIGHT

5 Data Collection: Student Outcomes & Performance Indicators Root Cause Analysis, Strategic Planning & Goal Setting Implementation of Activities The Evaluation Logic Model NYSED (VESID)

6 Two ‘Strands’ SIG Resources and partnerships applied to two strands: n Inservice Teachers n Pre-Service Teachers

7 Evaluation Goals n Tracking implementation n Identifying areas/strategies for program improvement n Capturing outcomes (according to stage of development of the program)

8 Evaluation Goals n Implementation to date n Degree of match between over arching goals of SIG and program activities n Degree of match between district stated goals and activities n Strengths of the SIG n Challenges to the program and strategies to meet these challenges n Lessons learned and emerging themes n Intermediate outcomes & performance measure attainment

9 Targets for Evaluation n District readiness n Partnership development with IHEs n Utility/effectiveness of technical assistance entities (VESID funded, HESC in particular) n Outcomes (particularly student improvement)

10 Multi-Method Evaluation Approach n Formative assistance n Surveys (IHE faculty, recent teacher preparation graduates, school administrators, training participants). n Interviews and focus groups n Document analysis n Site studies (inc interviews with SIG District grant recipients: admin, staff, TA providers and parents) n Quantitative analysis of student performance data

11 Evolution of a Systems Change n Sustained change n Initial changes (perceptions/data) n Implementation of planned activities to address goals n Desire for change and setting of goals (buy in) n Relationship building n Awareness

12 R eporting: Using the Logic Model Framework n Faithfulness of implementation n Effectiveness of activities n Lessons learned/Ingredients for success n Challenges noted through qualitative data collection n Outcome data (according to developmental stage of the Initiative) n Emerging policy issues n Recommendations

13 Participating IHEs with assistance from the HESC are expected to: n Develop, implement and/or sustain inclusive teacher preparation programs n Link with identified districts in their area to provide professional development and research assistance n Provide for professional development among other teacher educators at their university or college

14 Document analysis n Grant applications n IHE agreements n Initiative and School Grant reports n Meeting minutes: Management meetings Statewide Task Force IHE meetings Regional Task Force meetings

15 Surveys, focus groups, and interviews WITH THE IHE n Perceptions of status of partnerships, their role and change in districts n Changes in their teacher preparation programs n Usefulness of assistance from the HESC n Promising practices related to partnerships and teacher preparation programs WITH RECENT GRADUATES n Perceptions of preparedness n Identification of high impact activities and strategies in teacher preparation programs

16 Participating School Districts with assistance from the SIG Teams are expected to: n Undertake a root cause analysis process supported by SETRC Professional Development Specialists/RSSC Special Education Specialists. n Undertake a root cause analysis process supported by SETRC Professional Development Specialists/RSSC Special Education Specialists. n Develop plan for professional development and submit SIG District grant application. n Develop plan for professional development and submit SIG District grant application. n Work with RSSC, SETRC, & SIG Teams to determine the most effective use of SIG Team resources. n Work with RSSC, SETRC, & SIG Teams to determine the most effective use of SIG Team resources. n SIG Teams then provide job-embedded (sequential, on-going, and specifically job- relevant) professional development and tracking in the field. n SIG Teams then provide job-embedded (sequential, on-going, and specifically job- relevant) professional development and tracking in the field.

17 Site Visit Component n Annual selection of SIG Schools for site visits n Based on Variety of Variables –Geographical Location –Need –Urban, Suburban, Rural –Promising Practices

18 Logic Model for Instrument Development and Data Collection n Interview Protocol for Site Visits Developed Using Logic Model n Items Tapping into Documenting: –Activities –Outputs of Activities –Intermediate Outcomes –End Outcomes

19 Outputs of Activities n Changes in Teacher, Parent, Admin. Perceptions or Beliefs n Recognizing a Need to be Addressed n Recognizing New Method to Address Need n Recognizing Need for New Instructional Practices

20 Intermediate Outcomes n Changes in Actual Practices –E.g. Instructional Practices Across Buildings and at the Individual Classroom Level n Changes in Stakeholders Working Together –E.g. Increased Parent Involvement n Student Classroom Outcomes –E.g. Daily Behavior, Homework, Attitude

21 End Outcomes n Changes in classification, placement, and declassification practices –Compared with state averages –Disproportionality n Changes in Performance on State ELA and Math Assessments –General Education Students –For Students with Special Needs –Minority Students –ESL

22 Benefits n Able to document patterns between district and partners n Able to clearly delineate improvement by professional development method –Focus efforts on one building or across the entire district n Able to “tell story” over a period of time and see changes

23 Just the numbers…. Scope of the effort: n Total SIG School Districts Year One – Five: 51 (not counting Big 5) n Total Schools in ‘Big 5’: 19 n Total IHEs Partnering With Schools: 44 n Total IHEs Participating in SIG: 65

24 RESPONSIVE EVALUATION DESIGN

25 Other Evaluation Methods n Regional Think Tanks n SIG Team Interviews n School Outcome Data Tracking n Participant Training Survey n SIG Team Document Review n Observation & Participation

26 Review of Methods Revisiting SIG Evaluation Original Design: Review of Methods Year One-Five Revisiting SIG Evaluation Original Design: Review of Methods Year One-Five

27 Responsive Evaluation Model During the five years of the evaluation the design and methodologies needed to respond to: n External shifts/expectations/needs n Concurrent Internal programmatic changes/shifts Some of these shifts were anticipated and were worked into the original design, and some were not….

28 The NY SIG Responsive Evaluation Model The NY SIG Responsive Evaluation Model

29 Internal Shifts: Program Changes & Evaluation Responses n Partnerships: Challenges engaging parents  Regional Think Tanks. n Roles: Changes to the way NY thought about roles and responsibilities of SIG Teams and RSSC/SETRC partners  SIG Team Interviews, Training Surveys. n Practice: Challenges connecting with schools across vast geographical areas  SIG Team Interviews. n Programming: Introduction of new program components  Observation.

30 More Internal Shifts & Responses….. n Start-up: Time needed for start-up led to school grants for 3+ years  Document Review. n Roll-out: Changes to NYCBOE structure and school functioning  Document Review & Observation. n Reporting: Institution of school reporting mechanisms  Document Review.

31 A special note… Reality Check: Many of the internal shifts noted were linked to external shifts and needs of ‘external’ stakeholders….

32 Stakeholder Needs: Understanding Impact & Outcomes n Stakeholders: NYSED, USDOE, WESTAT n Challenges: –Identifying what the treatment is. –Identifying what people are doing and why. –Identifying who is receiving services, who isn’t, and how much. –Identifying the evidence base of the said activities. –Identifying impact on schools and students without use of experimental design. –Identifying changes in practice tied to SIG initiatives. –Availability (or lack) or data. –Identifying responsiveness and alignment to State needs.

33 Responses of Stakeholders to Challenges  External Shifts n National Evaluation. n Encourage Alignment with State Performance Plan. n Development of Federal Performance Measures. n ‘Collective call’ to utilize scientific and evidence based practice.

34 National Evaluation n Requirement: Understand outcomes/change in: child performance, teacher or administrator behavior, systems functioning, and scaling up of successful practices. n Evaluation Responses: –Provision of summary and full reports. –Phone Conference Interviews. –Attendance at 3 day multi-state workshop. –Responding to written information & clarification requests. –Move from sample data analysis to cohort data analysis.

35 State Performance Plan n Requirement: Respond to indicators including: graduation, drop-out, assessment participation, suspension/expulsion, LRE, dispoprotionality, parent involvement etc. n Evaluation Responses: –‘Retrofit’ of the evaluation plan to align with new and developing program components. –Regional think tanks on: parent involvement, disproportionality and culturally responsive practice –Participation in Disproportionality Learning Community –Tracking new initiatives in response to SPP (i.e. Learning Communities)  Observation & Participation, Document Review –Informed student and school outcome data points. –Move from sample data analysis to cohort data analysis.

36 Performance Measures n Requirement: Collect evidence in the areas of: scientific and evidence based opportunities offered and personnel trained, sustainability of efforts, teacher retention, alignment with State Performance Plan. n Evaluation Responses: –SIG Team Interviews to further understand implementation and outcomes. –Development of SIG Team Effort To Date Rubric. –Regional Think Tanks on sustainability. –Strengthening of IHE and School Survey instruments in the areas of documenting change in pre-service and in-service teacher practice & retention.

37 Scientific & Evidence Based Practice n Requirement: Personnel trained under programs supported by SPDG will have the knowledge and skills to deliver scientifically or evidence-based practices to children with disabilities. n Evaluation Responses: –Development of SIG Team Effort To Date Rubric. Rubric tracks: session topics, evidence base of content, # trainings, # personnel. –Documentation of practice through SIG Coordinator and SIG Team interviews.

38 Where did we meet stakeholder needs? –Identifying what the treatment is. –Identifying what people are doing and why. –Identifying who is receiving services, who isn’t, and how much. –Identifying changes in practice tied to SIG initiatives. –Identifying responsiveness and alignment to State needs.

39 Where do we still struggle? –Identifying the evidence base of the said activities. –Identifying impact on schools and students without use of experimental design. –Availability (or lack) or data. i.e. ‘Mining’ data down to the individual level to respond to SPP indicators involving IEPs and student outcomes, transition etc

40 Beyond SIG –Identifying the evidence base of the said activities  incorporation of requirement into grant applications and site selection processes. –Identifying impact on schools and students without use of experimental design  strengthening district reporting requirements. –Availability (or lack) or data  changes to cohort size, data collection direct from schools and/or regions.


Download ppt "The New York State School Improvement Grant Initiative Scientific and Evidence Based Evaluation of SIG/SPDG Initiatives: One State’s Response Office of."

Similar presentations


Ads by Google