Presentation is loading. Please wait.

Presentation is loading. Please wait.

DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.

Similar presentations


Presentation on theme: "DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010."— Presentation transcript:

1 DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010

2 Objectives  Describe core features of an effective evaluation system  Evidence to document program, initiative, or intervention  Evidence to improve and sustain implementation  Evidence to direct policies and practices  Share an ongoing and exemplary state-level evaluation  Provide an opportunity for question-answer collaboration

3 Core Features of an Effective Evaluation System An effective evaluation has a clearly defined purpose that tells a story  Evidence to Document Program, Initiative, or Intervention context, input, fidelity, and impact  Evidence to Improve and Sustain Implementation continuous improvement cycles/stages of innovation  Evidence to Direct Policies and Practices efficient and effective annual reports

4 Evidence to Document Program, Initiative, or Intervention A simple plan: Organize evaluation around what you need to know using questions you can answer.  What circumstances, conditions, or events were foundation for implementing the program?  What was the program that was implemented?  Was the program implemented at a level of fidelity sufficient to produce change?  What changes resulted from implementing the program? Improvements in school and classroom ecology? Improvements in academic and social behavior?  Did implementation improve the capacity of the district/state to sustain the program?

5 Evidence to Document Program, Initiative, or Intervention Plan Perform Measure Compare Context Input Fidelity Impact What? How? Who? Where? When? Why? What difference did it make? A Comprehensive Evaluation Model

6 Documenting Program Context and Input  Information about national, state, and local education agency leadership personnel and program providers  Information about program participants  Information about program  Focus, critical features, and content  Type and amount of support  Perceptions and other indicators of appropriateness  Expectations for change

7 Documenting Program Context and Input Context Who Where When Why STATEWIDE LEADERSHIP AND COORDINATION One full-time Consultant and eight part-time Regional Coordinators support implementation of PBS in NC. The Consultant is in the Behavior Support and Special Programs Section of the Exceptional Children Division at the NC Department of Public Instruction. The Regional Coordinators are hosted by LEAs or another agency in their region and spend 1/3 of their time working with the PBS implementation in that school system or Charter School. The primary role of these professionals is to coordinate training, support trainers/coaches/coordinators in LEAs, and facilitate the evaluation of the statewide initiative. http://www.ncpublicschools.org/positivebehavior/coordinator/

8 Documenting Program Context and Input Context Who Where When Why Eighty-four of the 100 counties in the state have at least one school participating in the North Carolina Positive Behavior Support Initiative.

9 Documenting Program Context and Input Context Who Where When Why Steady growth has been evident in the number of schools that have implemented PBS and current estimates suggest that about 85% are still implementing. TypeECH/PreschoolElemMid/Jr.HighK (8-12)Alt./Ctr.NC Total n3488175651234777 Percent of NC Total0.39%62.81%22.52%8.37%1.54%4.38%100.00%

10 Documenting Program Context and Input Context Who Where When Why

11 Documenting Program Fidelity Intervention Level Self-Assessment Measures Progress Monitoring Measures Research Measures UniversalSelf-Assessment Survey (SAS) Benchmarks of Quality (BoQ) Team Implementation Checklist (TIC) School-wide Evaluation Tool (SET) Secondary and Tertiary Benchmarks of Advanced Tiers (BAT) Individual Student School-wide Evaluation Tool (I-SSET) OverallEBS Survey Implementation Phases Inventory (IPI) Phases of Implementation (POI) Forms on www.pbssurveys.orgwww.pbssurveys.org New site Spring 2010: www.pbsassessment.orgwww.pbsassessment.org Input What

12 Documenting Program Fidelity Schools participating in the program regularly assess the extent to which key features of PBIS are being implemented and they use this information to develop action plans for refining and sustaining the effort. Input What

13 Documenting Program Impact  Social Behavior Benefits  Classroom Climate  Attitudes  Attendance  Office Discipline Referrals (ODRs)  Individual Student Points/Behavior Records  Proportion of Time in Typical Educational Contexts  Referrals to Special Education  Academic Behavior Benefits  Instructional Climate  Attitudes  Universal Screening  Progress Monitoring (vocabulary, oral reading fluency)  Standardized Test Scores

14 Core Features of an Effective Evaluation System An effective evaluation has a clearly defined purpose that tells a story  Evidence to Document Program, Initiative, or Intervention context, input, fidelity, and impact  Evidence to Improve and Sustain Implementation continuous improvement cycles/stages of innovation  Evidence to Direct Policies and Practices efficient and effective annual reports

15 Evidence to Improve and Sustain Implementation Plan Perform Measure Compare A Continuous Improvement Model Context Input Fidelity Impact What? How? Who? Where? When? Why? What difference did it make?

16 Evidence to Improve and Sustain Implementation Stages of Implementation  Exploration  Installation  Initial Implementation  Full Implementation  Innovation  Sustainability 2 – 4 Years Fixsen, Naoom, Blase, Friedman, & Wallace, 2005 [report]report

17 Evidence to Improve and Sustain Implementation Sustainability: Continuous Revisioning  Building Capacity State Region District School  Using Coaches Universal Targeted Intensive On a smarter planet, analyze the data and you can predict what will happen faster. [IBM Smarter Planet Initiative]IBM Smarter Planet Initiative

18 Evidence to Improve and Sustain Implementation Build District Capacity for Sustained Effects  Policy Statement on Social Behavior  Board Outcome Indicators  Job Recruitment Content “knowledge and experience implementing school-wide positive behavior support systems”  School Improvement Goals  Fall Orientation Content Professional development for administrators, staff, and families  Annual Evaluations Continuous demonstration of effectiveness of implementing practices  Exploration  Installation  Initial Implementation  Full Implementation  Innovation  Sustainability

19 Core Features of an Effective Evaluation System An effective evaluation has a clearly defined purpose that tells a story  Evidence to Document Program, Initiative, or Intervention context, input, fidelity, and impact  Evidence to Improve and Sustain Implementation continuous improvement cycles/stages of innovation  Evidence to Direct Policies and Practices efficient and effective annual reports o external support o www. pbssurveys.orgwww. pbssurveys.org o www. pbseval.orgwww. pbseval.org o www. pbsassessment.orgwww. pbsassessment.org

20 Evidence to Direct, Support, and Revise Policy Decisions Evaluation Blueprint The OSEP Technical Assistance Center on Positive Behavioral Interventions and Supports has developed a document for individuals who are implementing School-wide Positive Behavior Intervention and Support (SWPBIS) in districts, regions, or states. The purpose of the “blueprint” is to provide a formal structure for evaluating if implementation efforts are (a) occurring as planned, (b) resulting in change in schools, and (c) producing improvement in student outcomes.OSEP Technical Assistance Center on Positive Behavioral Interventions and Supports (blueprint)blueprint

21 Evidence to Direct, Support, and Revise Policy Decisions Core Features of an Effective Evaluation Report  Purpose  Description of Program Core Features, Implementation Process  Context and Input Who, What, Where, When, Why  Fidelity How  Impact Is Program Resulting in Intended Benefits? Are Improvements Needed? Are Systems and Outcomes Sustainable?  Cost  Recommendations  Resources

22 Evidence to Direct, Support, and Revise Policy Decisions North Carolina PBIS 08-09 Evaluation Report Report highlights the continued growth of PBIS in North Carolina as well as indicators of fidelity of implementation and the positive impact PBIS is having on participating schools across the state. In addition, the report includes information about plans for sustainability through training, coaching, and partnerships with other initiatives, in particular Responsiveness to Instruction (RtI). (report)report North Carolina Annual Performance Reports Available http://www.dpi.state.nc.us/positivebehavior/data/evaluation/ North Carolina PBIS Web Site http://www.dpi.state.nc.us/positivebehavior/ Illinois Evaluation Reports http://pbisillinois.org/

23 Evidence from an Exemplary Evaluation

24 Presentation Questions and Answers


Download ppt "DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010."

Similar presentations


Ads by Google