Assessing Program Quality with the Autism Program Environment Rating Scale.

Slides:



Advertisements
Similar presentations
Consensus Building Infrastructure Developing Implementation Doing & Refining Guiding Principles of RtI Provide working knowledge & understanding of: -
Advertisements

The NDPC-SD Intervention Framework National Dropout Prevention Center for Students with Disabilities Clemson University © 2007 NDPC-SD – All rights reserved.
Purpose of Instruction
Creating Collaborative Standards-Based IEPs: A Training for IEP Team Members Session Three.
Module: IEPs Head Start Center for Inclusion
Omaha Public Schools Behavior Consultation Team Program Supporting Children with Challenging Behaviors Kylee Starmer – Behavior Consultant Omaha Public.
August 15, 2012 Fontana Unified School District Superintendent, Cali Olsen-Binks Associate Superintendent, Oscar Dueñas Director, Human Resources, Mark.
Purpose of Evaluation  Make decisions concerning continuing employment, assignment and advancement  Improve services for students  Appraise the educator’s.
Student Growth Developing Quality Growth Goals II
The Oregon Framework for Teacher and Administrator Evaluation and Support Systems April Regionals Multiple Measures: Gathering Evidence 1.
The Massachusetts Model System for Educator Evaluation Training Module 5: Gathering Evidence August
July 2013 IFSP and Practice Manual Revisions April 29, 2013 May 3, 2013 Infant & Toddler Connection of Virginia Practice Manual Infant & Toddler Connection.
Consistency of Assessment
TCRP TEACHER ADVISORY PANEL MEETING December 2011 Derrick Chau, VP Instruction Diane Fiello, TCRP Coach
EBS Survey Vermont PBS “Bringing out the BEST in all of us.”
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Assessing Student Learning
UNDERSTANDING, PLANNING AND PREPARING FOR THE SCHOOL-WIDE EVALUATION TOOL (SET)
Speakers Dr. Blanca Enriquez, Director, Office of Head Start
Professional Growth= Teacher Growth
School’s Cool in Childcare Settings
Principles of Assessment
Adapted from Growing Success (Ontario Schools) by K. Gibson
So What Can I Expect When I Serve on an NEASC/CPSS Visiting Team? A Primer for New Team Members.
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Module 3 of 3.
New Jersey Department of Education Division of Early Childhood Education March 3, 2009.
NPDC Materials Designed to use with Teachers and Teams.
National Professional Development Center on ASD Lisa Sullivan MIND Institute, UC Davis.
Intro to Positive Behavior Interventions & Supports (PBiS)
School’s Cool in Kindergarten for the Kindergarten Teacher School’s Cool Makes a Difference!
Interstate New Teacher Assessment and Support Consortium (INTASC)
Performance and Development Culture Preparing for P&D Culture accreditation April 2008.
Targeted Assistance Programs: Requirements and Implementation Spring Title I Statewide Conference May 15, 2014.
EDU 385 Education Assessment in the Classroom
November 3, 2014 RCN Fall Leadership. Today’s General Agenda START and OSE Updates & Priorities Peer to Peer Support Data Collection 1 Secondary Transition.
Planning and Integrating Curriculum: Unit 4, Key Topic 1http://facultyinitiative.wested.org/1.
South Western School District Differentiated Supervision Plan DRAFT 2010.
After lunch - Mix it up! Arrange your tables so that everyone else seated at your table represents another district. 1.
March Madness Professional Development Goals/Data Workshop.
The IEP: Drafting the IEP (Steps 1, 2, 3, and 4) Southwest Ohio Special Education Regional Resource Center Tuesday, November 7, 2006.
Academic Practicum Winter Academic Practicum Seminar2 Agenda 4 Welcome 4 Burning ??’s 4 Routines & Organizational Systems 4 Overview of Academic.
 Field Experience Evaluations PSU Special Educator Programs Confidence... thrives on honesty, on honor, on the sacredness of obligations, on faithful.
AN OVERVIEW OF THE CHILD OUTCOMES SUMMARY RATING PROCESS 1 Maryland State Department of Education - Division of Special Education/Early Intervention Services.
Education Unit The Practicum Experience Session Two.
Candidate Assessment of Performance Conducting Observations and Providing Meaningful Feedback Workshop for Program Supervisors and Supervising Practitioners.
ESEA, TAP, and Charter handouts-- 3 per page with notes and cover of one page.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Measuring and Improving Child and Family Outcomes Conference New Orleans, Sept 19-21, 2011 Using the Child Outcomes Measurement System (COMS) Self-Assessment.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Literacy Partner 2007 – 2008 The literacy partner supports student learning by collaborating with teachers and administrators to model best practices and.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
© Crown copyright 2008 Subject Leaders’ Development Meeting Spring 2009.
An Overview of Revisions to the Rhode Island Model
Classroom Self-Assessment Using the ICP: Evaluation of the Process Award of Excellence in Inclusion of Children with Special Needs ExceleRate Illinois.
Assessment for Learning in Kindergarten
Implementing the Professional Growth Process Session 3 Observing Teaching and Professional Conversations American International School-Riyadh Saturday,
HOUSTON INDEPENDENT SCHOOL DISTRICT Appraisal Training for Central Office and Campus-Based Non-Teacher Employees September 2013 HOUSTON INDEPENDENT.
Improving Data, Improving Outcomes Conference Washington, DC Sept , 2013 Planful Changes: Using Self-Assessments to Improve Child and Family Outcome.
Training Personnel Using Autism online ebp Modules
Collecting Meaningful Outcome Data on Graduates
APERS scores: the first semester of the program and one year post graduation for 9 graduates Results Improved overall scores and for all but one of the.
Child Outcomes Summary Process April 26, 2017
New Goal Clarity Coach Training October 27, 2017
Welcome Parents! Parent Advisory.
ESE 315 Innovative Education-- snaptutorial.com
Chris Russell Sam Morgan Hunter College SPED 746
Exploring Assessment Options NC Teaching Standard 4
Clinical Educator and Teacher Candidate Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Cooperating Teacher and Student Teacher Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Presentation transcript:

Assessing Program Quality with the Autism Program Environment Rating Scale

Learner Goals and Present Levels (IEP) Learner Strengths, Interests, and History Team Member Experience and Knowledge Assessment Selection and Implementation of Evidence Based Practices  Implementation Learner Progress  Outcomes 2 Program Quality

APERS in Vermont NPDC Partner Sites:  Weeks of September 12 and 26, 2011  Spring, 2012 Vermont-led Sites:  Vermont staff trained on tool in Sept., 2011  APERS completed after this training

Program Quality Indicators and Evidence-Based Practices (EBP) Program Quality Contextual features of the program that represent best practices Program quality as the house in which practices are employed

Program Quality Indicators and Evidence-Based Practices (EBP) EBP Evidence-based practices as specific tools for specific skills EBP as the furniture or appliances designed for specific functions

Autism Program Environment Rating Scale (APERS) Designed to assess quality indicators of programs for children and youth with ASD Purposes of the APERS  Consultation  Professional development  Program evaluation

Consultation Purpose: to provide focused, direct feedback Process example:  Collaboratively develop goals from report  Define plan for change/coaching  Define timeline for change/coaching

Professional Development Purpose: to expand understanding of quality programs for students who have ASD Process example:  Share APERS content  Broad self-evaluation  Incorporate peer/mentor coaching

Program Evaluation Purpose: to evaluate the quality of a program for purposes other than professional development Not an aspect of the NPDC’s use of the instrument  Not for comparison between programs  Psychometric properties not yet established

Features of APERS Two APERS formats: PE; MHS Organized by domains and subdomains Applicable in self-contained and inclusive programs Scored on a five-point scale with behavioral anchors at three points Results can be summarized by scores or graphs

Learning Environment Interdisciplinary Teaming Program Ecology Structure & Schedule Positive Learning Climate Curriculum & Instruction Communication Social Competence Personal Independence Functional Behavior Assessment & IEP Family Participation Program Quality Learner Outcomes Transition (MHS only)

Consensus APERS Process Interviews Record Review Observations Scoring Debrief / Report Self - Assessment

APERS Components APERSDomainsItemsType of Items Preschool/ Elementary 1164Observation = 33 Interview = 31 Middle/High1266 Observation = 33 Interview = 33

P/E APERS Domains Learning Environments Learning Environment Structure/Schedule Positive Learning Climate Assessment and IEP Development Curriculum and Instruction Communication Social Competence Personal Independence and Competence Functional Behavior Family Involvement Teaming

MHS APERS Domains Learning Environments Learning Environment Structure/Schedule Positive Learning Climate Assessment and IEP Development Curriculum and Instruction Communication Social Competence Personal Independence and Competence Functional Behavior Family Involvement Teaming Transition

How Do We Collect This Information? Observation in program across the school day Review IEP Interview teacher(s) Interview parent(s) Interview team member Self-Assessment

Observations For inclusive programs with more than one class, observe two students for three hours each For self-contained programs observe for three hours Focus on taking notes on all content areas but not scoring Negotiate where to position yourself Find unobtrusive opportunity to scan entire classroom Limit interaction with classroom staff and students

Interviews Interview:  A parent of a child with autism  Teacher (usually special education)  Team member (general education teacher, related service provider or other knowledgeable team member) Logistics:  Approximately min/interview  Interview parties separately  Work within the interview protocol to ask open-ended questions—avoid leading or forcing yes/no answers

Interview Protocol 1. Review purpose of APERS 2. Discuss the importance of interviewee’s perspective and information for developing a deeper understanding of program 3. Use broad questions first Use follow-up probes to gather information that you do not get from the broad question Use notes section to capture additional information that is useful but does not fit with a specific interview question.

Self-Assessment Completing the self-assessment  Ask the practitioner to complete the self- assessment before you complete the full APERS  Explain that the self-assessment provides valuable information for reflection and the professional development process  The self-assessment results will be looked at later alongside results from the full APERS  Takes 30 minutes to a few hours, depending on practitioner’s level of reflection

Scoring Process Scores should be based on the entire observation – not isolated interactions and occurrences During scoring process make note of overall themes and suggestions for debrief/report

Scoring System Some items contain specific instructions or are not applicable to students who receive services entirely in an inclusive setting. Additional instructions are provided for some items and should be scored using the directions at the bottom of the page. Ratings are based on a scale of 1-5.

Score Sheet Can be used for both inclusive and self- contained classrooms. Score each observation-based item for all settings/classrooms observed. Score each interview item for teacher, team member, and parent.

General Scoring Guidelines Scores for individual items should always be 1-5. For inclusive middle/high school programs, interview items, or when multiple settings are observed for an individual item, use your best judgment to derive a score that best represents the overall program. To derive a final score with multiple observers or in multiple settings do not average individual item scores; instead, reach consensus with the other observer and complete a consensus score sheet.

Assigning Ratings A rating of 1 – any indicator is observed and checked A rating of 2 – none of the indicators under 1 are checked and at least one indicator of 3 is checked A rating of 3 – none of the indicators under 1 are checked and all of the indicators under 3 are checked A rating of 4 – all indicators under 3 are checked and at least one indicator under 5 is checked. A rating of 5 – all indicators under 3 and 5 are checked.

Example: Score of 1

Example: Score of 2

Example: Score of 3

Example: Score of 4

Example: Score of 5

Assigning Ratings: Inclusive Classrooms The Score Sheet is designed to help you organize this information by giving you separate spaces to score each setting observed. To obtain a score for each item, use your best judgment and/or reach consensus with another observer.

Debrief / Report Prior to meeting prioritize areas of focus for strengths of program and areas for growth Areas of focus become content for report Ensure that you have specific examples from the observation for each item to be discussed Use visual graphs to support discussion  APERS graph  Self-assessment graph

Debrief / Report Report is a record of the debriefing conversation; do not add content that was not covered during the debrief  not necessary to have a complete report at the debrief  A debrief shortly after the observation is most important with a report to follow See examples

Components of Debrief / Report “What is the APERS and how does it work?”  Not for comparison between programs  Psychometric properties not yet established  Effective as a tool for professional development and technical assistance Program Strengths  General strengths and specific examples Areas for Growth  Priorities areas to focus change  Only choose areas that you have a constructive suggestion for change and growth

Key Terms Team Members: two or more professionals directly involved in planning and implementing a student’s educational program All Team Members: all professionals directly involved in planning and implementing a student’s educational program Key Team Member: one team member who is expected to take key responsibility for the implementation of a student’s educational program or a particular aspect of that program

Key Terms School Staff: individuals who come into contact with a student during the course of the observation period Peer: any other similarly-aged student with or without disabilities. (A more specific descriptor (e.g., typically developing) will be used if an anchor describes a specific type of peers.) Other terms (e.g., few, some, multiple, sufficient, regularly) are defined in the protocol under ‘Key Terms’.