Saginaw KEYS Data Analysis Training for Continuous School Improvement March 20 and 21, 2006 Jacques Nacson Gary Obermeyer.

Slides:



Advertisements
Similar presentations
Quality control tools
Advertisements

Understanding Student Learning Objectives (S.L.O.s)
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Agenda For Today! Professional Learning Communities (Self Audit) Professional Learning Communities (Self Audit) School Improvement Snapshot School Improvement.
Notes by Ben Boerkoel, Kent ISD, based on a training by Beth Steenwyk –
The WINSS School Improvement Planning Tool: An Overview.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
1 Theory of Change Chesapeake Bay Funders Network Program Evaluation Training Workshop OMG Center for Collaborative Learning January 9-10, 2008.
EdTPA: Task 1 Support Module Mike Vitale Mark L’Esperance College of Education East Carolina University Introduction edTPA INTERDISCIPLINARY MODULE SERIES.
Action Logic Modelling Logic Models communicate a vision for an intervention as a solution to a public health nutrition (PHN) problem to:  funding agencies,
INTEGRATING BENEFICIARY FEEDBACK INTO EVALUATION- A STRUCTURED APPROACH Presentation to UKES Conference May 2015 Theme: Theory and practice of inclusion.
Formative and Summative Evaluations
Title I Needs Assessment and Program Evaluation
Toolkit Series from the Office of Migrant Education Webinar: CNA Toolkit August 21, 2012.
1-2 Training of Process FacilitatorsTraining of Coordinators 3-1.
Community Sector Governance Capability Framework
Building a Data Culture Data Guru Roles, Responsibilities & Expectations.
Cohort 5 Elementary School Data Review and Action Planning: Schoolwide Reading Spring
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
1-2 Training of Process FacilitatorsTraining of Coordinators 5-1.
Principles and Strategies of KEYS 2.0 Data Analysis and Interpretation GAE Training January 1009 Jacques Nacson Senior Policy Analyst NEA New Products.
KEYS to School Improvement Missouri National Education Association Teaching and Learning Director.
Step 6: Implementing Change. Implementing Change Our Roadmap.
Aligning the P-16 Curriculum to Improve Science Teacher Preparation Michael Odell, Ph.D. John Ophus, Ph.D. Teresa Kennedy, Ph.D. Jason Abbitt, Ph.D. Kristian.
What is KEYS 2.0 and Why is it Unique? Jacques Nacson, Ph.D. Danilo Lunaria National Education Association Center for Great Public Schools Quality School.
Toolkit Series from the Office of Migrant Education Webinar: Program Evaluation Toolkit August 9, 2012.
Building Leadership Teams Driving Continuous Improvement Throughout the School! Session #3 January 2012.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
PRESENTED BY THERESA RICHARDS OREGON DEPARTMENT OF EDUCATION AUGUST 2012 Overview of the Oregon Framework for Teacher and Administrator Evaluation and.
PBIS Data Review: Presented by Susan Mack & Steven Vitto.
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
How Do We Do This? Educate all students: – Build upon prior knowledge and experience –Address a wide range of skill levels –Instruct utilizing various.
Advancing Assessment Literacy Setting the Stage I: Engaging Stakeholders.
“How Do We Know They Are Learning?” ACT Meeting October 9, 2007 Ladera Ranch Middle School ACT Meeting October 9, 2007 Ladera Ranch Middle School Enhancing.
Creating Pathways for Education, Career and Life Success Webinar: Developing a Pathways Plan January 18, 2013 Facilitated by Jeff Fantine, Consultant.
Implementing QI Projects Title I HIV Quality Management Program Case Management Providers Meeting May 26, 2005 Presented by Lynda A. O’Hanlon Title I HIV.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
Cohort 5 Middle/Jr. High School Data Review and Action Planning: Schoolwide Reading Spring,
Performance Management: Getting Ready for Accreditation Needs Assessment Survey Community Assessment 1 Online survey Open to anyone interested in this.
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Why must we do Data Teams? We know the implementation of DT benefit STUDENTS -helps teachers identify exceeding/proficient students and plan for targeted.
EVAAS for Educators Mary Keel, Ed.D. Robin Loflin Smith, Ed.D. Tara Patterson, MSA.
OVERVIEW PRESENTATION
ISLLC Standard #2 Supporting Teacher Learning Name Workshop Facilitator.
Chapter 12 Getting the Project Started Winston Jackson and Norine Verberg Methods: Doing Social Research, 4e.
Copyright © 2014 Oracle and/or its affiliates. All rights reserved. | Change Readiness Assessment Analysis and Recommendations Presenter’s Name Presenter’s.
STAKEHOLDER MEETING Selecting Interventions to Improve Utilization of the IUD City, Country Date Insert MOH logoInsert Project logoInsert USAID logo (Note:
Mock Data Retreat Pam Lange TIE/ESA 7. 2 Agenda  Based on school’s need  May be ½ day/ full day/ two days  Work with district to determine needs –
INTEGRATED ASSESSMENT AND PLANNING FOR SUSTAINABLE DEVELOPMENT 1 Click to edit Master title style 1 Evaluation and Review of Experience from UNEP Projects.
Readiness for AdvancED District Accreditation Tuscaloosa County School System.
Northwest ISD Target Improvement Plan Seven Hills Elementary
Six Sigma Continuous Improvement Training Introduction to Benchmarking Six Sigma Simplicity.
SACS/CASI District Accreditation  January 2007  April 2007  May 2007  January – April 2008  Board Approval for Pursuit of District Accreditation.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
+ SOUTH DAKOTA PRINCIPAL EFFECTIVENESS MODEL PROCESS OVERVIEW PE WEBINAR I 10/29/2015.
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
The WINSS School Improvement Planning Tool: An Overview.
Office of Service Quality
Training on Safe Hospitals in Disasters Module 3: Action Planning for “Safe Hospitals”
White Pages Team Grey Pages Facilitator Team & Facilitator Guide for School-wide Reading Leadership Team Meetings Elementary.
Instructional Leadership Supporting Common Assessments.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Data Review Team Time Spring Purpose 0 This day is meant to provide school leadership teams with time to review the current status of their.
Data Review Team Time Winter 2014.
Data Review Team Time Spring 2014.
TFI Wordle This presentation is intended to introduce the PBISApps site and the types of data teams will be working with. The teams will take their first.
Implementing the Specialized Service Professional State Model Evaluation System for Measures of Student Outcomes.
An Introduction to Evaluating Federal Title Funding
Root Cause Analysis Identifying critical campaign challenges and diagnosing bottlenecks.
Presentation transcript:

Saginaw KEYS Data Analysis Training for Continuous School Improvement March 20 and 21, 2006 Jacques Nacson Gary Obermeyer

Workshop Agenda Welcome and Introductions Review of Agenda and Objectives Your Expectations and KEYS Experiences KEYS and Continuous School Improvement: The Action Research Cycle KEYS Data Analysis and Interpretations --Indicators and Predictions --General Data-Related Considerations --Demo Report: Charts and Data Points --Tools and Data Analysis Worksheets --Navigating the Online Data Report and Online Resources The NEA KEYS 2.0 Toolkit and Use of the Action and Facilitation Guides

Workshop Objectives To examine KEYS data within the larger context of continuous school improvement processes To learn how to analyze and interpret KEYS data and how results can be shared with school staff and other stakeholders To consider strategies for how KEYS data might be used as a stimulus to drive your targeted school improvement efforts To learn how to use KEYS support materials, including online resources, to help you take next steps beyond preliminary analysis of KEYS data

Problem or Opportunity Identified Data Collection Diagnosis & Refinement of Problem Action Planning Action Taking Implementation& Evaluation

Model for Understanding and Interpreting KEYS 2.0 Data School completes KEYS 2.0 survey School receives scores on KEYS 42 indicators of school quality Use other data, including student achievement to confirm, validate or justify Make inferences about our school’s quality based on the KEYS results and assessments of other data School makes decisions about next steps and plans for improvement Make judgments based on theories or hypotheses (if, then) to explain “why” Implementation of planned intervention Process & product evaluation including effects on student achievement NEA 6 Keys to School Quality

General Questions and Principles to Consider Before Looking at Your KEYS 2.0 Data

General Questions and Principles to Consider Before Looking at Your KEYS 2.0 Data (continued)

Flow Chart of Suggested Activities/Processes For Analyzing and Interpreting Your Data School Community Completes KEYS Survey School Improvement Plan Implemented and Evaluated Created a School Improvement Team (SIT) SIT trained in action research, use of quality tools and analysis and interpretation of KEYS Results SIT conducted preliminary analysis and interpretation of KEYS results and prepared for involvement of whole school community SIT assisted whole school community to examine results and make preliminary recommendations SIT considered whole school’s recommendations and begins deliberative process SIT developed preliminary school improvement plan based on KEYS data and other data and present to school community for feedback School improvement plan approved and ready for implementation ? No Yes No Yes ? ? ? ? ? No ? Yes No

Understanding the Graphs for Each Key Horizontal Axis Measure of quality: 5 point scale Left side Disagree (low value) Right side Agree (high value) Vertical Axis Indicators (groups of questions that measure the same concept) KEY 1. Shared Understanding and Commitment To High Goals Respondents Provide Direct Instruction to Students

Understanding the Graphs for Each Key School AverageAll Schools Average 90 th Percentile Score Data Points School average (red) All schools average (38 pilot schools) (Purple) 90 th percentile score of the pilot sample (Yellow) Length of the horizontal bar (Purple) 1 standard deviation above and below the school average (measure of agreement)  The Goals for your school in terms of continuous improvement for each indicator The school average moving continuously toward the right side (agree – high value of quality for that indicator) At the same time, reduce the standard deviation (narrow the length of the horizontal bar, meaning greater agreement among respondents) Standard Deviation

District Reports

5.4 Safe Environment

Worksheet for Analyzing KEYS 2.0 Results (Analysis of Individual Indicators) Preliminary Analysis NOTE: THIS WORKSHEET HAS BEEN REDESIGNED Measures of Quality Degree of Consensus Low ( Wide >1.3 pt.) High ( Narrow <1.3pt.) Very High (between 4.1 & 5.0) High (between 3.4 & 4.0) Low (between 2.1 & 2.6) Very Low (average score between 1 & 2) Indicators Key:_________________ Constituency Group:_________________ Date of Survey:______________________

Worksheet for Analyzing KEYS 2.0 Results (Analysis of Cross Key Indicators) (Secondary Analysis) Related IndicatorsIndicators Key:_____________________ Constituency Group:____________________ Date of Survey:________________________ score

Action Guide

Facilitation guide

 Tool Box that contains a CD with all materials

IntroductionIntroduction | General | Demographics | Learn More | Print results | En EspañolGeneralDemographicsLearn MorePrint results En Español Key 1Key 1 | Key 2 | Key 3 | Key 4 | Key 5 | Key 6Key 2Key 3Key 4Key 5Key 6 Select group: Eastern Middle School - DEMO Return to the KEYS Homepage

Steps a School Might Take Once KEYS Preliminary Analyses are Completed 1.GAPS: Decide on one indicator or a group of indicators where gap (s) exist. 2.RELEVANCE: Reflect with the “team” on the relevance, importance and priority of the selection. 3.DATA COLLECTION- VALIDATION: Consider the need to collect additional data to validate the KEYS findings. 4.DATA COLLECTION- DIAGNOSIS AND REFINEMENT: Examine what is the “root” cause of the problem.

Steps a School Might Take Once KEYS Preliminary Analyses are Completed 5. THEORY OF ACTION: Identify and select the most appropriate solution for your context. 6.ACTION PLANNING: Set achievable goals and develop specific action/project plans. 7.IMPLEMENTATION: Action plans must be implemented for improvement to occur. 8.DATA COLLECTION- EVALUATION: Both process and product evaluations are necessary for learning to happen. 9.BACK TO STEP 1

Objectives for Advanced Training KEYS Data Analysis Understand the Value of Data for Making Good Decisions Distinguish Between Symptoms and Root Causes for Continuous Improvement Learn How to Analyze and Use Data through the Continuous Improvement Cycle (PDCA) Use Quality Tools to Improve Process Quality and Student Outcomes Understand the Concept of Variation and Use Control Charts to reduce variation and improve outcomes

Agenda for Advanced Training KEYS Data analysis Data and General Data Principles W. Edward Deming – Data and Variation Continuous Improvement Cycle (PDCA) Quality Tools Statistical Process Control-Quality Control Charts Getting to the Root Cause Collecting, Organizing and Analyzing your Data Selecting the Right Intervention Based on Theory and Hypothesis Testing Planning and Carrying Out the Intervention Evaluating the success of the Intervention