Leading Indicators: Evaluation for Site-Level Improvement and System- Level Planning Samantha Sugar Research Associate/Analyst David P. Weikart Center.

Slides:



Advertisements
Similar presentations
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Advertisements

Practice Profiles Guidance for West Virginia Schools and Districts April 2012.
Copyright © 2010 American Institutes for Research All rights reserved. Washington 21st Century Community Learning Centers Program Evaluation February 2011.
Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
A joint venture between the Forum and High/Scope. Oakland Planning with Data & Kick-Off Webinar September 1, 2011 Laenne Thompson Weikart Center.
Summary of Results from Spring 2014 Presented: 11/5/14.
Improving Quality Systemwide October 11, What is your role in afterschool?
Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st CCLC Leading Indicators Breakout Session Neil Naftzger and Deborah Moroney.
Focus on Outcomes: Developing a Comprehensive Measurement Framework in Afterschool May 28,
NRCOI March 5th Conference Call
Talbert House Project PASS Goals and Outcomes.
Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report.
Student Assessment Inventory for School Districts Inventory Planning Training.
Two Generations of Success Family Engagement in Full Service Community Schools Coalition for Community Schools April, 2010.
Using An Organizational Assessment : A framework to Help Agencies Build on Strengths, Recognize Challenges, and Develop a Comprehensive Work Plan, CWDA.
Professional Growth= Teacher Growth
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
2014 AmeriCorps External Reviewer Training
Adolescent Sexual Health Work Group (ASHWG)
United Way of Greater Toledo - Framework for Education Priority community issue: Education – Prepare children to enter and graduate from school.
1 Adopting and Implementing a Shared Core Practice Framework A Briefing/Discussion Objectives: Provide a brief overview and context for: Practice Models.
1 Orientation to Teacher Evaluation /15/2015.
Stages of Commitment to Change: Leading Institutional Engagement Lorilee R. Sandmann, University of Georgia Jeri Childers, Virginia Tech National Outreach.
Union Elementary School “Soaring Above and Beyond”
ISLLC Standard #2 Implementation
Communication System Coherent Instructional Program Academic Behavior Support System Strategic FocusBuilding Capacity.
Talent Management: Using a Competency Model to Attract, Develop, and Retain the SEM Workforce AACRAO SEM Conference November 12, 2013.
Copyright © 2012 American Institutes for Research. All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Mariel Sparr.
Managing Organizational Change A Framework to Implement and Sustain Initiatives in a Public Agency Lisa Molinar M.A.
Using Data for Decisions Points to Ponder. Different Types of Assessments Measure of Academic Progress (MAP) Guided Reading (Leveled Reading) Statewide.
Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Neil Naftzger.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Practice Model Elements Theoretical framework Values and principles Casework components Practice elements Practice behaviors.
Your Presenters Melissa Connelly, Director, Regional Training Academy Coordination Project, CalSWEC Sylvia Deporto, Deputy Director, Family & Children’s.
Military Family Services Program Participant Survey Briefing Notes.
Defending Childhood Protect Heal Thrive January 25-27, 2011 Sandra Spencer Executive Director National Federation of Families for Children’s Mental Health.
Mountains and Plains Child Welfare Implementation Center Maria Scannapieco, Ph.D. Professor & Director Center for Child Welfare UTA SSW National Resource.
TPEP Teacher & Principal Evaluation System Prepared from resources from WEA & AWSP & ESD 112.
Social and Emotional Learning in After School Programming Jennifer Miller 21 st Century Learning Community Summit January 21, 2013.
Lincoln Community Learning Centers A system of partnerships that work together to support children, youth, families and neighborhoods. CLC.
Overview.  Accreditation is both a status and a process  Status:  Status: Accreditation provides public notification that standards of quality are.
Broward County Public Schools BP #3 Optimal Relationships
Understanding Your LI Reports October 16, 2015 October 2015 Copyright © 2015 American Institutes for Research. All rights reserved. Samantha Sniegowski.
Prepared by: Forging a Comprehensive Initiative to Improve Birth Outcomes and Reduce Infant Mortality in [State] Adapted from AMCHP Birth Outcomes Compendium.
1 Family Network on Disabilities of Florida, Inc Whitney Road Clearwater, Florida Phone: (727) Toll free: (800)
Help to develop, improve, and sustain educators’ competence and confidence to implement effective educational practices and supports. Help ensure sustainability.
Working Systemically For Increased Student Achievement OVERVIEW.
Using Cost-Effective Processes to Develop Large-Scale Data-Driven Continuous Improvement Systems for Local Programs Laurie A. Van Egeren, Jamie Wu, Michigan.
Education 2018: Excellence for Every Student Presented to the Board of Education August 27,
Springfield Public Schools SEEDS: Collecting Evidence for Educators Winter 2013.
Using the CLASS tool to Improve Instructional Practices in Early Childhood Tracie Dow and Felicia Owo.
Prepared for the Office of Head Start by ICF International School Readiness Goals: Using Data to Support Child Outcomes.
Out of School Time Part III: Quality Connections to Quality Rating & Improvement Systems (QRIS) and School Age Settings Jennifer Harris, Arkansas State.
BLOOMFIELD PUBLIC SCHOOLS B Learning and Growing Together Climate and Satisfaction Survey Results Dr. James Thompson, Superintendent Dr. Ellen J. Stoltz,
Instructional Leadership Supporting Common Assessments.
Sharing Your School Climate Data with STAFF Directions for PowerPoint users: The following is a sample template for sharing your DSCS results.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
An Evaluation Primer: A Plan for Measuring Program Quality and Outcomes in After School and Summer Programs by Brooke Culclasure, Ph.D. May 3, 2016.
Overview of MAAP Accreditation
Understanding Your LI Reports October 19, 2016
An Overview of the Minnesota Afterschool Accreditation Program (MAAP)
Improving Quality Systemwide
Presentation Appendix
Statewide Afterschool Evaluation— What Do the Data Tell Us
Sel in ymca afterschool project results
Statewide Afterschool Evaluation— What Do the Data Tell Us
THE INSPECTION SYSTEM AND THE SCHOOL EXTERNAL EVALUATION
An Introduction to Evaluating Federal Title Funding
National Center for Chronic Disease Prevention and Health Promotion
Presentation transcript:

Leading Indicators: Evaluation for Site-Level Improvement and System- Level Planning Samantha Sugar Research Associate/Analyst David P. Weikart Center for Youth Program Quality Charles Smith Vice President for Research Forum for Youth Investment Executive Director David P. Weikart Center for Youth Program Quality April 18, 2012 #readyby21

Agenda Welcome Opening Activity Important Questions – The Why – The What – The How Site-Level Improvement System-Level Planning – The Where Looking Forward/Next Steps Questions

Objectives Learn how collecting program data from staff, youth and parents help to tell the overall story about Quality for a single site. See how this data can be used to identify low capacity programs and support resources targeting decisions.

Opening Activity Effective performance data describes behaviors and conditions in a way that is: a. Timely – Data is available in real time as events occur or just after completion b. Objective – Data is focused on behaviors and conditions that can be identified through observation and easily named in relation to practice c. Reliable – Data is seen as precise and factual by all due to standardization of measures/methods d. Sensitive – Data describes behaviors and conditions that are likely to change in response to intervention and change can be captured on the measures e. Valid – Data describes behaviors and conditions thought to be a link in a causal chain of events desired by the actors involved f. Feasible - The minimum data necessary are collected using typical community resources g. Multi-Purpose – As they occur, BOTH data collection and data interpretation processes promote learning and coordination among actors in the organization h. Multi-level – Data designed for use by individual units (staff/sites) can be aggregated across individual units to assess collective performance Optimal Characteristics of Performance Data

Quality of Instruction (Point of Service Setting) Youth Voice and Program Governance Structures ENGAGEMENT INTERACTION SUPPORTIVE ENVIRONMENT SAFE ENVIRONMENT Higher order engagement through choice, planning, and reflection. Peer interaction through grouping and cooperative learning. Supportive environment through welcoming, conflict resolution, active learning, and skill building. Physical and emotional safety is provided. Why Were the Leading Indicators Developed?

Continuous Improvement Practices by Site Teams (Organizational Setting) Standardized Assessment of Instruction Team-based Planning with Data Coaching and Performance Feedback Training for Instructional Skills Why Were the Leading Indicators Developed?

Lower Stakes Accountabilities Interpretive Community Team Self Assessment Review external scores Team Planning and Implementing Improvement planning Performance coaching Higher Stakes Accountabilities Why Were the Leading Indicators Developed?

Point of Service Setting Organizational Setting Policy Setting Continuous Improvement Practices for Site Teams Quality Instruction & Proximal Child Outcomes Low Stakes Accountability and Supports Theory of Change: Multiple Levels of Setting

Effective performance data describes behaviors and conditions in a way that is: a. Timely – Data is available in real time as events occur or just after completion b. Objective – Data is focused on behaviors and conditions that can be identified through observation and easily named in relation to practice c. Reliable – Data is seen as precise and factual by all due to standardization of measures/methods d. Sensitive – Data describes behaviors and conditions that are likely to change in response to intervention and change can be captured on the measures e. Valid – Data describes behaviors and conditions thought to be a link in a causal chain of events desired by the actors involved f. Feasible - The minimum data necessary are collected using typical community resources g. Multi-Purpose – As they occur, BOTH data collection and data interpretation processes promote learning and coordination among actors in the organization h. Multi-level – Data designed for use by individual units (staff/sites) can be aggregated across individual units to assess collective performance Optimal Characteristics of Performance Data Why Were the Leading Indicators Developed?

Leading Indicator 5.1 – Family Satisfaction Leading Indicator 4.2 – Academic Efficacy Leading Indicator 4.1 – Socioemotional Development Leading Indicator 2.2 – Engaging Instruction Leading Indicator 1.1 – Staffing Model Leading Indicator 3.4 – Community Resources Leading Indicator 3.3 – School Alignment Leading Indicator 3.2 – Family Engagement Leading Indicator 2.1 – Academic Press Leading Indicator 1.2 – Youth Governance Leading Indicator 3.1 – System Norms Leading Indicator 1.4 – Enrollment Policy Leading Indicator 1.2 – Continuous Improvement Point of Service Setting Organizational Setting Policy Setting Continuous Improvement Practices for Site Teams Quality Instruction & Proximal Child Outcomes Low Stakes Accountability and Supports Why Were the Leading Indicators Developed?

Isn’t this just more data?

13 composite measures categorized into five different contexts: - Organizational Context - Instructional Context - External Relationships - Youth Characteristics - Family Satisfaction What Are the Leading Indicators? What are the Leading Indicators? Where did they come from? - Grantee Director/Site Coordinator Surveys - Afterschool teacher/Youth Worker Surveys - Youth Surveys (grades 4-12) - Parent Surveys - PPICS data How do we measure them? - Youth Program Quality Intervention (YPQI) - California Outcomes Measures (Vandell) - PPICS data

Sample Report What Are the Leading Indicators?

A C B Items are simply the questions that we ask on the surveys, demographic and enrollment data, or Youth PQA scores. On Figure 1 below, the Items correspond with letter A. Scales are made up of groupings of different Items that go together well. A Scale is designated by letter B in Figure 1 below. Leading Indicators are made up of groupings of different Scales, much like the Scales themselves are made up of Items. In the example below, the “ Accountability ” Scale (along with “ Collaboration ” ) make up the Leading Indicator 3.1 – System Norms, which is represented by letter C in Figure 2 below. [JB1] [JB1] Finally, all of the Leading Indicators are grouped into five overarching Domains based on the context that they represent. These Domains are color-coded for easy distinction, and include: Organizational Context (red), Instructional Context (green), External Relationships (blue), Youth Characteristics (purple) and Parent Satisfaction (brown). Figure 1 Figure 2

Effective performance data describes behaviors and conditions in a way that is: a. Timely – Data is available in real time as events occur or just after completion b. Objective – Data is focused on behaviors and conditions that can be identified through observation and easily named in relation to practice c. Reliable – Data is seen as precise and factual by all due to standardization of measures/methods d. Sensitive – Data describes behaviors and conditions that are likely to change in response to intervention and change can be captured on the measures e. Valid – Data describes behaviors and conditions thought to be a link in a causal chain of events desired by the actors involved f. Feasible - The minimum data necessary are collected using typical community resources g. Multi-Purpose – As they occur, BOTH data collection and data interpretation processes promote learning and coordination among actors in the organization h. Multi-level – Data designed for use by individual units (staff/sites) can be aggregated across individual units to assess collective performance Optimal Characteristics of Performance Data What Are the Leading Indicators?

How Have the Leading Indicators Been Used? Oklahoma Exemplar – System Characteristics History of the system Integration of QIS and required evaluation efforts 75 grantees in the first year, 77 this year Timeline Data Collection Methods Outputs

Site-Level Improvement

The report is… – A tool to help you identify the strengths of your program – A tool to help you identify the weaknesses of your program The report is not… – A mechanism to induce evaluative comparisons or competitions across grants – Something to be scared of Site-Level Improvement

Get a feel for the layout of the report Study the graphs - In what areas are you doing comparatively well? In what areas does it look like your site could improve? Celebrate your strengths What could you work on? Do some thinking Prepare to make a plan! How to read and interpret your report Site-Level Improvement

Priority Assessment Form: Leading Indicators 1.Create the story of your data (column one) – What is the message or story of your data? What do the numbers tell you? – What’s missing from the data? What important things about program quality do not come through? – Where are the gaps between what you want to provide and what the data says you’re providing? 2.Brainstorm ideas for improvement (column two) Site-Level Improvement

The profiles (clusters) in Figure A-8 may be interpreted as follows. Cluster 3: “High quality” Thirty-three percent of grantees fall into Cluster 3, where programs show high quality in all areas. Cluster 2: “High with low -- growth/mastery and family communication” This cluster represents 24% of grantees. These programs show relatively high quality in most areas, but low school alignment and parent communication. Cluster 1: “Medium” Cluster 1 represents 21% of grantees. These programs have medium levels of supervision quality, high academic press & school alignment, low program quality (growth & mastery), and low family communication. Cluster 4: “Low with high -- school alignment & family communication” Four percent of grantees fall into Cluster 4, where programs show low quality in supervision, growth & mastery, and academic press, but high school alignment & family communication. Cluster 5: “Low quality” Eleven percent of grantees fall into Cluster 5, where programs appear to demonstrate low quality in all areas. System-Level Planning

Intermediate & Academic Outcomes Table A-8 provides means of the satisfaction variables across cluster groups. The omnibus ANOVA indicates that the clusters produce significantly different scores in the Intermediate Outcomes set (staff job satisfaction, parent satisfaction, and parent reports of the program supporting academics). The highest quality group of grantees/sites (Cluster 3) produces the highest staff satisfaction, youth engagement, and academic efficacy, whereas the lowest quality group of grantees/sites (Cluster 5) exhibits the lowest or nearly the lowest in each area. Table A-8 Means Scores for Intermediate Outcomes by Level of Quality System-Level Planning Cluster Staff job satisfaction (S) Parent Satisfaction (P) Youth Engagement (Y) Homework completion (Y) Prog. Supports academics (P) 3: High quality : High with low : Medium quality : Low with high : Low quality Omnibus difference across clusters (ANOVA F) 8.3*** 2.6* ***

Where Have the Leading Indicators Been Used?

Looking Forward/Next Steps Further Validation Work – Exploration of Leading Indicators Framework – Theoretical and Statistical Expansion to New Networks Integration of Quality Improvement Systems

Questions?

Thank You!