Washington 21 st CCLC Evaluation February 2016 Copyright © 2016 American Institutes for Research. All rights reserved. 2016 Data Collection Activities.

Slides:



Advertisements
Similar presentations
WV High Quality Standards for Schools
Advertisements

Copyright © 2010 American Institutes for Research All rights reserved. Washington 21st Century Community Learning Centers Program Evaluation February 2011.
Copyright © 2010 American Institutes for Research All rights reserved. Washington 21st Century Community Learning Centers Program Evaluation – April Update.
Ministry of Education Perceptual Survey Overview.
Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Neil Naftzger.
Freehold Borough Teacher Evaluation System Freehold Intermediate School Friday – February 15, 2013 Rich Pepe Director of Curriculum & Instruction.
21 st CENTURY COMMUNITY LEARNING CENTERS 2008 BIDDER’S Workshops.
Afterschool Youth Outcomes Copyright © 20XX American Institutes for Research. All rights reserved. Leadership Institute Neil Naftzger.
Linking Actions for Unmet Needs in Children’s Health
Karen L. Mapp, Ed.D. Deputy Superintendent, Boston Public Schools
Copyright © 2007 Learning Point Associates. All rights reserved. TM Introduction to PPICS for Washington 21st CCLC Grantees Michael Hutson Research Associate.
Title III National Professional Development (NPD) Program Grantee Performance Reporting: A Webinar for FY2011 and FY2012 Grantees February 28, 2013 Prepared.
Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st CCLC Leading Indicators Breakout Session Neil Naftzger and Deborah Moroney.
Neil Naftzger Principal Researcher Washington 21st CCLC Evaluation February 2015 Copyright © 20XX American Institutes for Research. All rights reserved.
Talbert House Project PASS Goals and Outcomes.
Designing and Implementing An Effective Schoolwide Program
Gifted Program Review Spring Process  In February 2013 a team of 41 individuals met to develop questions: parent, teachers, psychologists and administrators.
Copyright © 2012 American Institutes for Research. All rights reserved. Administration of the Youth Skills and Beliefs Survey American Institutes for Research.
1 EEC Board Policy and Research Committee October 2, 2013 State Advisory Council (SAC) Sustainability for Early Childhood Systems Building.
How to Develop the Right Research Questions for Program Evaluation
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
Miyo Wahkohtowin Community Education Authority Maskwacis Student Success Program Presented by Ahmad Jawad March 8, 2011.
ASSESSMENT  Are you having an impact?  How do you know?  Are the programs and services you work on getting better?  How do you know?
The CIS Model Research, Rationale, & a Business Plan.
Horizon Middle School June 2013 Balanced Scorecard In a safe, collaborative environment we provide educational opportunities that empower all students.
Creating a New Vision for Kentucky’s Youth Kentucky Youth Policy Assessment How can we Improve Services for Kentucky’s Youth? September 2005.
School Leadership Teams Collaborating for Effectiveness Begin to answer Questions #1-2 on the Handout: School Leadership Teams for Continuous Improvement.
Copyright © 2007 Learning Point Associates. All rights reserved. TM Overview of the Oregon Attendees Module Neil Naftzger Principal Researcher To hear.
Proficiency Delivery Plan Strategies Curriculum, Assessment & Alignment Continuous Instructional Improvement System ( CIITS) New Accountability Model KY.
Stronge Teacher Effectiveness Performance Evaluation System
Committee of Practitioners ESEA Flexibility Waiver Review June 25, 2014.
PANAMA-BUENA VISTA UNION SCHOOL DISTRICT
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Maryland’s Journey— Focus Schools Where We’ve Been, Where We Are, and Where We’re Going Presented by: Maria E. Lamb, Director Nola Cromer, Specialist Program.
TRACKING AND REPORTING PROGRESS AND CONTINUOUS IMPROVEMENT AmeriCorps Program Directors’ Kickoff: 2015 –
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
TULSA COMMUNITY COLLEGE Julie Woodruff, Associate Professor of English Mary Millikin, Director of Institutional Research representing the AtD Data Team.
DPI 21 st Century Community Learning Center New Grantee Orientation: Part 2.
Documents posted at QRIS 2011 Program Quality Improvement Grant RFP Bidder’s Conferences February & March 2011 Wendy Valentine Director,
Copyright © 2012 American Institutes for Research. All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Mariel Sparr.
© 2010 NATIONAL TECHNICAL ASSISTANCE CENTER FOR CHILDREN’S MENTAL HEALTH, GEORGETOWN UNIVERSITY Expanded School Mental Health Services (ESMH) in Baltimore.
Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Neil Naftzger.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
Section I: Bringing The Community Together Center for Community Outreach Key Components of Afterschool Programs.
TPEP Teacher & Principal Evaluation System Prepared from resources from WEA & AWSP & ESD 112.
Certifying Your Data The Annual Performance Report (APR) is due each fall. Data collected in APlus will be used to generate sections of the APR for each.
What is Title I and How Can I be Involved? Annual Parent Meeting Pierce Elementary
Welcome to today’s Webinar: Tier III Schools in Improvement We will begin at 9:00 AM.
Transforming the Learning, Teaching, and Leadership Environment Summer Institutes 2001 Office of Superintendent of Public Instruction/Association of Washington.
Understanding Your LI Reports October 16, 2015 October 2015 Copyright © 2015 American Institutes for Research. All rights reserved. Samantha Sniegowski.
Statewide Evaluation Cohort 7 Overview of Evaluation March 23, 2010 Mikala L. Rahn, Ph.D.
Positive Behavior Support for Families and Community Members School Name / Date (Red font denotes information to be completed/inserted by the district.
Ohio 21 st CCLC Evaluation Presentation to Ohio Grantees September 10, 2015 Copyright © 20XX American Institutes for Research. All rights reserved. Matthew.
Copyright © 2011 American Institutes for Research All rights reserved Washington 21st CCLC Evaluation March 1 Webinar Neil Naftzger and Samantha.
ABC School Data Wall Our priorities for District Goals – 1) Ensure individual students continuously exhibit academic and personal.
21 st CCLC APR System Webinar Tanya Morin Gary Sumnicht Alison Wineberg April 25 and 26, 2016.
Pathway to Excellence. School’s Out Washington provides services and guidance for organizations to ensure all young people have safe places to learn and.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
Understanding Growth Targets and Target Adjustment Guidance for Student Learning Objectives Cleveland Metropolitan School District Copyright © 2014 American.
Leading Indicators: Evaluation for Site-Level Improvement and System- Level Planning Samantha Sugar Research Associate/Analyst David P. Weikart Center.
Understanding Your LI Reports October 19, 2016
Thanks for coming. Introduce 21st Century and team.
Continuous Quality Improvement Process
Washington 21st CCLC Data Collection Webinar Samantha Sniegowski
Statewide Afterschool Evaluation— What Do the Data Tell Us
Sel in ymca afterschool project results
Statewide Afterschool Evaluation— What Do the Data Tell Us
Washington 21st CCLC Annual Performance Report Technical Support Webinar Samantha Sniegowski November 2015 Copyright © 2015 American Institutes for Research.
Presentation transcript:

Washington 21 st CCLC Evaluation February 2016 Copyright © 2016 American Institutes for Research. All rights reserved Data Collection Activities Samantha Sniegowski Researcher

 Statewide Evaluation Report Findings  Leading Indicator Report Updates  Data Dashboard Updates  Case Studies  APR Data Submission Updates  Other Spring Data Collection Activities Agenda 2

Statewide Evaluation Report Findings 3

 Covers two program years: &  Grantee & Center Characteristics covered in both years  Program Quality covered in both years  Focus: Impact Analysis on Youth Outcomes  Focus: Youth Motivation, Engagement, and Beliefs Survey Report Background 4

1.What were the primary characteristics associated with the grants and centers funded by 21st CCLC and the student population served by the program? 2.To what extent was there evidence that centers funded by 21st CCLC implement research-supported practices related to quality afterschool programming? 3.To what extent is there evidence that students participating in services and activities funded by 21st CCLC demonstrated better performance on youth outcomes as compared with similar students not participating in the program? ( ) 4.What does youth completion of the Youth Motivation, Engagement, and Beliefs Survey indicate both about youth experiences in programming and youth functioning on social and emotional learning and noncognitive areas? ( ) Evaluation Questions 5

1.What were the primary characteristics associated with the grants and centers funded by 21st CCLC and the student population served by the program? 2.To what extent was there evidence that centers funded by 21st CCLC implement research-supported practices related to quality afterschool programming? 3.To what extent is there evidence that students participating in services and activities funded by 21st CCLC demonstrated better performance on youth outcomes as compared with similar students not participating in the program? ( ) 4.What does youth completion of the Youth Motivation, Engagement, and Beliefs Survey indicate both about youth experiences in programming and youth functioning on social and emotional learning and noncognitive areas? ( ) Evaluation Questions 6

Key Findings – Evaluation Q1 7 Table 2. Grants by Maturity, 2012–13 Washington GrantsAll Grants Nationwide Grant MaturityN Grants% GrantsN Grants% Grants New % % Mature %1, % Sustaining %1, % Total grantees %3, % Source. PPICS. Table 3. Grants by Maturity, 2013–14 Washington GrantsAll Grants Nationwide Grant MaturityN Grants% GrantsN Grants% Grants New %1293.4% Mature %1, % Sustaining %1, % Total grantees %3, % Source. PPICS.

Key Findings – Evaluation Q1 8 Figure 8. Percentage of Centers per Grade-Level Cluster per Year, 2008–2014

Key Findings – Evaluation Q1 9 Figure 9. Attendees and Regular Attendees in Washington State by APR Year, 2006–2014

 More than 90 percent of centers were school based in both programming periods.  On average, 21st CCLC regular participants attended 61 days of programming during 2012–13 and 63 days during 2013–14.  Overall, centers had approximately 73 regular attendees and 123 total attendees during the 2012–13 programming period, while centers had approximately 70 regular attendees and 114 total attendees during 2013–14. Key Findings – Evaluation Q1 10

1.What were the primary characteristics associated with the grants and centers funded by 21st CCLC and the student population served by the program? 2.To what extent was there evidence that centers funded by 21st CCLC implement research-supported practices related to quality afterschool programming? 3.To what extent is there evidence that students participating in services and activities funded by 21st CCLC demonstrated better performance on youth outcomes as compared with similar students not participating in the program? ( ) 4.What does youth completion of the Youth Motivation, Engagement, and Beliefs Survey indicate both about youth experiences in programming and youth functioning on social and emotional learning and noncognitive areas? ( ) Evaluation Questions 11

 Organizational Practices Strengths: – Staff reported supportive, collaborative program climates (Staff Survey) – Consistent meetings to discuss program improvement efforts (higher frequency reported among staff vs. site coordinators) Area for improvement – Opportunity for staff to observe peers delivering programming to provide feedback on practice (Staff Survey) – Use data to set program improvement goals with other staff (Staff Survey) Key Findings – Evaluation Q2 12

 Instructional Practices Strengths: – Site Coordinators and staff report frequent delivery of practices associated with program design. – Most programs considered high functioning as defined by the PQA Form A. Areas for Improvement – Staff report struggling to find adequate time to plan activity lessons and offerings. – Most programs operate at the moderate level as defined by the PQA Form B. Key Findings – Evaluation Q2 13

 Partnership Practices Strengths – Programs typically communicate with families once or twice a semester o Most common strategies: Communicating about program events, collaborating to enhance student success, and providing family literacy or social events. – Programs adopt strategies to establish meaningful linkages to the school-day o Most common strategy: hiring regular school-day teachers Areas for Improvement – Programs typically communicate with families once or twice a semester o Least common strategies: sending info home about student progress; asking for input from family members about what and how activities are provided. – Programs adopt strategies to establish meaningful linkages to the school-day o Least common strategy: ensuring activities are aligned with schoolwide improvement targets related to student performance Key Findings – Evaluation Q2 14

1.What were the primary characteristics associated with the grants and centers funded by 21st CCLC and the student population served by the program? 2.To what extent was there evidence that centers funded by 21st CCLC implement research-supported practices related to quality afterschool programming? 3.To what extent is there evidence that students participating in services and activities funded by 21st CCLC demonstrated better performance on youth outcomes as compared with similar students not participating in the program? ( ) 4.What does youth completion of the Youth Motivation, Engagement, and Beliefs Survey indicate both about youth experiences in programming and youth functioning on social and emotional learning and noncognitive areas? ( ) Evaluation Questions 15

Key Findings – Evaluation Q3 16 Table 12. Impact of 21st CCLC on Achievement Pooled Across Grades, 2011– –12 Program Year2012–13 Program Year SubjectTreatmentEffect Size SE of Effect Size pEffect Size SE of Effect Size p Reading a 30+ day − day Mathematics b 30+ day < day Cumulative GPA c 30+ day− − day < Percentage of credits earned c 30+ day < day <0.001 Note. SE, standard error. a Includes Grades 4–8, 10. b Includes Grades 4–8. c Includes Grades 9–12.

Key Findings – Evaluation Q3 17 Table 13.Impact of 21st CCLC on Number of Unexcused Absences and Number of Disciplinary Incidents Pooled Across Grades, 2011– –12 Program Year2012–13 Program Year OutcomeTreatmentEffectSEp Weighted Mean Ratio (Treatment/ Comparison) EffectSEp Weighted Mean Ratio (Treatment/ Comparison) Number of Unexcused Absences a 30+ days− < − < days− < − < Number of Disciplinary Incidents b 30+ daysNA − daysNA − < Note. NA, not applicable; SE, standard error. a Includes Grades 6–12. b Includes Grades 3–12.

1.What were the primary characteristics associated with the grants and centers funded by 21st CCLC and the student population served by the program? 2.To what extent was there evidence that centers funded by 21st CCLC implement research-supported practices related to quality afterschool programming? 3.To what extent is there evidence that students participating in services and activities funded by 21st CCLC demonstrated better performance on youth outcomes as compared with similar students not participating in the program? ( ) 4.What does youth completion of the Youth Motivation, Engagement, and Beliefs Survey indicate both about youth experiences in programming and youth functioning on social and emotional learning and noncognitive areas? ( ) Evaluation Questions 18

 Pilot year of the Motivation, Engagement, and Beliefs Survey (in 38 centers)  Total of 1,199 surveys completed; average of 32 surveys per center  Grades 4-12  Students who were likely to meet the definition of a regular attendee Key Findings – Evaluation Q4 19

 Major Scales Sense Belonging and Engagement in the Program Program Impact on Student Social and Emotional Development  Majority of youth fell within the positive end of the response scale.  Relationship with School-Related Outcomes Key Findings – Evaluation Q4 20

Key Findings – Evaluation Q4 21 Table 19. Summary of HLM Results by Survey Subscale and School Outcome, 2014 CoefficientStandard Errorp Value Academic Identity Reading assessment ** Reading growth percentile Mathematics assessment *** Mathematics growth percentile Unexcused absences − * Disciplinary incidents− ** Intervention days− *** Mindset Reading assessment Reading growth percentile Mathematics assessment * Math growth percentile Unexcused absences− * Disciplinary incidents− Intervention days− Note. N = 867 youth in Grades 4–8 with complete survey data; actual n varies by analysis. ***p <.001, **p <.01, *p <.05, + p <.10.

Key Findings – Evaluation Q4 22 Table 19. Summary of HLM Results by Survey Subscale and School Outcome, 2014, continued CoefficientStandard Errorp Value Self-Management Reading assessment Reading growth percentile Mathematics assessment Mathematics growth percentile− Unexcused absences− Disciplinary incidents− * Intervention days− * Interpersonal Skills Reading assessment * Reading growth percentile Mathematics assessment Mathematics growth percentile Unexcused absences− Disciplinary incidents− * Intervention days− ** Note. N = 867 youth in Grades 4–8 with complete survey data; actual n varies by analysis. ***p <.001, **p <.01, *p <.05, + p <.10.

Washington 21st Century Community Learning Centers Program Evaluation: 2012–13 and 2013– StatewideEvaluationReport.pdf 21 st CCLC Evaluation and Accountability Webpage on the OSPI website Full Report Available At: 23

Leading Indicator Report Updates 24

 State Assessment Data  School-Day Absence & Disciplinary Incidents  Timeline (end of February) Population of Outcome Data 25

Data Dashboard Updates 26

 Purpose  Specifications  Impact of new federal reporting Data Dashboard History 27

 Purpose  Specifications  Intended Use Data Dashboard – Where we are now 28

Case Studies 29

 Purpose  Who is involved?  What activities will take place?  Timeline (February to April 2016) Case Study Overview 30

APR Data Submission Updates 31

 Now through Feb. 11 Enter your Activity and Staffing information for Spring 2015  Feb. 12 through Feb. 19 Make any adjustments to Summer 2014 data  Feb. 20 through Feb. 26 Make any adjustments to Fall 2014 data  Feb. 27 through March 4 Enter Participation and Outcome data for Spring 2015 Make any adjustments to Activity and Staffing information for Spring 2015 APR Data Submission Timeline 32

 AIR will provide support for entering these data. Each grantee will receive center-level reports of attendance broken down by the required categories: – Participation o Grade Level o Regular Attendee status o Race/Ethnicity o Gender o Population Specifics – Outcomes o State Assessment data Expect to receive these reports by February 25 th. Spring 2015 Participation & Outcomes 33

 What do I do if I have already entered Participation and/or Outcome data in the system for Spring 2015?  What do I do if the numbers I have entered do not match the reports that AIR has given me? Further Guidance 34

Other Spring Data Collection Activities 35

 Used to populate the leading indicator reports Data obtained from federal reporting systems Surveys PQA data Youth outcome data  Purpose is to inform quality improvement efforts  Leading indicator surveys collect data from two groups: Site coordinators Staff working directly with youth in the delivery of programming Leading Indicator Surveys 36

 Survey Administration: Recorded webinar from AIR will be posted to assist project directors in how to navigate the online survey system – available February 12th  s to project directors containing a link to the survey management system and a username and password will be sent during the week of February 15th  Complete surveys from site coordinators and afterschool program staff are due March 31 st Leading Indicator Survey Timeline 37

Student Survey – Theory of Change 38

 Administered the Student Engagement, Motivation, & Beliefs Survey in all centers during Spring 2015  Intended for grades 4-12 AND for students who were or likely to be regular attendees by the end of the programming period  Surveyed as many students as possible  A total of 4,952 surveys were collected from 21st CCLC participants Student Survey – 2015 Preliminary Results 39

 Current version comprised of the following scales (47 items): Academic identity Mindsets Self-management School belonging Interpersonal skills Retrospective program impact on academic behaviors Retrospective program impact on self-management Program belonging and engagement Student Survey – 2015 Preliminary Results 40

Student Survey – 2015 Preliminary Results 41 Not at all trueSomewhat trueMostly trueCompletely true Youth Skills and Beliefs ACADEMIC IDENTITY 1.0%9%37%53% POSITIVE MINDSETS 1%12%46%41% SELF-MANAGEMENT 3%18%47%32% SCHOOL BELONGING (dropped for ) NA INTERPERSONAL SKILLS 1%11%50%38% Program Experiences and Impact PROGRAM BELONGING AND ENGAGEMENT 3%13%30%54% ACADEMIC BEHAVIORS (RETROSPECTIVE) 4%16%37%43% SELF-MANAGEMENT (RETROSPECTIVE) 8%16%39%37%

 All centers serving youth in grades 4-12 will be asked to collect youth survey data in the spring of 2015  Survey is intended for students are regular attendees, or are likely to be regular attendees by the end of the program year.  Survey as many students as possible. At least 25 if possible.  Preference to collect data online Let us know if this is going to be an issue by sending an to Student Survey – 2016 Administration 42

 The statewide student identifier will be collected in relation to youth taking the survey or some similar method will be employed to connect survey data with the state data warehouses  Grantees will have access to a webinar on how to administer the survey on April 1.  Online collection of youth survey data will begin on April 4 and continue through May 29  Results from the youth survey will be made available in the leading indicator reports and will have access to raw de-identified data Student Survey – 2016 Administration 43

Timeline Recap 44 DateItem February 5 (Friday) 12:00-1:30pmState Evaluation Update Spring Data Collection Process February 11 (Thursday)Spring 2014 APR Federal Data Due February 12-19Update Summer 2014 APR Data February 12 (Friday)LI Support Webinar Recording Available February 15Leading Indicator Surveys Launched February 20-26Update Fall 2014 APR Data February 27 - March 4Update Spring 2015 APR Data February 28Outcomes populated in LI Reports February-AprilCase Studies March 31stAIR Leading Indicator Staff and Site Coordinator Surveys Due April 1Student Survey Administration Webinar April 4 (Monday)Student Survey Opens May 29Student Survey Closes *These are just things we have covered today. A full revised calendar of events will be provided by OSPI.

Samantha Sniegowski S. River Plaza, Suite 600 Chicago, IL General Information: