Copyright © 2010 American Institutes for Research All rights reserved. Washington 21st Century Community Learning Centers : From State to Local Program.

Slides:



Advertisements
Similar presentations
An Introduction for the School Community
Advertisements

Understanding Student Learning Objectives (S.L.O.s)
Copyright © 2003 Pearson Education, Inc. Slide 1 Computer Systems Organization & Architecture Chapters 8-12 John D. Carpinelli.
Copyright © 2011, Elsevier Inc. All rights reserved. Chapter 6 Author: Julia Richards and R. Scott Hawley.
Author: Julia Richards and R. Scott Hawley
Properties Use, share, or modify this drill on mathematic properties. There is too much material for a single class, so you’ll have to select for your.
Science Subject Leader Training
Evidence 2/8/2014 Evidence 1 Evidence What is it? Where to find it?
Alignment and Quality 2/8/2014 Alignment and Quality 1 What is it? What do I need to look for?
Learner-Centered Education Course Redesign Initiative Builds upon work of prior LCE grants Will award grants of $40,000 - $50,000, with the option.
UNITED NATIONS Shipment Details Report – January 2006.
Recognized ASCA Model Program Application (the RAMP application)
DRDP Measure Slides by Domain
Copyright © 2010 American Institutes for Research All rights reserved. Washington 21st Century Community Learning Centers Program Evaluation February 2011.
Copyright © 2010 American Institutes for Research All rights reserved. Washington 21st Century Community Learning Centers Program Evaluation – April Update.
1 DPAS II Process and Procedures for Teachers Developed by: Delaware Department of Education.
Exit a Customer Chapter 8. Exit a Customer 8-2 Objectives Perform exit summary process consisting of the following steps: Review service records Close.
Custom Statutory Programs Chapter 3. Customary Statutory Programs and Titles 3-2 Objectives Add Local Statutory Programs Create Customer Application For.
Tennessee Higher Education Commission Higher Education Recommendations & Finance Overview November 15, 2012.
1 Citrus County 21 st CCLC Progress Report Larry Parman External Evaluator.
Targeted Assistance & Schoolwide Programs NCLB Technical Assistance Audio April 18, :30 PM April 19, :30 AM Alaska Department of Education.
Southern Regional Education Board 1 Preparing Students for Success in High School.
1 Career Pathways for All Students PreK-14 2 Compiled by Sue Updegraff Keystone AEA Information from –Iowa Career Pathways –Iowa School-to-Work –Iowa.
A presentation to the Board of Education
1 Adequate Yearly Progress (AYP) U.S. Department of Education Adapted by TEA September 2003.
1 Approaches to Implementing the 2% Cap for Adequate Yearly Progress NCES Summer Data Conference Washington, DC July 2008.
Supported by 1 1 kids learn from people who care welcome! velkomin!
Results of the survey to parents / respondents on the quality of the services offered during the school year Presented on October 20,
Using outcomes data for program improvement Kathy Hebbeler and Cornelia Taylor Early Childhood Outcome Center, SRI International.
Part Three Markets and Consumer Behavior
Afterschool Programs That Follow Evidence- based Practices to Promote Social and Emotional Development Are Effective Roger P. Weissberg, University of.
1 SESSION 5- RECORDING AND REPORTING IN GRADES R-12 Computer Applications Technology Information Technology.
The SCPS Professional Growth System
Management Plans: A Roadmap to Successful Implementation
The Massachusetts Model System for Educator Evaluation Training Module 4: S.M.A.R.T. Goals and Educator Plan Development August 2012 I. Welcome (3 minutes)
Pennsylvania Value-Added Assessment System (PVAAS) High Growth, High Achieving Schools: Is It Possible? Fall, 2011 PVAAS Webinar.
1 Quality Indicators for Device Demonstrations April 21, 2009 Lisa Kosh Diana Carl.
PP Test Review Sections 6-1 to 6-6
Bright Futures Guidelines Priorities and Screening Tables
Bellwork Do the following problem on a ½ sheet of paper and turn in.
Annual Title 1 Parent Meeting
Middle School 8 period day. Rationale Low performing academic scores on Texas Assessment of Knowledge and Skills (TAKS) - specifically in mathematics.
Copyright © 2012, Elsevier Inc. All rights Reserved. 1 Chapter 7 Modeling Structure with Blocks.
Orientation and Training Susan A. Abravanel Sydney Taylor June 25 th, 2014.
1 RA III - Regional Training Seminar on CLIMAT&CLIMAT TEMP Reporting Buenos Aires, Argentina, 25 – 27 October 2006 Status of observing programmes in RA.
Basel-ICU-Journal Challenge18/20/ Basel-ICU-Journal Challenge8/20/2014.
1..
District Advisory Council (DAC) 1 October 22, 2012 Westlawn Elementary School.
© 2013 E 3 Alliance 2013 CENTRAL TEXAS EDUCATION PROFILE Made possible through the investment of the.
Employment Ontario Program Updates EO Leadership Summit – May 13, 2013 Barb Simmons, MTCU.
Model and Relationships 6 M 1 M M M M M M M M M M M M M M M M
Analyzing Genes and Genomes
Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Neil Naftzger.
Essential Cell Biology
1 Phase III: Planning Action Developing Improvement Plans.
PSSA Preparation.
SLG Goals, Summative Evaluations, and Assessment Guidance Training LCSD#7 10/10/14.
Essential Cell Biology
Energy Generation in Mitochondria and Chlorplasts
Virginia Teacher Performance Evaluation System 0 August 2012.
SMART GOALS APS TEACHER EVALUATION. AGENDA Purpose Balancing Realism and Rigor Progress Based Goals Three Types of Goals Avoiding Averages Goal.
Teacher Evaluation System LSKD Site Administrator Training August 6, 2014.
Title One Program Evaluation Report to the CCSD Board of Education June 17, 2013 Bill Poock, Title One Coordinator Leslie Titler, Title One Teacher.
The Framework for Teaching Charlotte Danielson 4c: Communicating with Families 1 6/12/201 3.
1 What Counts: Measuring the Benefits of Early Intervention in Hawai’i Beppie Shapiro Teresa Vast Center for Disability Studies University of Hawai`i With.
Data, Now What? Skills for Analyzing and Interpreting Data
Neil Naftzger Principal Researcher Washington 21st CCLC Evaluation February 2015 Copyright © 20XX American Institutes for Research. All rights reserved.
Copyright © 2012 American Institutes for Research. All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Mariel Sparr.
Presentation transcript:

Copyright © 2010 American Institutes for Research All rights reserved. Washington 21st Century Community Learning Centers : From State to Local Program Evaluation June 2011

2 The Evaluation Team American Institutes for Research Recent merger with Learning Point Associates Responsible for the development and maintenance of the Profile and Performance Information Collection System (PPICS) Demonstrated 21st CCLC and afterschool content knowledge Other statewide evaluations of 21st CCLC in New Jersey, Texas, and Oregon

3 The Evaluation Team David P. Weikart Center for Youth Program Quality Developers of the Youth and School Age PQAs Working to build program quality systems in 20 states, including 12 statewide 21st CCLC implementations, including Washington Rigorously tested intervention strategy for improving the quality of youth-serving programs

4 Evaluation Objectives Provide an Assessment of the Current State of 21st CCLC Program Impact Support the PPICS Reporting Process and the Collection of Student Level Data Document the Extent to Which 21st CCLC Programs Are Meeting Local, State, and Federal Targets and Goals Identify Characteristics Associated With High-Performing Programs Increase the Capacity of Grantees to Meet Their Program Improvement and Evaluation Obligations

5 Overview Provide an overview of the 21st CCLC program at the state level Provide an overview of the 21 st CCLC program at the local level Recommendations for future directions

6 Assessment of the Current State of 21st CCLC Define the characteristics of the current 21st CCLC programs in Washington Assess program outcomes with participating students Examine local evaluation reports and alignment with state level data Examine the quality of local evaluation reports to support state guidance

7 Data Sources 21st CCLC Profile and Performance Information Collection System (PPICS) Grantee and center level From APR local evaluation reports prepared for the reporting period

8 Characteristics of 21 st CCLC Programs in Washington 46 active 21st CCLC grantees (21 new, 23 mature, 2 sustaining) A total of 172 centers Median first-year award: $333, percent of grants has a school district as a fiscal agent 96 percent of centers located in schools 34 percent offer summer programming

9 Characteristics of 21 st CCLC Centers in Washington During Academic Year: Average of 9.9 hours of programming after school each week Average of 4.4 days per week over 32 weeks During Summer: Average of 20 hours of programming per week 4.4 weeks of programming

10 Characteristics of 21 st CCLC Centers in Washington 68 percent of students were regular attendees Multiple grade levels served, 38 percent are elementary only Of the staff, 25 percent are paid school teachers, 35 percent are volunteers. Multiple types of activities, 39 percent of the centers were mostly enrichment, 19 percent were mostly homework help

11 Characteristics of 21 st CCLC Centers in Washington

12 Characteristics of 21 st CCLC Centers in Washington

13 Characteristics of 21 st CCLC Centers in Washington

14 21 st CCLC Program Student Outcomes 1.Is higher levels of attendance in 21st CCLC programming related to the desired academic and behavioral outcomes? 2.Are particular center and student characteristics associated with student academic and behavioral improvement?

15 State Assessment Outcomes

16 Attendance & Program Outcomes Significant positive relationship between # of days in the program and improved behavior (based on teacher surveys). Higher level of program attendance was not significantly related to increased performance in state assessments in reading and mathematics in (among students who scored below proficiency in ).

17 Program Characteristics & Outcomes School-based centers : More likely to be associated with teacher reports of higher improvement rates. More likely to demonstrate significant improvement in elementary students mathematics proficiency level. Students that scored in the lowest proficiency category in the previous year (e.g., Well Below the Standard) were more likely to demonstrate improvement than higher performing students.

18 Program Characteristics & Outcomes In centers classified as mostly tutoring, elementary students mathematics assesment scores were more likely to improve. In centers classified as mostly recreation teachers rated lower rates of improvement in student behaviors. In centers staffed by mostly teachers teachers reported higher levels of improvement in motivation, attentiveness and motivation, and homework completion and quality.

19 Local Evaluation Reports 1.What type of student outcomes do the programs target, and how do they measure these program outcomes? 2.What evidence can be obtained from the local evaluation reports regarding how programs may impact student outcomes?

20 Student Attendance Grantee goals: All grantees took attendance but only 20 identified increasing attendance and retention as a program objective. Attendance was highest in programs serving elementary school students and averaged 80 percent or higher. Many programs seeking to achieve a high percentage of regular attendees were not able to achieve their objectives.

21 Academic Performance Grantee goals: All grantees aimed to increase academic performance. PPICS teacher survey and change in state assessment scores are the most commonly used measures of academic performance. Many reports stated objectives to achieve a specific percentage of increase in student achievement but only few reported the percent change in achievement.

22 Academic Performance Almost all reports shared findings descriptively. Findings were more positive for teacher reports of improvement as compared to reports of change in state assessment scores. A large number of reports shared only the percentages of students that improved while ignoring those whose behaviors declined or those who did not need to change.

23 Student Behavior and Attitudes Grantee goals: Increase positive attitudes towards and sense of connection to school (most common) Decrease referrals and negative behavior Increased student skills Student exposure to enrichment and community involvement activities Exposure to career and post-secondary education opportunities.

24 Student Behavior and Attitudes Data collected from student surveys, teacher surveys, student focus groups, parent surveys. Teacher reports suggested that the majority of students improved although significant variations were commonly reported. Evaluations that targeted more specific skills and behaviors reported positive change (e.g., reduced discipline referrals, fewer missed days, increased knowledge of careers).

25 Parental Involvement Grantee goals: Increase parental involvement in childrens education Parent attendance in program activities Increasing parent skills and knowledge in English, literacy, and community resources

26 Parental Involvement Data collected through parent surveys, interviews, or informal feedback but primarily reports of parent attendance in program activities. Only a small number of programs achieved their goals on parental involvement. Mostly reports of attendance in social events. Few reported increased parent involvement with students education Few reported increased level of skills and knowledge in parenting, ESL, GED.

27 Analysis of Local Evaluation Reports Largely descriptive analysis to assess program impact Assessment of outcomes based on PPICS teacher surveys and district-level assessments Lack of meaningful assessment of student behavioral growth common Assessment of parental involvement based on attendance Results not rigorously compiled

28 Quality of Local Evaluation Reports Outcome Average Above Standards Acceptable Marginal Quality Below Standards 4321 Clarity of Presentation Comprehensiveness of Content Relevance of Content to Goals and Objectives Rigor of Evidence Report Requirements

29 Quality of Presentation Average is 3.0 A large number of reports were rated as Above Standards ; Clearly organized Materials presented in an order that described what the program goals and objectives are Activities described in a clear manner to provide an overview of the program.

30 Comprehensiveness of Content Average is 2.8 A large number rated Acceptable or Marginal Quality Varying degrees of detail Inconsistent reporting of data tools

31 Relevance of Content to Goals & Objectives Average is 3.0 A large number rated as Acceptable or Marginal Quality Did not reflect program action theory Objectives are not presented by SMART criteria Assessments were not aligned with program goals and intended outcomes.

32 Rigor of Evidence Average is 2.4 More than half of the reports are rated as Marginal Quality Did not triangulate findings from different resources to support evidence. Descriptive reporting provide little evidence that outcomes are due to program participation. Limited information on number of respondents (e.g., survey response rates, focus group participants) Little insight into program implementation and quality

33 Alignment with Report Requirements Average is 3. In general, reports followed state requirements and guidelines. Weakest aspect is meaningful discussion of findings and recommendations for future decision making.

34 Recommendations on Evaluation Design Identification of evaluation questions Developing a logic model to guide program evaluation Consider the most rigorous evaluation design that resources allow Align data collection tools with goals and objectives rather than whats available Need to use measures that better assess student behavioral outcomes.

35 Recommendations on Program Implementation and Quality Incorporate program monitoring into annual evaluation Adoption of a self-assessment tool on program supports and quality of activities Developing an evaluation team

36 Recommendations on Reporting Use a template to report findings Developed and disseminated by OSPI Require more insightful recommendations in evaluation reports that will guide action plans.

37 State-Level Recommendations Consider the adoption of one or more measure(s) to assess social-emotional functioning and other behaviors related to academic functioning (e.g., task persistence, organizational skills, etc.) Leveraging local assessment data to both (a) inform the design and delivery of programming and (b) assess student growth and development Consider adopting indicators that ask programs to measure within-year student growth on formative assessments employed by the districts they are working with

38 State-Level Recommendations Further examine the relationship between student recruitment and enrollment policies and the achievement of desired outcomes Find ways to connect the leading indicators being developed currently with local evaluation efforts

39 Activity - Pair or group with the participants sitting next to you. - Reflect on the findings that were presented. - Share with group 1.How can the local evaluations supplement state data rather than duplicate the findings? 2.What are the most feasible short-term and long- term recommendations for improving local evaluations? 3.How can the state can provide guidance to implement the short-term and long-term recommendations?

40 Manolya Tanyu, Ph.D Researcher American Institutes for Research P: North Wacker Dr. Suite 1231 Chicago, IL General Information: Website: Neil Naftzger Principal Researcher American Institutes for Research P: East Diehl Road, Suite 200 Naperville, IL 60563