Presentation is loading. Please wait.

Presentation is loading. Please wait.

Ohio 21 st CCLC Evaluation Presentation to Ohio Grantees September 10, 2015 Copyright © 20XX American Institutes for Research. All rights reserved. Matthew.

Similar presentations


Presentation on theme: "Ohio 21 st CCLC Evaluation Presentation to Ohio Grantees September 10, 2015 Copyright © 20XX American Institutes for Research. All rights reserved. Matthew."— Presentation transcript:

1 Ohio 21 st CCLC Evaluation Presentation to Ohio Grantees September 10, 2015 Copyright © 20XX American Institutes for Research. All rights reserved. Matthew Vinson Neil Naftzger

2  Who is AIR?  Evaluation Overview  Work To Date Project Director Survey – Some Results Evaluation Advisory Group  Upcoming Individual Student Data Collection  Questions? Agenda 2

3 AIR:  Matt Vinson  Samantha Sniegowski  Neil Naftzger  Nicole Adams Introductions 3

4  Undertaken a number of state and local research and evaluation projects, including: Statewide evaluations of 21st CCLC and related afterschool programs in New Jersey, Oregon, Rhode Island, South Carolina, Texas, Washington, and Wisconsin Local evaluations of 21st CCLC and related afterschool programs, most recently in Chicago, Nashville, and Palm Beach Research grants funded by both the William T. Grant and Charles Stewart Mott Foundations  Held a number of federal contracts related to the 21st CCLC program  Known for our comprehensive toolkit on starting and running high quality afterschool program known as Beyond the Bell American Institutes for Research 4

5 Conceptual Framework 5

6 Evaluation Overview 6

7  Evaluation will cover Year 1 and Year 2 21 st CCLC grantees following Paths A, B, and C  Will last approximately three years (through June 2018)  Particular focus on literacy, career readiness, and school attachment  Comprise implementation and outcome components Evaluation Overview 7

8  RQ1. How does implementation of the 21st CCLC program vary across the three primary programming paths associated with the 2015 and 2016 RFPs, particularly in relation to supporting literacy, career readiness, and school attachment?  RQ2. In what ways are 21st CCLC programs trying to take steps to ensure the quality of the programming they are delivering?  RQ3. What kinds of experiences are youth having in programming? How do these experiences vary across the three primary programming paths associated with the 2015 and 2016 RFPs? Evaluation Overview: Implementation Research Questions 8

9  RQ4. To what extent do youth that participate in 21st CCLC-funded programming more frequently (high attending youth) demonstrate greater annual growth on short-term measures of targeted skills, beliefs, and knowledge compared to youth participating in the program less frequently (low attending youth)?  RQ5. To what extent do youth that participate in 21st CCLC-funded programming more frequently (high attending youth) for multiple years perform better on school-related outcomes compared to similar youth attending the same schools not participating in programming? Evaluation Overview: Outcomes Research Questions 9

10  Individual student data collection  21 st CCLC data (Federal, once established)  Case study visits (to collect best practices/etc.)  Surveys (project director, teacher, staff, youth)  Literacy activity information  Local evaluation reports  ODE youth demographic data  ODE youth performance data (school-related outcomes, assessments, etc.) Evaluation Overview: Data Collection 10

11  Basic Descriptive analysis (sums, averages, etc.)  Case Study write-ups  Hierarchical linear modeling (correlational analysis) Links program characteristics with survey responses, etc  Propensity score matching (PSM) Enables creation of a comparison group by matching participant population of interest with other youth on key variables (e.g., demographics, prior-year assessment scores, etc.) Next-best to random assignment Has shown interesting results in other state assessments Evaluation Overview: Methods 11

12  Project Director Survey (finished, ~80% return rate)  Establish an Evaluation Advisory Group  Individual Student data collection (Open Today)  Arrange initial site visits  Review Local Evaluation Reports  Surveys Expect more information soon! Evaluation Overview: Near-Term Research Efforts 12

13  Administered during June and July  About 80% return rate  58 Programs responded and completed the survey  Covering paths A, B, and C Work to Date: Project Director Surveys 13

14 To what extent do you see it as a goal of the program to impact youth in the following ways related to the development of Literacy skills? Work to Date: Project Director Surveys 14 This is not a goal of the program This is a minor goal of the program This is a moderate goal of the program This is a major goal of the program Not Sure Development and practice of basic literacy skills 0%7%3%88%0% Development of literacy-related learning strategies 0%9%10%79%0% Enhance student confidence as readers 0%3%10%84%0% Cultivation of a positive mindset related to reading (e.g., growth mindset - if I put forth the effort I can succeed) 0%2%19%78%0% Cultivation of interest in reading 0%5%7%86%0%

15 Please select the various types of staff that provide literacy activities within your program. (select all that apply) Work to Date: Project Director Surveys 15

16 For activities that are especially meant to support student growth and development in literacy, what is the typical staff to student ratio? Work to Date: Project Director Surveys 16

17 Are you using any published or externally developed curriculum selected specifically to support literacy activities delivered in the afterschool program? Work to Date: Project Director Surveys 17

18 In the typical week, how many hours, if any, are dedicated to providing direct instruction in LITERACY to at least some participating youth? (Please enter a value greater than or equal to zero) Work to Date: Project Director Surveys 18

19 Approximately what percentage of youth served by your program are participating in direct instruction-related LITERACY activities each week? (Please enter a value greater than zero) Work to Date: Project Director Surveys 19

20 For those students participating in direct instruction-related LITERCY activities, approximately what percentage of time do they spend in direct instruction activities as a percentage of their total weekly participation in the program? Work to Date: Project Director Surveys 20

21 Does your program include activities that are meant to get parents and other adult family members more involved in supporting the literacy development of students enrolled in the program? Work to Date: Project Director Surveys 21

22 Work to Date: Evaluation Advisory Group (EAG) 22  About 12 Non-AIR, Non-ODE Advisors  Grant Directors and Local Evaluators  Conference Call on September 2 (introduction and individual student data collection)  Conference Call on September 7 (important concerns of local evaluators)  We will continue to meet concerning evaluation tasks, data-collection instruments, grantee concerns, and so on.

23 Upcoming: Individual Student Data Collection 23  Purpose: Collects data not otherwise available: – 21 st CCLC participant names – Participant attendance (number of days for summer 2014, fall 2014, and spring 2015) – Estimated hours of literacy instruction per week SSID (state student identifier) Provides AIR with youth participation data Enables linkage to ODE state data

24 Upcoming: Individual Student Data Collection 24  What about student privacy? All data collected in the system, along with all other data collected as part of the evaluation, are strictly for use by AIR to assess Ohio 21 st CCLC program impact. The data will be put to no other use. Further, AIR has strict, industry-standard security measures in place to ensure all data kept absolutely private. At no time will any youth identifying information be released in any report (or otherwise). AIR is working with ODE to ensure all data are kept secure, and all legal aspects are covered.

25 Upcoming: Individual Student Data Collection 25

26 Upcoming: Individual Student Data Collection 26

27 Upcoming: Individual Student Data Collection 27

28 Upcoming: Individual Student Data Collection 28

29 Upcoming: Individual Student Data Collection 29

30 Upcoming: Individual Student Data Collection 30

31 Upcoming: Individual Student Data Collection 31

32 Upcoming: Individual Student Data Collection 32

33 Upcoming: Individual Student Data Collection 33

34 Upcoming: Individual Student Data Collection 34

35 Upcoming: Individual Student Data Collection 35

36 Upcoming: Individual Student Data Collection 36

37 Upcoming: Individual Student Data Collection 37

38 Upcoming: Individual Student Data Collection 38

39 Upcoming: Individual Student Data Collection 39  The Individual Student Data Collection System is launching in two stages.  The “delegation” component (for district staff) is forthcoming: We want to make sure it works properly. We want to incorporate feedback we received to make sure the process flows as smoothly as possible.  You will receive an email from AIR when the delegation function is ready.

40 Upcoming: Individual Student Data Collection 40  Next Steps: You will receive an email with the url link to the individual student data collection system The email will have login instruction Begin entering: – SUMMER 2014 – SCHOOL YEAR 2014-15 Send questions to nadams@air.orgnadams@air.org Send feedback to mvinson@air.orgmvinson@air.org

41 Matthew Vinson mvinson@air.org Neil Naftzger nnaftzger@air.org Questions? 41


Download ppt "Ohio 21 st CCLC Evaluation Presentation to Ohio Grantees September 10, 2015 Copyright © 20XX American Institutes for Research. All rights reserved. Matthew."

Similar presentations


Ads by Google