Presentation is loading. Please wait.

Presentation is loading. Please wait.

Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Neil Naftzger.

Similar presentations


Presentation on theme: "Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Neil Naftzger."— Presentation transcript:

1 Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Neil Naftzger & Deborah Moroney April 16, 2012

2 2 Agenda Part 1: Evaluation and Leading Indicators, 1:30-2:15 Welcome and Introductions Sharing Roles Purpose of the Meeting Quality Framework Overview Discussion of the quality framework in relation to the evaluation, defining what quality programming is, and the desired outcomes of the 21 st Century program Leading Indicators (LI)

3 3 Key Questions How do LI relate to program quality? In our various roles supporting the academic and social success of children their families, how can we best work together and use the information we have here? What should be our next steps in thinking about this work?

4 4 The Evaluation Team American Institutes for Research Gibson Consulting Group, Inc. Oregon Department of Education and the Leading Indicators Work Group (LIAG)

5 5 Quality Framework

6 6 Children and youth bring their own contribution to the afterschool setting and to their own success in that setting.

7 7 Youth Characteristics 1.Academic performance and skills 2.Demographics 3.Communication, relationship and collaboration, critical thinking and decision making, and initiative and self-direction 4.Quality of learning experience at center feeder 5.Access to other key external developmental assets (e.g., family support and involvement, caring neighborhood)

8 8 Quality Framework The resources and characteristics of the local and school community support the development of program goals, program design, and allows for meaningful partnerships and program guidance.

9 9 Community Context 1.Locale (urban, suburban, rural) 2.School-based or center-based 3.Financial resources 4.Administrator support/Role of influential stakeholders and decision 5.School status relative to AYP/Status relative to other accountability 6.Program maturity 7.Grade level of youth 8.Community stability/safety

10 10 Quality Framework Program quality is based both on observable dimensions of quality and processes that are foundational to program quality, including Organizational Processes, Quality at the Point-of-Service, and Opportunities for Engagement.

11 11 Program Quality Organizational Processes 1.Definition of service population/enrollment 2.Recruitment approaches 3.Staffing (hiring, orientation, development, and evaluation) 4.Access to and use of youth data 5.Establishing linkages to the school day 6.Selection of key partners and partner engagement 7.Program improvement/evaluation processes 8.Approaches to parent/family engagement 9.Provision of developmentally appropriate opportunities for youth choice, voice, ownership, and program leadership 10.Alignment of youth needs, program objectives, and programming approach/theory of change (in relation to both academic and non-academic outcomes) 11.Selection and utilization of quality frameworks Instructional/Point of Service Quality 1.Safe, supportive, interactive, and engaging settings 2.Activities are sequenced, active, and focused 3.Intentional activity and session design/embedding content 4.Evidence of emotional support, activity organization, and instructional support

12 12 Quality Framework Children and youth are more likely to experience benefits from afterschool program participation if they attend consistently, over time, and in a variety of types of activities.

13 13 Program Participation 1.Duration of participation 2.Intensity of participation 3.Breadth of participation 4.Degree of interaction with a consistent set of staff

14 14 Quality Framework Afterschool program participants are most likely to reap positive youth outcomes if we take into account: 1.what participants bring to the program (Youth Characteristics); 2.how well the program reflects and involved the resources in the community (Community Context); 3.the extent in which they participate in the program (Participation) 4.the quality of the program (Program Quality).

15 15 Positive Youth Outcomes 1.Improved communication, relationship and collaboration, critical thinking and decision making, and initiative and self- direction skills 2.Enhanced bonding to school 3.Decrease in problematic, at-risk behaviors/disciplinary incidents 4.Improved school day attendance 5.Improved reading and mathematic achievement 6.Improved grade promotion 7.Improved college and career readiness/ACT-SAT scores

16 16 Goals of the Leading Indicator System  Provide information about how well an individual center and the state as a whole are doing in implementing programming that is likely to achieve the goals and objectives specified for the program  Help establish a standard of quality that grantees should be striving toward in the implementation of their program  Influence grantee behavior by detailing service delivery expectations and their performance relative to these expectations  Help inform state staff on what steps need to be taken from a training, technical assistance, and policy development front to support grantees in the achievement of program improvement goals

17 17 LI: Partners associated with the center are actively involved in planning, decision making, evaluating, and supporting the operations of the afterschool program. LI: Staff from partner organizations are meaningfully involved in the provision of activities at the center. LI: Staff at the center will be engaged in intentional efforts to collaborate and communicate frequently about ways to improve program quality. LI: Steps are taken by the center to establish linkages to the school day and use data on student academic achievement to inform programming Leading Indicators: Collaboration & Partnership

18 18 LI: Staff at the center are provided with training and/or professional development. LI: Staff at the center complete one or more self-assessment during the programming period. LI: Staff at the center are periodically evaluated/assessed during the program period. Leading Indicators: Staff

19 19 +Students+ LI: There is evidence of alignment between (a) program objectives relative to supporting youth development, (b) student needs, and (c) program philosophy/model AND frequency/extent to which key opportunities and supports are provided to youth. LI: There is evidence of alignment between(a) program objectives relative to the academic development of students, (b) student needs, and (c) program philosophy/model AND activities being provided at the center. LI: Intentionality in activity and session design among staff responsible for the delivery of activities meant to support student growth and development in mathematics and reading/language arts. Leading Indicators: Intentional Activities

20 20 +Families+ LI: Steps are taken by the center to reach out and communicate with parents and adult family members of participating students. LI: There is evidence of alignment between (a) program objectives relative to supporting family literacy and related development, (b) family needs, and(c) program philosophy/ model AND activities being provided at the center. Leading Indicators: Intentional Activities

21 21 Leading Indicator Reports  Goal is to embed leading indicator reports into PPICS in the interest of supporting program improvement efforts  Provide a snapshot of center status  Needs to be understandable and interpretable  Needs to convey meaningful information  Needs to support discussions and conversations with 21st CCLC staff  Facilitate an advisory group to guide and support the leading indicator development process

22 22 Part 2: Discussion of Methods Data Collection and Analysis, 2:30 – 4:00 Presentation of Evaluation Design Research Questions Primary Evaluation Design and Deliverables Role of the Leading Indicator Advisory Group Preliminary Findings Proposed Method(s) of Analysis Feedback on the Evaluation

23 23 Key Questions Are there particular ways we should look at the data we have presently? How should this work look in relation to evaluations in other states and in relation to other evaluative efforts to understand quality and impact?

24 24 Implementation Questions What is the spectrum of program quality across the programs under consideration?  Organizational Processes  Instructional/Point of Service Quality What organizational processes are found to be drivers of instructional/point of service quality at high performing centers? What instructional approaches are associated with high levels of student engagement at the point of service? What is the relationship between (1) the characteristics of individual youth, (2) program context, and (3) center quality and levels of student participation in 21st CCLC programming?

25 25 Evaluation Questions: Program Outcomes To what extent is there evidence that students participating in services and activities funded by 21st CCLC demonstrated better performance on the outcomes of interest as compared with similar students not participating in the program? To what extent is there evidence that students participating in services and activities funded by 21st CCLC more frequently demonstrated better performance on the outcomes of interest? To what extent is there evidence of a relationship between center and student characteristics and the likelihood that students demonstrated better performance on desired program outcomes?

26 26 New Data Collection Activities: Youth Outcomes Modified PPICS to allow for the collection of student identifiable information Data will be used to run queries against the state assessment data warehouse to obtain reading and mathematics scores and other relevant outcome data for 21st CCLC participants and non-participating students attending the same schools Data will be used to support impact analyses predicated on comparing 21st CCLC program participants with non- participants Method of analysis allows us to sort out preexisting differences between students who attend and those who do not

27 27 Data Collection Activities: Program Quality Site Coordinator Survey Focus on practices, policies, and procedures adopted by 21st CCLC- funded programs:  Collaboration & Partnership  Intentionality in activity and session design  Linkages to the school day  Data on student academic achievement to inform programming  Practices supportive of positive youth development  Practices supportive of family engagement Site Visits Highlight promising activity delivery practices:  Visit a small number of especially programs that have reported adopting high quality practices  Conduct program observations employing the CLASS observation tool

28 28 Report Functionality  Goal is to ensure reports can support meaningful comparisons  Against statewide averages  Over time  By key center characteristics  Grade level  Recruitment and retention policies  Staffing model  Activity model  Maturity  May attempt to include recommendations and action planning tools as well

29 29 Notable Implementation Findings - Other States Typically a fair degree of variation is found across both programs and staff within programs on the adoption of practices and approaches associated with quality implementation Documentation of a relationship between instructional practices theoretically associated with supporting youth engagement and student reports of engagement Importance of intentionality and youth ownership in activity session design and delivery Positive program climate and engaging settings tended to be predicated on relationships defined by knowledge of the students’ needs, interests, and personal lives Ongoing challenges demonstrated by programs in using student data to inform and drive the design programming, even in programs where the data was accessible to program staff

30 30 Contact Oregon Evaluation general email: OR21stcclc@air.orgOR21stcclc@air.org Neil NaftzgerDeborah Moroney P: 640-649-6616P: 312-288-7609 nnaftzger@air.orgdmoroney@air.org American Institutes for Research General Information: 800-356-2735 www.air.org


Download ppt "Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Neil Naftzger."

Similar presentations


Ads by Google