Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Neil Naftzger.

Slides:



Advertisements
Similar presentations
Understanding Student Learning Objectives (S.L.O.s)
Advertisements

[Imagine School at North Port] Oral Exit Report Quality Assurance Review Team School Accreditation.
Overview of Performance Measurement. Learning Objectives By the end of the module, you will be able to: Describe what performance measurement is, and.
The Journey – Improving Writing Through Formative Assessment Presented By: Sarah McManus, Section Chief, Testing Policy & Operations Phyllis Blue, Middle.
Guideposts --Quality Work-Based Learning Programs
Copyright © 2012 California Department of Education, Child Development Division with WestEd Center for Child & Family Studies, Desired Results T&TA Project.
WV High Quality Standards for Schools
Copyright © 2010 American Institutes for Research All rights reserved. Washington 21st Century Community Learning Centers : From State to Local Program.
Copyright © 2010 American Institutes for Research All rights reserved. Washington 21st Century Community Learning Centers Program Evaluation February 2011.
Copyright © 2010 American Institutes for Research All rights reserved. Washington 21st Century Community Learning Centers Program Evaluation – April Update.
Training for Teachers and Specialists
1 Citrus County 21 st CCLC Progress Report Larry Parman External Evaluator.
Self-Study Tool for Alaska Schools Winter Conference January 14, 2010 Jon Paden, EED Deborah Davis, Education Northwest/Alaska Comprehensive Center.
The Readiness Centers Initiative Early Education and Care Board Meeting Tuesday, May 11, 2010.
Orientation to EVALUATION PROCEDURES August, 2006.
MSCG Training for Project Officers and Consultants: Project Officer and Consultant Roles in Supporting Successful Onsite Technical Assistance Visits.
Afterschool Programs That Follow Evidence- based Practices to Promote Social and Emotional Development Are Effective Roger P. Weissberg, University of.
Special Education Survey Barnstable Public Schools September 17 – October 2, 2012.
Town Hall Presentation January 9-10, 2002 Curtis Powell Vice President for Human Resources The Division of Human Resources and William M. Mercer, Incorporated.
A Roadmap to Successful Implementation Management Plans.
1 Quality Indicators for Device Demonstrations April 21, 2009 Lisa Kosh Diana Carl.
1. 2 August Recommendation 9.1 of the Strategic Information Technology Advisory Committee (SITAC) report initiated the effort to create an Administrative.
1 Building Capacity to Advocate for Change May 24, 2007 GLA Capacity Building PLEASE ALSO JOIN US ON THE PHONE CALL: (Toll-free): +1 (866) Participant.
Southeastern Association of Educational Opportunity Program Personnel 38 th Annual Conference January 30 – February 3, 2010 Upward Bound Internal & External.
SEED – CT’s System for Educator and Evaluation and Development April 2013 Wethersfield Public Schools CONNECTICUT ADMINISTRATOR EVALUATION Overview of.
1 Welcome to the Title I Annual Meeting for Parents
1. 2 The San Jacinto Unified School District presents: Strategic Plan For
Poison Prevention: A Prescription for a Safer and Healthier Georgia Megan Popielarczyk, MPH, BSN, RN Public Health Fellow, Safe Kids Georgia 1.
RTI Implementer Webinar Series: Establishing a Screening Process
Using the PBIS Tiered Fidelity Inventory (TFI) E-12
Arts in Basic Curriculum 20-Year Anniversary Evaluation the Improve Group.
Training objectives & notes to the presenter… This training module is designed for an administrative commissioner ( Council Commissioner, Assistant Council.
1 Strengthening Teaching and Learning: Educational Leadership and Professional Standards SABES Directors’ Institute July 2011.
Support Professionals Evaluation Model Webinar Spring 2013.
1 Literacy PERKS Standard 1: Aligned Curriculum. 2 PERKS Essential Elements Academic Performance 1. Aligned Curriculum 2. Multiple Assessments 3. Instruction.
Campus Improvement Plans
Individualized Learning Plans A Study to Identify and Promote Promising Practices.
Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st CCLC Leading Indicators Breakout Session Neil Naftzger and Deborah Moroney.
Neil Naftzger Principal Researcher Washington 21st CCLC Evaluation February 2015 Copyright © 20XX American Institutes for Research. All rights reserved.
Schools’ Data Collection for National Partnerships Agreements (NPA) Educational Measurement and School Accountability Directorate (EMSAD)
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
1 1 M. Elena Lopez Senior Consultant Harvard Family Research Project Family Engagement for High School Success Toolkit Family Engagement for High School.
Copyright © 2007 Learning Point Associates. All rights reserved. TM Overview of the Oregon Attendees Module Neil Naftzger Principal Researcher To hear.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Introduction & Step 1 Presenter:. Training Overview Introduction Participation requirements FET Tool Orientation Distribution of username & passwords.
1. 2 Collaborative Partnerships It’s that evolution thing again! Adult education has been partnering and collaborating for years.
DPI 21 st Century Community Learning Center New Grantee Orientation: Part 2.
Copyright © 2012 American Institutes for Research. All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Mariel Sparr.
Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Neil Naftzger.
A state-wide effort to improve teaching and learning to ensure that all Iowa students engage in a rigorous & relevant curriculum. The Core Curriculum.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
TPEP Teacher & Principal Evaluation System Prepared from resources from WEA & AWSP & ESD 112.
1. Administrators will gain a deeper understanding of the connection between arts, engagement, student success, and college and career readiness. 2. Administrators.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
1. To what extent does the Correctional Education Association College of the Air (CEA/COA): a. Increase rates of participation in postsecondary and.
Illinois Department of Children & Family Service/Chicago State University STEP Program - NHSTES May THE STEP PROGRAM Supervisory Training to Enhance.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
Improving the Health Literacy Environment of Wisconsin Hospitals – A Collaborative Model Sue Gaard, RN, MS Wisconsin Primary Care Research & Quality Improvement.
Moving Title IA School Plans into Indistar ESEA Odyssey Summer 2015 Presented by Melinda Bessner Oregon Department of Education.
Understanding Your LI Reports October 16, 2015 October 2015 Copyright © 2015 American Institutes for Research. All rights reserved. Samantha Sniegowski.
Ohio 21 st CCLC Evaluation Presentation to Ohio Grantees September 10, 2015 Copyright © 20XX American Institutes for Research. All rights reserved. Matthew.
Copyright © 2011 American Institutes for Research All rights reserved Washington 21st CCLC Evaluation March 1 Webinar Neil Naftzger and Samantha.
Washington 21 st CCLC Evaluation February 2016 Copyright © 2016 American Institutes for Research. All rights reserved Data Collection Activities.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Understanding Your LI Reports October 19, 2016
Thanks for coming. Introduce 21st Century and team.
Continuous Quality Improvement Process
Washington 21st CCLC Data Collection Webinar Samantha Sniegowski
Partnering for Success: Using Research to Improve the Lowest Performing Schools June 26, 2018 Massachusetts Department of Elementary and Secondary Education.
Presentation transcript:

Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Neil Naftzger & Deborah Moroney October 21, 2011

2 Topics Provide a summary of our proposed approach to undertaking the evaluation  Evaluation Questions  New Data Collection Efforts  Leading Indicator Development  Timeline

3 The Evaluation Team American Institutes for Research  Recent merger with Learning Point Associates  Demonstrated 21st CCLC and afterschool content knowledge - Statewide 21st CCLC and afterschool evaluation and research studies in New Jersey, South Carolina, Texas, Washington and Wisconsin - Responsible for the development and maintenance of the Profile and Performance Information Collection System (PPICS) - Support the U.S. Department of Education in monitoring state delivery of 21st CCLC and in the creation of practitioner guides - Provider of afterschool training and technical assistance based on our Beyond the Bell toolkit and currently serve as the statewide training and technical assistance provider for 21st CCLC in Illinois

4 The Evaluation Team Gibson Consulting Group, Inc.  Texas-based research and evaluation firm  Working with AIR in completing the statewide evaluation of the 21st CCLC program in Texas  Recently completed 40 two-day site visits of 21st CCLC programs in Texas which included: - Site coordinator interviews - Staff focus groups - Activity observations - Administration of student surveys

5 The Evaluation Team Leading Indicators Work Group (LIAG) A representative group of local 21CCLC leaders from Oregon. LIAG Charge Provide information about how well an individual center and the state as a whole are doing in implementing programming that is likely to achieve the goals and objectives specified for the program Inform efforts to establish targets that centers should be striving toward in the implementation of their program Help inform state staff on what steps need to be taken from a training, technical assistance, and policy development front to support grantees in the achievement of program improvement goals

6 Evaluation Questions: Program Outcomes To what extent is there evidence that students participating in services and activities funded by 21st CCLC demonstrated better performance on the outcomes of interest as compared with similar students not participating in the program? To what extent is there evidence that students participating in services and activities funded by 21st CCLC more frequently demonstrated better performance on the outcomes of interest? To what extent is there evidence of a relationship between center and student characteristics and the likelihood that students demonstrated better performance on desired program outcomes?

7 New Data Collection Activities: PPICS Module Modified PPICS to allow for the collection of student identifiable information Student-level data will be collected in relation to Regular Attendee participation during the summer of 2010 and school year Integrated into the existing Regular Attendee module of PPICS Advise selecting the Regular Attendee reporting option in PPICS if you have not already done so All PPICS passwords were reset to conform to stricter security parameters and re-entry of username and password information is required when providing student identifiable data Important Points 1.Username and Password form 2.PPICS breakout session

8 New Data Collection Activities: Youth Outcomes Data will be used to run queries against the state assessment data warehouse to obtain reading and mathematics scores and other relevant outcome data for 21st CCLC participants and non-participating students attending the same schools Data will be used to support impact analyses predicated on comparing 21st CCLC program participants with non- participants Method of analysis allows us to sort out preexisting differences between students who attend and those who do not

9 New Data Collection Activities: Program Quality Site Coordinator Survey Focus on practices, policies, and procedures adopted by 21st CCLC- funded programs:  Collaboration & Partnership  Intentionality in activity and session design  Linkages to the school day  Data on student academic achievement to inform programming  Practices supportive of positive youth development  Practices supportive of family engagement Site Visits Highlight promising activity delivery practices:  Visit a small number of especially high performing programs  Conduct program observations employing the CLASS observation tool

10 Interrelated Factors of Program Quality

11 Goals of the Leading Indicator System  Provide information about how well an individual center and the state as a whole are doing in implementing programming that is likely to achieve the goals and objectives specified for the program  Help establish a standard of quality that grantees should be striving toward in the implementation of their program  Influence grantee behavior by detailing service delivery expectations and their performance relative to these expectations  Help inform state staff on what steps need to be taken from a training, technical assistance, and policy development front to support grantees in the achievement of program improvement goals

12 We need your help answering these Key Questions + Is this Indicator understandable and interpretable? + Does this Indicator convey meaningful information? + Will this Indicator support discussions and conversations with 21st CCLC staff? Other ways you can give us feedback  Attend the afternoon Leading Indicator breakout session  us your comments

13 LI: Partners associated with the center are actively involved in planning, decision making, evaluating, and supporting the operations of the afterschool program. LI: Staff from partner organizations are meaningfully involved in the provision of activities at the center. LI: Staff at the center will be engaged in intentional efforts to collaborate and communicate frequently about ways to improve program quality. LI: Steps are taken by the center to establish linkages to the school day and use data on student academic achievement to inform programming Leading Indicators: Collaboration & Partnership

14 LI: Staff at the center are provided with training and/or professional development. LI: Staff at the center complete one or more self-assessment during the programming period. LI: Staff at the center are periodically evaluated/assessed during the program period. Leading Indicators: Staff

15 +Students+ LI: There is evidence of alignment between (a) program objectives relative to supporting youth development, (b) student needs, and (c) program philosophy/model AND frequency/extent to which key opportunities and supports are provided to youth. LI: There is evidence of alignment between(a) program objectives relative to the academic development of students, (b) student needs, and (c) program philosophy/model AND activities being provided at the center. LI: Intentionality in activity and session design among staff responsible for the delivery of activities meant to support student growth and development in mathematics and reading/language arts. Leading Indicators: Intentional Activities

16 +Families+ LI: Steps are taken by the center to reach out and communicate with parents and adult family members of participating students. LI: There is evidence of alignment between (a) program objectives relative to supporting family literacy and related development, (b) family needs, and(c) program philosophy/ model AND activities being provided at the center. Leading Indicators: Intentional Activities

17 Leading Indicator Reports  Goal is to embed leading indicator reports into PPICS in the interest of supporting program improvement efforts and mid-year corrections before the programming period ends  Provide a snapshot of center status  Needs to be understandable and interpretable  Needs to convey meaningful information  Needs to support discussions and conversations with 21st CCLC staff  Facilitate an advisory group to guide and support the leading indicator development process

18 We need your help answering these Key Questions + Is this Indicator understandable and interpretable? + Does this Indicator convey meaningful information? + Will this Indicator support discussions and conversations with 21st CCLC staff? Other ways you can give us feedback  Attend the afternoon Leading Indicator breakout session  us your comments

19 Report Functionality  Goal is to ensure reports can support meaningful comparisons  Against statewide averages  Over time  By key center characteristics  Grade level  Recruitment and retention policies  Staffing model  Activity model  Maturity  May attempt to include recommendations and action planning tools as well

20 Data Collection Timeline TaskStart DateEnd Date PPICS Related Activities APR DataOpen11/15/2011 Student ID Data10/18/201112/15/2011 Site Coordinator Survey 11/21/201112/15/2011 Site Visits 2/15/20123/15/2012

21 Contact Oregon Evaluation general Neil NaftzgerDeborah Moroney P: P: American Institutes for Research General Information: