Partnerships, Alliances, and Coordination Techniques Building Capacity to Evaluate Partnership Initiatives February 2008 Facilitated By: The National Child.

Slides:



Advertisements
Similar presentations
1 When DAP Meets GAP Promoting Peaceful Coexistence between Developmentally Appropriate Practice & the Need to Address the Achievement Gap International.
Advertisements

Practice Profiles Guidance for West Virginia Schools and Districts April 2012.
The Readiness Centers Initiative Early Education and Care Board Meeting Tuesday, May 11, 2010.
Quality Rating & Improvement Systems Powerful Policy for Improving and Unifying Early Care and Education Anne Mitchell Early Childhood Policy Research.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Scaling-Up Early Childhood Intervention Literacy Learning Practices Maurice McInerney, Ph.D. American Institutes for Research Presentation prepared for.
Creating an Early Childhood System Karen Ponder February 9, 2010 Arizona Early Childhood Task Force.
Parents as Partners in Education
Presentation at The Conference for Family Literacy Louisville, Kentucky By Apter & O’Connor Associates April 2013 Evaluating Our Coalition: Are We Making.
Arts in Basic Curriculum 20-Year Anniversary Evaluation the Improve Group.
A HISTORICAL PERSPECTIVE SCHOOL READINESS:. WHERE DID WE START? 1999 : KSDE began working with Kansas Action for Children to define School Readiness 2000:
1 Literacy PERKS Standard 1: Aligned Curriculum. 2 PERKS Essential Elements Academic Performance 1. Aligned Curriculum 2. Multiple Assessments 3. Instruction.
A Plan for Improving the Behavioral Health of New Hampshire’s Children TRANSFORMING CHILDREN’S BEHAVIORAL HEALTH CARE Regional Presentations April-May.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Partnerships, Alliances, and Coordination Techniques Financing February 2008 Facilitated By: The National Child Care Information and Technical Assistance.
 Reading School Committee January 23,
700: Moving Through the Collaboration Continuum to Improve Permanency Outcomes Wednesday, March 20, 2013.
1 Why is the Core important? To set high expectations – for all students – for educators To attend to the learning needs of students To break through the.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Beth Rous University of Kentucky Working With Multiple Agencies to Plan And Implement Effective Transitions For Head Start Children Beth Rous University.
Comprehensive Curriculum Framework for Tiered Instruction: A Response to Intervention Model Sarah Jackson, M.Ed. Sandra Hess Robbins, M.Ed. Sanna Harjusola-Webb,
The Revised Strengthening Families Self-Assessments: What’s Different?
Schoolwide Planning, Part III: Strategic Action Planning
Evaluation. Practical Evaluation Michael Quinn Patton.
EEC’s Parental Consent Form Authorization to Collect and Use Child Data January 31, 2013 and February 1,
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Leadership within SW-PBS: Following the Blueprints for Success Tim Lewis, Ph.D. University of Missouri OSEP Center on Positive Behavioral Intervention.
Promoting Inclusive Opportunities for Young Children with Disabilities: A Cross Agency Initiative OSEP National Early Childhood Conference December 12,
1 EEC Board Policy and Research Committee October 2, 2013 State Advisory Council (SAC) Sustainability for Early Childhood Systems Building.
1 Early Childhood Special Education Connecticut State Department of Education Early Childhood Special Education Maria Synodi.
Community Input Discussions: Measuring the Progress of Young Children in Massachusetts August 2009.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
FROM DATA TO OUTCOMES How Standards and Measures Drive Quality 3/9/121FROM DATA TO OUTCOMES | NACCRRA Policy Symposium.
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Measuring and Improving Practice and Results Practice and Results 2006 CSR Baseline Results Measuring and Improving Practice and Results Practice and Results.
National Consortium On Deaf-Blindness Families Technical Assistance Information Services and Dissemination Personnel Training State Projects.
Early Childhood Outcomes Center1 Using Data for Program Improvement Christina Kasprzak, NECTAC/ECO Ann Bailey, NCRRC July 2010.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Massachusetts State Advisory Council (SAC) on Early Childhood Education and Care Review of Grant and Work Plan December
State Advisory Council Birth to Age 8 Alignment through the Rural Opportunities Initiative Summary Presentation for the Board of Early Education and Care.
We worry about what a child will be tomorrow, yet we forget that he is someone today. --Stacia Tauscher.
The Relationship of Quality Practices to Child and Family Outcomes A Focus on Functional Child Outcomes Kathi Gillaspy, NECTAC Maryland State Department.
Evaluation of the Indiana ECCS Initiative. State Context Previous Early Childhood System Initiatives –Step Ahead –Building Bright Beginnings SPRANS Grant.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
Presented at State Kindergarten Entry Assessment (KEA) Conference San Antonio, Texas February, 2012 Comprehensive Assessment in Early Childhood: How Assessments.
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
Module II: Developing a Vision and Results Orientation Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24,
Early Childhood Transition Part C Indicator C-8 & Part B Indicator B-12 Analysis and Summary Report of All States’ Annual Performance Reports.
Section 1. Introduction Orientation to Virginia’s QRIS.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Illinois Department of Children & Family Service/Chicago State University STEP Program - NHSTES May THE STEP PROGRAM Supervisory Training to Enhance.
Good Start, Grow Smart Inter-American Symposium Understanding the State of the Art in Early Childhood Education and Care: The First Three Years of Life.
Sustainability Planning Framework and Process Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop June 23-24, 2015 ©
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
1 Early Childhood Assessment and Accountability: Creating a Meaningful System.
TCRF Strategic Planning Process A Stakeholders’ Consultative Retreat- Morogoro 26 th -27 April 2013.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
Prepared for the Office of Head Start by ICF International School Readiness Goals: Using Data to Support Child Outcomes.
The PDA Center is funded by the US Department of Education Office of Special Education Programs Stories from the Field and from our Consumers Building.
Improving Data, Improving Outcomes Conference Washington, DC Sept , 2013 Planful Changes: Using Self-Assessments to Improve Child and Family Outcome.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Race to the Top—Early Learning Challenge Letters of Support Webinar
RtI Innovations: Evaluation Anna Harms & Jose Castillo
Community Input Discussions:
Blueprint Outlines practical, consumer-focused, state and local strategies for improving eating and physical activity that will lead to healthier lives.
Introductions Introduction
Using Data for Program Improvement
We worry about what a child will be tomorrow, yet we forget that he is someone today. --Stacia Tauscher.
Using Data for Program Improvement
Presentation transcript:

Partnerships, Alliances, and Coordination Techniques Building Capacity to Evaluate Partnership Initiatives February 2008 Facilitated By: The National Child Care Information and Technical Assistance Center (NCCIC) NCCIC Is A Service of the Child Care Bureau Presented by The National Child Care Information Center

Presenter

Todays Agenda

It is messy, but doable… and could be fun! This module will show you how to keep your head without losing your shirt! Evaluation of Partnership Initiatives

Session Objectives Participants will be able to: 1.Understand the basics of evaluation: approaches, data collection, and analysis. 2.Conduct an assessment of current capacity for evaluating partnerships initiatives 3.Determine the purpose and scope of their evaluation. 4.Understand the role of partners in making meaning of and communicating evaluation results.

PACT PACT is an initiative of NCCIC, a service of the Child Care Bureau, U.S. Department of Health and Human Services PACT gives State, Territory, and Tribal policymakersparticularly Child Care and Development Fund Administrators and their partnersthe resources they need to build more comprehensive and collaborative early care and school-age programs for serving children and families

PACT Materials PACT Collaborative Leadership Strategies: A Guide for Child Care Administrators and Their Partners Web-based guide contains an introduction and six training modules: –Fundamentals of Collaborative Leadership –Creating, Implementing, and Sustaining Partnerships –Communication Strategies –Management Strategies for Successful Partnerships –Financing –Building Capacity to Evaluate Partnership Initiatives

Objective 1: The Basics of Evaluation …Getting Your Feet Wet!

Goals of Evaluation Evaluation is a strategy to identify, monitor, and track progress of the implementation and expected outcomes of a collaborative project. The evaluation plan serves as a guide for partners, staff, and others in both day-to-day activities and long range planning. It is critical to be clear on the purpose of the evaluation and to match approaches and measures to the purpose!

Benefits of Evaluation +++ On the plus side +++ A Good Evaluation …. –Sets clear targets and goals –Provides objective information –Assists in project management –Builds public awareness and support –Improves performance –Impacts outcomes –Increases funding Source: Child Care Partnership Project. (2000). Using results to improve the lives of children and families: A guide for public-private partnerships. Washington, DC: Child Care Bureau, Administration for Children and Families, U.S. Department of Health and Human Services.

Considerations and Cautions --- On the downside --- An Ineffective Evaluation…. –Sets demands for significant results too quickly –Makes unrealistic assumptions about what caused change –Makes it difficult to collect appropriate data given the current state of early childhood measurement tools –Causes unintended harm to children or families if results are used inappropriately –Results in a redirection, realignment, or removal of program activities Source: Child Care Partnership Project. (2000). Using results to improve the lives of children and families: A guide for public-private partnerships. Washington, DC: Child Care Bureau, Administration for Children and Families, U.S. Department of Health and Human Services.

The Language of Evaluation The ABCs…of Evaluation Accountability Assessment Aggregate Beta Level Control Group What Terms Confuse You? Source: Child Care & Early Education Research Connections. (n.d.). Research glossary. Retrieved March 25, 2008, from

Why Work with Partners to Build Capacity for Evaluation? Accountability is being required in many sectors In Head Start In Child Care In Prekindergarten/Education In Early Intervention –Multiple partners are increasingly working together to align initiatives and programs to increase access and effectiveness to early care and education services. –A number of States and communities are designing early childhood systems initiatives or developing cross-sector initiatives to meet the multiple needs of families and children, and provide more comprehensive services.

Objective 2: Building Capacity for Evaluation ……Is the Water Warm Enough?

Why is Evaluation Important to You/Your Collaborative Project? What specific needs do you have that you would like the evaluation to address? –What are your goals? –What are each partners goals? What do you think are the benefits? –To your organization? To children/families/practitioners? What do you think are the challenges? –Are costs, capacity, resources available? What are your fears about evaluation?

Considerations in Assessing Your Projects Capacity for Evaluation What progress do you expect? What information will help you document gains? What data is already available and what data is needed? What capabilities do you have now? What do you need? How much time will it take to get the system working well? How much $$$$ will it require?

Six Key Strategies to Build Capacity for Evaluation Establish a culture of accountability Develop a long-range strategic plan Partner with researchers and experts Ensure data quality Engage families & business/legislators Communicate results simply and often

Assessing Your Projects Capacity for Evaluation From your small group discussion on building capacity: –What surprised you? –What elements are your strengths? –What elements need to be addressed? –What next steps have you identified? Common Issues to Address in Building Capacity –Evaluation Expertise –Costs

Ensuring You Have Evaluation Expertise… Key Partners or an executive committee provide oversight to the evaluation team Options for Evaluators Role: An outside evaluator (which may be an individual, research institute, or consulting firm) who serves as the team leader and is supported by in-house staff. An in-house evaluator who serves as the team leader and is supported by program staff and an outside consultant. An in-house evaluator who serves as the team leader and is supported by program staff. Source: Office of Planning, Research & Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services. (2006). Chapter 2: What will evaluation cost? In Program managers guide to evaluation. Retrieved March 25, 2008, from

Evaluation Cost Considerations Evaluation cost are driven by: –Evaluation design –The number of participants assessed –Standardized measures (number used, assessor training & reliability practices, frequency of assessment) –Data availability & quality (including automation of data entry & analyses) –Methods of reporting & communicating results –Infrastructure for data collection, level of analyses, printing, etc. Source: Golin, S., Mitchell, A., & Gault, B. (2004). The price of school readiness: A tool for estimating the cost of universal preschool in the states. Retrieved February 24, 2008, from

From Assessing Capacity to Strategic Action You have conducted a baseline assessment of the current capacity for evaluation… You have considered costs and expertise needed… Now you are ready to: Develop a strategic plan for building capacity for evaluation

Objective 3: Choosing an Evaluation Approach …Wallowing in the Mud!

Considerations for Determining the Scope of Your Evaluation What mandates or expectations for evaluation does your partnership project have? What is the current status of your evaluation capacity, including resources for funding the evaluation? What lessons learned/strengths of the partnership can be used in developing an evaluation approach? What are the challenges or sticky issues that may impact the success of the evaluation? What data do you have for a baseline and tracking outcomes over time, and across agencies?

Did you implement the program as planned? If not, why not? What changes were made? Outcome Evaluation Research Implementation Evaluation Stages of Evaluation Approaches Do participants do better than non-participants? Is one programmatic approach more effective than another? Does program achieve intended outcomes? For whom? Did organizational or system structure impact policy, resources, outcomes? Source: Oregon State University Family Policy Program & Oregon Child Care Research Partnership Project. (2000). Results accountability guidebook. Retrieved February 24, 2008, from,

Match Goal and Purpose to Evaluation Approach The fundamental principle-Evaluation approaches match the purpose and goals of partners and the initiative. They can be as simple or complex as needed. The following examples show the range of complexity and rigor that exist in the field of early care and education. What best meets your needs is up to you!

State Approaches to Evaluation Leading the Way to Quality Early Care and Education CD-ROM Literacy and Early Learning/Assessment and Evaluation: Florida discusses evaluation of school readiness initiatives. Ohio discusses the use of a Logic Model approach in evaluating an infant-toddler initiative. California discusses their Desired Results Accountability System for child care and early education services.

California: Desired Results for Children and Families Multi-purpose/multi-year state-level accountability system-to inform instruction, target technical assistance and monitor trends in publicly funded programs –Developmental observation profiles for children birth to age 14 to inform instruction –Family surveys and program self-assessments to target technical assistance –State level aggregated data to monitor trends –Conducted in partnership with a university and the training system Source: California Department of Education. (2007). Introduction to desired results. Retrieved March 25, 2008, from

Oklahomas Quality Rating System: Reaching for the Stars A longitudinal study, with multiple phases and purposes, conducted by the Early Childhood Collaborative of Oklahoma and others – observational study was conducted of implementation – validation study of centers –2003 – outcome study to determine impact of tiered rates on quality and relative impact of specific indicators on overall quality –2004– validation study of family child care homes Source: Norris, D., Dunn, L., & Dykstra, S. (2003). Reaching for the stars center validation study executive summary. Retrieved February 24, 2008, from

Marylands Model of School Readiness Multi-purpose/multi-year state-level accountability system – to inform instruction, target technical assistance and, monitor trends in publicly funded programs Each fall, all kindergarten teachers assess children using a modified version of the Work Sampling System and report this data to the Department of Education. The Department of Education submits a report based on this and other data to the General Assembly each November about the level of school readiness Statewide. The Department of Education, which includes child care, partners with a nonprofit to deliver and assess the training that supports this accountability effort. Source: Maryland State Department of Education. (n.d.). Maryland model for school references. Retrieved March 25, 2008, from

Ohio Child Care/Head Start Partnership Project This is a research study, funded by the Child Care Bureau, conducted in collaboration with State Policymakers The goal of the partnership project is to provide high-quality, seamless services to families with low incomes and their children. The longitudinal survey research is designed to examine the nature and benefit of partnerships, and the impact on outcomes for centers, teachers, and children.

A Systemic View of Child and Family Outcomes in Context

Assessment and Evaluation Lessons from Research and Professional Wisdom from the Field Clips from Child Care Works: Research to Practice, Assessment and Evaluation Module –Involving stakeholders in program evaluation –Developing systems of assessment –Challenges of measuring quality

10 Steps to the Information You Need to Make Good Decisions (and convince others too!) 1.Determine the purpose and scope 2.Agree on results 3.Select measures 4.Establish a baseline and objective 5.Determine and implement strategies aimed at positive change 6.Develop a performance agreement among groups responsible 7.Collect data 8.Analyze the data 9.Assess progress and modify strategies and resources 10.Publicize results Source: The Finance Project. (2002). Accountability systems: Improving results for young children. Retrieved February 24, 2008, from

What to Measure in a Partnership Project? It is important to be clear –Is increased collaboration a GOAL or an outcome in and of itself? AND…OR –Is increased collaboration/resource sharing a STRATEGY to achieve goals? AND…OR –Is effective administration of a project by multiple partners a CONDITION (theory of change) for success?

Short- & Intermediate-Term Objectives The Core Services describe activities which are designed to meet short- and intermediate- term objectives on the way to meeting the long term goal Tip/Challenge: As you identify program services, activities, and short and intermediate-term objectives, you must continually recheck and loop back to be sure that each element is aligned and reasonably links to the long term goal.

What Is a Theory of Change Logic Model? It is a TOOL to develop a common understanding of – Goals – Vision of how program will effect change – Program Services – Outcomes It serves as a dynamic process to guide program development, implementation, and evaluation/accountability.

How to Develop a Logic Model Gather key stakeholders perspectives on: 1.Long-term outcomes 2.Theory of change 3.Program services and activities 4.Short- & intermediate-term outcomes 5.Indicators/evidence of progress in meeting outcomes

Objective 4. Collecting Data and Reporting Findings …Making Mudpies!

Data Collection Identify data currently being collected to determine the fit with indicators chosen. Review the quality of the data and identify gaps in data needed to measure progress on the indicators. Start small. Its very easy, and pretty common, to go way overboard on data collection! It will keep you sane, and keep costs reasonable, if you choose a few data sources that have the intent and power to give you the information you need.

Multiple Levels of Data Collection System Level Data - Data on key system or partnership indicators Program /Service Level Data – Implementation data in the first stages and program outcome data in the second stage. Individual Level Data – Data on adults, children, or families, often from a sample, and best collected over time, with multiple measures

Collect Powerful Data Data Power –What are the most accurate and reliable data sources available? Proxy Power –Are the indicators clearly within the control of the program and have shown, in previous research, to predict later gains? Communication and Political Power –What outcomes are most important to key stakeholders? Source: Child Care Partnership Project. (2000). Using results to improve the lives of children and families: A guide for public-private partnerships. Washington, DC: Child Care Bureau, Administration for Children and Families, U.S. Department of Health and Human Services.

Decision Points and Options for Data Sources and Analysis LEVEL Of Intensity, Rigor, Cost TYPES OF DATATYPES OF ANALYSIS A--Leastprogram records, reports, simple surveys report format with simple statistics and perhaps some quotes/anecdotes B--Mediumas above, and 1 or 2 standardized measures at multiple points in time as above, plus limited gain scores or growth (increase/decrease) analyses C--Mostas above, and focus groups/interviews, 2 or more standardized measures, and/or control or comparison group as above plus case studies, yearly or multiple year gain scores and/or longitudinal analyses

Measuring Outcomes in Early Care and Education Not all measures to assess child outcomes have predictive ability to later outcomes, and may not be sensitive to young childrens dynamic growth or cultural and linguistic differences Observational measures of program quality are not applicable to all settings, and may not capture adequately the nuances and complexity of quality. Measures of partnership effectiveness, systemic impact, and system integration are sparse and difficult to adequately attribute causality/impact. Choosing measures and methods to document outcomes…is a fine art– balancing what is available, appropriate, and useful!

Findings…Meaning…Action It is all too easy to collect data….but much harder to analyze the findings appropriately, make meaning of the findings, and use the findings to take (appropriate) action Source: Hebbeler, K. (2006, May). Now comes the fun part: Gleaning meaning from early childhood outcome data. Retrieved March 27, 2008, from

Findings Findings are the numbers, the scores on measures, the summary of quarterly reports…which in and of themselves are meaningless! While numbers are not debatable, it is important to include enough information about the numbers (and the context of the initiative) to make them meaningful Data add substance to what could otherwise be dismissed as anecdotes, while stories add a personal element to cold numbers on a page (Using Results to Improve the Lives of Children and Families, pg. 7) Hebbeler, 2006

Meaning The interpretation put on the numbers Is this finding good news? Bad news? News we cant interpret? Meaning is debatable and reasonable people can reach different conclusions from the same set of numbers Stakeholder involvement can be helpful in making sense of findings Meaning is derived from the goals and your theory of change (why you believe you can achieve results). Hebbeler, 2006

Reporting Results: Tell the Story Identify areas where changes may be needed for future implementation. Inform policy and/or funding decisions by telling the "story" of program implementation and demonstrate the impact of the program on participants. Build public awareness and support with legislators, parents, and community members. Choose a report format that is consistent with your program purpose and appeals to the target audience.

Take Powerful Action A key role of the partnership team is communicating results and determining how the evaluation results are used –To improve program –To get more funding –To build public awareness –To plan next steps in the evaluation approach

In Summary: Building Capacity for Evaluation You have expertise and resources available to assist you You can take a thoughtful, planned approach to getting the information and data you need You, and your partners, play a key role is determining the purpose, gathering appropriate resources, providing oversight, and ensuring information is meaningful and useful

Closing Personal learning plan Quality improvement Session evaluation

Reflections I learned … I relearned … I will apply … I would like to know more about … I am surprised by …

Thank you! Facilitated by the National Child Care Information and Technical Assistance Center Rosehaven Street, Suite 400 Fairfax, VA Phone: Fax: TTY: Web: PACT is an initiative of NCCIC, a service of the Child Care Bureau