The EPISCenter is a project of the Prevention Research Center, College of Health and Human Development, Penn State University, and is funded by the Pennsylvania.

Slides:



Advertisements
Similar presentations
Sustainability Planning Pat Simmons Missouri Department of Health and Senior Services.
Advertisements

CW/MH Learning Collaborative First Statewide Leadership Convening Lessons Learned from the Readiness Assessment Tools Lisa Conradi, PsyD Project Co-Investigator.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Requires DSHS and HCA to expend state funds on: (1) Juvenile justice programs or programs related to the prevention, treatment, or care of juvenile offenders.
Using Quarterly Report Data OR I’ve Been Collecting Systems Data All Along and Didn’t Know It!
NRCOI March 5th Conference Call
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
From Silos to Systems: Performance Management in Public Health Turning Point Performance Management Collaborative October 2002.
Sustaining Local Public Health and Built Environment Programs Fit Nation NYC November 3, 2011 Annaliese Calhoun.
Competency Assessment Public Health Professional (2012)-
Continuous Quality Improvement (CQI)
Monitoring & Evaluation in World Bank Agriculture and Rural Development Operations SASAR Monitoring & Evaluation Workshop New Delhi; June 20, 2006.
GSU-NACDD-CDC Chronic Disease and Public Health Workforce Training Training Needs Survey and Public Health Certificate in Chronic Disease Training for.
Evidence-Based Programs The benefits, uses, and applicability of data driven programming and community collaboration.
Models for Program Planning in Health Promotion
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
The Future of Continuous Quality Improvement in Wisconsin’s Child Welfare System.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Funding Opportunity: Supporting Local Community Health Improvement Sylvia Pirani Director, Office of Public Health Practice New York State Department of.
School-Wide Positive Behavior Support District Planning Louisiana Positive Behavior Support Project.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
USING DATA TO IMPROVE PROGRAM PERFORMANCE 2013 TIG CONFERENCE Presenters: Michael O’Connor, Prairie State Legal Services (IL) Rachel Perry, Cleveland Legal.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Working Definition of Program Evaluation
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Cathy Burack and Alan Melchior The Center for Youth and Communities The Heller School for Social Policy and Management, Brandeis University Your Program.
Building State Capacity: Tools for Analyzing Transition- Related Policies Paula D. Kohler, Ph.D., Western Michigan University National Secondary Transition.
Georgetown University National Technical Assistance Center for Children’s Mental Health 1.
National Consortium On Deaf-Blindness Families Technical Assistance Information Services and Dissemination Personnel Training State Projects.
Julie R. Morales Butler Institute for Families University of Denver.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Special Education Advisory Committee Virginia Department of Education.
From Output to Outcome: Quantifying Care Management Kelly A. Bruno, MSW and Danielle T. Cameron, MPH National Health Foundation Background Objectives Methods.
Round Table: International Experiences in Research Governance Patricia Pitman June 10, 2008.
V Technical Assistance Center on Social Emotional Intervention (TACSEI)
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
704: Conducting Business in Fiscally Challenging Times: Strategies and Tools to Get There PCYA Leadership Academy Presentation March 28, 2012.
What is HMN? Global partnership founded on the premise that better health information means better decisions and better health Partners reflect wide.
Module IV: Building Organizational Capacity and Community Support Cheri Hayes Consultant to Nebraska Lifespan Respite Statewide Sustainability Workshop.
Needs Assessment Presented By Ernest D. Pérez Capacity Building Assistance Trainer BORDER HEALTH FOUNDATION Tucson, Arizona CAPACITY BUILDING ASSISTANCE.
IMPLEMENTATION QUALITY RESEARCH OF PREVENTION PROGRAMS IN CROATIA MIRANDA NOVAK University of Zagreb, Faculty of Education and Rehabilitation Sciences.
Third Sector Evaluation: Challenges and Opportunities Presentation to the Public Legal Education in Canada National Conference on “Making an Impact” 26.
AAU Undergraduate STEM Education Initiative Tobin Smith AAU Vice President for Policy ISSUES Workshop January 30, 2014.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Advice on Data Used to Measure Outcomes Friday 20 th March 2009.
Presented at State Kindergarten Entry Assessment (KEA) Conference San Antonio, Texas February, 2012 Comprehensive Assessment in Early Childhood: How Assessments.
Washington’s Education Research & Data Center 26 th Annual Management Information Systems Conference Concurrent Session I-B: Using a Research Center or.
North Etobicoke LIP Summit Woodbine Convention Centre June 28 th, 2011.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
Prepared by: Forging a Comprehensive Initiative to Improve Birth Outcomes and Reduce Infant Mortality in [State] Adapted from AMCHP Birth Outcomes Compendium.
Evaluation of Large Initiatives of Scientific Research at the National Institutes of Health Mary Kane Concept Systems Inc. William M. Trochim Cornell University.
Connecticut Part C State Performance Plan Indicator 11 State Systemic Improvement Plan Phase II.
Massachusetts Cancer Prevention Community Research Network (MCPCRN) CPCRN Atlanta Meeting October 15-16, 2009.
Help to develop, improve, and sustain educators’ competence and confidence to implement effective educational practices and supports. Help ensure sustainability.
Participants  n = 77 trainees  Mean Age (SD) = 42 years (11.7)  72% European American, 22% Latino/a, 6% Other  21% Male, 79% Female  Attended one.
 Kim Peters, Prevention Coordinator December 14, 2011.
INTRODUCING THE PSBA-GTO ACT FOR YOUTH CENTER OF EXCELLENCE IN CONSULTATION WITH HEALTHY TEEN NETWORK Planning for Evidence-Based Programming.
Assessment of Your Program Why is it Important? What are the Key Elements?
Comprehensive Youth Services Assessment and Plan February 21, 2014.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
A Framework for Evaluating Coalitions Engaged in Collaboration ADRC National Meeting October 2, 2008 Glenn M. Landers.
Louise A. Parker & Laura G. Hill Washington State University October 2004 Scholarship of Outreach Positioning Research as an Asset to Attract Community.
Introduction to the Grant August-September, 2012 Facilitated/Presented by: The Illinois RtI Network is a State Personnel Development Grant (SPDG) project.
The PDA Center is funded by the US Department of Education Office of Special Education Programs Stories from the Field and from our Consumers Building.
Cross State Analyses of Results TELL Survey. New Teacher Center (NTC) worked collaboratively with 11 state coalitions—including governors,
Program Planning for Evidence-based Health Programs.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Implementation Guide for Linking Adults to Opportunity
Community Benefit Activities
Presentation transcript:

The EPISCenter is a project of the Prevention Research Center, College of Health and Human Development, Penn State University, and is funded by the Pennsylvania Commission on Crime and Delinquency and the Pennsylvania Department of Public Welfare as a component of the Resource Center for Evidence-Based Prevention and Intervention Programs and Practices. Organizational Correlates of Local Outcome Assessment Introduction The EPISCenter & The Prevention Research Center, The Pennsylvania State University Background Increasing policy emphasis on evidence-based programs (EBPs) and results-based accountability (Bumbarger et al., 2010) Next step in EBP dissemination: demonstrate impact, assess delivery locally In context of state-wide natural dissemination, agency assessment of process and outcomes to -identify areas of program delivery to target for continuous quality improvement -garner support and buy-in from community stakeholders, an essential component of program sustainability (Rhoades et al., 2012) Mary H. Geier & Stephanie A. Bradley Analyses are based on data collected in 2011 as part of Annual Survey of Evidence-based Programs (ASEP; Doyle et al., 2012) for 93 organizations. Method Conceptual Model Philosophy towards data use and evaluation (4 items, α =.86) Organizational capacity to evaluate (4 items, α =.64) Knowledge of data collection and reporting process Skill/expertise to analyze data Internal or external consultant Program alignment with agency mission & goals (4 items, α =.81) Internal stakeholder buy-in Training and TA factors (3 items, α =.69) Logic model proficiency Support surrounding local evaluation Active planning for sustainability (4 items, α =.80) Fiscal plan Garnering stakeholder support (understand value) External stakeholder buy-in & support for program (6 items, α =.90) Prioritizing, attitude Collaborative board, CTC, etc Organization assess outcomes Program sustainability Capacity & Motivation Factors Hypothesized to Influence Local Outcome Assessment Results Respondents could indicate that they used data to: A)Meet grant requirements B)Know if the program is working C)Use the data to garner financial support (grant writing, presentations to potential funders) Future Directions Are organizations that assess local outcomes different from those that do not? d =.49 * d =.70 ** Landscape of Data Use & Reporting Conducted Assessment Did Not Conduct To examine factors associated with assessing implementation quality To distinguish between different levels of evaluation quality (e.g. “high” vs. “low”), and to subsequently examine how assessment capacity factors are associated with different levels of quality To conduct longitudinal analyses to determine a) how capacity to evaluate is related to sustainability and continuous quality improvement, and how these relationships change across time; and b) direction of effect between program operating level and choice to assess The probability that an organization conducts program assessments is highest in grant-funded years (differences across groups not significant: χ =.779, p =.377). Agencies that conduct assessments differ from those that do not on: 1) program alignment with agency mission and goals; 2) active planning for sustainability; and 3) philosophy towards data use and evaluation. Most organizations report utilizing data in three or more ways, and all organizations reported using data to know if a program is working. Continuous Quality Improvement Organization assess implementation quality Explores the landscape of data use among local EBP providers Examines how capacity and motivation factors are associated with community providers’ ability to conduct local outcome assessments 2 ** p<.01 * p<.05 p<.11 (n=45) (n=44)(n=29) ( n=38) Respondents could indicate that they presented data to: A)PCCD B)Coalition/Collaborative Board C)Community Groups/Organizations Correlations Across Factors Hypothesized to Influence Assessment Most organizations reported sharing data with three audiences. Reporting to multiple audiences has been associated with program sustainability (Rhoades et al., 2012). t ** p<.01 * p<.05 p<.11 t t Each factor is significantly correlated with at least one other factor, except Sustainability Planning. Present Study Within the context of a state-wide scale-up of evidence-based programming, including 200+ replications of a menu of EBPs in PA, since 2001: t t