Supporting Community Priorities and Emphasizing Rigor An Approach to Evaluation Capacity Building with Tribal Home Visiting Programs Kate Lyon, MA Julie.

Slides:



Advertisements
Similar presentations
Scaling-Up Early Childhood Intervention Literacy Learning Practices Maurice McInerney, Ph.D. American Institutes for Research Presentation prepared for.
Advertisements

State Implementation Grants for Improving Services for Children with ASD and other Developmental Disabilities and the State Public Health Coordinating.
Practicing Community-engaged Research Mary Anne McDonald, MA, Dr PH Duke Center for Community Research Duke Translational Medicine Institute Division of.
Linking Actions for Unmet Needs in Children’s Health
“If you don’t know where you are going, how are you gonna’ know when you get there?” Yogi Berra Common Elements of Strong Program Evaluation THEORY OF.
NRCOI March 5th Conference Call
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Healthy North Carolina 2020 and EBS/EBI 101 Joanne Rinker MS, RD, CDE, LDN Center for Healthy North Carolina Director of Training and Technical Assistance.
DC Home visiting Implementation and impact evaluation
The Center for IDEA Early Childhood Data Systems Using Needs Assessments to Identify and Evaluate Technical Assistance: Results of a National Survey about.
Early Childhood Mental Health Consultants Early Childhood Consultation Partnership® Funded and Supported by Connecticut’s Department of Children and Families.
Participatory Evaluation Mary Phillips, BME Former Circles of Care Program Coordinator, Oakland and an Evaluator, Los Angeles, CA.
Association on American Indian Affairs Juvenile Justice Reform and the Juvenile Detention Alternatives Initiative (JDAI) Prepared by Jack F. Trope, Executive.
KENTUCKY YOUTH FIRST Grant Period August July
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
Systems of Care Philosophy: A Native Perspective on the National Initiative Andy Hunt, MSW NICWA Director of Community Development for Children’s Mental.
Julie R. Morales Butler Institute for Families University of Denver.
State and Regional Approaches to Improving Access to Services for Children and Youths with Epilepsy Technical Assistance Conference Call Sadie Silcott,
Using Individual Project and Program Evaluations to Improve the Part D Programs Dr. Herbert M. Baum.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Logic Models: Laying the Groundwork for a Comprehensive Evaluation Office of Special Education Programs Courtney Brown, Ph.D. Center for Evaluation & Education.
From Knowledge to Action: Building Research and Evaluation Capacity in the RI Medicaid Program Susan M. Allen, Ph.D. Associate Professor and Deputy Director.
National Coordinating Center for the Regional Genetic Service Collaboratives ( HRSA – ) Joan A. Scott, MS CGC, Chief, Genetics Services Branch Division.
The PDA Center is funded by the US Department of Education Office of Special Education Programs Stories from the Field and from our Consumers Building.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Welcome! These slides are designed to help you think through presenting your evaluation planning and results. Feel free to pick and choose the slides that.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Performance Measurement and PM-OTOOL Presented at the Abandoned Infants Assistance Grantee Kick-Off Meeting Washington, DC December 4, 2008 icfi.com ©
The Eugene T. Moore School of Education Working together to promote the growth, education, and social development of children and youth David E. Barrett.
Welcome! These slides are designed to help you think through presenting your benchmark planning and progress. Feel free to pick and choose the slides that.
Building Capacity to Conduct Scientifically and Culturally Rigorous Evaluations in Tribal Communities through the Tribal Home Visiting Evaluation Institute.
Stages of Research and Development
Administration for Children and Families
Sharing your CQI Story: Creating a CQI Story Board Tribal MIECHV Annual Grantee Meeting Washington, DC May 6, 2015.
Building Processes for Conducting and Managing Data Collection
Strategies for Supporting Home Visitors with Data Collection
CT’s DCF-Head Start Partnership Working Together to Serve Vulnerable Families & Support the Development of At-Risk Children Presenters: Rudy Brooks Former.
American Evaluation Association Annual Conference
DCF Initiatives to Prevent and Intervene in Youth Homelessness
Evidence Based Practice In the Community Sector
Clinical Practice evaluations and Performance Review
Putting Your Data to Work
Tribal Home Visiting Evaluation Institute
Kate Lyon, MA, James Bell Associates, Inc.
Aboriginal Targeted Earlier Intervention Strategy
Fundamentals of Monitoring and Evaluation
Building Tribal Capacity for Home Visiting Evaluation through a Relational Technical Assistance Approach American Evaluation Association Annual Conference.
November 30, 2015 Discussion Draft
MUHC Innovation Model.
Kathleen Amos, MLIS & C. William Keck, MD, MPH
Introduction to Comprehensive Evaluation
The Relationship of CDC’s Conceptual Framework for PHIN Communities of Practice to Program Activities Andrea M. Hegedus, PhD, MPA Northrop Grumman Corporation.
Introduction to Program Evaluation
Cultural Competence and Consumer Involvement: Practice and Theory
Provincial Evaluation Plan By Kathleen Douglas-England
Family-Guided Routines-Based Intervention Introduction Module
Centers for Disease Control (CDC) Tribal Advisory Committee Update and Public Health Initiatives Amy Groom, MPH National Center for Chronic Disease Prevention.
America’s Promise Evaluation What is it and what should you expect?
Partnering for Success: Using Research to Improve the Lowest Performing Schools June 26, 2018 Massachusetts Department of Elementary and Secondary Education.
Texas Department of Family and Protective Services January 23, 2015
Parent-Teacher Partnerships for Student Success
Monitoring-Stakeholder Engagement
Evaluating Your Home Visiting Program
Texas Department of Family and Protective Services December 19, 2014
Siân Curtis, PhD OVC Evaluation Dissemination Meeting,
Program Evaluation of Nebraska’s Gamblers Assistance Program
Linda Mayo Willis and Carolyn Pope Edwards
By: Andi Indahwaty Sidin A Critical Review of The Role of Clinical Governance in Health Care and its Potential Application in Indonesia.
Children Services Committee Meeting
Presentation transcript:

Supporting Community Priorities and Emphasizing Rigor An Approach to Evaluation Capacity Building with Tribal Home Visiting Programs Kate Lyon, MA Julie Morales, PhD Erin Geary, MA James Bell Associates -Tribal Home Visiting Evaluation Institute Aleta Meyer, US Department of Health & Human Services October 2013

Overview Evaluation priorities: Evaluation requirement and approach Expanding the evidence base Address tribal & community questions Evaluation requirement and approach PICO Process What did we learn?

Evidence-Based Policy & MIECHV Requires State MIECHV grantees to implement evidence-based home visiting models HHS conducted a systematic review of the evidence of effectiveness, known as Home Visiting Evidence of Effectiveness HomVEE: results at http://homvee.acf.hhs.gov 14 models currently meet “evidence-based criteria” for the State MIECHV program

Evidence-Based Policy & MIECHV Identified studies of home visiting programs implemented in Tribal communities for review No models met the criteria for evidence of effectiveness Tribal grantees can: Adapt an evidence-based model designed for the “general population” to Tribal setting Use an evidence-based model developed for Tribal communities (but still may need adaptation to specific setting) Develop their own model http://homvee.acf.hhs.gov/Tribal_Report_2012.pdf

Rigorous Evaluation Requirement All Tribal MIECHV grantees are required to conduct a rigorous evaluation Goal is to inform practice and build the evidence base around effective home visiting interventions with Native populations Rigorous evaluation activities include: Examining effectiveness of home visiting models in serving Native populations Examining effectiveness of adaptations of evidence-based home visiting models for Tribal communities Questions regarding implementation or infrastructure necessary to support implementation of home visiting programs in Tribal communities Emphasis on rigorous evaluation comes from the legislation

Evaluation Approach All knowledge will be generated through local evaluations No cross-site evaluation Evaluation questions are developed by grantees in consultation with their community to reflect local interests & priorities Evaluation questions are informed by findings of needs assessment and connected to implementation decisions Tribal ownership of evaluation process, data and dissemination is respected IRB and Tribal approval is required Iterative, connected, and circular process for program planning, development, implementation, and evaluation

Evaluation Approach cont. Evaluations can be limited in size and scope A focused question is answered with rigorous design and methods Flexibility to focus evaluation on a component of home visiting Evaluations will inform grantees, communities, and the field about what works in implementing home visiting in Tribal communities Intensive technical assistance is provided to increase Tribal capacity and empowerment to conduct different types of evaluation Don’t have large budgets, so we’re encouraging grantees to look at a subset of outcomes or examine a component of the home visiting program, such as just the prenatal component or a set of modules. Or maybe a cultural enhancement or implementation strategy, like training and supervision of HV or family recruitment and retention strategies.

P I C O PICO P The target population you plan to serve I   I The intervention or program to be evaluated C The comparison you will make to understand how well the program works in your community O The intended outcomes you want to see achieved I C O A framework for developing a well-built evaluation question PICO developed by Mark Testa (UNC Chapel Hill) and used extensively with the Children’s Bureau’s Permanency Innovations Initiatives to reduce long-term foster care. (find material from slides) Adapted from: Permanency Innovations Initiative Evaluation Team (2011) Logic Model and Theory of Change. Presented at The Children’s Bureau’s PII Kickoff Meeting, Washington DC, November 2010

PICO Question Example Do Native families with a child under age 1 (P) that receive home visiting (I) demonstrate greater improvements in parent knowledge of child development and child development outcomes (O) than families who receive services as usual (C)? You’ll see the PICO question on the posters. 9

PICO Discussion Helps to identify model and evaluation priorities Return to PICO elements multiple times to refine evaluation question To identify potential subpopulations To consider potential comparison groups To hone in on discrete intervention component To prioritize outcomes To help articulate theory of change To refine hypotheses You’ll see the PICO question on the posters. 10

PICO Discussion cont. Brings together program leadership and staff, stakeholders, and evaluators Collaborative discussion about community needs, program development, and evaluation PICO builds rigor into the discussion Allows participatory process within a prescribed framework

Population Who is your target population for home visiting? What are their prioritized needs? Needs assessment informs target population. From broad: what is your community like? To specific: who will you serve?

Intervention Highlights the linkages between the needs of the target population, the program, and the benefits you hope to achieve. What is the theory of change for the program(s) you have selected? What implementation supports need to be in place to have a successful program?

Comparison Walk through different types of comparisons. What is the alternative course of action that your comparison group will experience? What is the strongest contrast that is feasible? The strongest possible contrast will help to attribute any observed differences between the groups to your program.

Outcomes What are the short- and long-term outcomes you hope to achieve? Is there evidence that the program will impact those outcomes? Which outcomes are most critical for the community?

Implementation Planning C P I Needs Assessment Model Selection Implementation Planning Evaluation Question O Links the evaluation question to the needs assessment and model selection Connects the evaluation question to the intervention And helps grantee to articulate their theory of change Helps narrow the focus of the evaluation to a manageable piece Builds a rigorous design into the evaluation question Facilitates a community-engaged approach to evaluation planning 16

What Did We Learn? Recognizing and respecting the need for community input Allowing time for input Will the findings be meaningful & useful? Address the community priorities? Recognize historical relationships with research/researchers Understanding Tribal context: Sample sizes vary widely Capacity and resources available Identification of needed supports Acceptable research designs vary across tribes

What Did We Learn? Finding the balance: Translating terms Awareness of “loaded” words What does “rigor” mean? Drawing on multiple ways of knowing Indigenous knowledge Evaluation Science Qualitative and quantitative data Will the findings contribute to the community and to the evidence-base?

Questions?

For more information on TEI contact: Nicole Denmark Kate Lyon The Tribal Evaluation Institute is funded by the Office of Planning, research and Evaluation within the Administration for Children and Families. TEI was awarded to James Bell Associates in partnership with the University of Colorado’s Centers for American Indian and Alaska Native Health and Michigan Public Health Institute. For more information, contact the individuals on this slide. The Tribal Home Visiting Evaluation Institute (TEI) is funded by the Office of Planning, Research and Evaluation, Administration for Children and Families, Department of Health and Human Services under contract number HHSP23320095644WC. TEI is funded to provide technical assistance to Tribal Home Visiting grantees on rigorous evaluation, performance measurement, continuous quality improvement, data systems, and ethical dissemination and translation of evaluation findings. TEI1 was awarded to MDRC; James Bell Associates, Inc.; Johns Hopkins Bloomberg School of Public Health, Center for American Indian Health, and University of Colorado School of Public Health, Centers for American Indian and Alaska Native Health. For more information on TEI contact: Nicole Denmark Kate Lyon Federal Project Officer Project Director Office of Planning Research and Evaluation James Bell Associates, Inc. nicole.denmark@acf.hhs.gov lyon@jbassoc.com