Creating National Performance Indicators that are Relevant to Stakeholders: Participatory Methods for Indicator Development, Selection, and Refinement.

Slides:



Advertisements
Similar presentations
Project Cycle Management
Advertisements

Poverty Reduction Strategies: A tool for implementing the BPOA Linda Van Gelder The World Bank.
LAO PDR Summary Findings from NOSPA Mission and Possible Next Steps.
PACE EH Redefining Local Environmental Health PACE EH National Summit Louisville, Kentucky March 28-29, 2006 The PACE EH Methodology.
Engaging Your Community in Practical Program Evaluation Stephanie Welch, MS-MPH, RD, LDN Knox County Health Department Phone (865)
REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Practice Profiles Guidance for West Virginia Schools and Districts April 2012.
Copyright The Info-Tech Research Group Inc. All Rights Reserved. D1-1 by James M. Dutcher Strategic IT Planning & Governance Creation H I G H.
A Roadmap to Successful Implementation Management Plans.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Engaging Patients and Other Stakeholders in Clinical Research
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
What You Will Learn From These Sessions
Practicing Community-engaged Research Mary Anne McDonald, MA, Dr PH Duke Center for Community Research Duke Translational Medicine Institute Division of.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Family Resource Center Association January 2015 Quarterly Meeting.
Small Projects & Tailoring Using the PPA 1:15 – 2:15 Teresa Kinley, OPHPR With Panelists: Susan Wilkin, NCCDPHP Andy Autry, NCBDDD Carol Waller, NCEH/ATSDR.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Public Health Collaborations to Improve Health Outcomes: Healthy Aging Opportunities Lynda Anderson, PhD Director, Healthy Aging Program Centers for Disease.
Summarizing Community-Based Participatory Research: Background and Context for the Review Lucille Webb, MEd Eugenia Eng, DrPH Alice Ammerman, DrPH Meera.
Action Implementation and Monitoring A risk in PHN practice is that so much attention can be devoted to development of objectives and planning to address.
NRCOI March 5th Conference Call
Assessing Co-management in Protected Areas in the Northern Territory: Lessons for Marine Protected Areas Central Land Council Arturo Izurieta, Natasha.
Evaluation. Practical Evaluation Michael Quinn Patton.
Accreditation Engaging in Continuous Improvement.
STRENGTHENING the AFRICA ENVIRONMENT INFORMATION NETWORK An AMCEN initiative A framework to support development planning processes and increase access.
Report to Los Angeles County Executive Office And Los Angeles County Health Services Agencies Summary of Key Questions for Stakeholders February 25, 2015.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
A Collaborative Community-Engaged Approach to Evaluation for the Alliance for Research in Chicagoland Communities M. Mason, PhD, 1,2 B. Rucker, MPH 3,4.
Shelter Training 08b – Belgium, 16 th –18 th November, 2008 based on content developed by p This session describes the benefits of developing a strategic.
The County Health Rankings & Roadmaps Take Action Cycle.
1 The Prevention Research Centers Program: The Case for Networks Eduardo Simoes, MD, MSc, MPH Program Director Prevention Research Centers National Center.
Organically evolving CBC opportunities and areas of work INTOSAI Capacity Building Committee - Meeting in Lima, Peru 9-11 September 2014.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
May 8, 2012 MWP-K Learning Event Monitoring, evaluation, and learning (MEL) framework for the Millennium Water Program, Kenya.
Toolkit for Mainstreaming HIV and AIDS in the Education Sector Guidelines for Development Cooperation Agencies.
Mission and Mission Fulfillment Tom Miller University of Alaska Anchorage.
1 Women Entrepreneurs in Rural Tourism Evaluation Indicators Bristol, November 2010 RG EVANS ASSOCIATES November 2010.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Evolving Directions & Initiatives Secwepemc Nation Injury Surveillance & Prevention Program Mary McCullough Three Corners Health Services Society Williams.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
Why a CPCRN? CDC Expectations Katherine M. Wilson, PhD, MPH CPCRN Technical Monitor Division of Cancer Prevention and Control CDC.
PATIENT-CENTERED OUTCOMES RESEARCH INSTITUTE PCORI Board of Governors Meeting Washington, DC September 24, 2012 Anne Beal, MD, MPH, Chief Operating Officer.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
1 National Indicators and Qualitative Studies to Measure the Activities and Outcomes of CDC’s PRC Program Demia L. Sundra, MPH Prevention Research Centers.
1 Community-Based Care Readiness Assessment and Peer Review Overview Department of Children and Families And Florida Mental Health Institute.
Systems Accreditation Berkeley County School District Accreditation Team Chair Training October 20, 2014 Dr. Rodney Thompson Superintendent.
Advisory Committee to the Director State, Tribal, Local and Territorial Workgroup Live Meeting/Conference Call Monday, March 21, 2011 David Fleming, MD,
Time to answer critical and inter-related questions: Whom will we serve? What will we offer? How will we serve them?
Session 2: Developing a Comprehensive M&E Work Plan.
Creative Intervention Planning through Universal Design for Learning MariBeth Plankers, M.S. CCC-SLP Page 127.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
Increased # of AI/AN receiving in- home environmental assessment and trigger reduction education and asthma self-management education Increased # of tribal.
A lens to ensure each student successfully completes their educational program in Prince Rupert with a sense of hope, purpose, and control.
GCM Community Involvement Tool Kit Glenburn Lodge, Muldersdrift, South Africa November 27-28, 2007.
Evaluation What is evaluation?
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Evaluating R&D Programs: R&D Evaluation Planning Concept Mapping and Logic Modeling Presentation to the Washington Research Evaluation Network June 2008.
Stages of Research and Development
Designing Evaluation for Complex Multiple-Component Programs
Thursday 2nd of February 2017 College Development Network
Cultural Competence and Consumer Involvement: Practice and Theory
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Country-led Development Evaluation The Donor Role in Supporting Partner Ownership and Capacity Mr. Hans Lundgren March 2009.
Knowledge Translation Across RERC Activities
Presentation to the Washington Research Evaluation Network June 2008
Evaluation in the GEF and Training Module on Terminal Evaluations
AFIX Standards: a new programmatic tool
Presentation transcript:

Creating National Performance Indicators that are Relevant to Stakeholders: Participatory Methods for Indicator Development, Selection, and Refinement Demia L. Sundra, M.P.H., Margaret Gwaltney, M.B.A., Lynda A. Anderson, Ph.D., Ross Brownson, Ph.D., Jennifer Scherer, Ph.D. Evaluation 2004 November 3, 2004

Contributors to Project DEFINE [Developing an Evaluation Framework: Insuring National Excellence] Evaluation Advisory Group Ross Brownson, Ph.D. (co-chair) Alan Cross, M.D. Robert Goodman, Ph.D., M.P.H. Richard Mack, Ph.D. Randy Schwartz, M.S.P.H. Tom Sims, M.A. Avra Warsofsky Carol White, M.P.H. CDC Lynda A. Anderson, Ph.D., Robert Hancock, Demia Sundra, M.P.H. COSMOS Corporation Jennifer Scherer, Ph.D., Margaret Gwaltney, M.B.A. (Currently with Abt Associates), Thérèse van Houten, D.S.W., Cynthia Carter Concept Systems, Inc. Dan McLinden, Ed.D., Mary Kane

Goals of Presentation Describe the context in which the performance indicators for the CDCs Prevention Research Centers (PRCs) program were developed Review the participatory methodology utilized to develop and select national indicators Discuss the benefits, challenges, and lessons learned from developing performance indicators in an established, diverse, national program

Background and Context

Features of CDCs Prevention Research Centers Program 33 academic-based extramural research centers across United States Academic centers partner with community and state organizations to develop, conduct, and disseminate prevention research through participatory methods Diversity across centers: When founded (newly funded to 18 yrs) Community setting and partners Focus of research

Context for Developing a National Evaluation National evaluation planning project initiated in response to: Institute of Medicine (IOM) report on the PRC program Support for evaluation at CDC Growth of program and related need for accountability Project DEFINE Goals (Planning Phase) Engage stakeholders, develop a logic model and performance indicators, and draft an evaluation plan Maintain a participatory and utilization-focused approach throughout

Intended Purposes of Performance Indicators Individual data on each PRC Evaluation Monitoring Technical assistance needs Cross-center summary data Accountability Program improvement Information sharing and communications with internal and external stakeholders

Anticipated Challenges in Developing PRC Indicators Centers strive to achieve diverse health outcomes Program had few previous cross-center requirements Centers are at various stages of growth and maturity Interests of diverse stakeholders had to be considered Concern existed about how performance indicators would be used Indicators had to be meaningful and impose minimal burden on PRCs in terms of time and cost

Methodology

Basis of Project DEFINE Concept Mapping Gained national and community perspectives on PRC program through 2-tiered approach Engaged diverse stakeholders in brainstorming statements describing PRC program Statements analyzed to create visual maps of concepts Concepts used to build draft logic models Community and national logic models combined Engage the Community Diversity & Sensitivity Relationships & Recognition Active Dissemination Technical Assistance Training Research Methods Research Agenda Core Expertise & Resources

Development of Draft Indicators More than 70 indicators first drafted Mapped to all components of program logic model Some indicators dropped, others refined based on input received at regional meetings and PRC contextual visits 52 candidate indicators remained on the list

Stakeholder Recommendations from Regional Meetings Select a limited number of indicators focused on features common across Centers Collect data on some components of logic model in other ways as part of the national evaluation, rather than through indicators Develop indicators through an iterative process, with multiple opportunities for input Link the performance indicators to the PRC Information System

Stakeholder Selection of the National Performance Indicators 52 indicators listed in structured feedback tool (workbook) All stakeholder groups provided feedback and comments PRCs, Community, State, and CDC Planned on having core and optional indicators

Results of Performance Indicator Feedback 100% response rate received on workbooks Comments from workbook summarized within each stakeholder group 3 of the 4 stakeholder groups recommended 8 indicators 2 of the 4 stakeholder groups recommended an additional 11 indicators

Resulting National Performance Indicators Evaluation advisory group selected and refined 13 indicators based on Results and feedback of workbook Map of indicators across the logic model Cross-walk of recommended indicators with IOM report recommendations Indicators correspond to various logic model components, e.g. Community input on selecting health priorities (input) Existence of explicit research agenda (activity) Evidence of peer-reviewed publications (output)

Collecting the Information Performance indicators were integrated into the information system that was in development Conceptualized from the beginning Reinforced through stakeholder feedback Fields were created in the information system for each performance indicator Information system was developed and reviewed by: Evaluation Contractors Centers and partners (usability and pilot tests) CDC staff Evaluation Advisory Group PRC Steering Committee

Core and Optional Indicators Only core indicators developed through Project DEFINE. Consensus allowed us to: Focus on 13 indicators Not use resources for optional indicator development and integration into information system 11 out of 28 PRCs developed center-specific indicators on their own Topics areas such as community satisfaction with partnership; funding generated; web site hits; infrastructure measures; research methods appropriate for minority population

PRC Performance Indicators: Summary Specific component requirements across all grantees Indicators reflect both process and outcome measures, with focus on process Initial requirements as part of new funding cycle Prospective evaluation Assess general information across PRCs rather than specific health topics Defining common outcomes, e.g. community capacity Indicators will be refined during evaluation implementation

Challenges, Benefits, and Lessons Learned

Current Challenges with Performance Indicators Requests for more guidance on how to further define and collect data Development of summary reports and provision of feedback to all stakeholders Need to increase specificity of indicators over time Balance between participatory processes and program requirements

Buy-in, support, and ownership of indicators Evaluation advisory group was critical for trust and support from larger stakeholder groups Community voice is reflected in indicators Perspective of the PRCs staff and partners reflected in utility and feasibility issues surrounding indicators and information system Benefits of Participatory Approach for Performance Indicator Development

Lessons Learned and Recommendations Utilize participatory methods for selecting and refining indicators to increase stakeholder support Build sufficient time into schedule to allow multiple opportunities for stakeholder input Acknowledge inherent challenge in developing indicators for an established program Include community input in indicator development to increase accountability to partners

For more information on the Prevention Research Centers Program Click on about the program to view the Conceptual Framework (logic model) and narrative Contact information: Demia Sundra:

PRC IS: General Information Page

PRC IS: Community Committees

PRC IS: Health Priorities