Reduce Waiting & No-Shows  Increase Admissions & Continuation www.NIATx.net Reduce Waiting & No-Shows  Increase Admissions & Continuation Lessons Learned.

Slides:



Advertisements
Similar presentations
Vs. Attending a Different Training as a Site Team.
Advertisements

EFFECTIVE TECHNICAL ASSISTANCE (TA)
Using medicaid with HUD’s Homeless Assistance Programs
Ray C. Rist The World Bank Washington, D.C.
Service Agency Accreditation Recognizing Quality Educational Service Agencies Mike Bugenski
Reduce Waiting & No-Shows  Increase Admissions & Continuation Eliminating Excessive Paperwork: A Step-by-Step Guide.
FROM THE CLINIC TO THE COMMUNITY: THE ROLE OF PUBLIC HEALTH INSTITUTES IN MODELING THE EXPANSION OF THE COMMUNITY HEALTH WORKFORCE.
Coordinating Center Overview November 18, 2010 SPECIAL DIABETES PROGRAM FOR INDIANS Healthy Heart Project Initiative: Year 1 Meeting 1.
Who does the monitoring?. State agency staff University/Extension Consultant Volunteer/citizens’ groups Soil & Water Conservation District, Irrigation.
Center for Health Care Quality Licensing & Certification Program Evaluation 1 August 2014 rev.
Planning for Sustainability: Framework and Process LifeSkills Training Webinar Series October 27, 2010.
Leadership within SW-PBS: Following the Blueprints for Success Tim Lewis, Ph.D. University of Missouri OSEP Center on Positive Behavioral Intervention.
Overview of NIATx & Process Improvement Process Improvement Overview and Basic Training 2008.
PERFORMANCE AUDIT REPORT ON MANAGEMENT OF PRIMARY HEALTH CARE (A CASE STUDY ON HEALTH CENTERS) 8/16/20151 Dr. Anna Nswilla CDHSMoHSW.
THEN and NOW Lessons Learned with Process Change and Management Evaluation in the State of Pennsylvania August 2014.
First, a little background…  The FIT Program is the lead agency for early intervention services under the Individuals with Disabilities Education Act.
Paula Peyrani, MD Medical/Project Director, HIV Program at the 550 Clinic Assistant Director, Research Design and Development Clinical and Translational.
Los Angeles County Evaluation System Accomplishments and Next Steps toward Outcomes and Performance Measurement of Treatment Services in LA County 2008.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Iowa’s Teacher Quality Program. Intent of the General Assembly To create a student achievement and teacher quality program that acknowledges that outstanding.
Optimizing Technology to Achieve Population Health Shannon Nielson, MHSA, PCMH-CCE Centerprise, Inc May 5 th, 2015 Indiana PCA Annual Conference
California Outcomes Measurement System – Treatment
Verification: Quality Assurance in Assessment Verification is the main quality assurance process associated with assessment systems and practice - whether.
OSEP National Early Childhood Conference December 2007.
Introducing QI Tools and Approaches Whole-Site Training Approach APPENDIX F Session C Facilitative Supervision for Quality Improvement Curriculum 2008.
Creating a Data Driven Culture for Change: What Questions Can Your Data Ask or Help You Answer? Mark Reynolds, Ed.D. ODMHSAS Jay Ford, Ph.D. NIATx.
Encounter Data Validation: Review and Project Update August 25, 2015 Presenters: Amy Kearney, BA Director, Research and Analysis Team Thomas Miller, MA.
Collecting & Analyzing Baseline Data January 2009 Follow-up Calls (Call #2) Based on the fall 2008 CATES Training Series Contra Costa County, San Bernardino.
National Center on Response to Intervention NCRTI TECHNICAL ASSISTANCE DOCUMENTATION AND IMPLEMENTATION Tessie Rose, PhD NCRTI Co-coordinator of TA and.
How Do We Do This? Educate all students: – Build upon prior knowledge and experience –Address a wide range of skill levels –Instruct utilizing various.
Transforming Community Services Commissioning Information for Community Services Stakeholder Workshop 14 October 2009 Coleen Milligan – Project Manager.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Sub-Regional Workshop for GEF Focal Points in West and Central Africa Accra, Ghana, 9-11 July 2009 Tracking National Portfolios and Assessing Results.
2013 NEO Program Monitoring & Evaluation Framework.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
A partnership of the Healthcare Association of New York State and the Greater New York Hospital Association NYSPFP Preventable Readmissions Pilot Project.
Results Management: Principles and Strategies based on the work of Gary L. Bowen, Ph.D. and Dennis Orthner, Ph.D School of Social Work University of North.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
The Quality Colloquium at Harvard University August 27, 2003 Patient Safety Organizational Readiness Assessment Tool Louis H. Diamond, MDBeverly A. Collins,
September 2008 NH Multi-Stakeholder Medical Home Overview.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
System Changes and Interventions: Registry as a Clinical Practice Tool Mike Hindmarsh Improving Chronic Illness Care, a national program of the Robert.
CHAPTER 28 Translation of Evidence into Nursing Practice: Evidence, Clinical practice guidelines and Automated Implementation Tools.
Copyright 2003 – Cedar Enterprise Solutions, Inc. All rights reserved. Business Process Redesign & Innovation University of Maryland, University College.
OVERVIEW OF STATE APPROACHES TO OVERSIGHT AND MONITORING OF PSYCHOTROPIC MEDICATIONS Joyce Pfennig, Ph.D. Kate Stepleton, MSW.
1 Poverty Analysis and Data Initiative (PADI) Capacity Building Program To Support The Poverty Reduction Strategy Shahid Khandker World Bank Institute.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
Measuring Fidelity in Early Childhood Scaling-Up Initiatives: A Framework and Examples Carl J. Dunst, Ph.D. Orelena Hawks Puckett Institute Asheville,
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
UCLA Integrated Substance Abuse Programs Richard Rawson, Ph.D. Rachel Gonzales, Ph.D. Funded by: California Alcohol and Drug Programs CalOMS Training for.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
Early Childhood Transition: Effective Approaches for Building and Sustaining State Infrastructure Indiana’s Transition Initiative for Young Children and.
NACDD Hill Day: Legislative Visits What to Expect Mari T. Brick, MA Program Consultant, NACDD
© 2013 University of Texas System/Texas Education Agency Using Collaborative Instructional Log information to develop an Individual Transition Plan Pamela.
Managing the National Communications Process UNFCCC Workshop on Exchange of Experiences and Good Practices among NAI Countries in Preparing NCs September.
DOQ-IT Project The EHR Roadmap Tony Linares, MD Medical Director, Quality Improvement.
Cross-site Evaluation Update Latino ETAC. Goal of Cross-site Evaluation To facilitate and conduct a rigorous evaluation of innovative and effective service.
1 Assuring the Quality of Data from the Child Outcomes Summary Form.
Building Capacity for EMR Adoption and Data Utilization Among Safety Net Organizations Presented by Chatrian Reynolds, MPH, Evaluator, LPHI Shelina Foderingham,
1 Early Intervention Monitoring Wyoming DDD April 2008 Training.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 11. Reporting.
How to Conduct Toileting Trials: A Webinar Course Evaluation
The Continuum of Interventions in a 3 Tier Model
Implementing Career Resources, Service-Delivery Tools, and Services
Implementing Career Resources, Service-Delivery Tools, and Services
An Introduction to Evaluating Federal Title Funding
UNFCCC Needs-based Finance (NBF) Project
Presentation transcript:

Reduce Waiting & No-Shows  Increase Admissions & Continuation Reduce Waiting & No-Shows  Increase Admissions & Continuation Lessons Learned for Creating a Process Improvement Performance Management System Jay Ford, Ph.D. Director of Research, NIATx March 2010

Reduce Waiting & No-Shows  Increase Admissions & Continuation Some is not a number, soon is not a time. -- Don Berwick, MD

Reduce Waiting & No-Shows  Increase Admissions & Continuation Institute of Medicine Report Promote patient centered care Foster the adoption of evidence based practices Develop and using process and outcome measures to enhance quality of care, and Mandate the use of quality improvement measures

Reduce Waiting & No-Shows  Increase Admissions & Continuation Improving the quality of care requires Identifying problem areas accurately Implementing creative interventions Understanding customer data needs

Reduce Waiting & No-Shows  Increase Admissions & Continuation Design Data Systems that Capture and track essential measures of performance and quality Teach staff how to collect, analyze and learn from data and Develop systems to support organizational data needs

Reduce Waiting & No-Shows  Increase Admissions & Continuation Is This Your Data System?

Reduce Waiting & No-Shows  Increase Admissions & Continuation Inquiring minds want to know What factors were related to successful adoption of process-focused data? What barriers were encountered developing data expertise and focus? What attributes do we need to think about when developing a data system?

Reduce Waiting & No-Shows  Increase Admissions & Continuation PIPM Hierarchy of Needs

Reduce Waiting & No-Shows  Increase Admissions & Continuation How Much Data to Collect?

Reduce Waiting & No-Shows  Increase Admissions & Continuation Key Lessons Learned: Data Collection Key process improvement variables are often not available (Date of first contact) or may not be adequately captured (e.g., no- shows) within existing systems. Conduct a data walk-through of your system to assess capabilities. –Identify currently available PI Data Elements –Flowchart of the provider submission process. –Evaluate the data submission instructions –Pilot test the process with a small sample of records

Reduce Waiting & No-Shows  Increase Admissions & Continuation Data Walk-through Questions Could the data easily be pulled from the state system? What barriers were encountered? How complete and accurate was the data? Were there significant missing gaps in the data? Did you notice any errors in the data? Write-up and share the lessons learned with key stakeholders.

Reduce Waiting & No-Shows  Increase Admissions & Continuation Is this Data Quality?

Reduce Waiting & No-Shows  Increase Admissions & Continuation Key Lessons Learned: Data Quality Establish a process for verifying and checking data accuracy. Failure to verify data entry for accuracy will limit the validity of performance management feedback reports related to process improvement. Approaches toward ensuring data integrity include –Automatic linkages (e.g., Washington) –Built-in quality checks (e.g., Ohio and Maine) –Feedback mechanisms (e.g., New York, South Carolina and Oklahoma and –Ongoing training and technical assistance

Reduce Waiting & No-Shows  Increase Admissions & Continuation Examples of Ongoing Training and Technical Assistance Oklahoma created a Data Integrity Review Team (DIRT) to provide on site review and technical assistance on all data issues for any provider. Maine created a change team to monitor data and performance of the contracted agencies and developed FAQs. New York developed a series of data entry and report analysis training modules for the STAR-QI system. Ohio offers technical assistance and follow-up through site visits, telephone calls, or conferences.

Reduce Waiting & No-Shows  Increase Admissions & Continuation What type of Feedback?

Reduce Waiting & No-Shows  Increase Admissions & Continuation Comparative Feedback Organizational Performance versus –a target (internal) or –a benchmark (external) Types of comparisons –Internal comparisons over time –External performance comparisons to other similar organizations –External performance comparisons to other agencies within a state

Reduce Waiting & No-Shows  Increase Admissions & Continuation Comparative Feedback Measurement Comparisons –Performance vs. Outcomes –Business Process vs. Treatment Performance/Outcomes Importance of comparisons Types of Feedback Reports –Data Quality –Performance Reports –Pay for Performance

Reduce Waiting & No-Shows  Increase Admissions & Continuation Comparative Feedback Understand the whole picture Select a few key outcome measures Use of reports to guide questions Benchmarks vs. Targets Focus on the comparison (internal vs. external)

Reduce Waiting & No-Shows  Increase Admissions & Continuation Key Lessons Learned: Performance Management Do not skimp on data quality efforts. Ensure access to all persons who need the reports. Create performance feedback loops that include, not isolate, the provider data coordinators. Provide only reports that help providers effectively use data to make decisions. Use pictures or graphs, but remember: one graph, one message. Update reports over time as data is corrected.

Reduce Waiting & No-Shows  Increase Admissions & Continuation State Examples New York generates data warehouse reports by provider or in the aggregate. Ohio links STAR-SI performance measures to departmental Performance Target Outline (PTO). South Carolina facilitates provider comparisons by preparing & disseminating monthly comparative reports. Maine provides public access to the TDS reports and allows agencies to access the secure system and to request specialized reports. Oklahoma provides feedback through the Integrated Client Information System (ICIS), allowing monthly access to feedback reports.

Reduce Waiting & No-Shows  Increase Admissions & Continuation

Reduce Waiting & No-Shows  Increase Admissions & Continuation

Reduce Waiting & No-Shows  Increase Admissions & Continuation

Reduce Waiting & No-Shows  Increase Admissions & Continuation Key Lessons Learned: Pay for Performance Building the system Pilot testing Offering the right type of incentive Overcoming potential obstacles Implementing strategies for long-term success and sustainability

Reduce Waiting & No-Shows  Increase Admissions & Continuation For further information, please visit Contact Information Jay Ford, PhD Mark Reynolds