The Objective Review & Monitoring Process

Slides:



Advertisements
Similar presentations
(Individuals with Disabilities Education Improvement Act) and
Advertisements

How to Request Technical Assistance Sera Morgan Department of Health and Human Services Health Resources and Services Administration HIV/AIDS Bureau Division.
All Providers Meeting November 6,  Import Limited Surveillance Data into CAREWare for Clients Who Give Consent  HIV diagnosis date  AIDS diagnosis.
Virginia QC - Best Practices: The Missing Puzzle Piece John Carpenter Quality Assurance Manager August 20, 2014.
5/2010 Focused Monitoring Stakeholders May /2010 Purpose: Massachusetts Monitoring System  Monitor and evaluate program compliance with federal.
SEPARATION OF DUTIES CONFLICT OF INTEREST POTENTIAL FRAUD 1.
Funded by HRSA HIV/AIDS Bureau Selecting an Indicator & Collecting Performance Data Barbara M Rosa, RN-C, MS.
Supportive Services for Veteran Families (SSVF) Data Bigger Picture Updated 5/22/14.
Family Resource Center Association January 2015 Quarterly Meeting.
REGIONAL WEBINARS OCTOBER & NOVEMBER, 2013 What If…? Understanding Part C Eligibility Determination, Assessment and Transition Requirements Through Scenarios.
Special Education Accountability Reviews Let’s put the pieces together March 25, 2015.
Quality Management Chart Review Pamela Casey, MS, RD June 24, 2014.
“Creating Ongoing Successful Retirement Plan Experiences For Employers And Employees” ©401(k) Advisors Securities offered through Financial Telesis,
The Role of the CPCDMS in QM Activities Elizabeth Love, MPH Harris County Public Health and Environmental Services Department HIV Services Section.
Implementing a risk-based Title IV audit program Presented by: Jarod Paulson, Compliance Manager Erin Hage, Audit and Compliance Specialist.
Program Collaboration and Service Integration: An NCHHSTP Green paper Kevin Fenton, M.D., Ph.D., F.F.P.H. Director National Center for HIV/AIDS, Viral.
Lecture #9 Project Quality Management Quality Processes- Quality Assurance and Quality Control Ghazala Amin.
OSEP National Early Childhood Conference December 2007.
John Stem 7 th Annual Summit on VR PEQA Louisville, Kentucky.
© 2014, Florida Department of Education. All Rights Reserved. Bureau of Federal Educational Programs Updates _________________________________.
Presented by: Jan Stanley, State Title I Director Office of Assessment and Accountability June 10, 2008 Monitoring For Results.
Development Impact Evaluation in Finance and Private Sector 1.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
DHHS COE Meeting Agenda May 19, 2011 Welcome Introductions Contract Compliance Reporting Questions and Answers DHHS Open Windows Update.
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
Oregon Project Independence Pilot Project Quality Assurance Process.
Unit 8: Implementation, Part II Seminar Wednesday pm ET.
Quality Milestones Elaborate quality system developed over the years “Joint Agenda Building” (JAB) group “Strategic Quality” – Progress report CA/80/04.
Module 4 Presented by Strumpf Associates on Behalf of the WDB February 19, 2016.
CAMBA QI PROJECT Improving Clients’ Involvement In & Documentation of Medical Care ANGELES DELGADO November 14 th, 2006.
CHILDREN OF PRISONERS PARTNERSHIP What is the Children of Prisoners (CP) Partnership? The CP Partnership is a funding project between PFI and selected.
March 23, SPECIAL EDUCATION ACCOUNTABILITY REVIEWS.
DESIRE – Desertification Mitigation and Remediation of Land DESIRE Work Block 3: Gudrun Schwilch / CDE 1 October 2007 / WB3 Training Workshop Murcia, Spain.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
Preparing for Title IIA Monitoring Review (FY15) November 9, 2015 Deborah Walker Meagan Steiner David LeBlanc.
Integrating Program Innovation to Improve Prevention and Care Services USCA 2016 – September 17, 2016 April Stubbs-Smith, MPH Director, Division of Domestic.
Technical Business Consultancy Project
Chapter 33 Introduction to the Nursing Process
Clinical Learning Environment Review GMEC January 8, 2013
How to Work with Your Subrecipients on Joint Quality Improvement Goals
303(d) List Methodology Jeff Manning
September 10, 2017 Stewart Landers, Project Director
Marlene Matosky and Susan Robilotto HIV/AIDS Bureau (HAB)
Improve Your Care and Services with Meaningful Consumer Input and Participation Daniel Tietz Manager of Consumer Affairs, New York State Department of.
Assuring your TCM program is in compliance
Using Regional Groups and Peer Learning to Improve HIV Care
Career Pictures.
CCSP/SOURCE and EVV Update
Overview of the FEPAC Accreditation Process
HCS 455 EDU Lessons in Excellence-- hcs455edu.com.
HCS 455 TUTORS Lessons in Excellence -- hcs455tutors.com.
Using Regional Groups and Peer Learning to Improve HIV Care
Electronic Manifesting (e-Manifest)
How to Work with Your Subrecipients on Joint Quality Improvement Goals
Using Data for Leadership/Implementation Planning
Data Quality: Why it Matters
Mandate of the Eurostat Working Group
Louisiana Ryan White Part B & HOPWA Data Management Update 2018
How to conduct Effective Stage-1 Audit
Quality Management STD/HIV Program (SHP)
North Carolina Council on Developmental Disabilities
Quality Audits, and How to Survive them
Monitoring & Managing Your WIF Grant
TEXAS DSHS HIV Care services group
Ryan White Part B Eligibility and Beyond
CAP & PDSA Expectations for 2018
Collaboration with DSHS Laura Jasso, LCSW & Jessica Conly, BSW
2018 Ryan White Part A/B Meeting: HOPWA Updates
Sepsis Program Development
Laura Jasso, LCSW April Marek
Presentation transcript:

The Objective Review & Monitoring Process Presented by Rhonda Stewart, Director of Quality ImProvement Germane Solutions for DSHS HIV Care Services August 23, 2018

What is an objective review? An objective review “Is a process that involves the thorough and consistent examination of applications based on an unbiased evaluation of scientific or technical merits or other relevant aspects of the review”. Reviewers do not make assumptions or judgements during a review Germane Solutions reviewers are consistent with all External Review Organization Processes This Photo by Unknown Author is licensed under CC BY-SA This Photo by Unknown Author is licensed under CC BY-NC-ND

Things to know….. Reviewers will not ‘dig’ for information, rather it should be very evident where items are throughout a client file system If your charts look like this, then your outcomes will look like this & the exit will feel like This Photo by Unknown Author is licensed under CC BY-SA

Standards & Monitoring Standards tell you what is being measured The Standards incorporate all Federal requirements and DSHS policies If one component is missing when monitoring, the indicator is not met This Photo by Unknown Author is licensed under CC BY This Photo by Unknown Author is licensed under CC BY-NC-SA

Service Standards https://www.dshs.texas.gov/hivstd/taxonomy/#section6

Monitoring tools https://www.dshs.texas.gov/hivstd/taxonomy/#section6

Why do we monitor? Monitoring is a program requirement for all Ryan White federal funded programs Germane Solutions conducts the quality assurance monitoring for DSHS HIV Care Services Group to determine provider compliance with the Standards Providers are then able to determine quality improvement processes that will enhance their compliance with the Standards that ultimately will improve the client system of care

Tips to prepare Have all items available and ready for the review team before the site visit Look at the dshs monitoring tools alongside the respective dshs service standard to understand numerators/denominators that determine the outcome for the indicators listed in “performance measures” Breathe…..monitoring is a process to help with improvement initiatives

Recommendations for 2019 – Thoughts? Randomized client lists – send directly to providers Should we send 2 weeks or 1 month prior Impact of sending 1 month prior = not a full 12-month review period for rolling review Reports Include recommendations discussed during exit Indicate trends seen (e.g., # of charts that were missing third party verification in eligibility, etc.)

Sample Size for 2019- changes 1. change to the sampling methodology for most service categories similar to the HOPWA tier system Reduce the sampling burden for sub-recipient chart pulls Provides a significant outcome result –additional charts can be pulled when a trend can not be established with the minimum number of files Follows HRSA guidance

Unduplicated Client ID per service category Proposed Sample Size guidelines 01 – 24 = 100% of files 25 -- 50 = 25 randomized files 51 – 100 = 30 randomized files > 100 = 40 randomized files

Sample size 2019- what stays the same 2. Continue to use the confidence level/interval sampling of 80% +/- 8 for the following: Universal standards OAHS Oral Health MCM and NMCM Most funded service categories Ensures outcomes are statistically reflective of the population Closely linked to viral load

Monitoring Data Base Access and 2019 role out

Data Base access Report Viewing privileges User name and password- only for AA use at this time Training and TA on how to run reports AA’s will be able to see all data- including other regions

2019 Schedule role-out On-site training and TA to enter monitoring data directly into the database Training role-out ongoing for calendar year 2019- intent to develop training calendar by end of 2018 Very tentative- First role-out between January and July: BVCOG, STDC, TRG and Lubbock Second role-out between August and November: Bexar, Dallas, Tarrant and we can include Austin and Houston if interested. AA’s must continue to use the excel monitoring tools until their monitors are trained to use the data base for 2019 reviews This is an implementation year- anticipate issues and practice flexibility