Navy Data Quality Management Control Program (DQMCP) DQMCP Conference Navy Breakout.

Slides:



Advertisements
Similar presentations
DQ MEPRS Audit Readiness DQ – Where We Stand RE-T-3-B 2012 Navy Medicine Audit Readiness Training Symposium.
Advertisements

2012 Navy Medicine Audit Readiness Training Symposium
(Individuals with Disabilities Education Improvement Act) and
Data Quality Management Control (DQMC) Program
2010 UBO/UBU Conference Title: DQ Guidance DQ – Coding Summary Audit Session: W
DQ MEPRS Audit Readiness DMHRSi Timecard Management Bi-weekly vs. Monthly RE-T-3-F 2012 Navy Medicine Audit Readiness Training Symposium.
OUTLINE Section 1: DMHRSi Project Creating a new project
From Registration to Accounts Receivable – The Whole Can of Worms 2007 UBO/UBU Conference 1 Briefing:Implementing EHR is More than Pushing the On Button.
26 January 2012 Data Quality Management Control Program Service-Level Breakout Session UNCLASSIFIED.
13 September 2012 Data Quality Management Control Program Service-Level Breakout Session UNCLASSIFIED.
OUTLINE Section 1: DMHRSi Project Creating a new project
2010 UBO/UBU Conference Health Budgets & Financial Policy 1 Briefing: DQ Statement Roundtable Date: 25 March 2010 Time: 1010–1200.
Data Quality Management Control (DQMC) Program DQMC Program Review List for FY 2012.
TMA Uniform Business Office Program Manager September 2011 Data Quality: UBO & The Revenue Cycle.
Data Quality: Uniform Business Office & The Revenue Cycle DHA Uniform Business Office Program Manager January 2015.
Data Quality Management Control Program
People First Manual Timesheet Training Guide Section 1
Data Quality: UBO & The Revenue Cycle
Data Quality Section, PASBA. 2 Overview  Regulatory Guidance  Program Management  Organizational Factors  System Inputs, Processes, and Outputs 
I n t e g r i t y - S e r v i c e - E x c e l l e n c e Headquarters U.S. Air Force Data Quality Management Program TSgt Jody Callender Air Force Data.
Data Quality Management Control Program
Health Budgets & Financial Policy TRICARE Data Quality Training Course May 22, 2012 DATA QUALITY MANAGEMENT CONTROL (DQMC) PROGRAM.
Health Budgets & Financial Policy TRICARE Data Quality Training Course March 1, 2011 DATA QUALITY MANAGEMENT CONTROL (DQMC) PROGRAM.
Data Quality Section, PASBA September  Regulatory Guidance  Program Management  Organizational Factors  System Inputs, Processes, and Outputs.
2010 UBO/UBU Conference Title: Air Force Data Quality Guidance Session: W
Ver Sep 09 Defense Medical Human Resources System internet for MHS Data Quality Course 16 Sep 2009.
2010 UBO/UBU Conference Title: Civilian ER Billing Session: T
I n t e g r i t y - S e r v i c e - E x c e l l e n c e Headquarters U.S. Air Force Data Quality Management Control Program Report TSgt Jody Callender.
Navy Data Quality Management Control Program (DQMCP) DQMCP Conference Navy Breakout.
Headquarters U. S. Air Force I n t e g r i t y - S e r v i c e - E x c e l l e n c e U.S. AIR FORCE Data Quality Management Control (DQMC) Program TSgt.
MEPRS 2010 From Source to Resource: Data Quality Solutions for Today's MEPRS Challenges 26 – 29 July Lansdowne, VA Briefing: AFMOA MEPRS Dashboard/Vector.
WHAT IS EMIS? Education Management Information System Established by law in 1989, the Education Management Information System (EMIS) provides the architecture.
Internal Management Controls Tri-SERVICE MEPRS CONFERENCE 28 August 2007.
Navy Data Quality Management Control (DQMC) Program DQMCP Conference February 2009.
The Challenge and the Goal: Regaining the Custody/Control of Outpatient Medical Records.
From Registration to Accounts Receivable – The Whole Can of Worms 2007 UBO/UBU Conference 1 Briefing:MSA Compliance Audits Date: 23 March 2007 Time: 0800.
Recovery Audit Contractor Program The Demonstration Project Experience - California.
Timely, Complete, & Accurate Submission of EAS IV Data System Access Source Data EAS IV Input EAS IV Repository /MEWACS 1.
2010 UBO/UBU Conference Title: DQ Guidance DQ – UBO/UBU Data in EAS Session: W
Module 5: Data Collection. This training session contains information regarding: Audit Cycle Begins Audit Cycle Begins Questionnaire Administration Questionnaire.
Data Quality Management Program FY 06 Changes. Outline Introduction DQ Review List & Commander’s Statements DQ Data Submission Schedule.
Health Budgets & Financial Policy TRICARE Data Quality Training Course May 19, 2009 DATA QUALITY MANAGEMENT CONTROL (DQMC) PROGRAM.
“Medically Ready Force…Ready Medical Force” Data Quality Management Control (DQMC) Program DQMC Program Review List for FY 2016.
2010 UBO/UBU Conference. Title: DQMC Review List and DQ Statement Roundtable Session: T
2010 UBO/UBU Conference Health Budgets & Financial Policy 1 Briefing: Painting The Auditing Picture Date: 23 March 2010 Time: 1400–1450.
Performance Measures 101 Presenter: Peggy Ketterer, RN, BSN, CHCA Executive Director, EQRO Services Health Services Advisory Group March 28, :00.
2010 UBO/UBU Conference Navy Medicine DQMC Breakout Session FY11 DQ Guidance and FY12 Preview.
Data Quality: UBO & The Revenue Cycle
FINANCIAL SUCCESS ROUTINE Provided by the LAUSD Food Services Division
2010 UBO/UBU Conference 1 Briefing: MTF Coding Audit Results for FY2007 Records Date: 23 March 2010 Time: 1610–1700.
“The Medical Record is the Practice of Medicine.” - Lawrence L. Weed, MD.
Data Quality Management Control (DQMC) Program DQMC Program Review List for FY 2011.
2010 UBO/UBU Conference Title: MEPRS and the Performance Based Assessment Model (PBAM) Session: W UNCLASSIFIED.
To code, or not to code: that is the question: Whether 'tis nobler in the mind to suffer (786.5) The calls and s of outrageous fortune, Or to take.
Office of Housing Choice Voucher Program Voucher Management System – VMS Version Released October 2011.
Project Chartering & Approval Process
Data Quality Management Control (DQMC) Program DQMC Program Review List for FY 2012.
MEPRS Processing FY10 Tri-Service MEPRS Conference.
1 Data Quality Management Control Program Army – Mr. Angel Padilla.
Personnel Perspective – DQ Checks. Executing with Precision Timely, Complete, & Accurate Submission of EAS IV Data System Access Source Data EAS IV Input.
Fiscal Year Preparation MEPRS/EAS Source Systems and Tables.
FIMS Data Quality FIMS Annual Training May 10-12, 2016 Prepared by: Mark Gordy and Gayle Smith.
MEPRS Data Sources & Applications: EAS IV Repository vs. M Tri-Service MEPRS Conference Lansdowne, VA Herb Escobar
2010 UBO/UBU Conference Title: MEPRS and the Performance Based Assessment Model (PBAM) Speaker: Richard Meyer Session: W UNCLASSIFIED.
Personnel: Internal Management Control. Executing with Precision Timely, Complete, & Accurate Submission of EAS IV Data System Access Source Data EAS.
1 Direct Care Prof Encounters TMA / WISDOM Excerpt Direct Care Professional Encounters.
2010 UBO/UBU Conference Title: Data Integrity Breakouts (Navy) DQ – FY11 DQMC Guidance Speaker: Colleen Rees & Team Session: T
2012 Navy Medicine Audit Readiness Training Symposium
Title: MEPRS Tools: MEPRS Early Warning & Control System (MEWACS)
Requirements for Establishing a Work Center for MEPRS/EAS
Presentation transcript:

Navy Data Quality Management Control Program (DQMCP) DQMCP Conference Navy Breakout

Dilbert on Data Quality…

DQMCP Components Commander’s Statement Navy DQMCP Roles and Responsibilities DQMC Process Flow and Deadlines

Critical MTF Staff: Commanding Officer / ESC, Data Quality Manager, Data Quality Assurance Team DQMC Review List: Internal tool to identify and correct financial / clinical workload data and processes Monthly DQMC Commander’s Statement: Monthly statement forwarded through the MTF Regional Command to BUMED and TMA DQMCP Components 1 MTF DQMCP Components

Meets Regularly With DQMC Manager Acts as Subject Matter Experts Identifies / Resolves Internal DQMC Issues Team Membership (minimum): –MEPRS –Coding / PAD / Medical Records –CHCS, AHLTA, and ADM Experts –Physician / Provider Champion –Executive Link –Business Analysts DQMCP Components 1 DQMCP MTF Teams

Leadership commitment and DQMC structure Timely and accurate Ensure accurate, complete and timely data IA, access breach Organizational Factors Data Input Data Output Security DQMCP Review List DQMCP Components 1 System administrator ID, IT business processes System Design and Training

Navy DQMCP Roles and Responsibilities 2 BUMED Program management, oversight, policy and strategies. MTFs DQMCP execution, Review List, Commander’s Statement, CO briefs, and communication of issues to regional representatives. NMSC Systems execution, website maintenance / development, and DQMCP support. REGIONS Regional consolidation of Commander’s Statements, DQMCP coordination, issue resolution, audits and training.

BUMEDNMSCNAVMISSANME Navy DQMCP Roles and Responsibilities 2 NCANMW DQMCP Points of Contact

NAVMISSA Consolidated Call Center “Who do I call?” –Toll Free: NAVY (6289) –Commercial: – “How do I know the status of my problem?” –Broken functions to existing products are closely monitored by the NMSC DQMC Program Manager. –Weekly status reports are posted to the DM SharePoint site for visibility. –Your problem is not considered solved until you say it is solved. “What if I have a new need or good idea?” –MTFs are encouraged to provide any proposed requirements or ideas for improvement to their Regional DQ Manager. –BUMED and Regional DQ Managers will vote and prioritize items based on available resources. Navy DQMCP Roles and Responsibilities 2

Recurring DQMCP Tasks DQMC Process Flow and Deadlines 3 Daily MonthlyAnnually SADR Transmission End of Day (EOD) Coding Compliance SIDR Transmission WWR Transmission Appt. File Transmission DRG File Transmission EAS File Transmission EAS / Financial Reconciliation DMHRSi Timecards 100% Completed MEWACS Review Coding Audits DQMCP Review List Commander’s Statement Coding Table Updates DMIS ID Table Updates EAS Table Updates MEPRS Code Changes

DQMCP System Process Flow DQMC Process Flow and Deadlines 3

Reporting Timeframes for DQMCP DQMC Process Flow and Deadlines 3 * Timeframes may be updated as the year progresses, be sure to obtain the most current version from the BUMED Financial Guidance Portal at:

Commander’s Statement Overview 11 Questions, 37 Individual Elements Submitted monthly to BUMED via the Regional Commands (and sent to TMA via BUMED) Signed and reviewed by the Commanding Officer The month reported on the statement is two months behind the current month (March’s submission is for January data) When a system-wide issue prevents completing an element on the eDQ, BUMED will provide a standard response for the MTFs to use. Commander’s Statement 4

Commander’s Statement Overview For any question where a difference between an MTF’s submission and the automatic eDQ calculation is greater than 2%, a NAVMISSA Trouble Ticket # (and source for the local number) must be included in the comments section. MTFs are required to provide comments, an MHS Trouble Ticket and a POAM for actions being taken to resolve non-compliant (<80%) metrics and metrics that have significantly decreased (10% or more) from the prior month. Commander’s Statement 4

1a : EOD Every Appointment, Every Day Commander’s Statement – End of Day (EOD) 4 1 (a) Methodology: Two timeframes: Clinics with normal hours complete EOD by midnight 24 / 7 Clinics complete EOD by 0600 the next calendar day 1a - # of Appointments Closed by midnight (or 0600) / # of Appointments Metric is dependent on the receipt of each site’s DQMC Appointment Audit File Includes: MEPRS Codes B*** and FBN* Appointment status KEPT, WALK-IN or SICK CALL Excludes: Appointment status of T-CON, CANCELLED, ADMIN or LWOBS Appointments not within the reporting month *Auto-Populated by the NAVMISSA eDQ* *Local Data Should be Calculated Using the BUMED Approved CHCS Ad-hoc*

1a : Historical EOD Commander’s Statement – End of Day (EOD) 4 1 (a) *Note that the vertical axis on the historical charts is adjusted to better display trends*

1a : EOD Calculation Accuracy Commander’s Statement – End of Day (EOD) 4 1 (a) NOLA transition impacted the file receipts at NAVMISSA, inflating the calculation difference (blue data points are adjusted to reflect actual data received). NMSC and NAVMISSA Tiger Team On-Site BUMED 2% Goal

2a : Historical SADRs Coded in 3 Business Days Commander’s Statement – Coding Timeliness 4 2 (a) Methodology: Compliance is determined by the number of business days between the appointment date and the date a SADR is transmitted. 2a - # of SADRs coded within 3 business days / Total SADRs Includes: MEPRS Codes B*** and FBN* Excludes: APVs SADR Appointment Status CANCELLED, LWOBS, ADMIN or TCON Weekends and Federal Holidays *Auto-Populated by the NAVMISSA eDQ*

2a : Historical SADR Coding Commander’s Statement – Coding Timeliness 4 2 (a) *Note that the vertical axis on the historical charts is adjusted to better display trends*

2b : APVs Coded in 15 Calendar Days Commander’s Statement – Coding Timeliness 4 2 (b) Methodology: Compliance is determined by the number of calendar days between the APV date and the date a SADR is transmitted. 2b - # of APVs coded within 15 calendar days / Total APVs Includes: MEPRS Codes B**5, B**6 and B**7. Note: APV flag is not currently used as it is not consistently utilized. Excludes: All other MEPRS Codes SADR Appointment Status CANCELLED, LWOBS, ADMIN or TCON *Auto-Populated by the NAVMISSA eDQ*

2b : Historical APV Coding Commander’s Statement – Coding Timeliness 4 2 (b) *Note that the vertical axis on the historical charts is adjusted to better display trends*

2a and 2b : SADR Compliance Calculation Accuracy Commander’s Statement – Coding Timeliness 4 2 (a, b) NOLA transition impacted the file receipts at NAVMISSA, sites with transmission issues were removed from the metrics on this slide (retransmitting impacts the SADR Extract Date). NMSC and NAVMISSA Tiger Team On-Site Tiger Team impact on the calculations was minimal as only a small percentage of SADRs fall on the border of compliance where a methodology change would have an influence. However, since January 2009 (FM4) sites have steadily been more accepting of the eDQ calculation as sites are educated on how to calculate the two metrics. Also, the elimination of TCONs from the FY10 metric has reduced local variation. Tiger Team impact on the calculations was minimal as only a small percentage of SADRs fall on the border of compliance where a methodology change would have an influence. However, since January 2009 (FM4) sites have steadily been more accepting of the eDQ calculation as sites are educated on how to calculate the two metrics. Also, the elimination of TCONs from the FY10 metric has reduced local variation.

2c : SIDRs Coded in 30 Calendar Days Commander’s Statement – Coding Timeliness 4 2 (c) Methodology: Compliance is determined by the number of calendar days between the disposition date (“E” records) and the date a SIDR is coded (“D” records). Date coded is determined by the DRG assignment date transmitted to NAVMISSA in the DRG file. 2c - # of SIDRs coded within 30 calendar days / Total SIDRs Includes: All “D” and “E” SIDRs Excludes: SIDR files received after the 15 th of the month freeze. Any “F” or “C” SIDRs Resource Sharing and VA workload *Auto-Populated by the NAVMISSA eDQ*

2c : Historical SIDR Coding Commander’s Statement – Coding Timeliness 4 2 (c) *Note that the vertical axis on the historical charts is adjusted to better display trends*

3a : MEPRS/EAS Financial Reconciliation 3b: MEWACS Reviewed and Anomalies Explained Commander’s Statement – MEPRS Reconciliation 4 3 (a, b) Methodology: Both questions are answered “Yes” or “No” by each MTF. 3a – Financial reconciliation must be completed, validated and approved prior to the monthly MEPRS transmission. BUMED policy is to answer “Yes”, since this process is performed by BUMED. 3b – MTFs must review the current version, regardless of whether it matches the reporting month or not (this question should always be “Yes”). Includes: Not applicable Excludes: Not applicable

3a : Historical MEPRS/EAS Financial Reconciliation 3b: Historical MEWACS Review Commander’s Statement – MEPRS Reconciliation 4 3 (a, b) *Note that the vertical axis on the historical charts is adjusted to better display trends*

3c : DMHRSi Timecards Submitted by Suspense 3d: DMHRSi Timecards Approved by Suspense Commander’s Statement – MEPRS Reconciliation 4 3 (c, d) Methodology: Both questions are provided by BUMED. 3c - Timecards “Submitted”, “Working”, “Rejected” or “Approved” / Total Timecards on the BUMED DMHRSi Interim Report Date 3d – Timecards “Approved” / Total Timecards on the BUMED DMHRSi Final Report Date Includes: Not applicable Excludes: 3c does not include “Not Submitted” or “Working”

3c : Historical DMHRSi Timecard Submission 3d: Historical DMHRSi Timecard Approval Commander’s Statement – MEPRS Reconciliation 4 3 (c, d) *Note that the vertical axis on the historical charts is adjusted to better display trends*

4a : MEPRS/EAS in 45 Calendar Days 4b : SIDR by 4 th Calendar Day 4c : WWR by 4 th Calendar Day Commander’s Statement – Data Transmission 4 4 (a-c) Methodology: All three measures are “Yes” or “No” and calculated based on the day the files were successfully transmitted to NAVMISSA, not when the transmissions were attempted. If 4a is “No”, questions 8c and 8d should use local WAM data. Note: For 4b and 4c, compliance is measured by 5 th Business Day and 10 th Calendar day for TMA reporting purposes. Includes: MEPRS/EAS – 1 File per Parent DMIS SIDR / WWR – Number of files expected is MTF dependent. Excludes: Re-submissions (updated data) do not count against this metric. *Auto-Populated by the NAVMISSA eDQ*

4a : Historical MEPRS/EAS 4b : Historical SIDR 4c : Historical WWR Commander’s Statement – Data Transmission 4 4 (a-c) *Note that the vertical axis on the historical charts is adjusted to better display trends*

4d : SADR Transmitted Daily Commander’s Statement – Data Transmission 4 4 (d) Methodology: SADR transmissions are reported as a percentage, since they are the only file transmitted multiple times in a month. Every DMIS (Parent and Child) should have a SADR file transmitted each day (even if the file is empty). Logic for sites (especially overseas) is based on time zones and CHCS ETU settings. Includes: All Navy DMIS IDs. Excludes: Not applicable. *Auto-Populated by the NAVMISSA eDQ*

4d : Historical SADR Commander’s Statement – Data Transmission 4 4 (d) *Note that the vertical axis on the historical charts is adjusted to better display trends*

5a : DRG Accuracy Commander’s Statement – Inpatient Coding Audit 4 5 (a) Methodology: # of correct DRG codes / Total # of DRG codes Includes: 30 Inpatient dispositions per reporting month (or 100% if fewer than 30 dispositions). Excludes: Resource sharing and VA facilities report “N/A” for this metric. MTFs without any inpatient services or external partnerships report “N/A” for the entire 5-series (5a-5f).

5a : Historical DRG Accuracy Commander’s Statement – Inpatient Coding Audit 4 5 (a) *Note that the vertical axis on the historical charts is adjusted to better display trends*

5b : Inpatient Professional Services Rounds E&M Accuracy 5c : Inpatient Professional Services Rounds ICD9 Accuracy 5d : Inpatient Professional Services Rounds CPT Accuracy Commander’s Statement – Inpatient Coding Audit 4 5 (b-d) Methodology: 5b – # of Correct E&M codes / Total # of E&M codes documented and expected 5c – # of Correct ICD9 codes / Total # of ICD9 codes documented and expected 5d – # of Correct CPT codes / Total # of CPT codes documented and expected Note: The denominator is not the # of IPS rounds audited. Includes: One calendar day of the attending professional services during each audited hospitalization (from 5a) is randomly selected. For admissions greater than one day, odd registration numbers have the first day audited, even numbers the second day. Excludes: MTFs without any inpatient services or external partnerships report “N/A” for the entire 5-series (5a-5f).

5b : Historical IPS Rounds E&M Accuracy 5c : Historical IPS Rounds ICD9 Accuracy 5d : Historical IPS Rounds CPT Accuracy Commander’s Statement – Inpatient Coding Audit 4 5 (b-d) *Note that the vertical axis on the historical charts is adjusted to better display trends*

5e : DD Form 2569 Completed and Current 5f : DD Form 2569 Correct in CHCS Patient Insurance Information Module (PIIM) Commander’s Statement – Inpatient Coding Audit 4 5 (e, f) Methodology: 5e – # of Available DD 2569’s (completed and signed within the last 12 months) / # of Non-Active Duty records audited 5f – # of Records from the numerator of 5e correct in PIIM / Numerator from 5e Notice that the basis for 5f is the number from 5e that are completed and signed within the last 12 months. Includes: Non-Active Duty Records. Overseas MTFs now report both 5e and 5f. Excludes: Active Duty records.

5e : Historical DD Form 2569 Completed and Current 5f : Historical DD Form 2569 Correct in CHCS Patient Insurance Information Module (PIIM) Commander’s Statement – Inpatient Coding Audit 4 5 (e, f) *Note that the vertical axis on the historical charts is adjusted to better display trends*

6a : Encounter Documentation Available Commander’s Statement – Outpatient Coding Audit 4 6 (a) Methodology: Consists of 30 randomly selected records. If a record is documented as being checked out within the facility, it is counted as available. If a record is documented as being checked out to a patient, it is not counted as available. 6a – # of Available records / 30 Includes: Documentation from medical record, loose (hard copy) or electronic documentation (AHLTA) Excludes: Not applicable

6a : Historical Encounter Documentation Availability Commander’s Statement – Outpatient Coding Audit 4 6 (a) *Note that the vertical axis on the historical charts is adjusted to better display trends*

6b : Outpatient Encounter E&M Accuracy 6c : Outpatient Encounter ICD9 Accuracy 6d : Outpatient Encounter CPT Accuracy Commander’s Statement – Outpatient Coding Audit 4 6 (b-d) Methodology: 6b – # of Correct E&M codes / Total # of E&M codes documented and expected 6c – # of Correct ICD9 codes / Total # of ICD9 codes documented and expected 6d – # of Correct CPT codes / Total # of CPT codes documented and expected Note: The denominator is not the # of encounters audited. Includes: Not applicable. Excludes: Not applicable.

6b : Historical Outpatient Encounter E&M Accuracy 6c : Historical Outpatient Encounter ICD9 Accuracy 6d : Historical Outpatient Encounter CPT Accuracy Commander’s Statement – Outpatient Coding Audit 4 6 (b-d) *Note that the vertical axis on the historical charts is adjusted to better display trends*

6e : DD Form 2569 Completed and Current 6f : DD Form 2569 Correct in CHCS Patient Insurance Information Module (PIIM) Commander’s Statement – Outpatient Coding Audit 4 6 (e, f) Methodology: 6e – # of Available DD 2569’s (completed and signed within the last 12 months) / # of Non-Active Duty records audited 6f – # of Records from the numerator of 6e correct in PIIM / Numerator from 6e Notice that the basis for 6f is the number from 6e that are completed and signed within the last 12 months. Includes: Non-Active Duty Records. Overseas MTFs now report both 6e and 6f. Excludes: Active Duty records.

6e : Historical DD Form 2569 Completed and Current 6f : Historical DD Form 2569 Correct in CHCS Patient Insurance Information Module (PIIM) Commander’s Statement – Outpatient Coding Audit 4 6 (e, f) *Note that the vertical axis on the historical charts is adjusted to better display trends*

7a : APV Encounter Documentation Available Commander’s Statement – APV Coding Audit 4 7 (a) Methodology: Sample size must be a minimum of 30 APVs (or 100%, if less than 30 APVs were completed). If a record is documented as being checked out within the facility, it is counted as available. If a record is documented as being checked out to a patient, it is not counted as available. 7a – # of Available records / 30 (or all APVs if less than 30) Includes: Documentation from medical record, loose (hard copy) or electronic documentation (AHLTA) Excludes: Not applicable

7a : Historical APV Encounter Documentation Availability Commander’s Statement – APV Coding Audit 4 7 (a) *Note that the vertical axis on the historical charts is adjusted to better display trends*

7b : APV Encounter ICD9 Accuracy 7c : APV Encounter CPT Accuracy Commander’s Statement – APV Coding Audit 4 7 (b, c) Methodology: Sample size must be a minimum of 30 APVs (or 100%, if less than 30 APVs were completed). 7b – # of Correct ICD9 codes / Total # of ICD9 codes documented and expected 7c – # of Correct CPT codes / Total # of CPT codes documented and expected Note: The denominator is not the # of encounters audited. Includes: Not applicable. Excludes: Not applicable.

7b : Historical APV Encounter ICD9 Accuracy 7c : Historical APV Encounter CPT Accuracy Commander’s Statement – APV Coding Audit 4 7 (b, c) *Note that the vertical axis on the historical charts is adjusted to better display trends*

7d : DD Form 2569 Completed and Current 7e : DD Form 2569 Correct in CHCS Patient Insurance Information Module (PIIM) Commander’s Statement – APV Coding Audit 4 7 (d, e) Methodology: 7d – # of Available DD 2569’s (completed and signed within the last 12 months) / # of Non-Active Duty records audited 7e – # of Records from the numerator of 6e correct in PIIM / Numerator from 7d Notice that the basis for 7e is the number from 7d that are completed and signed within the last 12 months. Includes: Non-Active Duty Records. Overseas MTFs now report both 7d and 7e. Excludes: Active Duty records.

7d : Historical DD Form 2569 Completed and Current 7e : Historical DD Form 2569 Correct in CHCS Patient Insurance Information Module (PIIM) Commander’s Statement – APV Coding Audit 4 7 (d, e) *Note that the vertical axis on the historical charts is adjusted to better display trends*

8a : SADR to WWR Comparison Commander’s Statement – Workload Comparison 4 8 (a) Methodology: SADRs transmitted to NAVMISSA are used to calculate the numerator. WWR workload category “Outpatient Visits” is used for the denominator. The percentage should always be less than or equal to 100%. If the percentage is greater than 100%, the number reported to TMA will be adjusted (i.e. 102% = 98%) 8a - # of SADRs (Count) / WWR Outpatient Visits Includes: MEPRS Codes B*** and FBN*, Count SADRs only APVs and Resource Sharing are included. Excludes: SADR Appointment Status CANCELLED, LWOBS, or ADMIN *Auto-Populated by the NAVMISSA eDQ*

8a : Historical SADR to WWR Comparison Commander’s Statement – Workload Comparison 4 8 (a) *Note that the vertical axis on the historical charts is adjusted to better display trends*

8a : SADR Calculation Accuracy Commander’s Statement – Workload Comparison 4 8 (a) NOLA transition impacted the file receipts at NAVMISSA, inflating the “COUNT” and “NON-COUNT” calculation difference (blue data points are adjusted to reflect actual data received). NMSC and NAVMISSA Tiger Team On-Site BUMED 2% Goal

8b : SIDR to WWR Comparison Commander’s Statement – Workload Comparison 4 8 (b) Methodology: SIDRs transmitted to NAVMISSA are used to calculate the numerator. WWR workload category “Dispositions” is used for the denominator. The percentage should always be less than or equal to 100%. If the percentage is greater than 100%, the number reported to TMA will be adjusted (i.e. 102% = 98%) 8b - # of SIDR Dispositions / WWR Dispositions Includes: “D” SIDRs Excludes: “E” or “F” SIDRs Resource Sharing or VA workload *Auto-Populated by the NAVMISSA eDQ*

8b : Historical SIDR to WWR Comparison Commander’s Statement – Workload Comparison 4 8 (b) *Note that the vertical axis on the historical charts is adjusted to better display trends*

8c : EAS Visits to WWR Visits Comparison Commander’s Statement – Workload Comparison 4 8 (c) Methodology: EAS Visits are pulled from the EAS repository by NAVMISSA. WWR workload category “Outpatient Visits” is used for the denominator. The percentage should always be less than or equal to 100%. If the percentage is greater than 100%, the number reported to TMA will be adjusted (i.e. 102% = 98%) 8c - # of EAS Visits / WWR Outpatient Visits Note: If an MTF answers “No” for 4a, MTFs should use WAM data. Includes: MEPRS Codes B*** and FBN* APVs and Resource Sharing are included. Excludes: Not Applicable *Auto-Populated by the NAVMISSA eDQ*

8c : Historical EAS Visits to WWR Visits Comparison Commander’s Statement – Workload Comparison 4 8 (c) *Note that the vertical axis on the historical charts is adjusted to better display trends*

8d : EAS Dispositions to WWR Dispositions Commander’s Statement – Workload Comparison 4 8 (d) Methodology: EAS Dispositions are pulled from the EAS repository by NAVMISSA. WWR workload category “Dispositions” is used for the denominator. The percentage should always be less than or equal to 100%. If the percentage is greater than 100%, the number reported to TMA will be adjusted (i.e. 102% = 98%) 8d - # of EAS Dispositions / WWR Dispositions Note: If an MTF answers “No” for 4a, MTFs should use WAM data. Includes: EAS Dispositions and WWR Dispositions should also match the “D” + “E” SIDR total used in 2c. Excludes: Resource Sharing and VA workload *Auto-Populated by the NAVMISSA eDQ*

8d : Historical EAS Dispositions to WWR Dispositions Commander’s Statement – Workload Comparison 4 8 (d) *Note that the vertical axis on the historical charts is adjusted to better display trends*

8e : Inpatient Professional Services Rounds to WWR Bed Days + Dispositions Comparison Commander’s Statement – Workload Comparison 4 8 (e) Methodology: IPS Rounds are obtained from the MTF SADR Transmissions. WWR workload categories “Dispositions” and “Occupied Bed Days” are used for the denominator. 8e - # of IPS Rounds / WWR OBDs + Dispositions Includes: All A*** MEPRS Codes Excludes: Any E*** MEPRS Codes *Auto-Populated by the NAVMISSA eDQ*

8e : Historical IPS Rounds to WWR Bed Days + Dispositions Comparison Commander’s Statement – Workload Comparison 4 8 (e) *Note that the vertical axis on the historical charts is adjusted to better display trends*

8e : IPS Rounds Calculation Accuracy Commander’s Statement – Workload Comparison 4 8 (e) NOLA transition impacted the file receipts at NAVMISSA (blue data points are adjusted to reflect actual data received). NMSC and NAVMISSA Tiger Team On-Site BUMED 2% Goal

9a : AHLTA Utilization Commander’s Statement – AHLTA Utilization 4 9 (a) Methodology: The “Source System” field in MTF SADR Transmissions is used to determine whether the encounter was created in AHLTA or another system. This is also the same field used in M2. This metric only needs to be above 80% to be green, since AHLTA is not designed for all clinics. 9a - # of AHLTA Encounters / Total # of Encounters Includes: MEPRS Codes B*** and FBN* ER, Optometry and other MEPRS Clinics are all included (BUMED 6040) Excludes: Updates (eliminates the issue of CHCS or ADM updates changing the source system) *Auto-Populated by the NAVMISSA eDQ*

9a : Historical AHLTA Utilization Commander’s Statement – AHLTA Utilization 4 9 (a) *Note that the vertical axis on the historical charts is adjusted to better display trends*

9a : AHLTA Utilization Calculation Accuracy Commander’s Statement – AHLTA Utilization 4 9 (a) NMSC and NAVMISSA Tiger Team On-Site One Navy MTF utilized the incorrect methodology to report their AHLTA utilization (i.e. AHLTA Utilization cannot be greater than 100%, etc.).

10a : Potential Duplicate Patient Records Commander’s Statement – Duplicate Patients 4 10 (a) Methodology: A standard CHCS report is provided to Host sites and is used to provide the raw data for this metric. Only sites that are a CHCS Host report this metric (others report “N/A”). This metric is not “graded” (red/yellow/green) on the TMA report. 10a - # of Potential Duplicate Encounters Includes: CHCS Host Sites Excludes: Sites that are not CHCS Hosts

10a : Historical Potential Duplicate Patient Records Commander’s Statement – Duplicate Patients 4 10 (a) *Note that the vertical axis on the historical charts is adjusted to better display trends*

11a : Commander’s Signature Commander’s Statement – Commander’s Signature 4 11 (a) Methodology: The Commander or Officer in Charge signs the Commander’s Statement indicating that it has been reviewed and acknowledged. This cannot be signed “By Direction”. If the CO/OIC is away, the “Acting” may sign. This metric should always be “Yes”. Includes: Not Applicable. Excludes: Not Applicable.

Reporting Timeframe Issues DQMC Process Flow and Deadlines

REGIONS Regional consolidation of Commander’s Statements, DQMCP coordination, issue resolution, audits and training. MTFs DQMCP execution, Review List, Commander’s Statement, CO briefs, and communication of issues to regional representatives. NMSC Systems execution, website maintenance / development, and DQMCP support. BUMED Program management, oversight, policy and strategies. Navy DQMCP Roles and Responsibilities 2