Presentation on theme: "TRICARE Data Quality Training Course"— Presentation transcript:
1 TRICARE Data Quality Training Course Introduce SelfCurrently serving as a ______________________Background includes working with EIDS for X years supporting_______________Intro Material: Will be providing an overview of the data quality, completeness, and timeliness efforts that are done in conjunction with the ingest, preparation, and presentation of EIDS data.GOAL: To provide apriciation of how and who is doing data quality work throughout the MHS.May 2010
2 Objectives DHSS DHSS Supports Enterprise-Wide Data Quality Efforts Why data quality mattersHow our tools affect data qualityHow you can use this information in your data quality programDHSSMHS centralized data storeReceive, analyze, process, and store 100+ terabytes of dataThousands of users worldwideEIDS is the centralized data store for the Military Health System with more than 2,500 users worldwide. With interfaces around the globe, EIDS manages the receipt, processing, and storage of hundreds of millions of health care records that characterize the Military Health System’s operations and performance.In terms of scale, our current storage volume (disk and tape) is approaching 1 Petabyte (1000 TB) with about 100 TB in active use. One of the largest “HMO’s” in the world.It is useful to provide some definition of Data Quality:Data Quality Defined - Textbook: “data fit for their intended uses in operations, decision making and planning" (J.M. Juran). Alternatively, the data are deemed of high quality if they correctly represent the real-world construct to which they refer.We define Data Quality as: The state of completeness, validity, consistency, timeliness and accuracy that makes data appropriate for a specific use.Data quality is also the reliability and effectiveness of data, Gartner Inc. reports that 25% of critical data within Fortune 1000 companies will be inaccurate through Another Gartner study says that poor quality “person” data costs U.S. businesses an estimated $611 billion dollars a year just in categories such as postage, printing, and staff overhead.
3 (Personnel, Finance, Logistics . . .) What is DHSS? ,000 Foot ViewTRICARE decision support that makes the vision of the Military Health System Plan possibleMilitary Health System technology that integrates and standardizes clinical, resource, population, logistics, and other referential informationClinicalHealth InformationInfrastructureHealth InformationInfrastructureLogisticsResourcesD I S NThis slide offers a logical, 40,000 foot view of the types of information contained within EIDS.Includes:Clinical: e.g. SADRResources: e.g. MEPRSLogistical: e.g.Non-MedicalApplicationsSustaining BaseDeployed(Personnel, Finance, Logistics . . .)
4 > 10 Million Beneficiaries What is DHSS? ,000 Foot ViewThis slide is the 20,000 foot view and gives a bit more detail and description of the environment. Important to note that there are two main components of the MHS healthcare system feeding DHSS. As many of you may be fully engaged with only the direct care system data, you may not be aware of some of these terms and functions.Direct Care: All are familiar, the 450+ clinics and 100+ hospitalsPurchased or “Indirect” care: That referred to civilian providers and civilian (TRICARE) facilities.Example: SADR data is a representation of Outpatient DIRECT careTED “NI” for Non-institutional is a representation of (with some differences) of Outpatient DIRECT care.SIDR data is a representation of Inpatient DIRECT careTED “I” for Institutional is a representation of (with some differences) INPATIENT INDIRECT/PURCHASED CARECMS 1500 and UB92 are similar constructs for Civilian care> 10 Million BeneficiariesDirect (Military Provided) Care:Inpatient: 250,000 AnnuallyOutpatient: 30 Million AnnuallyPurchased (Civilian Provided) CareInpatient Claims: 800,000 AnnuallyOutpatient “Claims”: 100 Million Annually
5 DHSS – The Healthcare Data Warehouse 5,000 Foot View A wide variety of healthcare dataRx, Lab, Rad, etcInpatient EpisodesOutpatient EncountersSurvey DataEnrollment DataReference DataClaims DataCollects and distributes dataDaily, weekly, and monthlyFrom over 460 freestanding clinics and 100 hospitalsFrom thousands of civilian facilitiesWorldwide geographic distributionHealthcare Data Warehouse Overview.From 20,000 feet to 5000 feet. Now we start to see some of the larger and most important data types.Run listIntent however, of this slide is to provide a conceptual understanding of what a data warehouse “looks like” in terms of functionality.Emphasis: Warehouse Generation SW and ODS/Query & Reporting ToolsComponent Listing:Warehouse Generation SW: e.g. SASM2: e.g. Query and Reporting ToolMetadata: e.g. MCAT Database, The Datatracker (you will hear much more about this as we go on, as it is the primary tool used for EIDS internal DQ work)ODS: e.g. Again, the Datatracker, which is a hybrid ODS that only captures Data Quality related information.
6 DHSS Architecture 1,000 Foot View A vastly simplified diagram, but now at 1000 feet.Data Quality tools exist at all levels of the architecture. On CHCS hosts there are tools and processes in place. Ask audience to name some? E.g. End of Day Reconcile of Appointments to SADRS. E.g. SIDR completion statistics, e.g. ???At the repository, the Data Quality Operational Data Store (ODS) where benchmarking, trending, file integrity, selected field checks, and statistical process control tools are used.As part of processing Quality Control (QC) (e.g. a variety of tools that compare (to a greater degree at the fled level) this months data being processed to prior months.,Least desirable, in the Data Marts. This occurs generally as a by product of analytical work. This is where the analyst or algorythm the case of Biosurveillance, discovers an issue missed in all prior checks. Example: Wendy Funk in later presentations will provide many.
7 TRICARE Management Activity Data Quality Program June 2007CHCS Host Architecture and DHSS InterfacesCDM Hosted by DHSSCDMCDRELIGIBILITY & ENROLLMENTCLINICAL DATA REPOSITORYCHCS Host Patient DatabaseStandard Files and Tables (DMIS, ICD-9, CPT/HCPCS, DRG, HIPAA Taxonomy, National Drug Codes, Zip Code)Site Defined Files and Tables (Hospital Locations, Providers, Users, Formulary, Tests/Procedures)Application Business RulesInpatient Admissions andDispositions (PAD)Outpatient Appointment SchedulingManaged Care Program (PAS/MCP)Ambulatory DataModule (ADM)Clinical Order Entry and Results ReportingLaboratory(LAB)Radiology(RAD)Pharmacy(PHR)ConsultsNursingOrdersMedical ServicesAccounting (MSA)WorkloadAssignment Module(WAM)Great slide, courtesy of Charlene Colon. This is the 100 foot view of a CHCS host, from where many of the direct care data interfaces to DHSS systems occur.I have highlighted in red these data types and the locations of their interfaces.One of our challenges is that with over 100 hospitals worldwide, there are ALWAYS “things” (a technical term), going on with some aspect of these interfaces at some facility. DHSS “knows” of these “things” by rather sophisticated tools for monitoring data flow. We will talk more about these tools to monitor data flown in later slides.Bottom line? The number one problem in terms of DHSS effort expended on data quality, is the identification of, correction of, and recovery of data via these many interfaces.CHCS Generic Interface System (GIS) for (HL7) and Electronic Transfer Utility (Sy_ETU)HL7, M/OBJECTS, OR CUSTOM INTERFACESSFTP DATA TRANSFERS to DHSS and other Corporate SystemsADT(Admit, Discharge, Transfer, other status)DHSS HL7 ADTCaptureLABINSTRUMENTSCO-PATHLAB-INTEROPDBSSHIVDHSS HL7 LABCaptureDIN-PACSVOICE RADDHSS HL7 RAD CapturePDTSATCBAKER CELLPYXISVOICE REFILLDHSS HL7 PHR CaptureTRICARE OPS CTR (WW)SIDR/SADR/CAPEREAS/MEPRSWWRAPPOINTMENTSANCILLARYGCPR ExtractsAHLTAICDBDHSSDoD/VA SHARECIS/ESSENTRISAUDIO CARETRANSPORTABLE CPRTRAC2ESCAC (Patient Look-Up)NMISCODING EDITOR (CCE)Diagram courtesy of Charlene Colon, Clinical Data Analyst
8 MDR (MHS Data Repository) Military Health System Data RepositoryCentralized data capture and validation of MHS data worldwideMore than 5 billion records on-line with 10+ years of dataProvides repository for other systems/applications to receive extractsTypical users: small cadre of high-level data analystsMAJOR APPLIATION – DIRECT CARE: M2“Is the VIEW into the MDR”DataWorkloadPopulationCustomer Satisfaction SurveysEncountersMaster patient/person indexPopulation eligibilityToolBusiness ObjectsFully ad-hoc; Windows-like interfaceUsersData savvy, data analystsTrickle up/downOver 800 users61% at local treatment facilities
9 M2 (MHS Management Analysis & Reporting Tool) Powerful ad hoc query tool for detailed trend analysis such as patient and provider profilingTypical users: Data analysts skilled in Business Objects softwareSimply a slide showing an M2 report.May digress into who should have one, and how to get M2 accounts if desired.
10 DHSS Data Quality Requirements Capture and catalog data filesAssess and monitor data completenessPerform data quality feed node assessmentsDevelop data quality software that:Performs automatic data quality checksImplements data quality assessmentsProvides metrics and manages perspective of the files’ data qualityThese simple requirements are essentially EIDS’s data quality charter.ABSENCE of a focus on these elements lead to the following “SNIPPETS”:Gardner: Four of every 10 projects DW/BI projects fail (Cite EDW and DMEI if so inclined).Cutter Consortium, an IT analysis firm: 41 per cent of data-warehousing projects fail.Gartner: 50% will have limited acceptance or outright failure as result of attention to data quality issues.In general then, if the data is not clean and accurate, the queries and reports will be wrong, the users will either make the wrong decisions or, if they recognize that the data is wrong, will mistrust the reports and not act on them.This then is the BASIC REASON why EIDS has a group that focuses on the issues of Data Quality.
11 DHSS Data Quality Metrics Integrity: is it secure?Relevancy: is it appropriate?Reliability: is it rationally correlated?Validity: is it sound?Consistency: is it free from contradiction?Uniqueness: is it free from duplication?Timeliness: is it available when needed?Completeness: is it whole?Accuracy: is it free from error?Most healthcare organizations at many levels, need to respond to HEDIS or other sorts of Compliance Audits. These compliance audits all have some form of Assessment of Data Completeness. For example, with data submitted the organizations may be asked:For claims/encounter data received from practitioners, provider groups, facilities and vendors:How does your organization monitor and assess the completeness of data submitted?How often does your organization monitor and assess the completeness of data submitted?Has the your organization established benchmarks to assess the completeness of data submitted? If so, describe.Has your organization conducted additional studies or analyses of data completeness or under-reporting? (This includes studies of total claims volume and claims not received.) If so, describe.Describe barriers to obtaining complete and accurate claims/encounter data. Consider all factors that influence your ability to collect such information from practitioners and contracted vendors, including, but not limited to, vendor system constraints or incompatibilities, lack of reporting requirements, payment arrangements (e.g., capitation), data integration issues.Clearly, this week of classes will better prepare you to answer these questions as EIDS supports, and is a part of YOUR ORGANIZATIONRutan Defines – tell all how wonderful crusty old Rutan is. : Completeness Process Components THREE MAIN COMPONENTS TO EIDS DQ WORKMonitoringScript based “monitors” (data feed monitoring tools)Examination of volumetricsAlertingBased on process control or arbitrary thresholdsAlerts for “What is late” and “what is missing”Alerting functions call upon benchmarks of current data against historical dataReportingRequires a “History”A Data Quality/Completeness Datamart (or ODS)Populated automatically in real timeFew selected data completeness measures for identified data typesProvides the history used by alerting functions
12 Data Quality ToolsReal-time key data quality/completeness DB2 database for:SIDR SADR HL7 PDTS Appointment AncillaryDatabase updated daily and scripted to provide “event-driven” alerts via for critical data quality areasDMIS IDs “real time” and “snapshot” views of key data completeness measuresINTERNAL Web front-end access for standard reportsMultilayer data comparisons from raw to processed data for procedure-based actionsStatistical process control algorithms and control charts to detect data anomaliesThis slide introduces EIDS’s Datatracker – This is the tool we use to do DQ work.This tool essentiall developed because: LESSON LEARNED: Data completeness measurement needs to be in sync with the utilization of the data.By way of example: Typically, there is data that arrives daily and can be used daily. e.g. for epidemiologic surveillance, or real time enterprise activity monitoring. The HL7 data stream (e.g. Lab, Rad, Pharmacy transactions) and the Outpatient data or equivalent data types represent that which is likely to be used for daily activity monitoring.“Monitoring” Inpatient data) is more often in accordance with monthly or quarterly management reporting cyclesThis simply means that you should not have weekly or monthly activities or events or “monitors” for data that is provided and used on a daily basis.The tool provides reports: MATT KAPUSTA – Introduce MATT – He has examples of some recent reports and I encourage you to to see him during your breaks and see how your organization or facility is represented in these reports.In addition to a “real time “ data quality database, EIDS has developed a host of sripts that perform inventory functions. E.g. Announcement of file(s) arrival which notifiy of arrival of key files, monthly completeness inventories.ALL THE TOOLS USED BY EIDS ARE TO ANSWER:Timeliness Questions: Timeliness quantified (“what arrived and when”)Completeness Questions: Completeness quantified (“what did not arrive – what is missing”)Missing or late data from client facilities or fiduciary intermediariesProfound impact on the bottom line for the enterprise.
13 Data Tracker Essentially a “Mini MDR/M2”. Data processed in real time Data Tracker tools and reportsSIDR, SADR, HL7, Appointment, Ancillary, TED Inst/Non-Inst reports provide:File based accounting (e.g. Gap reports)Treatment based accounting (e.g. reports based on care date)Timeliness reporting (e.g. lag from care rendered date to ingest)Other statistical reports including benchmarking against WWRStatistical Process Control Alerting for SADR anomaliesOther Data Tracker tools and reportsMonthly reports (SIDR and SADR vs WWR Benchmarking)Ad Hoc Queries to the Data TrackerGCPR & PDTS Gap Reports – Receipt Reports – Pull ReportsCurrent Data Tracker reports on the DHSS (EIDS) Web siteDaily SADR by HOST DMIS (The “What Was Received Yesterday” Report)Daily SADR by Treatment ID – 90 Day (The daily “90 Day Roller” Report)Monthly SIDR by Treatment DMISWeekly HL7 GapsMore Data Tracker “Sales Pitch” as something like it, call it a “DQ Datamart” has been discussed as a product.DATATRACKER IS: A Real-time key data quality/completeness DB2 database for:SIDR SADR HL7 PDTS GCPR Appointment AncillaryEssentially a Hybrid Operational Data Store (ODS) An operational data store (ODS) is a type of database often used as an interim area for a data warehouse. Unlike a data warehouse, which contains static data, the contents of the ODS are updated through the course of business operations. An ODS is designed to quickly perform relatively simple queries on small amounts of data (such as finding the status of a customer order), rather than the complex queries on large amounts of data typical of the data warehouse.Database updated dailyDatabase scripted to provide “event-driven” alerts via for critical data quality areasDMIS IDs “real time” and “snapshot” views of key data completeness measuresWeb access and front end for reporting (standard reports)Multilayer data comparisons from raw to processed data for procedure-based actionsStatistical process control algorithms and control charts to detect data anomalies
14 DHSS (EIDS) Web Portal Resource Location of data quality reports in DHSS EIDS Web portal.The web portal also contains some information is you wish to validate your sites daily transmissions to EIDS. Raw counts for SIDR, SADR, and HL7 data are included here.Before launch into specific tools, some general background material.In data warehouses processing and load times occur prior to healthcare data becoming available in the corporate or enterprise databases or datamarts. This processing and/or ETL may occur weekly, monthly, or on other schedules.IMPORTANT POINT: If operations data completeness “processes” occured after scheduled processing and database loading, this results in significant delay in the identification of missing or late dataDelays in identification of missing or late data delays recovery and ultimate availability of complete data.Example:If a Monthly file from a site is not provided for processing - Processing occurs without itThen the Database/datamart is populated - Which allows the identification of a data gapSubsequently, there is recovery of the data - And the data, which has been unavailable for the entire preceding cycle, is then provided for processing cycle - With resultant late database/datamart availability .Problems may not be detected until the time of a processing cycle. Worst case, when users themselves identify missing data from downstream datamarts or databases. e.g. If data is received daily from sending sites, then simple scripts can report on sites that have not sent within 24 hours or a defined time periodIn the database you have to describe:What did happen - The universe of all possibilities The difference or variation is “What didn’t happen”What needs to be examined?Structure Checking Identifies failed ingest (e.g. malformed or truncated files) Both file and record level of detailContent Checking - Reasonability Field LevelRutanisms – “We seek the absence of the expected” - “We have alerts to provide a positive affirmation of nothingness” - More difficult than it sounds. When reporting on the completeness of a set of data, how do you mark a data element as missing if the record isn’t in the database? How do you represent “what didn’t happen” in a database? It is easyer to spot an anomaly, than to spot the absence of an anomaly.
15 Data Quality Assurance Start with Run ChartsFacilities showing gaps in daily outpatient encounter data receipt. Investigation and data recovery actions required.Data set has no correlation with other source system provided data setsOLD SCHOOL: Run a chart for every clinic and hospital and eyeball looking for examples like the above. Use a single data type clinics, 100+ hospitals, 30+ data types. Interesting problem.How does your organization scan multitudes of data sets for significant “Variation” in completeness?NEW SCHOOL: EIDS usesMonitoringScript based “monitors” (data feed monitoring tools)Examination of volumetricsAlertingBased on process control or arbitrary thresholdsAlerts for “What is late” and “what is missing”Alerting functions call upon benchmarks of current data against historical dataReportingRequires a “History”A Data Quality/Completeness Datamart (or ODS)Populated automatically in real timeFew selected data completeness measures for identified data typesProvides the history used by alerting functions
16 Data Completeness Determination “Completeness” as a Process Control ProblemAmenable to Statistical\Process ControlExamine for Special Cause VariationSignals when a problem has occurredDetects variationAllows “Process Characterization”Reduces inspection needNEW, NEW SCHOOL: Data flows are predictable. Seek variation. Apply rules to identify variation. The red circles are a depiction of “rules” that become code, that identify variation.Essentially, and rather crudely described, we developed software to replace the human eyeball looking at hundreds of charts. If interested in the statistical approach used, refer to the Shoe.Optional – If you are feeling Didactic:Dr. Deming introduced the terms Common Cause and Special Causes, although he assigns the first usage to W. Shewhart. An understanding of the type of variation being observed is important if the correct course of action is to be taken.Common Cause variation is created by many factors, that are commonly part of the process. Their origin can usually be traced to the key elements of the system in which the process operates. (Materials, Equipment, People, Environment, Methods).Special Cause variation is created by a non-random event leading to an unexpected change in the output. The effects are intermittent and unpredictable. If Special Causes of variation are present, the output is not stable and is not predictable. .
17 Compare Each Day To Itself Red Boxes/X’s/etc indicate “Alerts” sent to DQ Team via automatedProject previousdata to today then compare this projection with newly arrived data.The graphic (using actual “outpatient” data) now displayed represents in classic run chart format, events that are derived from the aforementioned “Alarm Rules.Open red square, point lies below the lowest control limitRed horizontal cross (+), point is the second below the centerline in Zone A or beyond out of three successive pointsRed diagonal cross (x), point is the fourth below the centerline in Zone B or beyond out of five successive pointsRed open diamond (and preceding 8 points are also marked), point is the ninth successive point below the centerlineCONCEPTUALLY, what we are doing is, and what this slide shows an example of is:Each Monday’s “Volume” compared to prior Monday’s VolumesEach Tuesday’s “Volume” compared to prior Tuesday’s VolumesIt is essentially a projection of previous data forward in time to today, then a comparison of this projection with the newly arrived data.Holiday Logic PendingChart: Encounters by Day
18 Identifying Data Completeness Problems Red boxes/Xs/etc. indicate automatic “Alerts” to the data quality team.Alerting and Notification IssueHow do you identify and present“possible” problems when:the “problem” is transient,it is one data point in a series,it is from one of a vast number of daily input data sources?I think you now see that even after development of the techniques for data incompleteness identification, we would end up with 460+ control charts DAILY for just one single data type out of the many data types ingested.Clearly we still had a problem and it still required drudgery and a Human Eyeball.New Method – alerts based on SPC generated “Alarms” based on zone rules and trend rules.Essentially a projection of previous data forward in time to today then a comparison of this projection with the newly arrived data.
19 Data Tracker Report Series including: SADR vs Appt. Tracking (a real-time “Hutchinson” Report )SADR vs Appointment Delta AlertingEven with all the fancy alerting, we still need to line up data with other “related” data sets. In fact, we have even developed tools called “Delta Dectectors” that look at related data sets and identify variation.This is an example of a report that looks at Appointment and SADR data and this allows us to better distinguish variation from reality. By this I mean, if we are to look at appointments ONLY, a low number in a time period MAY reflect reality (temporary clinic closure, Physican TDY, etc. But if we compare that data to another (e.g. SADR) and there is a significant “DELTA” then we know it is not a “reflection of reality” and instead, a system problem of data generation, interface, transmssion or receipt.
20 Daily Ancillary Data Report - 90 Day Roller Just a report exampleDaily Ancillary Data Report - 90 Day Roller
21 Data Quality Tools – “Timeliness” Historical “Delta Detector” studies. EIDS has gone back (pre delta detector) and ran deltas for Appointment and SADR data. Allowed discovery of a number of sites with what appear to be missing SADR data.Large reharvest efforts completed.
22 EIDS Web Portal display of daily FILES receipt The HL7 Weekly Tracker – Sorted by Service. Posted on the EIDS Web site and updated weekly from the Data Tracker database.
23 Data Quality Tools – “Delta Detectors” CAPER vs SADR Delta Detector ExampleWe found sites sending SADRS but no CAPERS, sites sending CAPERS but no SADRS, and all shades of grey in between.
25 Data Quality Tools – “Delta Detectors” Another Delta report – SADR vs WWR (no longer used) – Shows “Rule Violation”
26 Data Quality Tools – “Interface Monitoring” Report showing HL7 data receipt and whether we are getting Lab, Rad, Pharm, etc data as we should from each sending location.
27 Data Quality Processes Problem Determination Process“File Level” GapsAutomated Processe.g. Nothing received as of date or durationSADR ExampleNothing received in 3 days – Alert)Note Files contain data from many encounter dates. Files may be received daily, but these files may not correlate with CURRENT data]“Encounter Date” AnalysisMonthly and ad hoc manual reviewse.g. Run ChartsNOTE: SPC Control Charts aredesigned to provide an automatedmeans to perform this activity)This is the logical process we go through to identify data quality problems.Look for file level gaps.Look into the files for significant variation in the “expected”.
29 Data Quality Processes Problem Resolution Process : IF “file level” OR “encounter date level” problem detected:Immediate MHS Help Desk TicketNotification if problem is deemed “significant and or long standing”Determination if “Blaster” message to analytical community is appropriate or required. Note that individual site “transient” halts in transmission occur regularly and are usually resolved quickly. These “transient” problems are not reported in real time as M2 utilizes a batch process and problems are often resolved between batch processing cycles.Coordination with “Service POCs” to determine is problem also exists in “Service” databases.Recovery of files via sharing between service databases and DHSSTier III recovery/reharvest of missing data (except HL7 and Ancillary as no reharvest mechanism exists)Self Explanatory
30 Data Quality Processes Common Problems – In Order of Occurrence4 Broad CategoriesProvider/Coding Issues“Slow Coding” for the data receipt perspectiveProvider “left”Transmission/Send of DataSy_ETU ProblemsHost Issues (e.g. Change Package induced problems)Network Routing IssuesIngest or ProcessingDoes the Data Reflect Reality? Or is it it one of these?What may appear to be data gaps or problems may be reflections of “real-life” eventsClinic closure for training, staff illness, or other events can mimic a data gap caused by transmission or receipt problemsIf it is identified as an issue, EIDS maintains information regarding “investigations” of data completeness issues as data “caveats” are often required by analysts.Also, for the MHS as an enterprise, maintenance of information regarding “investigations” are mandated (e.g. reporting HEDIS measures)
31 Data Quality ToolsPartial List of Standard Reports from the EIDS Web Portal DataTracker DatabaseHL7 tracking: Displays a tabular view of file submission history for each HL7 site.SADR gaps: Displays a list of sites, by ADS version, that did not report data for at least a fixed number of daysSADR lags: Displays the mean and standard deviation of the reporting lag for each site, by ADS version.SADR scores: Displays a SADR transmission completeness report. For each site, by ADS version, a completion percentage is listed. assumed.SADR tracking: A tabular view of file and record submission history for each site, by ADS version. Each column corresponds to a file date.SADR treatment DMIS ID gaps: Displays a list of treatment DMIS IDs that did not report data for at least a fixed number of days.SADR treatment DMIS ID scores: A SADR transmission completeness report. For each treatment DMIS ID, a completion percentage is listed.SADR treatment DMIS ID tracking: Displays a tabular view of record submission history for each treatment DMIS ID.SADR treatment DMIS ID (by visit type) tracking: Displays a tabular view of record submission history for each treatment DMIS ID. The displayed counts indicate the number of unique SADR data records, determined by appointment prefix and appointment identifier number.SIDR gaps: A list of reporting sites that did not report data for a fixed number of SIDR months, up to and including the ending SIDR monthSIDR tracking: Displays a tabular view of file and record submission history for each reporting site.SIDR treatment DMIS ID tracking: Displays a tabular view of SIDR completion history for each treatment DMIS ID.GCPR gap: Displays a list of sites that did not report data for at least a fixed number of days.Just a laundry list of data tracker reports – another sales pitch for the datatracker?
32 Data Quality Tools (continued) GCPR sites: Displays a list of GCPR sites by Service, region, and DMIS ID, allowing the user to review the mapping of GCPR sites to DMIS IDs.GCPR tracking: Displays a tabular view of file submission history for each GCPR site. Each column corresponds to a date within the range specified.HL7 gap: Displays a list of sites that did not report data for at least a fixed number of days, as specified by the user query.PDTS gap: Displays a line if PDTS data has not been reported for at least a fixed number of days, as specified by the user query.PDTS tracking: Displays a tabular view of file submission history for PDTS. Each column corresponds to a file date within the range specified.Ancillary Tracking: Displays a tabular view of file and record submission history for each reporting DMIS ID. Each column corresponds to a file date within the selected range.Ancillary Gap: Displays a list of reporting DMIS IDs, that did not report data for at least a fixed number of days.Ancillary treatment DMIS ID Tracking: Displays a tabular view of record submission history for each ancillary performing DMIS ID. Each column corresponds to a service date within the range specified. The displayed counts indicate the number of unique ancillary data records, as determined by the accession number for laboratory, exam number for radiology, and prescription number for pharmacy.Ancillary treatment DMIS ID Gap: Displays a list of performing DMIS IDs that did not report data for at least a fixed number of days, as specified by days, up to and including the ending service date, as specified.Appointment treatment DMIS ID Tracking: Displays a tabular view of record submission history for each appointment treatment DMIS ID. Each column corresponds to an appointment date within the inclusive range specified by the beginning appointment date, bgndate, and the ending appointment date, enddate. The displayed counts indicate the number of unique appointment data records, as determined by the appointment identifier number and the node seed name.Appointment treatment DMIS ID Gap: Displays a list of treatment DMIS IDs that did not report data for at least a fixed number of days, as specified by days, up to and including the ending appointment date, as specified.
33 Data Quality Tools Allow DHSS to Catalog data files Monitor data completenessProvide metrics to assess data quality/completeness of data receivedDesign, develop and maintain data quality softwareSUMMARY AND RECAP:EIDS performs and ongoing analysis of the quality of healthcare databases which requires performance of basic data analysis and profiling includingCompleteness assessmentFrequency distributionsVolumetricsOutlier analysis and other reasonability analysesEIDS’s assessments of data quality has value when it is used to raise awareness of process failure and results in process improvements that eliminate the causes of defective dataEIDS performs a role that all Healthcare data warehousing organizations must have, which is the ability to answer at any point in time “what is late and what is missing”, as well as benchmark data against historical and current data
34 The Key To Data Quality Success Partnering with our users to maximize information sharingQuestions?Real time data use demands that there be real time data completeness processes. Best practice for data warehousing organizations indicates that for data that is ingested/used daily, that there are processes to identify and correct data receipt problems on a daily basis.For data that has a cyclical processing cycle, best practice identifies data not present PRIOR to processing, takes action to recover or repair, and caveats the processed data when data gaps are identified.The ability and degree to which an organization is able to answer these simple questions will have a profound impact on the bottom line for the enterprise.