Presentation is loading. Please wait.

Presentation is loading. Please wait.

AHRQ State and Regional Demonstration Project Evaluation: Kevin B. Johnson, MD, MS Associate Professor, Biomedical Informatics, Vanderbilt University Medical.

Similar presentations


Presentation on theme: "AHRQ State and Regional Demonstration Project Evaluation: Kevin B. Johnson, MD, MS Associate Professor, Biomedical Informatics, Vanderbilt University Medical."— Presentation transcript:

1 AHRQ State and Regional Demonstration Project Evaluation: Kevin B. Johnson, MD, MS Associate Professor, Biomedical Informatics, Vanderbilt University Medical Center Nashville, Tennessee Barbeque, Blues, Beneficial Technology Barbeque, Blues, Beneficial Technology

2 2 Project Overview

3 3 Project Drivers  Incomplete information increases admission rate and ED LOS  Poor communication impacts ED efficiency  Less patient data at the point of care impacts the rate of test ordering  Less patient data at the point of care impacts clinical outcomes

4 4 Data Exchange Has HUGE Potential ROI Financial Measures Dollar Savings (millions) Reduced inpatient hospitalization $5.6 ED communication distribution$0.1 Reduced IP days due to missing Group B strep tests $0.1 Decrease in # of duplicate radiology tests $9.0 Decrease in # of duplicate lab tests $3.8 Lower emergency department expenditures $5.5 Total Benefit$24.2 If data is exchanged across all facilities within the three-county region, the overall savings has potential to reach $48.1 million. Notes: 1 – Core healthcare entities include: Baptist Memphis, Le Bonheur Children’s Hospital, Methodist University Hospital, The Regional Medical Center (The MED), Saint Francis Hospital, St. Jude Children’s Research Hospital, Shelby County/Health Loop, UTMG, LabCorp, Memphis Managed Care-TLC, Omnicare

5 5 Qualitative Research System Implementation and Evaluation Get the Model right Build the Team ID the settings Learn, Collaborate, Design ImplementOutcomes Research Qualitative Research

6 6 Key Aspects of Value Proposition  Qualitative Information  Costs  System usability  System use and utility  Clinical value (patient outcomes)  Dollars saved in care delivery process  Workflow efficiency gains

7 7 Qualitative Questions  Usability (focus groups in ED) 1 month and 1 year after go-live  Barriers to implementing infrastructure (cognitive artifacts) Evaluated in year 4  Drivers for adoption (interviews of governing board and ED staff) Evaluated in year 5

8 8 Costs  Personnel  Training  Community Meetings Sales Legal agreements Organizational development  Equipment  Software development  Site-specific customizations and costs

9 Assessing Usability: Questionnaire for User Interface Satisfaction The Questionnaire for User Interaction Satisfaction (QUIS) is a tool developed by a multi-disciplinary team of researchers in the Human- Computer Interaction Lab (HCIL) at the University of Maryland at College Park. The QUIS was designed to assess users' subjective satisfaction with specific aspects of the human-computer interface. The QUIS team successfully addressed the reliability and validity problems found in other satisfaction measures, creating a measure that is highly reliable across many types of interfaces.

10 10 QUIS Details  Six scales  Eleven interface factors Screen Terminology/system feedback learning factors system capabilities technical manuals internet access on-line tutorials, multimedia, voice recognition, virtual environments, and software installation

11 11

12 12 System Usability  Will conduct usability testing of SPL Vanderbilt as pilot site for face validity and modifying QUIS Will modify accordingly  Will survey Memphis ED attendings and nursing staff 1 month after go live and again 6 months later

13 13 System Usage and Epidemiology  Help desk use  Provider enrollment  Patient enrollment (RHIO in versus RHIO out)  Usage statistics  Latency  Downtime

14 14 Content Quality  Accuracy  Missing data  Categorization errors

15 15 Disease-specific Hypotheses  Improved neonatal GBBS management  Improved asthma controller med use  Improved ACE/ARB use in CHF  Improved immunization rates (flu, s.pneumo)  ?Others

16 16 ED Administrative Outcomes  Reduce inpatient admissions  Decreased duplicate testing (radiology and lab)  Decreased ED Expenses Workflow efficiency Costs per visit

17 17 Workflow change  Activity-based costing Model construction at Vanderbilt Model validation in Memphis Use model to construct activity matrices in EDs under study Assess how activity matrices change pre and 1 year post implementation

18 18 Model Construction: Data Collection  Trained observers will document Key transition points in information flow:  Eliciting prior medical history  Triage and treatment processes  Disposition/discharge from ED Data Elements  Activity performed  Agent (RN, MD, Clerk, etc.)  Start-Stop times (hh:mm:ss)

19 19 Sample of Activity-Based Data

20 20 Model Construction: Activity Matrices  Standardize raw data into activity classifications  Calculate activity durations (stop time – start time) costs (duration * agent’s rate of pay)  Assemble elements into generic work flow diagrams, baseline cost estimates  Transfer model to Memphis teams for validation & adaptation to local sites

21 21 Standardization of Raw Data

22 22 Activity-Based Estimates (Aggregate)

23 23 Overall Methodology  Descriptive Analysis Usability Usage Content quality  Pre-post design Workflow change ED administrative outcomes  Clinical outcomes design

24 24 Data Sources  Usability Survey, administered to all ED clinicians at major hospitals in project 1 month and 6 months after go live  Use Log evaluation at 1, 6, and 12 months after go live

25 25 Methodology: Usability  Site leadership cooperation  Sampling frame = All ED clinical staff (all who access system)  Site manages delivery, responses sent back to evaluation team  Non-respondents re-surveyed directly by evaluation team

26 26 Methodology: Epidemiology  Data from SPL usage logs  Data from SPL patient enrollment  Data from site provider enrollment  Downtime logs  Help desk logs  Other cognitive artifacts (meeting minutes, etc.)

27 27 Data Sources Patient with Data in vaults Patient without Data in vaults Record Accessed During Study No RHIO record Accessed Outcome of interest

28 28 Using the Vault as the Primary Data Source for Outcomes Change in LOS LOS of all encounters in vault whose records were not accessed LOS of all encounters in vault whose records were accessed = vs LOS of all encounters in vaults (before go live) Baseline LOS =

29 29 Analytic Approach  Calculate baseline (pre) rate  Rate of outcome in patients whose record was accessed  Rate of outcome in all other patients in vault  Rate of outcome in patients NOT in vault

30 30 Clinical Outcomes Methodology  Pre-post Easy to implement Will not impact rollout or clinic flow Sensitive to existing trends off on Rollout stable

31 31 Other Approaches  Assign times of day randomly to downtime status  Assign patients randomly to control group (no data for them)  Assign retrieval events randomly to control (i.e., no result) retrievals off Rollout stable

32 32 Covariate Analysis  ED (site) characteristics survey to be completed by ED Administration  Readiness survey to be completed by ED administration and clinical leadership

33 33 IRB Approach: Five Approvals Activity-based costing (approved) Usability, readiness and demographic survey (letters of cooperation) Baseline data for administrative measures and activity costing System content quality Disease-specific hypotheses

34 34 Thanks!


Download ppt "AHRQ State and Regional Demonstration Project Evaluation: Kevin B. Johnson, MD, MS Associate Professor, Biomedical Informatics, Vanderbilt University Medical."

Similar presentations


Ads by Google