Presentation is loading. Please wait.

Presentation is loading. Please wait.

November 7, 2014 Health IT Implementation, Usability and Safety Workgroup David Bates, chair Larry Wolf, co-chair.

Similar presentations


Presentation on theme: "November 7, 2014 Health IT Implementation, Usability and Safety Workgroup David Bates, chair Larry Wolf, co-chair."— Presentation transcript:

1 November 7, 2014 Health IT Implementation, Usability and Safety Workgroup David Bates, chair Larry Wolf, co-chair

2 Workgroup Members David W. Bates, Brigham and Women’s Hospital (Chair) Larry Wolf, Kindred Healthcare (Co-Chair) Joan Ash, Oregon Health & Science University Janey Barnes, User-View Inc. John Berneike, St. Mark's Family Medicine Bernadette Capili, New York University Michelle Dougherty, American Health Information Management Association Paul Egerman, Software Entrepreneur Terry Fairbanks, Emergency Physician Tejal Gandhi, National Patient Safety Foundation George Hernandez, ICLOPS Robert Jarrin, Qualcomm Incorporated Mike Lardieri, North Shore-LIJ Health System Bennett Lauber, The Usability People LLC Alisa Ray, Certification Commission for Healthcare Information Technology Steven Stack, American Medical Association Ex Officio Members Svetlana Lowry, National Institute of Standards and Technology Megan Sawchuck, Centers for Disease Control and Prevention Jeanie Scott, Department of Veterans Affairs Jon White, Agency for Healthcare Research and Quality-Health and Human Services ONC Staff Ellen Makar, (Lead WG Staff) 1

3 Meeting Schedule MeetingsTask Monday, September 22, :00 PM-4:00 PM Eastern Time Review charge Work to date=- background / history Preliminary goals discussion of deliverable Friday, October 10, :00 PM-3:00 PM Eastern Time Presentation of usability research MedStar and NIST Friday, October 24, :00 PM-3:00 PM Eastern Time ECRI and TJC results of adverse event database analysis Usability Testing Implementation Science (field reports) Certification – Alicia Morton Friday, November 7, :00 PM-3:00 PM Eastern Time Friday, December 12, :00 PM-3:00 PM Eastern Time Post-implementation Usability & Safety, Risk Management & Shared Responsibility Safety Center Report Out Realignment of timeline/ goals for

4 Agenda Objective: ONC Health IT Certification Program 1:00 p.m.Call to Order/Roll Call Michelle Consolazio, Office of the National Coordinator 1:05 p.m.Context: Usability and Safety Criteria for the ONC Health IT Certification Program Larry Wolf, co-chair 1:20 p.m. ONC Health IT Certification Alicia Morton, Office of the National Coordinator 1:40 p.m. How Usability of EHRs and Workflow Impact Patient Safety Alicia Morton, Office of the National Coordinator Lana Lowry, National Institute of Standards and Technology 2:55 p.m. Public Comment 3:00 p.m. Adjourn 3

5 ONC Health IT Certification 2014 Edition EHR Certification Criteria on “safety-enhanced design” (using UCD processes) - Identify what is being done - Increased transparency based on information available through certification. See ONC’s CHPL site.ONC’s CHPL site. ONC Authorized Certifying Body (ACB) can conduct surveillance in live environments. – ACB’s are “health oversight agencies” under HIPAA – See ONC FAQ #45 CMS Fact sheet “flexibility rule” Guidance/Legislation/EHRIncentivePrograms/Downloads/C EHRT2014_FinalRule_QuickGuide.pdf Guidance/Legislation/EHRIncentivePrograms/Downloads/C EHRT2014_FinalRule_QuickGuide.pdf 4

6 Safety- Enhanced Design 5 New: Safety-enhanced design. User centered design processes must be applied to each capability an EHR technology includes that is specified in the following certification criteria: § (a)(1), (2), (6) through (8), (16) and (18) through (20) and (b)(3), (4), and (9). Current: Safety-enhanced design. User-centered design processes must be applied to each capability an EHR technology includes that is specified in the following certification criteria: § (a)(1), (2), (6) through (8), and (16) and (b)(3) and (4).

7 UCD processes in 2014 Edition R2 Electronic Health Record (EHR) Certification Criteria § (g)(3) Safety-enhanced design. User -entered design processes must be applied to each capability an EHR technology includes that is specified in the following certification criteria: § (a)(1) Computerized provider order entry § (a)(2) Drug-drug, drug-allergy interaction checks § (a)(6) Medication list § (a)(7) Medication allergy list § (a)(8) Clinical decision support § (a)(16) Inpatient setting only – electronic medication administration record § (a)(18) Optional – computerized provider order entry – medications § (a)(19) Optional – computerized provider order entry – laboratory § (a)(20) Optional – computerized provider order entry – diagnostic imaging § (b)(3) Electronic prescribing § (b)(4) Clinical information reconciliation § (b)(9) Optional – clinical information reconciliation and incorporation Certification test procedure: pproved_v1.3_0.pdf pproved_v1.3_0.pdf 6

8 Quality Management System § (g)(4) Quality management system. For each capability that an EHR technology includes and for which that capability's certification is sought, the use of a Quality Management System (QMS) in the development, testing, implementation and maintenance of that capability must be identified. – (i) If a single QMS was used for applicable capabilities, it would only need to be identified once. – (ii) If different QMS were applied to specific capabilities, each QMS applied would need to be identified. This would include the application of a QMS to some capabilities and none to others. – (iii) If no QMS was applied to all applicable capabilities such a response is acceptable to satisfy this certification criterion. Test procedure: p_approvedv1.2.pdf p_approvedv1.2.pdf 7

9 ONC Health IT Certification Program CAPT Alicia Morton, DNP, RN-BC 8

10 ONC Health IT Certification Program Workgroup Information Request /Items for today’s discussion: Brief overview of the ONC Health IT Certification Program – Current, 2014 Edition certification requirements and how they are tested ONC CHPL site overview and revision plans Surveillance program for certified products 9

11 Both cars meet baseline safety standards, functional conformance testing, vehicle emissions testing Yet, those baseline standards are not the differentiator. Safe Useful Usable Satisfying 10

12 How Does the ONC Health IT Certification Program Work? Regulation ONC issues a regulation that includes certification criteria (and associated standards) for health IT products and corresponding certification program requirements Developers Create health IT products that, at a minimum, meet the standards and certification criteria adopted by HHS in regulation ONC-ATLs Test health IT products based on the standards and certification criteria adopted by HHS ONC-ACBs Issue certification to tested health IT products; Conduct surveillance Submit product information to ONC for posting on the Certified Health IT Product list (CHPL)CHPL Providers & Hospitals Have assurances products meet specific certification criteria and associated standards In the case of the EHR Incentive Program (aka MU), certified health IT in a specified manner to attest/report to the program to receive an incentive and avoid a payment adjustment 11

13 ONC Health IT Certification Program Structure / Process 12 ONC reviews and posts certified product to CHPL ONC approves ONC-AA Approved Accreditor accredits NIST NVLAP National Voluntary Laboratory Accreditation Program accredits Developer performs testing against criteria ACB Authorized Certification Body* ONC-ACB ONC-Authorized Certification Body certifies tested products Authorized Testing Body* ATL Accredited Testing Laboratory ATL Accredited Testing Laboratory performs testing against criteria ISO/IEC ISO/IEC NIST 150 NIST authorizes Product successfully passes testing Product successfully achieves certification ISO/IEC 17065

14 Seven Certification Criteria Categories in 2014 Edition ClinicalCare Coordination CQMPrivacy & Security Patient Engagement Public HealthUtilization CPOE Transitions of Care (1) Capture & export Authentication, access control, authorization VDTImmunization Info Automated MU Numerator Calculation Drug-drug, drug- allergy checks Transitions of Care (2) Import & calculateAuditable EventsClinical summary Immunization Transmission Automated MU Measure Calculation Demographics E-prescribingElectronic reportingAudit reportSecure messaging Syndromic surveillance Transmission Safety Enhanced Design Vital signs Clinical Info Reconciliation Amendments (HIPAA privacy) Reportable labs transmission Quality management system Problem list Incorporate lab results Auto-log offCancer info Medication list Send labs to ambulatory providers Emergency AccessCancer transmission Med allergy list Data portability End-user device encryption CDSIntegrity E-notes Accounting of disclosures (HIPAA Privacy) Drug-formulary Smoking status Image resultsBase EHR Family HHx Patient List Creation EMAR Advance Directives 13

15 Helpful Clarifying Terminology Edition FR: Redefining Certified EHR Technology and Related Terms 1. Certified EHR Technology (CEHRT)— SPECIFIC to the EHR Incentive Program (MU)Certified EHR Technology (CEHRT) 2. Base EHR Definition (this subsumes the term “qualified EHR”)Base EHR Definition 3. Complete EHR Definition (going away with next rule- making as noted in 2014 Edition Release 2 FR. No effect on certification to the 2014 Edition.)Complete EHR Definition 2014 Edition Release 2 FR

16 MU Measure CMS sets specific provider performance metric related to the use of certified capabilities + Certified Capability in EHRDemonstration of MU = ONC adopts certification criteria that specify technical capabilities for EHR technology An eligible provider reports their performance on each MU metric to CMS in order to receive an incentive or avoid a penalty  An EP must e-prescribe more than 40% of their Rx’s  EHR technology required to be able create standardized e-Rx file  EP attests 73%  An EP must implement 5 CDS interventions related to 4 or more CQMs  EHR technology required to enable multiple CDS related functionalities  EP attests “yes”  An EP must record problems in standardized data for more than 80% of patients  EHR technology required to be able to record problems in SNOMED CT  EP attests 93% ONC Health IT Certification and CMS Meaningful Use Relationship 15

17 Certified HIT Product List All products at Test Result Transparency: The final rule requires that ONC-ACBs submit a hyperlink of the test results used to issue a certification to a Complete EHR or EHR Module.hyperlink of the test results Includes information on what was tested, write-ups on the “usability” assessments performed, and more. 16

18 Certified Health IT Surveillance ONC-AA performs surveillance & technical assessment of the ONC- ACBs – According to ONC program requirements and standards governing certification bodies ONC-ACBs perform surveillance of certified Health IT products – Proactive and reactive Proactive: ONC priority areas of Exchange, Safety, Security, Population Management (quality measurement) Reactive: product complaints, large number of inherited certified status request, etc. – ONC Surveillance guidance/ priority areas: First/ CY acb_2013annualsurveillanceguidance_final_0.pdfhttp://www.healthit.gov/sites/default/files/onc- acb_2013annualsurveillanceguidance_final_0.pdf Report for CY 2014 due to ONC late Feb 2015 Second, CY acb_cy15annualsurveillanceguidance.pdfhttp://www.healthit.gov/sites/default/files/onc- acb_cy15annualsurveillanceguidance.pdf Plans for CY 2015 just received and under review by ONC NIST NVLAP performs surveillance of the ONC-ATLs 17

19 Questions/Discussion /Contacts ONC Health IT Certification Program: Certified Health IT Product List: ONC Certification team mailbox: 18

20 Lana Lowry NIST HIT Usability Project Lead

21 Only safety related usability must be evaluated Helps vendors, hospitals, and other stakeholders to ensure that EHR use errors are minimized Provides technical guidance for summative usability evaluations prior to deployment or implementation of an HER The summative usability testing evaluation is meant to be independent from factors that engender creativity, innovation, or competitive features of the system Examples of safety-related usability issues that have been reported by healthcare workers include poorly designed EHR screens that slow down the user and might sometimes endanger patients, warning and error messages that are confusing and often conflicting, and alert fatigue (both visual and audio) from too many messages, leading users to ignore potentially critical messages. EHR Usability Protocol (EUP) – a Core Validation Tool 20

22 Focus of the EUP Clearly distinguishes between usability aspects that pertain to user satisfaction and usability features that impact clinical safety Limited critical usability aspects that pertain to the clinical safety must be embedded into the system and must be required as a core functionality not a competition feature; a "barrier to entry" to the marketplace on safety is an expected outcome. Typical measures for clinical safety are adverse events (wrong patient, wrong treatment, wrong medication, delay of treatment, unintended treatment). Accepted usability/safety standards should be considered industry standard practices. Any company is free to go above and beyond the basic standard, however the minimum standards for usability in safety enhanced design should be established and articulated to address patient safety. 21

23 EUP is a model for understanding relationship between usability issues and patient safety outcomes through: Step I. Usability Application Analysis led by the development team, which identifies the characteristics of the system’s anticipated users, use environments, scenarios of use, and use related usability risks that may induce medical errors Step II. Expert Review/Analysis of EHR Application, an independent evaluation of the critical components of the user interface in the context of execution of various use case scenarios and usability principles Step III. Usability Testing, involving creation of a test plan and then conducting a test that will assess usability for the given EHR application including use efficiency and presence of features that may induce potential medical errors EUP Key Areas 22

24 Three-step process for design evaluation and human user performance testing for the EHR 23

25 EUP emphasis is on ensuring necessary and sufficient usability validation and remediation has been conducted so that (a) use error is minimized and (b) use efficiency is maximized. The EUP focuses on identifying and minimizing critical usability issues The intent of the EUP is to validate that the application’s user interface is free from critical usability issues and supports error-free user interaction Objective of the EUP 24

26 Elimination of “never events” Identification and mitigation of critical use errors Identification of areas for potential UI improvement and record user acceptance / satisfaction Report summative testing results in CIF (Common Industry Format) Goal of the EUP

27 Relationship between EUP and NIST UCD Guidelines 26

28 UCDEUP User Needs, Workflows & EnvironmentsApplication Analysis Engage UsersExpert Review/Analysis of EHRs Set Performance ObjectivesApplication Analysis DesignApplication Analysis Test and EvaluateUsability Testing EUP is a Key Component of UCD 27

29 Performance is examined by collecting user performance data that are relevant indicators of the presence of safety risks These measures may include, but are not limited to, objective measures of successful task completion, number of errors and corrected errors, performance difficulties, and failures to complete the task successfully or in proper sequence Performance is also evaluated by conducting post-test interviews focused on what users identify as risks based on confusion or misunderstanding when carrying out directed scenarios of use The goal of the validation test is to make sure that critical interface design issues are not causing patient safety-related use error; in other words, that the application’s user interface supports error-free user interaction How to Measure Performance? 28

30 Theoretical Example of EUP Scenario 29

31 Theoretical Example of EUP Scenario 30

32 Theoretical Example of EUP Findings NUse Errors in Usability Test Severity Rating 6/18 When reviewing a patient chart, the clinician does not detect that new lab critical results are available 2 17/18 User copies the wrong (older) data from one note to the current note being written 3 1/18 When mcg/kg/min is ordered, the user incorrectly selects units of mcg/hr and accepts the wrong dose 4 15/18 A medication schedule is changed, the user fails to select the “refresh” button in the menu and does not comply with the new medication schedule 2 Scale is 1 (least) to 4 (most) 31

33 Relating Usability and Patient Safety 32

34 Example: Wrong Patient Record Open EHR: Patient AEHR: Patient B Imaging: Patient A 33

35 Example: Wrong Mode for Action Direct Dose Mode (mcg/min) Weight Dose Mode (mcg/kg/min) 34

36 Example: Incomplete Data Displayed Lidocaine Hydrochlor 35

37 Example: Data not Readily Available 80 mg 36

38 Definitions Usability: How useful, usable, and satisfying a system is for the intended users to accomplish goals in the work domain by performing certain sequences of tasks Workflow: A set of tasks – grouped chronologically into processes – and the set of people or resources needed for those tasks that are necessary to accomplish a goal Workaround: Actions that do not follow explicit rules, assumptions, workflow regulations, or intentions of system designers 37

39 Relating Workflow and Patient Safety  Poorly supported work processes  suboptimal nonstandard care, poor decision support, dropped tasks  Missed information  delays in diagnosis, missed/redundant treatment, wrong patient  Inefficient clinical documentation  copy/paste, “smart text”, templates, scribes  Provider dissatisfaction  workarounds, slower adoption rates in specialty areas  High rates of false alarms  ignored alarms, alerts, reminders 38

40 Methods: Modeling with SMEs  Ambulatory care physicians; collegial discussions  Interdisciplinary team meetings – human factors, informatics, physicians  Process maps  Goal-means decomposition diagram  Insights for moving towards “patient visit management system” 39

41 Workflow “Buckets” in Ambulatory Care Before patient visit During patient visit Physician encounter Discharge Visit documentation 40

42 Balance workload Does pt have significant complexity? Clinical overview and review new findings/labs Review prior history and physical Before Patient Visit yes no 41

43 Balance workload Does pt have significant complexity? Clinical overview and review new findings/labs Review prior history and physical yes no Before Patient Visit 42

44 Balance workload Does pt have significant complexity? Clinical overview and review new findings/labs Review prior history and physical yes no Before Patient Visit 43

45 Balance workload Does pt have significant complexity? Clinical overview and review new findings/labs Review prior history and physical yes no Before Patient Visit 44

46 During Patient Visit Physician Encounter Before Patient Visit Discharge Visit Docm 45

47 Recommendations for EHR Developers Increase efficiency: – Reviewing results with the patient – Drafting pre-populated orders to be formally executed later – Supporting drafting documentation with shorthand notations without a keyboard Design for empathetic body positioning/eye contact Support dropping tasks and delaying task completion Verification of alarms and alerts and data entry without “hard stops” 46

48 Recommendations for Ambulatory Care Moderate organizational design flexibility Design room to support patient rapport & EHR access Minimize redundant data entry via interoperability Reduce clinic pace or increase flexibility of pace Ensure functionality that supports continuity in task performance in the case of interruption Relax requirements to enter detailed data for others during fast-paced patient visits 47

49 Next Meeting: Friday, December 12, :00 PM-3:00 PM Eastern Time 48

50


Download ppt "November 7, 2014 Health IT Implementation, Usability and Safety Workgroup David Bates, chair Larry Wolf, co-chair."

Similar presentations


Ads by Google