Presentation is loading. Please wait.

Presentation is loading. Please wait.

Open Source Technical Support and Working Group Services for VA VistA Contract Number: VA118-16-C-0841 April 21, 2016 SLIN 0002AD – Open Source Software.

Similar presentations


Presentation on theme: "Open Source Technical Support and Working Group Services for VA VistA Contract Number: VA118-16-C-0841 April 21, 2016 SLIN 0002AD – Open Source Software."— Presentation transcript:

1 Open Source Technical Support and Working Group Services for VA VistA Contract Number: VA118-16-C-0841 April 21, 2016 SLIN 0002AD – Open Source Software and Product Selection Criteria Initial Submission

2 Open Source Software and Product Selection Criteria Overview Approach Product Selection Criteria and Tool Success Factors and Challenges Next Steps Note: Selection Criteria and Scoring Tool v1.0 (provided separately in Excel format) Contents

3 Open Source Software and Product Selection Criteria Analysis Overview

4 Conduct “Discovery” activities, performing research and analysis to identify open source EHR products, code, and toolsets that align to, or would further enhance/expand upon, the feature set requirements as defined in the VistA 4 Product Roadmap. Support the alignment of open source products and VistA needs: –Produce a Gap Analysis of priority features and functions required to make progress with VA’s VistA vision, with primary emphasis on how that vision is elaborated in the Feature Set delivery schedule per the VistA 4 Product Roadmap. –Subsequently, overlay the findings of the Gap Analysis and SWOT Analysis to document detailed Open Source Software and Product Selection Criteria. –The Contractor shall utilize the VA open source software selection criteria to measure the degree to which open source candidates may fulfill the capability gaps. Assess Open Source Product Candidates for VA VistA Intake

5 Open Source Software and Product Selection Criteria Open Source Software and Product Selection SOW Requirement (section 5.2.1)Slide Numbers 1. Consolidates and prioritizes with VA the functional, technical, and performance attributes of VistA Feature Set or non-VistA Feature Set variables for further investigation 13-14 2. Documents the constraints and assumptions or “boundary conditions” which define imposed limitations that can be physical or programmatic (e.g., specifying the latest acceptable initial operational capability (IOC) date illustrates a programmatic constraint) 15 3. Elaborates capability gaps identified in the respective BRDs and RSDs16, 19 4. Elaborates the extent to which the code has been vetted and tested by the open source community, and the extent to which that code may have been previously certified via automated testing and peer review which has verified the safety, compliance and functionality of the code both prior to and after new code submissions 17 5. Assigns a quantitative metric by which to measure open source product attributes against functional, technical, capacity, performance, interoperability, and security requirements criteria. Additionally, the Contractor’s assessment shall include implementation criteria by which to assess the ease of integrating the open source code in the corresponding VA VistA application and with the application’s internal VA VistA interfaces. 20-22

6 Approach

7 Identify initial set of open source criteria Add identified Feature Set 3 gaps as criteria for the initial quarters Develop Selection Criteria and Scoring Tool Plan to mature the content of selection criteria over the next several quarters Define, mature, and synergize the relationships of other Capabilities Based Assessments (CBA) content to the selection criteria Approach

8 Approach Overview

9 Q1 (current) –Feature set 3 gaps included based on Q1 gap analysis –Best practice criteria included –Scoring tool developed based on criteria; will be used to evaluate the next set of relevant open source software candidates Q2 –Include additional feature set 3 gaps based on Q2 gap analysis –Add initial set of security selection criteria per TWG and related discussions –Refine best practice criteria, scoring, and weighting based on feedback –Incorporate stakeholder perspective from interviews conducted during Q2 Q3+ –Incorporate new gaps as they are identified in the gap analysis –Include additional security criteria –Continue to refine best practice criteria, scoring, and weighting –Continue to mature the product selection criteria Selection Criteria – Quarterly Maturation Plan

10 The Open Source Software (OSS) and Product Selection Criteria will incorporate additional variances as they are identified through subsequent Gap Analyses The Product Selection Criteria in addition to the Selection Criteria and Scoring Tool will be used to screen OSS for SWOT analysis The selection criteria will be iterated in conjunction with the Gap Analysis findings and information gathered to screen for SWOT candidates with the most potential positive impact Integration with Work Products

11 Product Selection Criteria and Tool

12 Selection criteria developed for multiple areas Criteria cover the full breadth of relevant elements –Include VA-specific elements and gaps Product Selection Criteria Overview

13 Programmatic Constraints & Boundary Conditions Functional Fit / Capability Gaps Technical, Capacity, Performance, and Interoperability Implementation Risks Specific VistA Gaps to be Filled Security –Weighting TBD, to be determined as criteria mature Selection Criteria Areas

14 Each criteria supports selection against functional, technical, and performance attributes Specific VistA / VA criteria from Gap Analysis and newly emerging information from VA Criteria phrased for consistent scoring Specific Criteria Applied to Each Area

15 Fits with Roadmap plans - timing No significant physical, logistical, or other constraints No additional open source version improvements likely, timing of intake good (vs. improvements by others anticipated, too early to use) Speeds substantive time-to-value for VA in the area Complies with mandates relevant to implementation Programmatic Constraints & Boundary Conditions Criteria

16 Fills Implementation Gaps –Capability gaps identified in BRDs and RSDs Fills Vision Gaps –Capability gaps identified by comparing implementation plans against the broad VE vision Measurably improves delivery of healthcare and/or access improvements Software can perform business functions at a high-level of quality and reliability Software’s interface is user friendly Functional Fit / Capability Gaps

17 Application is interoperable and integrates well with VistA architecture Data and data exchange are interoperable High level of code quality and reliability - certified High capacity and scalability High quality of software documentation Minimal-to-no infrastructure changes required Software is rapidly responsive to users (speed of performance) Minimal-to-no software modifications required Software is easily maintainable – technical and business rules Software has minimal-to-no operational support requirements No licensing or copyright issues such as license mismatch Technical, Capacity, Performance, and Interoperability

18 Low level of business risk for implementation of new processes and cultural change Low level of software technical integration and complexity risk Impact and rollout risks are very low Implementation cost is low Implementation Risks

19 Scheduling risks include development of standardized information sharing for scheduling data exchange, both internal and external to the VHA Ability to use population level data to assess quality of care at the institutional protocol level (e.g., how well is one care team doing versus another with their pool of patients) EHR with analytics, cloud, and patient experience capabilities –VA CIO LaVerne Council, Congressional Testimony, April 14, 2016 http://www.healthcareitnews.com/news/cio-laverne-council-says-va- needs-new-ehr-analytics-cloud-patient-experience-capabilities Specific VistA Gaps to be Filled

20 The Product Selection Tool provides quantitative metrics by which to measure open source product attributes –Provides score for each relevant open source software candidate –Assesses candidate against the criteria across all areas –Tool developed in Excel to calculate specific criteria which are weighted to emphasize most important measures, balance criteria across areas, and provide “tuning” capability for areas and criteria emphasis Product Selection Tool

21 Areas weighted evenly (20% each for Q1 version) –Security is currently not weighted since the criteria is TBD –Weights can be adjusted later based on prioritization Weighted Scale: 2 - Overweight 1 - Neutral 0.5 – Underweight Product Selection Tool Weighting Criteria

22 Criteria Scoring Scale 1.0 - Candidate fully satisfies business requirement or decision criterion. 0.5 – Candidate partially satisfies business requirement or decision criterion. 0.0 - Unknown or null/balanced (The candidate neither satisfies nor dissatisfies business requirement or decision criterion.) -0.5 - Candidate partially dissatisfies business requirement or decision criterion. -1.0 - Candidate fully dissatisfies business requirement or decision criterion Product Selection Tool Criteria Scoring System

23 Success Factors and Challenges

24 Focus on business value –Identify functional, business, and technical selection criteria to focus efforts on analysis of open source candidates that meet VA criteria Use flexible approach to content and document development which accommodates: –Initial development of selection criteria with longer term plan and maturation over time based on usefulness and feedback from VA and community Most selection criteria will be stable over time, and weighting may evolve Criteria regarding specific gaps will evolve Considerations around security criteria are emerging Success Factors

25 There are many documents describing the aspects of VistA Evolution (VE). These various documents give rise to issues such as: –Content overlap with varying degrees of currency –Assorted elements (KPIs, metrics, etc.) describing aspects of the VE target –No prior VE open source selection criteria to work from or use as templates –VE plans and implementations are continuously evolving, and document updates lag the changes Alignment of the selection criteria within specific gaps will be continuously shifting –Criteria around VistA gaps and open source software security will need to adjust as these issues are discussed and identified Challenges

26 Next Steps

27 Use selection prioritization criteria and Selection Criteria Scoring Tool to assess open source candidates for Q2 Mature selection prioritization criteria for Q2 –Expand the criteria for security and Feature Set 3 –Adjust the criteria weighting and scoring if needed, based on experience using the criteria for Q2 candidates –Incorporate feedback from this version –Integrate with other products Incorporate OSEHRA community input Next Steps


Download ppt "Open Source Technical Support and Working Group Services for VA VistA Contract Number: VA118-16-C-0841 April 21, 2016 SLIN 0002AD – Open Source Software."

Similar presentations


Ads by Google