Presentation is loading. Please wait.

Presentation is loading. Please wait.

XLC Gate Review Consolidated Slide Deck

Similar presentations


Presentation on theme: "XLC Gate Review Consolidated Slide Deck"— Presentation transcript:

1 XLC Gate Review Consolidated Slide Deck
Centers for Medicare & Medicaid Services eXpedited Life Cycle (XLC) XLC Gate Review Consolidated Slide Deck BlueButton On FHIR [Clarity ID:] Phase: Architecture Review Slide Deck Overview The Expedited Life Cycle (XLC) Gate Review Consolidated Slide Deck encompasses the following XLC stage gate reviews: Architecture Review (AR) Preliminary Design Review (PDR) Detailed Design Review (DDR) Operational Readiness Review (ORR) Post Implementation Review (PIR) Disposition Review (DR) Use the deck for the entire life of the project. The slide decks are consolidated into a single deck to integrate the information necessary to meet the needs of the stage gate reviews. The project team should always review the story and make updates, as needed. Sections should not be removed from the presentation. If a section is not applicable, please indicate as such and provide an explanation. Additional slides may be added to convey information that you feel is important to share that is not addressed by this template. Please ensure that your presentation is Section 508 compliant by following this URL: Each slide in the deck has individual instructions. Please see each slide’s notes pages for more information about how to complete each slide. Questions and comments can be directed to the CMS Division of Information Technology Governance at Slide Instructions Enter the following information on the slide: Name of the project Clarity ID. If you do not have a Clarity ID, please enter “No Clarity ID” and one will be assigned by the Division of IT Governance. The Clarity ID will be used for identification and tracking purposes only. Target implementation date

2 Introductory Slides Slide Instructions:
All of the slides contained in this section (Introductory Slides) must be reviewed and updated (as needed) prior to every gate review throughout the life of your project.

3 Project Slide Deck Revision History
Introductory Slides Version Date Organization or Contributors Description of Changes 1.0 05/2015 OEDA Architecture Review (AR) 2.0 XX/XXXX Preliminary Design Review (PDR) 3.0 Detailed Design Review (DDR) 4.0 Operational Readiness Review (ORR) 5.0 Post Implementation Review (PIR) 6.0 Disposition Review (DR) Slide Instructions: This slide should be reviewed and modified before every TRB consult and/or XLC stage gate review. This slide captures the revision history for the project’s slide deck. Use this page whenever major changes are made to the slide deck. The text that is shown in the table is for example purposes only and may be modified.

4 System Summary Project Summary CMS BlueButton on FHIR (BBonFHIR)
Introductory Slides CMS BlueButton on FHIR (BBonFHIR) The current BlueButton web platform at CMS has been used by more than one million beneficiaries. This is a valuable beneficiary service but it leaves the beneficiary to do the “heavy lifting” of reconciling and using the data the download provides. BBonFHIR implements an upgraded data service, enabling beneficiaries to connect their MyMedicare.gov data to the applications and services they trust including research platforms, such as PCORnet. This is a key tenet of the Precision Medicine Initiative. BBonFHIR establishes structured data formats for BlueButton data, develops and implements secure Application Programming Interface (API) services allowing beneficiaries to automate the connection of their data to third party applications, such as PCORI research tools. BBonFHIR is based on Open Source technologies. Code and file formats will be in the public domain, encouraging wider adoption across healthcare. The data service uses the public domain HL7 FHIR framework. This benefits application developers by creating standard integration design patterns thereby speeding and simplifying implementation. BBonFHIR directly benefits PCORI and other research entities while also creating an integration model for the industry with structured data formats and standard interfaces making it simpler for beneficiaries to automate linking their data to research studies. BBonFHIR benefits CMS: By creating an easily consumable Data API Beneficiary requests for their Health and Claims information can be satisfied – a source of more than 80% of FOI requests. The API interface also enables more granular control of data use by third parties than is possible on the current platform. Slide Instructions: This slide shouldn’t change from one stage gate review to the next unless the scope of the project has changed. If the project’s scope changes, please highlight the changes/additions on this slide. The information on this slide should come from Section 3 of the Business Case, the Project Charter, or the Intake Form (or other source if this information makes it clearer). Here are some examples of the type of information that can be included on this slide. Please feel free to include additional information, as needed. Business Need: Insert condensed business need and drivers from the Business Case. This is the description of how the Business Case is satisfied. Goals/Scope/Purpose: Insert condensed goals/scope from the Business Case. For Disposition Review Only: For the Disposition Review (DR), describe the conditions supporting disposition of the system. For the Disposition Review, this slide should be updated to include specifics of the disposition effort. It should describe the problem to be solved and why the system is no longer necessary. Describe why disposition is called for (have requirements changed, is the need no longer valid, does technology improvement necessitate a migration, is the current implementation too expensive). This could be information for Lessons Learned to be applied to future projects.

5 High Level Business Concept Graphic
Introductory Slides Educate What new features mean Taking responsibility for their data sharing Awareness of risks Managing the connections Authenticate Identity verification SLS Verification Acknowledge Agree to data sharing terms and responsibilities Beneficiary enables advanced BlueButton API Original BB unchanged Advanced BB API Beneficiary connects from 3rd Party App and authorizes data sharing on a per app basis Slide Instructions: This slide should be reviewed and modified (if necessary) before every TRB consult and/or XLC stage gate review. If information changes, highlight the change and provide an explanation. Insert a graphic providing a high-level overview of the business process. Authenticate Beneficiary authenticates to CMS account Authorize Beneficiary Authorizes data sharing with 3rd party app Connect 3rd Party App performs unattended data request on beneficiary’s behalf Advanced BB API 3rd Party App

6 Presenters Introductory Slides Gate Review Presenter’s Name
Role and/or Organization Architecture Review Mark Scrimshire CMS Entrepreneur-in-Residence Niall Brennan CMS Chief Data Officer (OEDA) Slide Instructions: This slide should be reviewed and modified (if necessary) before every TRB consult and/or XLC stage gate review. Since presenters may change from one stage gate review to the next, please check this slide for each review and make any necessary changes. Identify the presenter(s) for each gate review, indicating each individual’s role and/or organization. Please note that the Business Owner must be present.

7 Key Project Resources Introductory Slides Project Role
Name & Organization Project Responsibilities Business Owner Niall Brennan, OEDA Project Sponsor Project Manager Mark Scrimshire, OEDA Architect COR TBD Program Coordination System Maintainer Application Development Organization(s) OEDA / OC / OTS Other Contracting Organizations Information System Security Officer Hosting Data Center Slide Instructions: This slide should be reviewed and modified (if necessary) before every TRB consult and/or XLC stage gate review. If key resources have changed, this slide needs to be updated as does the project charter. Insert the key resources as identified in Section 4.1 of the Project Charter. At a very minimum, this should include business owner, project manager, COR and contractors.

8 Project Stakeholders Stakeholders Beneficiaries
Introductory Slides Stakeholders Beneficiaries Ability to connect their CMS data with the applications and services that they trust OEDA Fewer data requests OC Enhanced ability to manage bad actors attempting to access CMS beneficiary data 3rd Party Developers Easier access to CMS data via beneficiaries that provide permission Research Organizations e.g. PCORI. Easier access to CMS data for beneficiaries engaged in research programs Slide Instructions: This slide should be reviewed and modified (if necessary) before every TRB consult and/or XLC stage gate review. If information changes, highlight the change and provide an explanation. Project Stakeholders: Insert stakeholder information from Section 3.3 of the Business Case

9 Business Risks & Impacts
Introductory Slides Initial Costs: OEDA has submitted a funding application to the PCOR Trust Fund. If successful the funding would help to defray the integration, testing and deployment costs involved in building and deploying the BBonFHIR platform. Project Management: BBonFHIR is currently seeking a qualified Project Manager to coordinate activities across the stakeholder groups involved with the project. Project Resources: Integration with the existing source systems for BlueButton data (IDR) and beneficiary-facing services (MyMedicare.gov, NGD) will require resource commitments from specialists and contractors who are outside the core BBonFHIR project team. Slide Instructions: This slide should be reviewed and modified (if necessary) before every TRB consult and/or XLC stage gate review. If information changes, highlight the change and provide an explanation. Business Risks/Issues: Insert condensed business risks/issues from Section 3.4 of the Business Case.

10 Project Complexity Determination and PPA
Introductory Slides Project Characteristic Your Project’s Level? 1, 2, or 3 Shared Services Implications 3 Program / Business Process Profile (with Design / Development Implications) Privacy Implications Security Implications (based on Information Type [1] processed, accessed, stored, or transmitted) 2 Data Complexity (ties to data’s financial implications) 1 Interface Complexity Slide Instructions: This slide should be reviewed and modified (if necessary) before every TRB consult and/or XLC stage gate review. If information changes, highlight the change and provide an explanation. Update this slide whenever the PPA is updated. Complexity Level: Indicate your project’s complexity level by using the Complexity Worksheet tab in the Project Process Agreement (PPA). Provide a copy of your Project Process Agreement.

11 Business Performance Goals and Measures
Introductory Slides Measurement Area Measure- ment Category Measure- ment Group Measure- ment Indicator Target Surveillance Method Bench- mark (if applicable) Perform- ance Test Results Mission and Business Results Customer Results Process and Activities The information on this slide is to be reviewed and modified (if necessary) before every TRB consult and/or XLC stage gate review, especially prior to the AR and PIR. Note: There may be multiple measurement categories, measurement groups, and/or indicators for each measurement area. You may provide a separate spreadsheet containing this information, if desired. Slide Instructions: Using the table shown on this slide, provide the Business Performance Goals and Measures that will be used to assess the degree of success for this project. The Measurement Area, Measurement Category and Measurement Indicator should be based upon the Federal Enterprise Architecture (FEA) Performance Reference Model (PRM), more information for which can be found within the FEA Consolidated Reference Model (CRM) document located here: Please reference the FEA CRM document as an aid to help establish and document performance goals and measures. Measurement Category Measurement Area ([1] Mission and Business Results; [2] Customer Results; [3] Process and Activities) In preparation for the Architecture Review (AR) or the initial TRB consult, populate these table columns: Target Measurement Indicator Measurement Group Update information previously entered (as needed) Benchmark (if applicable) Surveillance Method Enter Performance Test Results for each Measurement Indicator In preparation for the Post Implementation Review (PIR), populate these table fields: Definitions Note: Get more information about Measurement Areas, Categories, Groups, and Indicators from The Business Area Performance Measurement Categories are listed and defined below and include their associated groups. Measurement Groups: Financial Management Group, Costs Group, Planning Group, Savings and Cost Avoidance Group Financial Category - Achieving financial measures, direct and indirect total and per unit costs of producing products and services, and costs saved or avoided . Measurement Groups: Productivity Group, Efficiency Group Productivity Category – The amount of work accomplished per relevant units of time and resources applied. Measurement Groups: Cycle Time Group, Timeliness Group Cycle Time and Timeliness Category - The time required to produce products or services. Measurement Groups: Errors Group, Complaints Group Quality Category - Error rates and complaints related to products or services. Measurement Groups: Security Group, Privacy Group Security and Privacy Category - The extent to which security is improved and privacy addressed. Measurement Groups: Participation Group, Policies Group, Compliance Group, Risk Group, Knowledge Management Group, Innovation and Improvement Group Management and Innovation Category - Management policies and procedures, compliance with applicable requirements, capabilities in risk mitigation, knowledge management, and continuous improvement. A defined indicator that provides a way to measure performance. Note: Project teams should define measurement indicators based on the stated business need. Examples of Business Area Performance Measurement Indicators, Process and Activities, are as follows: Percentage of projects done on time (under the Financial Category and the Planning Grouping); Number of security incidents (under the Security and Privacy Category and the Security Grouping); or Number of risks identified (under Management and Innovation Category and Risk Grouping). The parameters that comprise the Measurement Indicator that may be documented as a performance requirement or derived from a high-level requirement.     Indicate how this measurement will be recorded. E.g., Survey, Audit, Report, Periodic Evaluation, etc… Benchmark The Proof of Concept or test results for the measurement indicator. The results of the performance tests. Performance Test Results

12 Technical Performance Goals and Measures
Introductory Slides Technology Component Measure- ment Category Measure- ment Group Measure- ment Indicator Target Surveillance Method Bench- mark (if applicable) Perform- ance Test Results The information on this slide is to be reviewed and modified (if necessary) before every TRB consult and/or XLC stage gate review, especially prior to the PDR, DDR and ORR. Note: There may be multiple measurement categories, measurement groups, and/or indicators for each measurement area. You may provide a separate spreadsheet containing this information, if desired. Slide Instructions: Using the table shown on this slide, provide the Technical Performance Goals and Measures that will be used to assess the degree of success for this project. The Measurement Area, Measurement Category and Measurement Indicator should be based upon the Federal Enterprise Architecture (FEA) Performance Reference Model (PRM), more information for which can be found within the FEA Consolidated Reference Model (CRM) document located here: Please reference the FEA CRM document as an aid to help establish and document performance goals and measures.    In preparation for the Preliminary Design Review (PDR) and the Detailed Design Review (DDR), populate these table columns: Measurement Group Measurement Category Technology Component Target Measurement Indicator Update information previously entered (as needed) Benchmark (if applicable) Surveillance Method Enter Performance Test Results for each Measurement Indicator In preparation for the Operational Readiness Review (ORR), populate these table fields: Definitions Note: Get more information about Measurement Areas, Categories, Groups, and Indicators from A component or service to be measured for performance. The Technology Area Performance Measurement Categories are listed and defined below and include their associated groups. Please note these are NOT exhaustive lists. Measurement Groups: Overall Costs, Licensing Costs, Support Costs, Operations and Maintenance Costs, and Training and User Costs. Technology Costs Category -Technology-related costs and costs avoided through reducing or eliminating IT redundancies. Measurement Groups: Functionality, IT Composition, and Standards Compliance and Deviations. Quality Assurance Category -The extent to which technology satisfies functionality or capability requirements or best practices, and complies with standards. Measurement Groups: System Response Time, interoperability, Accessibility, Load Levels, and Technology Improvement. Efficiency Category - System or application performance in terms of response time, interoperability, user accessibility, and improvement in technical capabilities or characteristics. Measurement Groups: External Data Sharing, Data Standardization or Tagging, Internal Data Sharing, Data Reliability and Quality, and Data Storage. Information and Data Category -Data or information sharing, standardization, reliability and quality, and storage capacity. Measurement Groups: Availability and Reliability. Measurement Category: Reliability and Availability-System or application capacity, availability to user, and system or application failures. Measurement Groups: User Satisfaction, User Requirements, IT Contribution to Process, Customer or Mission. Measurement Category: Effectiveness-Extent to which users are satisfied with the relevant application or system, whether it meets user requirements, and its impact on the performance of the process(es) it enables and the customer or mission results to which it contributes. A defined indicator that provides a way to measure performance. Note: Project teams should define measurement indicators based on the stated business need. Examples of Technology Area Performance Measurement Indicators are as follows: Maximum Transaction Response Time (under the Efficiency Measurement Category and the System Response Time Measurement Grouping); Maximum Payload Size (under the Efficiency Measurement Category and the Load Levels Measurement Grouping); or Mean Time Between Failures (MTBF) (under Reliability and Availability Measurement Category and Reliability Measurement Grouping). The parameters that comprise the Measurement Indicator that may be documented as a performance requirement or derived from a high-level requirement. Indicate how this measurement will be recorded. E.g., Survey, Audit, Report, Periodic Evaluation, etc… The Proof of Concept or test results for the measurement indicator. Benchmark The results of the performance tests. Performance Test Results

13 Project Management Update: Milestones
Introductory Slides Milestone Target Date Prototype design proposal completed 11/15 Implementation of internal test platform 01/16 Build out of BBonFHIR production service for pilot launch 05/16 BBonFHIR pilot launch 09/16 Launch of full production of BBonFHIR service 01/17 Update data formats and software code for publication 03/17 Slide Instructions: The information on this slide is to be reviewed and modified before every TRB consult and/or XLC stage gate review. If information changes, highlight the change and provide an explanation. Insert high level milestones from Section 7 of the Project Charter or from the project schedule.

14 Project Management Update: Status
Introductory Slides 1 Review current BlueButton download Complete 2 Evaluate FHIR Code base 3 Develop high level design In Progress 4 Architecture Review Slide Instructions: The information on this slide is to be reviewed and modified before every TRB consult and/or XLC stage gate review. If information changes, highlight the change and provide an explanation. Provide an update on the overall health of the project. A dashboard may be appropriate for this slide.

15 Project Management Update: Risks & Issues
Introductory Slides Support from OTS & OC to support detailed design work: Integration with MyMedicare.gov Integration with api.data.gov Adaptation of IDR Export process to feed FHIR server Development of migration plan to replace IDR Export with API access to future beneficiary data lake Integration with EIDM for Developer Account creation Integration with SLS for beneficiary verification Slide Instructions: The information on this slide is to be reviewed and modified before every TRB consult and/or XLC stage gate review. If information changes, highlight the change and provide an explanation. Provide an update on the project risks and issues, including the probability of occurrence and the mitigation for each risk.

16 Project Management Update: Cost & Schedule
Introductory Slides Under run, Behind Under run, Ahead Slide Instructions: The information on this slide is to be reviewed and modified before every TRB consult and/or XLC stage gate review. Plot Cost Performance Index (CPI) against Schedule Performance Index (SPI) for reporting intervals (usually monthly) leading to PIR. CPI = earned value (or budgeted cost of work performed)/actual cost of work performed This is a measure of cost efficiency or productivity - for every dollar spent, how much work is being completed When planned costs for completed work match actual expenditures, CPI = 1 Expected range: .95 < Project CPI < 1.05 SPI = earned value (or budgeted cost of work performed)/budgeted cost of work scheduled This is a measure of general schedule status – how close to plan is work being completed When planned costs for completed work match the planned timing for spending money, SPI = 1 Expected range: .95 < Project SPI < 1.05 For CPI or SPI <.95 or > 1.05 Project should have corrective action(s) in place and should report on corrective actions’ effectiveness. This information can be copied from monthly project reviews. Project Duration Short (< 12 month?) Project Report the cost and schedule numbers – option to replace Bulls Eye Chart Long (> 12 months?) Project Cost Performance Index vs. Schedule Performance Index How to Update Graphic: Double click on plot to open Excel. Select “Data” tab on lower left of Excel window to enter/paste values from monthly reporting. Select “Chart” tab to show updated graph. Click outside Excel window to return to PowerPoint.] Summarize CPI and SPI Trend Summarize effectiveness of ongoing corrective actions Cost Discuss budget sufficiency: Show Actuals-to-date compared to the Plan Report Estimate at Complete (EAC) and compare to budget Show management reserve (it is part of EAC) and compare to Budget-to-go Compare current CPI from previous slide to Complete Performance Index (TCPI) Schedule Discuss schedule sufficiency: Show top level Integrated Master Schedule (IMS) Discuss any changes to the Critical Path Discuss stability of float Discuss any tasks that are or completed significantly behind schedule Highlight schedule reserve leading to next major event Overrun, Behind Overrun, Ahead

17 508 Compliance Status Introductory Slides Solution is being developed with a front-end provided by a Python/Django web front-end enabling 508 compliance for beneficiary web interactions The core of the solution is a Data-as-a-Service that has no direct end user interaction. Developer account management will also be handled via a Python/Django web front-end that enables 508 compliance. Slide Instructions: The information on this slide is to be reviewed and modified (if necessary) before every TRB consult and/or XLC stage gate review. If information changes, highlight the change and provide an explanation. Provide an update on the system’s 508 compliance status.

18 Records Management Introductory Slides Record retention for BBonFHIR should replicate the controls used for the current BlueButton repository in the MyMedicare.gov platform A review of the 3 year time limit for BlueButton should be reviewed and a decision taken on extending the length of past history. This should be reviewed based on the ability to satisfy FOI requests from beneficiaries using BlueButton data A Migration path should be developed to enable the transition of the BBonFHIR data repository from using the IDR Export process to a direct access to a planned beneficiary data lake provided by the Enterprise Data Warehouse Beneficiary generated health information is currently stored in NGD. A transition plan should be developed to migrate from using an MQ trigger to synchronize updates between NGD and BBonFHIR to where MyMedicare.gov stores and retrieves beneficiary generated health data directly from BBonFHIR using the FHIR REST API. BBonFHIR will allocate a unique and anonymous GUID key to each beneficiary that enables BBonFHIR. This key will be used by external applications to identify a unique beneficiary account. This key will be used by back-end FHIR services to map to beneficiary accounts held in CMS Enterprise data stores. This is an established approach used by other health applications. It avoids the third party application having to store PHI for a data subject. Slide Instructions: The information on this slide is to be reviewed and modified (if necessary) before every TRB consult and/or XLC stage gate review. If information changes, highlight the change and provide an explanation. Provide an update on Records Management status. In order to ensure CMS’s Record Management policies, the project team is to consult with the Office of Strategic Operations and Regulatory Affairs (OSORA) for additional details on record retention and/or archival strategies. Provide Record Retention and Archival Strategies Describe where the system’s data will reside and identify any data exchanges that may occur. Alternatively, you may attach Section of the project’s Requirements document. Inputs: Identify all data (as well as the format of the data—paper, manual input, electronic data) supplied to the system as well as who/what is supplying the data. Provide instructions on what happens to the manual/electronic inputs after they are entered into the master file/database and are verified. Master Files – Provide a detailed description of the data maintained in the system/database. Provide detailed instructions for the retention and disposition of this data (where will the data be maintained, when will the data be deleted or destroyed). Outputs: List all reports, data sharing with other Agencies/CMS systems, etc. Provide instructions for how long the reports are needed for Agency business and when they should be destroyed/deleted. Is this system replacing a paper-based records system or an existing electronic system? If electronic, has the migration of the legacy data been addressed?

19 Deliverables Status Table
Introductory Slides Deliverable Name Status Slide Instructions: The information on this slide is to be reviewed and modified before every TRB consult and/or XLC stage gate review. Provide an update on the status of the deliverables required for the project as documented in the Project Process Agreement. You may need to insert additional slides to accommodate the number of required deliverables.

20 Acronyms Acronym Literal Translation Introductory Slides HL7
Health Level Seven International (HL7) is a not-for-profit, ANSI-accredited standards developing organization dedicated to providing a comprehensive framework and related standards for the exchange, integration, sharing, and retrieval of electronic health information that supports clinical practice and the management, delivery and evaluation of health services. FHIR Fast Health Interoperability Resources. A set of Resources that define Clinical Concepts BlueButton The mechanism that allows Patients and Beneficiaries to download their health data from an institution that has provided them with health services BBonFHIR The next generation platform delivering Beneficiary BlueButton data using the HL7-FHIR API framework PCOR Patient Centered Outcome Research EIDM Enterprise IDentity Management Slide Instructions: Provide a list of acronyms used in the presentation and specify each acronym’s associated literal translations. List the acronyms in alphabetical order using a tabular format, as shown on the slide.

21 Architecture Review (AR)
[Date of AR] Slide Instructions: This slide provides separation between the Introductory slides and the Architecture Review (AR) slides. Enter the date of the AR.

22 Alternatives Analysis: Alternative A
Architecture Review (AR) A draft proposal was reviewed from the MyMedicare.gov contractor. The design used the Microsoft .Net architecture and SOAP-based transactions to send CCDA documents to nominated Direct endpoints identified by the beneficiary. New information would be packaged as a CCDA document when it was received by the MyMedicare.gov platform and sent via SOAP to a HISP for transmission to a trusted endpoint. Estimated cost $5M. Issues: Limited consumer adoption of Direct Messaging, but established in the Provider community. Requires active management by the beneficiary Proprietary code base Perpetuates the current isolated infrastructure Provides no additional beneficiary identity verification Slide Instructions: Alternative A: Insert condensed alternatives analysis for Alternative A from Section 6.1 of the Business Case; indicate if this alternative is the preferred solution.

23 Alternatives Analysis: Alternative B (Preferred)
Architecture Review (AR) A FHIR-based alternative uses the public domain and open-source Java-based FHIR Server as a core service that provides a secure REST API with integrated OAuth Authorization protocols that enables approved third party applications to connect to the FHIR server and pull the allowed beneficiary information in JSON or XML format. In addition the FHIR server can generate CCDA documents and send via Secure SMTP through an out-sourced HISP service. This therefore provides a full service solution of CCDA/Direct Push and JSON/XML REST API Pull services, all built on a common core, open source code-base. As an interim solution this design would “piggy back” on the existing IDR export routines to provide the data for the BBonFHIR data repository. Since beneficiaries can choose to update their personal health information on the MyMedicare.gov portal with the information being stored in the beneficiary profile within NGD this data is included in the BlueButton download. Therefore, an MQ Web Service would be required to capture updates to this information in NGD and pass the information to BBonFHIR to update the patient generated data for the beneficiary. The BBonFHIR Service would be built using Open Source technologies using a core code base provided by developers both inside and outside of the Federal Government. A Presidential Innovation Fellow attached to the ONC has leveraged the HL7 FHIR code base to create a BlueButton export in JSON format. The prototype codebase uses Enterprise Java with Spring and Hibernate to isolate the underlying database structure. The deployment project would migrate the prototype codebase to use a robust enterprise database platform such as PostgreSQL or MongoDB to store the BlueButton data repository. Early prototypes have been deployed using Red Hat Linux and JBoss with database support provided by MySQL/MariaDB or PostgreSQL. Slide Instructions: Alternative B: Insert condensed alternatives analysis for Alternative B from Section 6.2 of the Business Case; indicate if this alternative is the preferred solution.

24 Alternatives Analysis: Alternative C
Architecture Review (AR) The fully integrated FHIR Service builds on the FHIR Server core and data repository outlined in Alternative B but includes the following optimizations: Remove Patient Generated Data from NGD and modify MyMedicare.gov portal to instead read and write the data via the FHIR Server’s REST API. Modify the MyMedicare.gov portal to generate the Text and PDF BlueButton files using the FHIR Server’s REST API. Replace the IDR Export process with an API Interface that allows the beneficiary BlueButton data to be pulled directly, in real time from a beneficiary “Data Lake” managed by the Enterprise Data Group. This would reduce the number of copies of duplicated beneficiary-related data in the FHIR Service and in MyMedicare.gov. Replace the current data export service also enables CMS to take advantage of back-end work to connect beneficiary data across plans, allowing additional data, such as for Dual Eligible beneficiaries, to be incorporated in to the BBonFHIR service with minimal changes. Slide Instructions: Alternative C: Insert condensed alternatives analysis for Alternative C from Section 6.3 of the Business Case; indicate if this alternative is the preferred solution.

25 Funding & Acquisition Architecture Review (AR) Solution design work for BBonFHIR is being covered under a CMS Entrepreneur-in-Residence program initiative. A $3.8M proposal has been submitted to the PCOR Trust Fund for FY’16-17 funding to support Integration, piloting and deployment of the BBonFHIR platform. Slide Instructions: Funding: Insert the funding source(s) for the preferred solution from the Business Case. Acquisition Strategy: Insert condensed acquisition strategy for the preferred solution from the Business Case or from the project’s Acquisition Strategy document.

26 Current System Architecture Review (AR) The current BlueButton service is hosted on MyMedicare.gov. The BlueButton file is created via a daily/weekly export from IDR. The export is stored in a data repository that is accessible to the web portal and used to generate the BlueButton file when requested by the beneficiary. The process enables registered Medicare Beneficiaries to download a text or PDF file. The file contains between 6 months and 3 years of personal claims data and any health information the beneficiary has recorded in NGD via the portal. IDR Part D Claims (PDE) Nat. Drug Codes (NDC) NGD Fast Export Provider info (Rx, Hosp., Agency) Slide Instructions: If applicable, insert a graphic providing an overview of the current system from the High-Level Technical Design document. PHR mainframe LDAP Connect: Direct MyMedicare.gov Cobol Header Login Export Trailer Export Current BB Txt/PDF Service Download 26

27 DRAFT Proposed System Architecture Review (AR) Slide Instructions:
Insert graphic providing an overview of the proposed system from Section 5.2 of the High-Level Technical Design document.

28 DRAFT Proposed System Architecture Review (AR) Slide Instructions:
System Scope: Insert condensed system scope from the High-Level Technical Design document. High Level Functional Requirements: Insert high-level functional requirements from the High-Level Technical Design document. Summary of Changes: Insert condensed summary of changes of the current system from the High-Level Technical Design document (if applicable).

29 Proposed System: Infrastructure & Security
Architecture Review (AR) DRAFT Slide Instructions: Platform – Insert platform information from Section of the High Level Technical Design. System Hosting – Insert hosting information from Section of the High Level Technical Design. Connectivity Requirements – Insert connectivity requirements from Section of the High-Level Technical Design. Modes of Operation – Insert modes of operation information from section of the High-Level Technical Design. Assurance Level – Insert the system security level by information type from Section 2.9 of the Information Security Risk Assessment. E-Authentication Level – Insert the e-authentication level from Section 2.10 of the Information Security Risk Assessment. Authorization – Insert user authorization approach from Section of the High Level Technical Design Encryption – Insert any anticipated needs to encrypt data from Section of the High-Level Technical Design.

30 Consult Request Appendix C TRB Consult – 05/28/15 Map conceptual architecture to CMS Enterprise Architecture System Architecture Network Architecture Security Architecture Identity Management Workflows MDM integration for beneficiary GUID creation Identify Migration sequencing NGD Beneficiary data migration to FHIR Transition to use of Beneficiary Data Lake Replacement of IDR Export process Slide Instructions: This is a blank slide that can be used for Technical Review Board consultations. Insert the date of the consult at the top right-hand corner of the slide Insert as many slides as needed for your consultation Enter the slide title and other content

31 Preliminary Design Review (PDR)
[Date of PDR] Slide Instructions: Note: Sample system architecture diagrams are available at this URL: If you do not have access to this URL, please contact your CMS point of contact to obtain them for you. This slide provides separation between the Architecture Review (AR) slides and the Preliminary Design Review (PDR) slides. Enter the date of the PDR. Note: In preparation for your project’s Design Reviews, model diagrams with examples of System Architecture, Technology Stack, Security Design, Performance Design, Physical Design, and Multi Data Center Integration can be accessed from the following SharePoint site pages. Since these example diagrams are located on an internal SharePoint site, if you would like to see the example diagrams but don’t have access to the site, please request them from your CMS project manager/project lead. PDF: Visio:

32 Architecture Review (AR)/Consult Findings
Preliminary Design Review (PDR) Slide Instructions: Identify any recommendations and/or findings from the Architecture Review (AR) and their resolution.

33 Design Considerations
Preliminary Design Review (PDR) Design parameters that emerged during initial discussions with OC and other IT representatives: Use separate infrastructure to minimize impact on existing production infrastructure Enhance beneficiary verification beyond current CMS issued userid and password Slide Instructions: Model diagrams with examples of System Architecture, Technology Stack, Security Design, Performance Design, Physical Design, and Multi Data Center Integration can be accessed from the following SharePoint site pages. Since these example diagrams are located on an internal SharePoint site, if you would like to see the example diagrams but don’t have access to the site, please request them from your CMS project manager/project lead. PDF: Visio: Summarize the design goals and guidelines, development methods and contingencies, and architectural strategies as described in the System Design Document.

34 Design Considerations: 508 Compliance, Performance, and Security
Preliminary Design Review (PDR) Slide Instructions: 508 Compliance: If applicable, briefly describe what 508 testing will be done, from Sections 5.4 & 5.5 of the System Design Document?] Performance: If applicable, briefly describe what hardware and software performance strategies will be implemented, from Sections 3, 4.2.2, & of the System Design Document? Security: If applicable, briefly describe what security features are being implemented, from Sections 3, & of the Systems Design Document? If applicable, briefly describe how are any outstanding High or Moderate findings and/or SEV1 or SEV 2 issues being addressed?

35 Design Considerations: Technical Reference Architecture (TRA)
Preliminary Design Review (PDR) Slide Instructions: Technical Reference Architecture (TRA) If applicable, briefly describe from Section 3 & 4 of the System Design Document, how the system is TRA compliant and address any non-compliance issues, specifically: What application code is being reengineered to comply with the TRA? What business logic code is being separated from the data access logic code? What business logic is being moved from the data tier to the application tier? How will the system provide indirect access from the application to the data tier since direct access is not allowed?

36 Design Considerations: Shared Services
Preliminary Design Review (PDR) The Developer Account workflow will need to integrate with EIDM Account workflows appear to have potential synergies with account workflows planned for CPI and NPPES/PCOS Slide Instructions: Shared Services If applicable, briefly describe from Section 3 & 4 of the System Design Document… What new shared services will be created? How will any new shared services being developed be consumable by other projects? What existing shared services will be consumed? Non-Shared Service Reusable Components What new reusable components will be developed? How can any new reusable components being developed be reused by other projects? What existing reusable components will be implemented?

37 Design Considerations: Software Architecture
Preliminary Design Review (PDR) FHIR is an industry-supported design for a developer-friendly API. It is supported by HL7 and is being presented as a new HL7 standard. HL7 has created an open source reference code base that implements the FHIR API. BBonFHIR is based on the HL7 open source , java code base. Slide Instructions: Software Architecture If applicable, briefly describe from Sections 3, & 4.3 of the System Design Document… What are the key software design decisions based on the mapping of the requirements to the prioritized Quality Attributes (aka non-functional requirements)? E.g. If Modifiability is more important than Performance, then it is highly probable that a different solution will be designed. What is being designed to enhance automation and to ensure that parameters, URIs, URLs, port numbers, and other variables can be changed without recompiling the code and/or restarting the application? What industry standard software patterns are being implemented in the software architecture to create a repeatable development processes?]

38 Design Considerations: Data Architecture
Preliminary Design Review (PDR) BBonFHIR is supporting three data design objectives: Minimize changes to current BlueButton data extract operations to enable agile implementation Develop a BBonFHIR data repository that can integrate with a beneficiary data lake when that service becomes available Make BlueButton data categories/content available via a secure REST API enabling data reuse for other future applications Slide Instructions: Data Architecture If applicable, briefly describe from Section 3 & 4.4 of the System Design Document… What is being done to align with the Enterprise Data Models? What is being done to reduce the amount of data being stored for non-system of record data? What is being done to use existing system of record data, e.g. Integrated Data Repository (IDR) data, in a non-persistent or temporary manner? What conformed data entities are being created to support new/existing functionality? What existing data entities are being conformed to support new/existing functionality?

39 Design Considerations: Business Intelligence and Portal Architecture
Preliminary Design Review (PDR) Slide Instructions: Business Intelligence If applicable, briefly describe from Section 3 & 4.4 of the System Design Document… What enterprise data models will be used to support both reports and dashboards? What methodology will be implemented to ensure no duplication of data or logic is required to support both reports and dashboards? Portal Architecture If applicable, briefly describe from Section 3 & 4.3 of the System Design Document… What existing portlets will be consumed? What new portlets will be reusable by other projects?

40 Preliminary System Architecture
Preliminary Design Review (PDR) Slide Instructions: Insert the system architecture diagram depicting the overall, integrated structure of the system in terms of presentation, application and data regions including data storage and manipulation, user and external interfaces from the System Design Document.

41 Test Plan Preliminary Design Review (PDR) Slide Instructions:
Insert the testing approach / strategy from Section 4 of the Test Plan.

42 Release Plan Preliminary Design Review (PDR) Slide Instructions:
Insert the release content and schedule from Sections 3.3 and 3.4 of the Release Plan.

43 Data Conversion Preliminary Design Review (PDR) The current BlueButton service extracts data via an export process from the IDR BBonFHIR plans to replicate that process and adapt it to create a JSON structured data export that can be loaded into the BBonFHIR repository via a bulk data load process on the FHIR Server Slide Instructions: Insert the data conversion scope from Section 5.1 of the Data Conversion Plan.

44 Detailed Design Review (DDR)
[Date of DDR] Slide Instructions: Note: Sample system architecture diagrams are available at this URL: If you do not have access to this URL, please contact your CMS point of contact to obtain them for you. This slide provides separation between the Preliminary Design Review (PDR) slides and the Detailed Design Review (DDR) slides. Enter the date of the DDR. Note: In preparation for your project’s Design Reviews, model diagrams with examples of System Architecture, Technology Stack, Security Design, Performance Design, Physical Design, and Multi Data Center Integration can be accessed from the following SharePoint site pages. Since these example diagrams are located on an internal SharePoint site, if you would like to see the example diagrams but don’t have access to the site, please request them from your CMS project manager/project lead. PDF: Visio:

45 Preliminary Design Review (PDR) Findings
Detailed Design Review (DDR) Slide Instructions: Identify any recommendations and/or findings from the Preliminary Design Review (PDR) and their resolution.

46 Data Design Detailed Design Review (DDR) Slide Instructions:
Model diagrams with examples of System Architecture, Technology Stack, Security Design, Performance Design, Physical Design, and Multi Data Center Integration can be accessed from the following SharePoint site pages. Since these example diagrams are located on an internal SharePoint site, if you would like to see the example diagrams but don’t have access to the site, please request them from your CMS project manager/project lead. PDF: Visio: Summarize the Logical and Physical Data Models for the application.

47 User & Machine Readable Interface
Detailed Design Review (DDR) Slide Instructions: Describe the user and machine readable interface, inputs and outputs as described in Sections 5.4, and of the System Design Document.

48 Detailed System Architecture
Detailed Design Review (DDR) Slide Instructions: Insert the system architecture diagram depicting the overall, integrated structure of the system in terms of presentation, application and data regions including data storage and manipulation, user and external interfaces from the System Design Document.

49 Physical Design Detailed Design Review (DDR) Slide Instructions:
From the information contained in the System Design Document, insert a diagram depicting the physical hardware design.

50 Software Design Detailed Design Review (DDR) Slide Instructions:
From the information contained in the System Design Document, provide information on each system software component, indicating the following: Services identifier Service based design, decomposition of services being used and built from the application and data zones Definition – the specific purpose and semantic meaning of the service Responsibilities – the primary responsibilities and/or behavior of the service. What does the component accomplish? What role does it play? What kinds of services does it provide to its clients? Resources – a description of any and all resources that are managed, affected, or needed by the service. Resources are components external to the design such as memory, processors, printers, databases, or a software library. This should include a discussion of any possible race conditions and/or deadlock situations and how they might be resolved. Reporting Design and Integration - if built in, provide details on data traffic and volumes

51 Security Design Detailed Design Review (DDR) Slide Instructions:
From the information contained in the System Design Document, insert a diagram depicting the security design.

52 Performance Design Detailed Design Review (DDR) Slide Instructions:
From the information contained in the System Design Document, insert a diagram depicting the performance/reliability design.

53 Internal Communications Design
Detailed Design Review (DDR) Slide Instructions: From the information contained in the System Design Document, insert a diagram depicting the internal communications design.

54 System Integrity Controls
Detailed Design Review (DDR) Slide Instructions: From the System Design Document, provide design specifications for the following minimum levels of control and any additional controls as appropriate or necessary: Internal security to restrict access of critical data items to only those access types required by users/operators Audit procedures to meet control, reporting, and retention period requirements for operational and management reports Application audit trails to dynamically audit retrieval access to designated critical data Standard tables to be used or requested for validating data fields Verification processes for additions, deletions, or updates of critical data Ability to identify all audit information by user identification, network terminal identification, date, time, and data accessed or changed.

55 External Interfaces Detailed Design Review (DDR) Slide Instructions:
From the System Design Document and Section 6 of the Interface Control Document, if applicable, describe the interface(s) between the system being developed and other systems (e.g. batch transfers, queries, etc.), indicating the location of the interfacing system. Include the interface architecture(s) being implemented (e.g. wide area networks, gateways, etc.) and the interfacing mechanisms (e.g., MQ, Gentran, etc.). If remote connectivity is required, identify the method of access. Provide a diagram depicting the communications path(s) provided in Section 3, System Overview, of the System Design Document. The graphical representation should depict the connectivity between systems, showing the direction of data flow.

56 Planned Tests Detailed Design Review (DDR) Slide Instructions:
Insert information on the planned testing from the Test Plan. Describe the various types of testing (test functions) to be performed.

57 Features Not Being Tested
Detailed Design Review (DDR) Slide Instructions: From the Test Plan, list and describe the system functions/features not planned to be tested and explain why.

58 Defect Management Detailed Design Review (DDR) Slide Instructions:
From the Test Plan, describe the defect resolution process to be implemented during testing, including the operational definition and assignment of appropriate impact/severity levels.

59 Test Environment Detailed Design Review (DDR) Slide Instructions:
From the Test Plan, provide details and a graphical representation of the environmental components required to test the system, including hardware, software, communications, and any other resources used to configure the test environment, as well as any security considerations.

60 Test Schedule Detailed Design Review (DDR) Slide Instructions:
From the Test Plan, list the milestone events and dates for all testing activities.

61 Release Impacts Detailed Design Review (DDR) Slide Instructions:
From the Test Plan, list the milestone events and dates for all testing activities.

62 Release Notification Detailed Design Review (DDR) Slide Instructions:
From the Release Plan, describe any release-specific communication that needs to occur.

63 Data Conversion Approach
Detailed Design Review (DDR) Slide Instructions: From the Data Conversion Plan, describe the approach that will be used to extract, transform/cleanse and load data from the source to target destinations during the conversion/migration process.

64 Data Conversion Schedule
Detailed Design Review (DDR) Slide Instructions: From the Data Conversion Plan, provide a schedule of conversion activities to be accomplished.

65 Data Conversion Quality Assurance
Detailed Design Review (DDR) Slide Instructions: From the Data Conversion Plan, describe the strategy to be used to ensure data quality before and after all data conversions.

66 Implementation Plan Detailed Design Review (DDR) Slide Instructions:
From the Implementation Plan, describe the planned deployment, installation, and implementation approach.

67 Operational Readiness Review (ORR)
[Date of ORR] Slide Instructions: This slide provides separation between the Detailed Design Review (DDR) slides and the Operational Readiness Review (ORR) slides. Enter the date of the ORR.

68 Detailed Design Review (DDR) Findings
Operational Readiness Review (ORR) Slide Instructions: Identify any recommendations and/or findings from the Detailed Design Review (DDR) and their resolution.

69 Actual System Architecture
Operational Readiness Review (ORR) Slide Instructions: Insert the actual System Architecture for the application. Explicitly identify any differences between the actual system architecture and the detailed system architecture from the Detailed Design Review.

70 Actual Physical Design
Operational Readiness Review (ORR) Slide Instructions: Insert the actual physical design for the application. Explicitly identify any differences between the actual physical design and the proposed physical design from the Detailed Design Review.

71 Data Conversion Operational Readiness Review (ORR) Slide Instructions:
Provide a status on any data conversion effort required before the application can go live. If the data conversion has not yet occurred, insert the conversion schedule from Section 5.4 and the backup strategy and restore process from Sections 6.2 and 6.3 of the Data Conversion Plan.

72 Validation Results Operational Readiness Review (ORR)
Summary Assessment # Test Cases % of Total Comments Test Cases Planned 100% Test Cases Run Test Cases Reviewed Test Cases Passed Test Cases Failed Test Cases To Be Run Test Cases Held Slide Instructions: Provide a summary of the test case results obtained for the reported test effort from Table 2 of the Test Summary Report.

73 Impact/Severity Level
Validation Results Operational Readiness Review (ORR) Impact/Severity Level Total Reported Total # Resolved % Total Resolved Total # Unresolved % Total Unresolved Low Moderate High Combined Totals Slide Instructions: Provide a summary of the test incidents that were reported during the testing from Table 3 of the Test Summary Report.

74 Authorization to Operate
Operational Readiness Review (ORR) Slide Instructions: Provide a copy of your Authorization to Operate.

75 Operations & Maintenance Manual
Operational Readiness Review (ORR) Slide Instructions: Provide a copy of the Receiving/Hosting data center’s approval of your Operations & Maintenance Manual.

76 Post Implementation Review (PIR)
[Date of PIR] Slide Instructions: This slide provides separation between the Operational Readiness Review (ORR) slides and the Post Implementation Review (PIR) slides. Enter the date of the PIR.

77 Operational Readiness Review (ORR) Findings
Post Implementation Review (PIR) Slide Instructions: Identify any recommendations and/or findings from the Operational Readiness Review (ORR) and their resolution.

78 Customer Satisfaction and Assessment Method
Post Implementation Review (PIR) Slide Instructions: Briefly describe the system’s users and the process used to assess user or customer satisfaction (e.g., surveys, user group meetings, customer focus groups, etc.). See Section 3 of the Post Implementation Report.

79 Assessment Results Post Implementation Review (PIR)
Slide Instructions: Summarize the results of surveys or other user or customer inputs, and usage trends. Is the existing system providing customers the needed functionality and performance? Based on your user or customer inputs, is actual performance consistent with user or customer expectations, or do the current performance goals reflect current user or customer functional or performance requirements? See Section 3 of the Post Implementation Report.

80 Lessons Learned Post Implementation Review (PIR) Slide Instructions:
Negative Lessons Learned Insert key lessons learned throughout the XLC (including disposition) and how they might be applied to help other projects avoid similar hindrances. Positive Lessons Learned Insert key positive lessons learned throughout the XLC (including disposition), approaches and/or actions that worked and how other projects could realize similar positive benefits.

81 Recommendations Post Implementation Review (PIR) Slide Instructions:
Justify if the existing system should continue in operation as is, be enhanced, or terminated. If the system is to be enhanced or terminated, summarize the actions to be taken this fiscal year. See Section 5 of the Post Implementation Report.

82 Disposition Review (DR)
[Project Name] Disposition Review (DR) [Date of DR] Slide Instructions: This slide provides separation between the Post Implementation Review (PIR) slides and the Disposition Review (DR) slides. Enter the name of the project and the date of the DR.

83 System Closeout Impact: Technical
Disposition Review (DR) Slide Instructions: This slide is specific to Disposition activity. Describe and chart predicted and actual closeout impacts to dependent systems. This indicates how well the transition/shut down was coordinated and communicated. Show how observed decommissioning impacts are aligned with planned impacts. (e.g. expectation was that requests for data decreased and then stopped 24 hours before system was taken off line or backlog queues remained steady once system was transitioned to replacement) How and how well did the Disposition Phase meet its Infrastructure Decommissioning requirements? Impacts to Dependent Systems Summarize impact of system’s closeout on systems formerly dependent on closed-out system’s output Summarize impact of system’s closeout on systems formerly dependent on closed-out system’s ingest Infrastructure Decommissioning Describe results of infrastructure decommissioning actions. Include comparison of planned and actual timeframe.

84 System Performance: People – Resource Analysis
Disposition Review (DR) Slide Instructions: This slide shows historical project data, it could include specifics of disposition effort if unique training was required for disposition activity. User/Customer It is suggested that Users/Customers who will provide assessment data be identified early in the Life Cycle so collection of this information can be planned as part of operating the project. Usability is a significant driver here. Assumes user expectations have been managed effectively or this could be a subject of discussion at this review. Staffing & De-staffing Show planned versus actual staffing Discuss successes – what actions helped the project meet the staffing plan Discuss challenges – what could be done to establish a better staffing plan in the future Covers Help Desk and Maintainers (hardware and software) Discuss communications about the staffing requirements changes to affected staff, their management, and other projects. Training Effectiveness The training program should include specific terminal objectives that specify students’ expected performance under certain conditions. Post training knowledge/skill assessment should test students’ ability to accomplish these terminal objectives. One measure of training effectiveness is comparing pre-training and post-training knowledge assessment results The same assessment is given before and after the training so improvement can be attributed to the training Another measure of training effectiveness is to report the percentage of students passing the post-training knowledge assessment on their first attempt. Discuss training effectiveness for any closeout specific training Training the help desk to address periods of parallel operations and closeout transition issues Suggest tracking help desk call volumes and topics over time as an indicator of training success User/Customer Assessment Include User/Customer Assessment results (this may be a version of the RATER model). Describe staffing successes and challenges Describe staffing communications Training Describe training effectiveness: % of trained staff able to perform successfully, % of trainees able to pass knowledge assessment without additional study or coaching. Discuss any closeout specific training Insert Graphics of Project’s: User/Customer assessment (Section 5 of the last Operational Analysis Report) Staffing/De-staffing Plan vs. Actuals

85 Closeout Performance: Process
Disposition Review (DR) Slide Instructions: This slide is specific to Disposition activity. How did the Disposition Phase meet its Security and Privacy requirements? Possible discussion points: Volume of Storage Media sanitized, User Login disabled, Web page taken down, and 800 number retired. How did the Disposition Phase address required Records Management requirements? Files archived for TBD years and research data retrieved and searched for privacy compromises. Consider items such as contracts and proposals, business case, charter, scope statement, schedule, budget estimate, project management documents, surveys, status reports, checklists, and s. The type of information actually archived will differ depending on the scope and type of project. Formal data archives should be stored in compliance with US National Archives and Records Administration (NARA) regulations.) How did the Disposition Phase address required Administrative Closure requirements? Results of procedures to transfer the project products or services to production and/or operations; stakeholder approval for all deliverables; verification that all deliverables have been provided and accepted; confirmation that the project has met all sponsors, clients, and other stakeholder’s requirements; validation that completion and exit criteria have been met; regulatory compliance items. Summarize project contract closure activities such as formally closing all contracts associated with the completed project. How did the Disposition Phase address required Contracts Closure requirements? Discuss and explain any open action items and if they transfer to another project. An example could be an ongoing data cleansing operation. This project stops collecting data but the system that receives the transitioned data still needs to accomplish the data cleansing. Security & Privacy Describe results of security and privacy closeout actions Records Management Describe results of records management closeout actions Administrative Closure Describe results of administrative closeout actions Contract Closure Describe results of contracts closeout actions Open Action Items Describe and explain any open actions

86 Appendices Slide Instructions:
This is the title slide for Appendices and does not need to be updated.

87 [Slide Title] Appendix C TRB Consult – [DATE] Slide Instructions:
This is a blank slide that can be used for Technical Review Board consultations. Insert the date of the consult at the top right-hand corner of the slide Insert as many slides as needed for your consultation Enter the slide title and other content


Download ppt "XLC Gate Review Consolidated Slide Deck"

Similar presentations


Ads by Google