Presentation is loading. Please wait.

Presentation is loading. Please wait.

April 2008 Texas Nodal Program Nodal Quality Assurance Overview Gary Macomber.

Similar presentations


Presentation on theme: "April 2008 Texas Nodal Program Nodal Quality Assurance Overview Gary Macomber."— Presentation transcript:

1 April 2008 Texas Nodal Program Nodal Quality Assurance Overview Gary Macomber

2 2 Why change the QA/QC process? Lessons learned through first year of operation –Program Mgmt support required –Cumbersome, Redundant Information required just slightly different from what was being requested by PMO Perception of additional work for little value –Not fully integrated into Program reporting Time to move from pilot to operational program Complement existing reporting by providing improved visibility –Didn’t provide a complete picture Focused on timing of delivery of artifacts Program quality includes Testing and Traceability Conclusions –Focus on Artifact readiness –Add Test and Traceability Summary metrics

3 3 Traceability Testing Artifacts Quality Assurance The business need New Quality assurance framework is based on three components. Artifacts  Traceability  Testing The quality assurance program is based on three fundamental components to ensure we are delivering what is needed and to maintain the system going forward.

4 4 Nodal Quality Assurance objectives 1.To understand, communicate and execute a comprehensive quality assurance program. √√√√√ 2.Achieve systems compliance with 100% of the Service Level Agreement (SLA) performance and audit requirements. √√ 3.To increase quality assurance awareness at all levels within ERCOT and initiate active quality assurance and risk reduction behavior. √√ 4.Ensure that 100% of protocols as per program baseline are covered and met, through quality metrics and reporting on coverage matrix, and requirements traceability through business requirements, use cases, and test cases. √√√ 5.Ensuring consistency in project artifacts with ERCOT standards, through guidelines, checklists and templates. √√√√ 6.Improve the consistency, predictability and quality of the Nodal overall solution by implementing and following the Rational Unified Process, as customized by ERCOT. √√√√ 7.Ensuring quality of overall Nodal solution design though fit with overall architecture as represented in SoSA. √√√ 8.Ensuring quality assurance though tiered reviews and approvals √√√ PowERUP SiteWork ProductsReview and Approval Quality AssessmentQuality Metrics QA Process Quality Objectives A Milestone has  Definition  Criteria  List of Work Products  Reviewers A Work Product has  Guidelines  Template Quality Metrics PowERUP Site, Work products

5 5 So what does the QA/QC evaluation look like?

6 6 Quality Assurance Summary ProjectOverallArtifactsTraceabilityTesting Next Review TrendCommentary CRRAGG A 4/16/08= Amber due to pass rate at 81% in FAT_2 cycle MISRAGR4/1/08  Red due to pass rate at 33% NMMSRRG R 4/15/08  Red due to 11 sev 1 and 2s in testing COMS – S&BRAG Testing 4/22/08  Red due to 6 sev 2s in testing COMS – CMMRRA A 4/1/08=Mtg rescheduled to this week COMS – FTAAG N/A 4/22/08  Amber due to artifacts at 83% complete in Pre-FAT COMS –REGAAG G 4/29/08  Amber due to artifacts 84% complete COMS –DISPRRRN/A4/29/08  Red due to artifacts 69% complete and traceability at 58% for requirements to protocols Test – N/A at IOC MMSRRRA4/28/08  Red due to artifacts 62% complete and traceability at 47% for percentage of test cases to requirements. EMSRRA G 4/29/08  Red due to Artifacts at 69% complete at FAT_1 MOTE/SOTEGGN/A 5/15/08= Artifacts – (82%); are pending approval of requirements from TPTF on 3/4/08. Trace/Test – N/A at LCO phase. OTSGGN/A 4/15/08=Artifacts – A (83%), Trace/Test – N/A at LCO phase OSAAATesting5/6/08= EIPTBD 4/21/08= Follow up meeting based on revised metrics for on-going QA Reviews. MPIMTBD = UITBD = Follow up meeting to be scheduled week of 4/28 based on revised metrics for on-going QA Reviews. CDRTBD = Follow up meeting to be scheduled week of 4/14 based on revised metrics for on-going QA Reviews.

7 7 What is the Artifact picture like by Project? 1 1

8 8 What does the Traceability Summary look like by Project?

9 9 What does the Testing Summary look like by Project? FAT Status as of March 19, 2008 ApplicationTest Phase % Pass% ExecSev 1Sev 2 Quality Assessment S&B 2FAT 0%19%34 MMS UIFAT 79%84%00 OSFAT 95%85%06 CDR 1BFAT 68%51%11 MMS 3 FAT 81%Complete00Amber MIS 3BFAT 37%Complete00Red COMS-REGFAT 97%Complete00Green EMS 4.1FAT 100%Complete00Green NMMS 2A (Build1, 2)FAT 95%Complete120Red CRR 1(all patches)FAT81%Complete00Amber CMMFAT99%Complete02Amber MPIM 2iTEST 65%Complete00Red Final AssessmentGreenAmberRed sev 1 and sev 201 to 5> 5 pass rate> 8980 to 89< 80 *NOTE - a complete for % executed means that the project has executed all test scripts that will be executed for that release

10 10 What is next? Metrics updated weekly in IDA status report Presented to TPTF monthly Questions for TPTF –Is this the right information? –Is monthly the right frequency? –What else would you like to see?


Download ppt "April 2008 Texas Nodal Program Nodal Quality Assurance Overview Gary Macomber."

Similar presentations


Ads by Google