Presentation is loading. Please wait.

Presentation is loading. Please wait.

University of Southern California Center for Systems and Software Engineering Quality Management & Architecture Review Board October 5, 2015 ©USC-CSSE1.

Similar presentations


Presentation on theme: "University of Southern California Center for Systems and Software Engineering Quality Management & Architecture Review Board October 5, 2015 ©USC-CSSE1."— Presentation transcript:

1 University of Southern California Center for Systems and Software Engineering Quality Management & Architecture Review Board October 5, 2015 ©USC-CSSE1

2 University of Southern California Center for Systems and Software Engineering Objectives of QM To ensure the high quality process in order to deliver high quality products (c) USC-CSSE2

3 University of Southern California Center for Systems and Software Engineering Quality Management in 577ab IIV&V Configuration Management Defect Reporting and Tracking Testing Buddy Review Architecture Review Board Core Capability Drive through Design Code Review Document template Sample artifacts (c) USC-CSSE3

4 University of Southern California Center for Systems and Software Engineering Quality Guidelines Design Guidelines –Describe design guidelines on how to improve or maintain modularity, reuse and maintenance –How the design will map to the implementation Coding Guidelines –Describe how to document the code in such as way that it could easily be communicated to others (c) USC-CSSE4

5 University of Southern California Center for Systems and Software Engineering Coding Guidelines C: http://www.gnu.org/prep/standards/standards.htmlhttp://www.gnu.org/prep/standards/standards.html C++ : http://geosoft.no/development/cppstyle.htmlhttp://geosoft.no/development/cppstyle.html Java: http://geosoft.no/development/javastyle.htmlhttp://geosoft.no/development/javastyle.html Visual Basic: http://msdn.microsoft.com/en-us/library/h63fsef3.aspx (c) USC-CSSE5

6 University of Southern California Center for Systems and Software Engineering Quality Guidelines Version Control and History –Chronological log of the changes introduced to this unit Implementation Considerations –Detailed design and implementation for as-built considerations Unit Verification –Unit / integration test –Code walkthrough / review / inspection (c) USC-CSSE6

7 University of Southern California Center for Systems and Software Engineering Quality Assessment Methods Methods, tools, techniques, processes that can identify the problems –Detect and report the problem –Measure the quality of the software system Three methods of early defect identification –peer review, IIV&V, Automated Analysis (c) USC-CSSE7

8 University of Southern California Center for Systems and Software Engineering Peer Review Reviews performed by peers in the development team –Can be from Fagan’s inspections to simple buddy checks –Peer Review Items –Participants / Roles –Schedule (c) USC-CSSE8

9 University of Southern California Center for Systems and Software Engineering Defect Removal Profiles (c) USC-CSSE9

10 University of Southern California Center for Systems and Software Engineering 10 AT&T/Lucent ARB Concept -Overview -Results -Recommendations USC CS577 ARB Concept -Participants -Procedures -Results Outline

11 University of Southern California Center for Systems and Software Engineering 11

12 University of Southern California Center for Systems and Software Engineering 12

13 University of Southern California Center for Systems and Software Engineering 13

14 University of Southern California Center for Systems and Software Engineering 14

15 University of Southern California Center for Systems and Software Engineering 15

16 University of Southern California Center for Systems and Software Engineering 16

17 University of Southern California Center for Systems and Software Engineering 17

18 University of Southern California Center for Systems and Software Engineering 18 [CSCI 577a FCR] [CSCI 577a DCR and 577b RDCR]

19 University of Southern California Center for Systems and Software Engineering 19

20 University of Southern California Center for Systems and Software Engineering 20

21 University of Southern California Center for Systems and Software Engineering 21

22 University of Southern California Center for Systems and Software Engineering 22

23 University of Southern California Center for Systems and Software Engineering 23

24 University of Southern California Center for Systems and Software Engineering 24

25 University of Southern California Center for Systems and Software Engineering 25

26 University of Southern California Center for Systems and Software Engineering 26

27 University of Southern California Center for Systems and Software Engineering 27 AT&T/Lucent ARB Concept -Overview -Results -Recommendations USC CS577 ARB Concept -Participants -Procedures -Results Outline

28 University of Southern California Center for Systems and Software Engineering The Incremental Commitment Spiral Model (ICSM) 28 Stakeholder value-based system definition and evolution Incremental commitment and accountability Concurrent system and software definition and development Evidence and risk-based decision making 4 Key Principles:

29 University of Southern California Center for Systems and Software Engineering ICSM for 24-week e-services projects Page 29

30 University of Southern California Center for Systems and Software Engineering 30 USC CS577 ARB Participants Project Team -Everybody presents something Reviewers -Clients -Instructors and TA’s -Industry participants 80 minute time slots

31 University of Southern California Center for Systems and Software Engineering ARB/milestones for two-semester team FCR ARB: October 12 th, 14 th, and 16 th –Based on preliminary FC package –Focus on FCR success criteria DCR ARB: November 30 th, December 2 nd, 4 th –Based on draft DC package –Focus on DCR success criteria 31

32 University of Southern California Center for Systems and Software Engineering ARB/milestones for one-semester team FCR/DCR ARB: October 12 th, 14 th, and 16 th –Based on DC package –Focus on DCR success criteria CCD: November 18 th –Core Capability Drive-through –Client(s) will have hands-on experience on your core capabilities TRR: November 30 th, December 2 nd, 4 th –Based on AsBuilt package 32

33 University of Southern California Center for Systems and Software Engineering ARB Review Success Criteria 33 FCR For at least one architecture, a system built to arch. will: Support the Ops Concept Satisfy the Requirements Be faithful to the Prototype(s) Be buildable within the budgets and schedules in the Plan Show viable business case Most major risks identified and resolved or covered by risk management plan Key stakeholders committed to support Foundations (nee Architecting or Elaboration) Phase (to DCR) DCR For the selected architecture, a system built to the arch. will: Support the Ops Concept Satisfy the Requirements Be faithful to the Prototype(s) Be buildable within the budgets and schedules in the Plan All major risks resolved or covered by risk management plan Key stakeholders committed to support full life cycle

34 University of Southern California Center for Systems and Software Engineering Commitment Review Success Criteria 34 TRR / OCR Show value Product works as expected (or not with noted exceptions) Product will help users do job Show quality development e.g. relevant parts of your IOC documentation Process Show sustainability e.g. support requirements/plans Transition plan & status Show confidence that product is/will be ready to be used e.g. Transition plan & status See also value Determine whether client needs anything further to ensure successful Transition and Operation Changes in priorities for remaining features? Changes being made to operational procedures? More materials needed for training? Changes in system data or environment we should prepare for? Anyone else who should experience CCD? CCD

35 University of Southern California Center for Systems and Software Engineering Team Preparation for ARB Reviews Week-1 Within-team Dry run of presentations and demo Further dry runs as necessary ARB Week ARB Presentation and discussion Follow-up team discussions, client discussions Week+1 Monday: FC packages due Monday: DC packages due 35

36 University of Southern California Center for Systems and Software Engineering Grading Criteria (70 points) Quality of Presentation (10 points) Quality of Project (40 points) –See Session Outline Progress (10 points) Consistency and project synchronization (5 points) Time management (5 points) 36

37 University of Southern California Center for Systems and Software Engineering FCR ARB Session Outline Architected Agile Team (x,y): (presentation time, total time) (5, 5) Remote Team Member(s) Team’s strong points & weak points (operational view and technical view) concerns & possible solutions; S/P Engineer observations (10,10) OCD. System purpose; shared vision; proposed new system; benefit-chain diagram; system boundary; desired capabilities and goals (10,10) Prototype. Most significant capabilities [buying information](especially those with high risk if gotten wrong) (5, 10)Requirements. Most significant requirements and its priorities level (10, 10)Architecture. Top-level physical and logical architecture; Use case diagram, status of NDI/reuse choices (5, 10)Life Cycle plan. Life cycle strategy; focus on Foundations phase; key stakeholder responsibilities; project plan, resource estimation (10, 10)Feasibility Evidence. Business case (beginnings, including benefits analysis); NDI analysis results; major risks; 3 Personas (5, 5) QFP. Traceability Matrix and summary; Quality Management Strategy; Defect Identification review type summary (what & how) by document section or UML, and current defect injection & removal matrix, technical debt (20)Things done right; issues to address (Instructional staff) Do not forget your slide number Each chart MUST have information specific to your project 37

38 University of Southern California Center for Systems and Software Engineering FCR ARB Session Outline NDI/ NCS Team (2 semesters) (x,y): (presentation time, total time) (5, 5) Remote Team Member(s) Team’s strong points & weak points (operational view and technical view) concerns & possible solutions; S/P Engineer observations (10,10) OCD. System purpose; shared vision; proposed new system; benefit-chain diagram; system boundary; core capabilities, constraints and goals (5, 5) WinWin Agreements. Agreed Win conditions in each category (10,10) Prototype. Most significant capabilities, NDI/NCS integration (5, 10)Architecture. Top-level physical and logical architecture; Use case diagram (10, 10)Life Cycle plan. Life cycle strategy; focus on Foundations phase; key stakeholder responsibilities; project plan, resource estimation (10, 15)Feasibility Evidence. NDI/NCS alternatives, NDI/NCS evaluation & analysis results; Business case (beginnings, including benefits analysis); major risks; Capability and LOS feasibility evidence; 3 Personas (5, 5)QFP. Traceability Matrix and summary; Defect Identification review type summary (what & how) by document section or UML, and current defect injection & removal matrix, Quality Management Strategy; Technical Debt (20)Things done right; issues to address (Instructional staff) Do not forget your slide number Each chart MUST have information specific to your project 38

39 University of Southern California Center for Systems and Software Engineering DCR ARB Session Outline NDI/ NCS Team (1 semester) (x,y): (presentation time, total time) (5, 5) Remote Team Member(s) Team’s strong points & weak points (operational view and technical view) concerns & possible solutions; S/P Engineer observations (10,10) OCD. System purpose; shared vision; proposed new system; benefit-chain diagram; system boundary; core capabilities, constraints and goals (5, 5) WinWin Agreements. Agreed Win conditions in each category (10,10) Prototype/ Product Demo. Most significant capabilities, NDI/NCS integration (5, 5)Architecture. Top-level physical and logical architecture; (if applicable) (10, 10)Life Cycle plan. Life cycle strategy; focus on Development phase & transition increment; key stakeholder responsibilities; project plan; resource estimation (10, 15)Feasibility Evidence. NDI/NCS alternatives, NDI/NCS evaluation & analysis results; Business case (beginnings, including benefits analysis); major risks; Capability and LOS feasibility evidence; 3 personas (5, 5)QFP. Traceability Matrix and summary; Defect Identification review type summary (what & how) by document section or UML, and current defect injection & removal matrix; Quality Management Strategy; Technical Debt (20)Things done right; issues to address (Instructional staff) Do not forget your slide number Each chart MUST have information specific to your project 39

40 University of Southern California Center for Systems and Software Engineering Resilient Agile team (x,y): (presentation time, total time) (5, 5) Remote Team Member(s) Team’s strong points & weak points (operational view and technical view) concerns & possible solutions; S/P Engineer observations (10,10) OCD. Program Model (5, 5) Requirements, use case and use case scenario, traceability between use case diagram and requirements (10,10) Storyboard (5, 5)Domain Model, Robustness diagram (10, 10)Life Cycle plan. Stakeholder responsibilities, project plan (10, 15)Feasibility Evidence. 3 personas, major risks (5, 5)QFP. Technical Debt (20)Things done right; issues to address (Instructional staff) Do not forget your slide number Each chart MUST have information specific to your project 40

41 University of Southern California Center for Systems and Software Engineering Specfics for DEN students Team’s strong points & weak points List at least one item for each of the following –List your team’s strong points Operational view Technical view –List your team’s weak points Operational view Technical view –Identify specific technical concerns & possible solutions –Identify operational risks & possible mitigation Sources of observations –Team activities, package evaluation, WinWin negotiation, and etc. 41

42 University of Southern California Center for Systems and Software Engineering Specfics for DEN students System/Project Engineer at ARB WinWin Shaping Status: –Open WinCs: Identified –Agreed WinCs with Issues & without Issues Overall Project Evaluation consideration –All SCS(s) CRACKness –Complexity –Precedentedness (for team!) –Communication & use of communication tools between on-campus team, S/PE and client –Skills/needs match –Knowledge/experience mis-matches 42

43 University of Southern California Center for Systems and Software Engineering QFP – Defect Identification Review For each document section, UML model, and etc. identify the following –type of review you used (peer review, agile artifact review, and etc. ) –Other form of defect identification, e.g., grading, client feedback, etc. Current Defect Injection and Removal Matrix –Current, total defect information from your progress report 43

44 University of Southern California Center for Systems and Software Engineering QFP – Quality Management Strategy Briefly explain techniques, tools your team is using for quality management, configuration management. Is it useful? Improvement? 44

45 University of Southern California Center for Systems and Software Engineering ARB Session Outline DCR Similar format to FCR, different focus: –Less time for OCD, Prototype –More time for Architecture, Plans –Details TBD based on FCR ARB experience General rule on focus: emphasize your project's high risk areas –At FCR (in most cases) this will involve establishing the operational concept (including system analysis) –At DCR (in most cases) this will involve the system design and development plan (especially schedule) 45

46 University of Southern California Center for Systems and Software Engineering Results To Date 46 * Reasons: - Poor performance - Poor team management - Poor communication (within team and with client)

47 University of Southern California Center for Systems and Software Engineering ARB Packages bring 4 copies of your ARB presentation (4 slides per page) Post your presentation on your team website > valuation phase 47

48 University of Southern California Center for Systems and Software Engineering Demos in ARB For those teams doing a live demo in the ARB meeting, please include screenshots of your demo in your presentation –for your IV&Vers to see the demo in case video connection is a problem –for reviewers to make notes on 48

49 University of Southern California Center for Systems and Software Engineering ARB Presentation slides Upload your ARB presentation slides (before your ARB) on your team website for off-campus students 49

50 University of Southern California Center for Systems and Software Engineering Webex & Teleconf Off-campus students who can not attend the ARB in-person, will be connecting through Webex / google + 50

51 University of Southern California Center for Systems and Software Engineering ARB timeslots 51 Monday Oct 12Wed Oct 14Fri Oct 16 11:00 – 12:20pm 12:30 – 1:50pm 2:00 – 3:20pm 3:30 – 4:50pm 5:00 – 6:20pm Make reservation at http://doodle.com/poll/i4gepifithqic5xshttp://doodle.com/poll/i4gepifithqic5xs - Clients and all team members, including off-campus students, must be available to attend

52 University of Southern California Center for Systems and Software Engineering Past FCR Experiences and General Guidelines

53 University of Southern California Center for Systems and Software Engineering Outline Previous FCR ARB Feedback Summary Examples of Good and Bad Practices seen at ARBs ARB Chartsmanship & Presentation 53

54 University of Southern California Center for Systems and Software Engineering Overall FCR Feedback Generally done well: presentations, time management, client rapport Reconcile FCR content with ARB Success Criteria When asked a question: –Give the answer in brief, this will help your time management and the Review Board will get the desired information –Listen Carefully, One speaker at a time Many had poor time management that indicated that presentation(s) had NOT been practiced Occasional pointing at laptop screen, not projected image (even better over Webex, use mouse) Very occasionally, slides with NO value added 54

55 University of Southern California Center for Systems and Software Engineering OCD Feedback (1) Generally done well: Organizational goals, Operational concept, System boundary and organizational environment. Some benefits chains need rework: –Added stakeholders: users, clients, developers, IIV&Vers, database administrators, maintainers, interoperators, suppliers –Assumptions are about environment not about outcome –Involvement/use of system before system is built –Some organization goal(s) are Benefits Chain end outcomes System boundary diagram –If you are using the component in/for your system, remove it from environment, e.g. PHP,.NET framework. 55

56 University of Southern California Center for Systems and Software Engineering OCD Feedback (2) Organization Goals –Are Benefits Chain End Outcomes (or maybe a subset) –Are NOT project Initiative contributions Identify Levels of Service properly –100% availability, 100% reliability - not feasible! –Make sure you can measure LOS goals Prototypes and System are NOT the same (usually) Business Workflow –Use activity-type diagram –Illustrate business activities Not technical/system activity May not even “see” system explicitly 56

57 University of Southern California Center for Systems and Software Engineering Prototype Feedback Generally done well: GUI Prototypes, Good understanding of client’s needs Prototype all high-risk elements, not just GUI’s –COTS interoperability, performance, scalability Use user/client-friendly terms –“John Doe, 22 Elm St.” not generic substitutions like “Name1, Addr1” –Use as an opportunity to gather more information and/or examples Identify end users and try to get feedback from end users Focus on important and high priority requirements (initially) 57

58 University of Southern California Center for Systems and Software Engineering Requirements Feedback Generally done well: Project and Capability requirements, OCD-requirements traceability Prioritize all the requirements Propagate LOS goals from OCD into SSRD or drop LOS requirements from SSRD (and SSAD) Distinguish between client imposed requirements and developer choice solution (SSAD) Make sure all requirements are testable Qualify “24/7 Availability" with exceptions Update the new requirements in WinBook tool There is no such thing as an “implicit requirement” 58

59 University of Southern California Center for Systems and Software Engineering SSAD Feedback Generally done well: Overall views Follow UML conventions (arrows, annotations, etc.) Generalization of actors Uncommon mistakes in use-case diagrams –Two actors-one use case (means BOTH present) –Arrow direction for or Devil is in the detail; simple is the best Only two teams had an adequate start on Information & Arctifacts Diagram Read the exit criteria for the milestone carefully 59

60 University of Southern California Center for Systems and Software Engineering LCP Feedback - 1 Generally done well: overall strategy, roles and responsibilities Too many 577b TBDs Identify required skills for NN new team member(s) (577b; if needed to meet "team size") Show (concentrate on) your future plan; not the past Full Foundations phase plan Don’t plan ONLY for documentation –Include Modeling –Include Prototyping; coding; executable architecture 60

61 University of Southern California Center for Systems and Software Engineering LCP Feedback - 2 COCOMO drivers –Often differ per the module (type) –PMATs rationale was often wrong: CS577 projects' process maturity should be between 2 and 3 –Some driver rationales were "ridiculous" Add DEN Student interactions to Gantt Chart –IIV&V –System/Project Engineer Add maintainer’s responsibilities 61

62 University of Southern California Center for Systems and Software Engineering FED Feedback Generally done well: Business case framework, risk analysis Specify LOS feasibility plans Include training, operations, maintenance, opportunity costs/effort Few had developers hours as cost (which is correct) Try to quantify benefits, show return on investment Change ROI to reflect on-going costs (possibly savings) Distinguish one-time from annual costs in business case Benefits start in mid 2010 (go at 6 months granularity); Costs start mid 2009 Elaborate process rationale Complete section 6 – COTS Analysis 62

63 University of Southern California Center for Systems and Software Engineering QFP Generally done well Some missing traceability injection-removal matrix Some seemed to try to "snow us with data", not present just a quick summary 63

64 University of Southern California Center for Systems and Software Engineering Things to improve Presentation – communication skill –One word wrong could lead to billion $ loss. Practice in front of others Be concise and precise Consistencies among each artifact Team work vs. integrated individual works Prepare your client: –Tell them what an ARB is (use agenda, success criteria) –Tell them what to expect from ARB Time management –Get in and set-up ASAP –Have documents & client present 64


Download ppt "University of Southern California Center for Systems and Software Engineering Quality Management & Architecture Review Board October 5, 2015 ©USC-CSSE1."

Similar presentations


Ads by Google