Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Copyright Lockheed Martin 2012 System & Software Architecture Performance Measurement Workshop 31 July 2012 Paul Kohl – Lockheed Martin Alejandro Bianchi.

Similar presentations


Presentation on theme: "1 Copyright Lockheed Martin 2012 System & Software Architecture Performance Measurement Workshop 31 July 2012 Paul Kohl – Lockheed Martin Alejandro Bianchi."— Presentation transcript:

1 1 Copyright Lockheed Martin 2012 System & Software Architecture Performance Measurement Workshop 31 July 2012 Paul Kohl – Lockheed Martin Alejandro Bianchi – Liveware IS S. A. Practical Software and Systems Measurement Objective Information for Decision Makers

2 2 Copyright Lockheed Martin 2012 System & Software Architecture Performance Measurement

3 3 Copyright Lockheed Martin 2012 INTRODUCTION Read Ahead Materials

4 4 Copyright Lockheed Martin 2012 BACKGROUND

5 5 Copyright Lockheed Martin 2012 Why? Outgrowth of a NDIA/PSM study 1 Identify a set of leading indicators that provide insight into technical performance Build upon objective measures in common practice in industry, government, and accepted standards. Select objective measures based on essential attributes (e.g., relevance, completeness, timeliness, simplicity, cost effectiveness, repeatability, and accuracy). Measures should be commonly and readily available Results published as NDIA System Development Performance Measurement Report, December 2011 NDIA System Development Performance Measurement Report Architecture was a high priority area but no indicators were identified that met criteria This is an attempt to define measures that can become the leading indicators Introduce them into common practice Using means that are easy to implement 1 NDIA System Development Performance Measurement Report1 NDIA System Development Performance Measurement Report, December 2011

6 6 Copyright Lockheed Martin 2012 What is an Architecture? ISO/IEC/IEEE 42010-2011 - IEEE Systems and software engineering -- Architecture description Architecture (system) – fundamental concepts or properties of a system in its environment embodied in its elements, relationships, and in the principles of its design and evolution Elements -Structure -Behavior -Data -Procedures Relationships -Internal -External Principles -Architecture Rules and Overarching Guidance

7 7 Copyright Lockheed Martin 2012 Architecture Design Process Activities/Tasks ISO/IEC/IEEE 15288 Architecture Activities and Tasks INCOSE Handbook para 4.3.1.5 Activities and Tasks ISO/IEC/IEEE 15288 Tasks and Activities (6.4.3.3) Reference Define the architecture6.4.3.3 a) Define appropriate logical architecture designs 6.4.3.3 a) 1) Partition the system functions6.4.3.3 a) 2) Define and document interfaces 6.4.3.3 a) 3) Analyze and evaluate the architecture 6.4.3.3 b) Analyze the resulting architectural design 6.4.3.3 b) 1) Determine which requirements are allocated to humans 6.4.3.3 b) 2) Determine if COTS available6.4.3.3 b) 3) Evaluate alternative design solutions 6.4.3.3 b) 4) Document and maintain the architecture 6.4.3.3 c) Specify the physical design solution 6.4.3.3 c) 1) Record the architectural design 6.4.3.3 c) 2) Maintain mutual traceability6.4.3.3 c) 3) INCOSE Handbook Activities (4.3.1.5) Reference Define the architecture4.3.1.5 Define consistent logical architecture designs Partition system requirements Identify interfaces and interactions including human Define V&V criteria Analyze and evaluate the architecture Evaluate COTS Evaluate alternative design solutions Support definition of system integration strategy Document and maintain the architecture Document & maintain the design and relevant decision Establish and maintain traceability

8 8 Copyright Lockheed Martin 2012 Outcomes of AD process Per ISO/IEC/IEEE 15288 – “The purpose of the AD process is to synthesize a solution that satisfied system requirements” and has the following outcomes: An architecture design baseline An implementable set of system element descriptions that satisfy the requirements for the system is specified The interface requirements are incorporated into the architecture design solution The traceability of the architectural design to system requirements is established A basis for verifying the system elements is defined A basis for the integration of system elements is established These outcomes are all objectively measurable

9 9 Copyright Lockheed Martin 2012 Traditional Architecture Measurement Traditionally architecture quality was determined at the milestone reviews and was a lagging indicator Reviewers were briefed and had access to documents and artifacts to determine: -Maturity and Consistency –Are all the elements required present at the current program phase? –Are all requirements accounted for? –Does it tie together? Within an architecture level? Between levels? Between artifact types? -Best Architecture= Product (Solution) Quality –Does it meet the stakeholder needs? –Does it avoid known architecture deficiencies? –Does it do so better than alternatives? Difficult to see full picture Even harder to determine consistency

10 10 Copyright Lockheed Martin 2012 Program Manager Leading Indicator Needs Does the architecture provide the right solution to the problem and does it meet all the requirements? -Best -Requirements Traceability Is the architecture going to be done on time? -Progress/Complete -Stability Will the architecture be low in defects? -No missing data -Entered data is correct -Data is consistent between artifacts and/or system elements?

11 11 Copyright Lockheed Martin 2012 Additional Measurement Needs Process efficiency Can the process be done better to reduce cost or improve quality? Size / Complexity How big and/or complex is the architecture effort so I can compare to other efforts? Cost What was the Total effort? What effort was required for each task / system element/ artifact?

12 12 Copyright Lockheed Martin 2012 Measurement Beyond the Program Enterprise type metrics related to architecture -Process efficiency -ROI in architecture -Market Share (meeting customer/stakeholder needs) Need to identify base measures of architectures that can support the above

13 13 Copyright Lockheed Martin 2012 MEANS OF MEASURING

14 14 Copyright Lockheed Martin 2012 Architecture Measures Architecture measurement requires a set of measures to fully address the needs Measures may be: -Objective (Quantitative) where discrete elements can be counted or otherwise measured or -Subjective (Quality) where human judgment is needed to fully evaluate an aspect of the architecture Measures should be: -Based on common practice and standards -Readily obtainable -Reflect essential attributes of architecture

15 15 Copyright Lockheed Martin 2012 Measurement in a model based environment Model based architecting (or architecture modeling) makes the evaluation of completeness and consistency feasible as a leading indicator) -Architecture tools provide better insight into consistency and completeness via pre-defined reports or by directly accessing the underlying database -Makes it easy(ier) to count artifacts and determine change dates -Easier to determine empty data fields -Easier to make consistency checks between architecture artifacts (parent-child, peer-to-peer) Quantitative measures are now available

16 16 Copyright Lockheed Martin 2012 Impact of Architecture Frameworks on Measurement Architecture Frameworks have defined stable sets of process activities (TOGAF) or viewpoint/models (DoDAF & FEAF) The latter provide items which may be measured When combined with the advances in modeling tools we have a standard set of products which may be measured with relative ease -Size -% Complete -Conformance to standard -Adequacy of representation (right viewpoints & well represented)

17 17 Copyright Lockheed Martin 2012 Quantitative Measurement Goal is to measure whether an architecture is complete and consistent Easier with model-based architecting -Anticipated artifacts / completed artifacts -Internal reports showing missing data and inconsistencies between artifacts -Supported by many of the architecture tools but requires effort on the part of the program to create and customize -Models help visualize heuristics as well Examples -Progress chart -Requirements trace reports (SELI) -TBx closure rate (SELI) -Empty data field counts -Visual reviews of artifacts -Other reports from the modeling tool database that address consistency

18 18 Copyright Lockheed Martin 2012 Additional Quantitative Measurables -% of functional requirements with elaborated behavior -% of requirements allocated/traced to an element of the architecture -Tables of normalized interface counts across elements -Level of detail of behavior definition -# of defects per element From Seidl & Sneed, Modeling Metrics for UML Diagrams, Testing Experience, Sep-Oct 2011 Formulas for calculating measures of a UML software architecture. Sample measures include: -Design Complexity (1-(# Design Entities/# Design Relationships) -Degree of Coupling -Degree of Consistency -Degree of Completeness Applicable to system architectures as well Adaptable to other modeling languages in a similar fashion

19 19 Copyright Lockheed Martin 2012 Example Progress Table/Chart Estimated # of diagrams StartedDefinition TEM Complete DrawnInspectedERBed% Complete System Behavior Diagrams 26 100% Subsystem Behavior Diagrams 175 17016015086% Component Behavior Diagrams 30025 20155%

20 20 Copyright Lockheed Martin 2012 Qualitative Measurement Goal is to ensure the architecture is correct and satisfies the needs -Does it meets stakeholder needs within the program constraints? -Is it better than the alternative architectures in satisfying stakeholder needs? Still somewhat subjective but has aspects that can be measured -Can only be determined in comparison to the alternatives -TPMs and MOE/KPP satisfaction compared Examples -TPM/MOE radar charts -Est. At Completion vs TPM/MOE -Architecture design trade study records

21 21 Copyright Lockheed Martin 2012 Additional Qualitative Measurables Reusability Maintainability Scalability Risk in execution -Architecture build out -Implementation (manufacturability or missing skill sets) -Uncertainty of evaluation of the other factors/measures and the potential impact of being wrong Technical Risk -(has its own set of measures and won’t be covered)

22 22 Copyright Lockheed Martin 2012 Example Architecture “Radar” Chart / Table AttributeWeightValueWeighted Value Flexibility25% 75%19% Adaptability10% 80%8% Modular15% 25%4% Simplicity10% 75%8% Usability10% 75%8% Performance30% 100%30% Total100% 77% “Utility Function” for the architecture assessment is a simple weighted sum of the assessed attribute values… repeat for each candidate architecture! Attribute 1 Attribute 2 Attribute 3 Attribute 4 Attribute 5 Attribute N Key attributes Must haves Evaluate as true/false Examples: Completeness of requirements coverage Threshold performance

23 23 Copyright Lockheed Martin 2012 Structural Heuristics “The eye is a fine architect. Believe it” Werner Von Braun, 1950Werner Von Braun “A good solution somehow looks nice” –Robert Spinrad, 1991Robert Spinrad

24 24 Copyright Lockheed Martin 2012 Heuristics Additional ways to measure architecture quality Heuristics – “Does it look right” -Review of the model artifacts can sometimes indicate if an architecture exhibits good / bad characteristics such as low cohesion or high levels of coupling Internal metrics -Number of internal interfaces -Number of requirements per architecture element can indicate an imbalance -Coupling counts Heuristics and expert review are experience based -Not generally directly measurable using quantitative means -If not applied early become a lagging indicator

25 25 Copyright Lockheed Martin 2012 Heuristics Loose coupling (McCabe 1976, Carson 2000) -Number, type of interfaces -What is “too many” (  “tight coupling”)? Functional Cohesion -Quantity of data between elements -What is “too much data between elements”? Can we identify specific architecture measures for these and define thresholds? Can these be compensated by other program elements (cf., Gau Pagnanelli et al., INCOSE 2012)?

26 26 Copyright Lockheed Martin 2012 Heuristics Example High External Complexity Low External Complexity Which Partitioning is Better? Why?

27 27 Copyright Lockheed Martin 2012 Architectural Characteristics System of Systems Multi-Enterprise or Multi-platform System Enterprise or Platform Subsystem Self contained functionality Component (Set of) OS address space(s) Software, Hardware, and User/Operators Organizational Owner One Many Autonomy High Low Coupling Low High Ability to enforce Implementation Uniformity Low High Acceptance Criteria Specific General

28 28 Copyright Lockheed Martin 2012 Additional Heuristics Design patterns in the architecture that provide warning that something might be amiss System Architectures -Functionality (for a single capability) scattered between multiple architecture elements -Functionality grouped with unlike functionality -Ambiguous interfaces without clear definitions -Functionality which requires extensive interaction between elements SW Architectures -Ambiguous interfaces -Extraneous connectors (two types of connectors used to link SW components) -Excessive interaction requirements between SW components

29 29 Copyright Lockheed Martin 2012 Heuristics Application Heuristics must be applied within the architecture team to be effective -Utilized as part of artifact/product inspections -Required application prior to baselining of products Otherwise Heuristics become a lagging indicator -Found at milestone reviews -Become defects

30 30 Copyright Lockheed Martin 2012 Other Sources of Heuristics Rechtin and Maier, 2009, The Art of Systems Architecting, 3d ed. Identifying Architectural Bad Smells, Garcia, Popescu, Edwards and Medvidovic (undated) USC database Personal experience (usually hard won)

31 31 Copyright Lockheed Martin 2012 WORKSHOP OBJECTIVES

32 32 Copyright Lockheed Martin 2012 Workshop Objectives Identify the key attributes of architecture to be measured Define a set of architecture measures that provide insight into the architecture -Base and/or composite (derived) -Support program leadership needs for leading indicators -Are quantitative -Are readily obtainable Recommend means/methods for obtaining the measures -Modeling tools -Requirement tools -Outputs from related processes Fill in the PSM template for the measures


Download ppt "1 Copyright Lockheed Martin 2012 System & Software Architecture Performance Measurement Workshop 31 July 2012 Paul Kohl – Lockheed Martin Alejandro Bianchi."

Similar presentations


Ads by Google