Presentation is loading. Please wait.

Presentation is loading. Please wait.

Quality Exercise 2 Instructions

Similar presentations


Presentation on theme: "Quality Exercise 2 Instructions"— Presentation transcript:

1 Quality Exercise 2 Instructions
Quality Analysis The current software upgrade project has been in progress for a year now and the project should be well into the Coding and Unit Testing phase. The PM believes that something is amiss and has asked your IPT to perform an in-depth analysis to assess the project’s status from a quality perspective. The PM received from the prime contractor the measurement indicators (without any additional comments or information) and has given them to your IPT to analyze (see Contractor Data Presented to PM). The PM has also identified the program’s specific Quality Attributes and is concerned that the data does not allow for a good quality assessment. The PM wants your IPT to complete the enclosed assessment templates. Tasks:  Within your IPT, review carefully the set of charts to identify problems, assess their impact, project the outcomes, and evaluate alternatives. Compare/contrast the PM’s Quality Attributes with the Software Quality Model Characteristics (ISO :2001 Software engineering) presented in the lesson material and identify any areas you feel the PM should address. Compare and contrast the ICM Quality measurable concepts (with indicators and measures) from Exercise 1 with the PM’s Quality Attributes. Identify the differences between the ICM table and the PM’s Quality Attributes. Assess the data provided by the contractor - does the data provided meet information needs based on the PM’s Quality Attributes? If not, identify the areas where you need additional data to support the PM’s Quality Attributes. (Note: Each team will focus on one Quality Attribute area - Team 1 = Attribute 1, etc) What recommendations would you make to the PM based on the insight gained from your analysis? Be sure to include recommendations for strengthening the overall SW quality program. What additional analyses you would perform to assess and strengthen the SW Quality activities. SLIDE INFORMATION************************************* *Slide Type (Content or Exercise): Content *ELO ID:  ELO Given several process-focused and product-focused software quality assurance methods, describe how each assures quality in a software acquisition ELO Given a software acquisition scenario, recognize the preferred method for identifying and tracking defects ***************************************************************

2 PM Quality Attributes 4. Quality Attribute. Modularity
1. Quality Attribute. Interoperability Attribute Concerns A. Ease of interfacing with using applications B. Ease of interfacing with other systems 4. Quality Attribute. Modularity Attribute Concerns A. Replace an architectural component Loose coupling/high cohesion Design from a common set of components 2. Quality Attribute Adaptability Attribute Concerns Ability to accommodate new requirements Ability to accommodate new technologies Ability to field a subset of the current requirements (functionality) Ability to support various platforms 5. Quality Attribute Usability Attribute Concerns A. Minimize user training (operator, user, non-signal user) B. Enable user to perform tasks in allocated time. C. Minimize number of operators. D. Consistent GUI (common look and feel) 3. Quality Attribute. Maintainability Attribute Concerns A. Ability to accommodate new technologies B. Ability to support isolation of faults C. Minimize training for maintainers

3 Quality Exercise 2 Answer Template
These are some examples of what the students may identify – the intent is to point out some areas where the PM could focus to assess quality Quality Exercise 2 Answer Template Performance Analysis Worksheet 1. Compare/contrast the PM’s Quality Attributes with the Software Quality Model Characteristics (ISO : Software engineering) presented in the lesson material and identify any areas you feel the PM should address.  Interoperability: (useability, maintainability, portability) The high percent of interfaces under configuration management leads one to believe that there is attention being paid to them but there is not enough information to say whether they are capable of interfacing with other systems, how complex they are or if the interfaces are in compliance and verified. Adaptability: (functionality, reliability, portability) Currently new requirements are not under any type of configuration which means they are not suited to integrate with in place interfaces, unable to accommodate new technologies, systems and sub systems may not be available thus not reliable and there is no sign of the defects or their severity causing the new requirements to be out of control. Maintainability: (useability, functionality, efficiency, safety and security) There is no way of determining if the system can be supported in its present state. Operator errors and error trends are unidentifiable in the system and we are unable to determine system vulnerabilities. Modularity: (All ICM Quality Measure Concepts) With the current set of data points, there is no way to determine the PM’s Modularity concerns. There is no mention of any architectural components nor system design. Usability: (reliability) Unable to determine the number of users. Based on the user error rate it appears that over time the error rate seems to be declining but the criticle error rate remains constant. No way to determine if this is through formal training, on the job training or system modifications making it easier for the user. Availability of the system and system element failures are not specified to analyze.  2. Compare and contrast the ICM Quality measurable concepts (with indicators and measures) from Exercise 1 with the PM’s Quality Attributes. Identify the differences between the ICM table and the PM’s Quality Attributes.    SLIDE INFORMATION************************************* *Slide Type (Content or Exercise): Content *ELO ID:  ELO Given several process-focused and product-focused software quality assurance methods, describe how each assures quality in a software acquisition ELO Given a software acquisition scenario, recognize the preferred method for identifying and tracking defects ***************************************************************

4 Quality Exercise 2 Answer Template
These are some examples of what the students may identify – the intent is to point out some areas where the PM could focus to assess quality Quality Exercise 2 Answer Template Performance Analysis Worksheet Assess the data provided by the contractor - does the data provided meet information needs based on the PM’s Quality Attributes? If not, identify the areas where you need additional data to support the PM’s Quality Attributes. (Note: Each team will focus on one Quality Attribute area - Team 1 = Attribute 1, etc)  NO. Each measure provided is only a single data point. Unable to determine if each measure is showing success or failure in obtaining each of the PM’s quality attributes. Most importantly, what are the measures for (type of SI item, sub system, or system) Examples: None of the measures specify if they all pertain to one SI, Subsystem or System. Need additional data for the requirements (number of new or modified) and if the requirement was critical and how it was prioritized. Are the SI’s and subsystems passing their integration testing; if not how much rework is being generated in the form of defect rate data? What is the severity of the defects and how are they prioritized. How many SI elements are being accepted and what part of the system does each SI affect? Those sub systems passing integration testing I would need to know their availability for operator training and if they are having a higher than normal failure rate. Are known defects being corrected and how long does it take to correct them. Are more problems being discovered than being fixed? Are the interfaces between systems verified, how many interfaces are there, are the interfaces under CM, what other systems does this system have to interface with and are all interfaces being verified.  What recommendations would you make to the PM based on the insight gained from your analysis? Be sure to include recommendations for strengthening the overall SW quality program. What additional analyses you would perform to assess and strengthen the SW Quality activities.  Recommend that the PM specify to the developer the types of measures that they need to provide. Those measures need to be aligned so that they give us insight into all system and subsystem development efforts which support the PM Quality Attributes and the Measurable Concepts. As an example, we need to see where we are, where we are going and when we expect to get there. The current set of Data Points tell us nothing other than there are problems but we have no idea as to what is causing them and we are unable to make a true assessment of the systems current health nor are we able to set up a mitigation strategy to make the system healthy again. For each of the PM’s attributes I would like to see measures identified by what they plan to accomplish and what has been completed, a time line for each effort, the number of number of requirements being completed, the number of new requirements being generated, the number and type of SW builds showing the status of each, the number of defects being discovered versus the number being closed, the results of each SI and Subsystem testing, the mean time between failures for each system function, planned and actual interface completion over time, the number of actual interfaces to be completed, the type and number of other systems to be interfaced with and the status of those other systems interfaces, the TPM status and trends over time. I would also want to track the interface complexity against the actual interfaces being developed.

5 Software Quality Model Characteristics
ISO :2001 Software engineering -- Product quality provides a SW quality model that identifies six main quality characteristics: Functionality Suitability Accurateness Interoperability Compliance Security Reliability Maturity Fault tolerance Recoverability Usability Understandability Learnability Operability Efficiency Time behavior Resource behavior Analyzability Maintainability Changeability Stability Testability Adaptability Portability Installability Conformance Replaceability SLIDE INFORMATION*************************************************************************************************************************** *Policy / Directive / Standard / DTM ID: ISO 9126 Software Engineering – Product Quality *Supporting ELOs ID: ELO 24.X.X.X Define software quality *Slide Type: Content (Content or Exercise) ********************************************************************************************************************************************************** ISO 9126 is an international standard for the evaluation of software. The standard is divided into four parts which addresses, respectively, the following subjects: quality model; external metrics; internal metrics; and quality in use metrics. ISO 9126 Part one, referred to as ISO is an extension of previous work done by McCall (1977), Boehm (1978), FURPS and others in defining a set of software quality characteristics. ISO represents the latest (and ongoing) research into characterizing software for the purposes of software quality control, software quality assurance and software process improvement (SPI). This article defines the characteristics identified by ISO The other parts of ISO 9126, concerning metrics or measurements for these characteristics, are essential for SQC, SQA and SPI but the main concern of this article is the definition of the basic ISO 9126 Quality Model. The ISO software quality model identifies 6 main quality characteristics, namely: Key Points: Efficiency Maintainability Usability Reliability Functionality Portability MT Software Quality factors are attributes of the software that improve the efficient use (capabilities provided) and impact the life-cycle costs of the software. These characteristics are broken down into sub-characteristics, a high level table is shown below under the Terms\Definitions\Acronyms section How are these “ilities” being put to use in your Software Quality Assurance program? Key Questions to Ask and Anticipated Answers: The main characteristics of the ISO quality model, can be defined as follows: Terms \ Definitions \ Acronyms: Suitability This refers to the correctness of the functions, an ATM may provide a cash dispensing function but is the amount correct? Accurateness This is the essential Functionality characteristic and refers to the appropriateness (to specification) of the functions of the software Interoperability Where appropriate certain industry (or government) laws and guidelines need to be complied with, i.e. SOX. This subcharacteristic addresses the compliant capability of software. Compliance A given software component or system does not typically function in isolation. This subcharacteristic concerns the ability of a software component to interact with other components or systems. This subcharacteristic relates to unauthorized access to the software functions.  Security Fault tolerance This subcharacteristic concerns frequency of failure of the software. Maturity The ability of software to withstand (and recover) from component, or environmental, failure. Understandability Ability to bring back a failed system to full operation, including data and network connections. Recoverability Determines the ease of which the systems functions can be understood, relates to user mental models in Human Computer Interaction methods. Learning effort for different users, i.e. novice, expert, casual etc. Learnability Operability Ability of the software to be easily operated by a given user in a given environment. Characterizes response times for a given thru put, i.e. transaction rate. Time behavior Resource behavior Characterizes the ability to identify the root cause of a failure within the software. Analyzability Characterizes resources used, i.e. memory, cpu, disk and network usage. Changeability Characterizes the sensitivity to change of a given system that is the negative impact that may be caused by system changes. Stability Characterizes the amount of effort to change a system. Testability Characterizes the ability of the system to change to new specifications or operating environments. Adaptability Characterizes the effort needed to verify (test) a system change. Installability Similar to compliance for functionality, but this characteristic relates to portability. One example would be Open SQL conformance which relates to portability of database used. Conformance Characterizes the effort required to install the software. Characterizes the plug and play aspect of software components, that is how easy is it to exchange a given software component within a specified environment. Replaceability Software Quality Assurance

6 Quality Exercise 1 Potential Solution
Measurable Concepts and Questions Addressed Potential Indicators and Measures Priority Functionality Is the product good enough for delivery to the user? Are identified problems being resolved? Defect Profiles / Defect Density Technical Measurement Trends System Elements Accepted Defects by status, severity, priority, distribution, age,etc. Technical measurement requirement, target, threshold,budget, and actual System elements verified 2 Reliability How often is service to users interrupted? Are failure rates within acceptable bounds? Mean-Time-to-Failure Availability System element failures by severity, priority System element start, end times 1 Usability Is the user interface adequate and appropriate for operations? Are operator errors within acceptable bounds? User Interface Acceptability Operator Error Trends Actions from user interface reviews Operator errors 5 Efficiency Does the target system make efficient use of system resources? Utilization Throughput Response Time System element capacity available, used Time for function (budget, actual) 6 Maintainability How much support does the system require? How difficult is it to support? How big is and how much change is occurring with the product's functional size, content, or logical characteristics? Interface number (unique), complexity, growth, approval rates, changes, TBD/TBR closure per plan Interface Complexity, Interface Compatibility, Lines of Code Trends Hours to restore Calendar hours and labor hours to repair Number of paths through system 4 Portability To what extent can the functionality be rehosted on different platforms? Interface Compliance Interfaces verified 7 Safety and Security How many vulnerabilities are identified and remediated by life-cycle phase? How many relevant attack patterns have been covered by test cases? - Vulnerabilities discovered, remediated - Cost to fix vulnerabilities - Test cases developed, verified per attack pattern 3 SLIDE INFORMATION************************************* *Slide Type (Content or Exercise): Content *ELO ID:  ELO Given several process-focused and product-focused software quality assurance methods, describe how each assures quality in a software acquisition ELO Given a software acquisition scenario, recognize the preferred method for identifying and tracking defects *************************************************************** Instruction version of ICM chart is highlighted to show from where potential answers may originate; prioritization is based on the current problems the program is having – the students can identify other priorities as long as they justify their answers.


Download ppt "Quality Exercise 2 Instructions"

Similar presentations


Ads by Google