Quality Exercise 2 Instructions

Slides:



Advertisements
Similar presentations
System Integration Verification and Validation
Advertisements

Risk Analysis for Testing Based on Chapter 9 of Text Based on the article “ A Test Manager’s Guide to Risks Analysis and Management” by Rex Black published.
Software project management (intro) Quality assurance.
CS 325: Software Engineering March 26, 2015 Software Quality Assurance Software Metrics Defect Injection Software Quality Lifecycle Measuring Progress.
Testing - an Overview September 10, What is it, Why do it? Testing is a set of activities aimed at validating that an attribute or capability.
Software Process and Product Metrics
Handouts Software Testing and Quality Assurance Theory and Practice Chapter 11 System Test Design
Non-functional requirements
Handouts Software Testing and Quality Assurance Theory and Practice Chapter 17 Software Quality
Handouts Software Testing and Quality Assurance Theory and Practice Chapter 17 Software Quality
Software Quality SEII-Lecture 15
Software Project Management Fifth Edition
S/W Project Management
Managing Software Quality
COURSE TITLE: 1 Software Quality Assurance. Course Aims Introduction to software quality assurance. Software testing terminology. Role and responsibility.
 The software systems must do what they are supposed to do. “do the right things”  They must perform these specific tasks correctly or satisfactorily.
CPIS 357 Software Quality & Testing
Sept - Dec w1d11 Beyond Accuracy: What Data Quality Means to Data Consumers CMPT 455/826 - Week 1, Day 1 (based on R.Y. Wang & D.M. Strong)
Chapter 6 : Software Metrics
SOFTWARE SYSTEMS DEVELOPMENT 4: System Design. Simplified view on software product development process 2 Product Planning System Design Project Planning.
Software Quality : The Elusive Target
Software Project Management Lecture # 3. Outline Chapter 22- “Metrics for Process & Projects”  Measurement  Measures  Metrics  Software Metrics Process.
University of Palestine software engineering department Testing of Software Systems Testing throughout the software life cycle instructor: Tasneem.
Software Testing and Quality Assurance Software Quality Assurance 1.
OHT 1.1 Galin, SQA from theory to implementation © Pearson Education Limited 2004 The uniqueness of software quality assurance The environments for which.
Chapter 13: Software Quality Project Management Afnan Albahli.
CSE 303 – Software Design and Architecture
Software Quality Assurance SOFTWARE DEFECT. Defect Repair Defect Repair is a process of repairing the defective part or replacing it, as needed. For example,
 System Requirement Specification and System Planning.
ISA 201 Intermediate Information Systems Acquisition
ISQB Software Testing Section Meeting 10 Dec 2012.
Chapter 2 Object-Oriented Paradigm Overview
TOTAL QUALITY MANAGEMENT
SOFTWARE TESTING Date: 29-Dec-2016 By: Ram Karthick.
Software Quality Control and Quality Assurance: Introduction
Non Functional Requirements (NFRs)
Software Quality Assurance
Security SIG in MTS 05th November 2013 DEG/MTS RISK-BASED SECURITY TESTING Fraunhofer FOKUS.
SEVERITY & PRIORITY RELATIONSHIP
Source & Courtesy: Doc. S. Dapkūnas
Software Quality Assurance Software Quality Factor
Quality Exercise 2 Instructions
Lecture 15: Technical Metrics
Software Reliability Definition: The probability of failure-free operation of the software for a specified period of time in a specified environment.
Software Quality Models.
Software Testing and Quality Assurance
Security Engineering.
Quality Exercise 2 Instructions
Software engineering.
UNIT-6 SOFTWARE QUALITY ASSURANCE
CMMI – Staged Representation
Engineering Processes
Introduction to Software Testing
Charakteristiky kvality
Thursday’s Lecture Chemistry Building Musspratt Lecture Theatre,
Software Quality Assurance Lecture 3
Progression of Test Categories
Baisc Of Software Testing
Welcome to Corporate Training -1
Software Engineering I
Capability Maturity Model
Software Requirements Specification (SRS) Template.
Chapter # 7 Software Quality Metrics
Applying Use Cases (Chapters 25,26)
Managing Project Risks and Opportunities
Quality Factors.
Capability Maturity Model
ISO/IEC Systems and software Quality Requirements and Evaluation
Presentation transcript:

Quality Exercise 2 Instructions Quality Analysis   The current software upgrade project has been in progress for a year now and the project should be well into the Coding and Unit Testing phase. The PM believes that something is amiss and has asked your IPT to perform an in-depth analysis to assess the project’s status from a quality perspective. The PM received from the prime contractor the measurement indicators (without any additional comments or information) and has given them to your IPT to analyze (see Contractor Data Presented to PM). The PM has also identified the program’s specific Quality Attributes and is concerned that the data does not allow for a good quality assessment. The PM wants your IPT to complete the enclosed assessment templates. Tasks:  Within your IPT, review carefully the set of charts to identify problems, assess their impact, project the outcomes, and evaluate alternatives. Compare/contrast the PM’s Quality Attributes with the Software Quality Model Characteristics (ISO 9126-1 :2001 Software engineering) presented in the lesson material and identify any areas you feel the PM should address. Compare and contrast the ICM Quality measurable concepts (with indicators and measures) from Exercise 1 with the PM’s Quality Attributes. Identify the differences between the ICM table and the PM’s Quality Attributes. Assess the data provided by the contractor - does the data provided meet information needs based on the PM’s Quality Attributes? If not, identify the areas where you need additional data to support the PM’s Quality Attributes. (Note: Each team will focus on one Quality Attribute area - Team 1 = Attribute 1, etc) What recommendations would you make to the PM based on the insight gained from your analysis? Be sure to include recommendations for strengthening the overall SW quality program. What additional analyses you would perform to assess and strengthen the SW Quality activities. SLIDE INFORMATION************************************* *Slide Type (Content or Exercise): Content *ELO ID:  ELO 24.1.1.7 Given several process-focused and product-focused software quality assurance methods, describe how each assures quality in a software acquisition ELO 24.1.1.8 Given a software acquisition scenario, recognize the preferred method for identifying and tracking defects ***************************************************************

PM Quality Attributes 4. Quality Attribute. Modularity 1. Quality Attribute. Interoperability Attribute Concerns A. Ease of interfacing with using applications B. Ease of interfacing with other systems 4. Quality Attribute. Modularity Attribute Concerns A. Replace an architectural component Loose coupling/high cohesion Design from a common set of components 2. Quality Attribute Adaptability Attribute Concerns Ability to accommodate new requirements Ability to accommodate new technologies Ability to field a subset of the current requirements (functionality) Ability to support various platforms 5. Quality Attribute Usability Attribute Concerns A. Minimize user training (operator, user, non-signal user) B. Enable user to perform tasks in allocated time. C. Minimize number of operators. D. Consistent GUI (common look and feel) 3. Quality Attribute. Maintainability Attribute Concerns A. Ability to accommodate new technologies B. Ability to support isolation of faults C. Minimize training for maintainers

Quality Exercise 2 Answer Template These are some examples of what the students may identify – the intent is to point out some areas where the PM could focus to assess quality Quality Exercise 2 Answer Template Performance Analysis Worksheet 1. Compare/contrast the PM’s Quality Attributes with the Software Quality Model Characteristics (ISO 9126-1 :2001 Software engineering) presented in the lesson material and identify any areas you feel the PM should address.  Interoperability: (useability, maintainability, portability) The high percent of interfaces under configuration management leads one to believe that there is attention being paid to them but there is not enough information to say whether they are capable of interfacing with other systems, how complex they are or if the interfaces are in compliance and verified. Adaptability: (functionality, reliability, portability) Currently new requirements are not under any type of configuration which means they are not suited to integrate with in place interfaces, unable to accommodate new technologies, systems and sub systems may not be available thus not reliable and there is no sign of the defects or their severity causing the new requirements to be out of control. Maintainability: (useability, functionality, efficiency, safety and security) There is no way of determining if the system can be supported in its present state. Operator errors and error trends are unidentifiable in the system and we are unable to determine system vulnerabilities. Modularity: (All ICM Quality Measure Concepts) With the current set of data points, there is no way to determine the PM’s Modularity concerns. There is no mention of any architectural components nor system design. Usability: (reliability) Unable to determine the number of users. Based on the user error rate it appears that over time the error rate seems to be declining but the criticle error rate remains constant. No way to determine if this is through formal training, on the job training or system modifications making it easier for the user. Availability of the system and system element failures are not specified to analyze.  2. Compare and contrast the ICM Quality measurable concepts (with indicators and measures) from Exercise 1 with the PM’s Quality Attributes. Identify the differences between the ICM table and the PM’s Quality Attributes.      SLIDE INFORMATION************************************* *Slide Type (Content or Exercise): Content *ELO ID:  ELO 24.1.1.7 Given several process-focused and product-focused software quality assurance methods, describe how each assures quality in a software acquisition ELO 24.1.1.8 Given a software acquisition scenario, recognize the preferred method for identifying and tracking defects ***************************************************************

Quality Exercise 2 Answer Template These are some examples of what the students may identify – the intent is to point out some areas where the PM could focus to assess quality Quality Exercise 2 Answer Template Performance Analysis Worksheet Assess the data provided by the contractor - does the data provided meet information needs based on the PM’s Quality Attributes? If not, identify the areas where you need additional data to support the PM’s Quality Attributes. (Note: Each team will focus on one Quality Attribute area - Team 1 = Attribute 1, etc)  NO. Each measure provided is only a single data point. Unable to determine if each measure is showing success or failure in obtaining each of the PM’s quality attributes. Most importantly, what are the measures for (type of SI item, sub system, or system) Examples: None of the measures specify if they all pertain to one SI, Subsystem or System. Need additional data for the requirements (number of new or modified) and if the requirement was critical and how it was prioritized. Are the SI’s and subsystems passing their integration testing; if not how much rework is being generated in the form of defect rate data? What is the severity of the defects and how are they prioritized. How many SI elements are being accepted and what part of the system does each SI affect? Those sub systems passing integration testing I would need to know their availability for operator training and if they are having a higher than normal failure rate. Are known defects being corrected and how long does it take to correct them. Are more problems being discovered than being fixed? Are the interfaces between systems verified, how many interfaces are there, are the interfaces under CM, what other systems does this system have to interface with and are all interfaces being verified.  What recommendations would you make to the PM based on the insight gained from your analysis? Be sure to include recommendations for strengthening the overall SW quality program. What additional analyses you would perform to assess and strengthen the SW Quality activities.  Recommend that the PM specify to the developer the types of measures that they need to provide. Those measures need to be aligned so that they give us insight into all system and subsystem development efforts which support the PM Quality Attributes and the Measurable Concepts. As an example, we need to see where we are, where we are going and when we expect to get there. The current set of Data Points tell us nothing other than there are problems but we have no idea as to what is causing them and we are unable to make a true assessment of the systems current health nor are we able to set up a mitigation strategy to make the system healthy again. For each of the PM’s attributes I would like to see measures identified by what they plan to accomplish and what has been completed, a time line for each effort, the number of number of requirements being completed, the number of new requirements being generated, the number and type of SW builds showing the status of each, the number of defects being discovered versus the number being closed, the results of each SI and Subsystem testing, the mean time between failures for each system function, planned and actual interface completion over time, the number of actual interfaces to be completed, the type and number of other systems to be interfaced with and the status of those other systems interfaces, the TPM status and trends over time. I would also want to track the interface complexity against the actual interfaces being developed.

Software Quality Model Characteristics ISO 9126-1 :2001 Software engineering -- Product quality provides a SW quality model that identifies six main quality characteristics: Functionality Suitability Accurateness Interoperability Compliance Security Reliability Maturity Fault tolerance Recoverability Usability Understandability Learnability Operability Efficiency Time behavior Resource behavior Analyzability Maintainability Changeability Stability Testability Adaptability Portability Installability Conformance Replaceability SLIDE INFORMATION*************************************************************************************************************************** *Policy / Directive / Standard / DTM ID: ISO 9126 Software Engineering – Product Quality *Supporting ELOs ID: ELO 24.X.X.X Define software quality *Slide Type: Content (Content or Exercise) ********************************************************************************************************************************************************** ISO 9126 is an international standard for the evaluation of software. The standard is divided into four parts which addresses, respectively, the following subjects: quality model; external metrics; internal metrics; and quality in use metrics. ISO 9126 Part one, referred to as ISO 9126-1 is an extension of previous work done by McCall (1977), Boehm (1978), FURPS and others in defining a set of software quality characteristics. ISO9126-1 represents the latest (and ongoing) research into characterizing software for the purposes of software quality control, software quality assurance and software process improvement (SPI). This article defines the characteristics identified by ISO 9126-1. The other parts of ISO 9126, concerning metrics or measurements for these characteristics, are essential for SQC, SQA and SPI but the main concern of this article is the definition of the basic ISO 9126 Quality Model. The ISO 9126-1 software quality model identifies 6 main quality characteristics, namely: Key Points: Efficiency Maintainability Usability Reliability Functionality Portability MT1.4.1. Software Quality factors are attributes of the software that improve the efficient use (capabilities provided) and impact the life-cycle costs of the software. These characteristics are broken down into sub-characteristics, a high level table is shown below under the Terms\Definitions\Acronyms section How are these “ilities” being put to use in your Software Quality Assurance program? Key Questions to Ask and Anticipated Answers: The main characteristics of the ISO9126-1 quality model, can be defined as follows: Terms \ Definitions \ Acronyms: Suitability This refers to the correctness of the functions, an ATM may provide a cash dispensing function but is the amount correct? Accurateness This is the essential Functionality characteristic and refers to the appropriateness (to specification) of the functions of the software Interoperability Where appropriate certain industry (or government) laws and guidelines need to be complied with, i.e. SOX. This subcharacteristic addresses the compliant capability of software. Compliance A given software component or system does not typically function in isolation. This subcharacteristic concerns the ability of a software component to interact with other components or systems. This subcharacteristic relates to unauthorized access to the software functions.  Security Fault tolerance This subcharacteristic concerns frequency of failure of the software. Maturity The ability of software to withstand (and recover) from component, or environmental, failure. Understandability Ability to bring back a failed system to full operation, including data and network connections. Recoverability Determines the ease of which the systems functions can be understood, relates to user mental models in Human Computer Interaction methods. Learning effort for different users, i.e. novice, expert, casual etc. Learnability Operability Ability of the software to be easily operated by a given user in a given environment. Characterizes response times for a given thru put, i.e. transaction rate. Time behavior Resource behavior Characterizes the ability to identify the root cause of a failure within the software. Analyzability Characterizes resources used, i.e. memory, cpu, disk and network usage. Changeability Characterizes the sensitivity to change of a given system that is the negative impact that may be caused by system changes. Stability Characterizes the amount of effort to change a system. Testability Characterizes the ability of the system to change to new specifications or operating environments. Adaptability Characterizes the effort needed to verify (test) a system change. Installability Similar to compliance for functionality, but this characteristic relates to portability. One example would be Open SQL conformance which relates to portability of database used. Conformance Characterizes the effort required to install the software. Characterizes the plug and play aspect of software components, that is how easy is it to exchange a given software component within a specified environment. Replaceability Software Quality Assurance

Quality Exercise 1 Potential Solution Measurable Concepts and Questions Addressed Potential Indicators and Measures Priority Functionality Is the product good enough for delivery to the user? Are identified problems being resolved? Defect Profiles / Defect Density Technical Measurement Trends System Elements Accepted Defects by status, severity, priority, distribution, age,etc. Technical measurement requirement, target, threshold,budget, and actual System elements verified 2 Reliability How often is service to users interrupted? Are failure rates within acceptable bounds? Mean-Time-to-Failure Availability System element failures by severity, priority System element start, end times 1 Usability Is the user interface adequate and appropriate for operations? Are operator errors within acceptable bounds? User Interface Acceptability Operator Error Trends Actions from user interface reviews Operator errors 5 Efficiency Does the target system make efficient use of system resources? Utilization Throughput Response Time System element capacity available, used Time for function (budget, actual) 6 Maintainability How much support does the system require? How difficult is it to support? How big is and how much change is occurring with the product's functional size, content, or logical characteristics? Interface number (unique), complexity, growth, approval rates, changes, TBD/TBR closure per plan Interface Complexity, Interface Compatibility, Lines of Code Trends Hours to restore Calendar hours and labor hours to repair Number of paths through system 4 Portability To what extent can the functionality be rehosted on different platforms? Interface Compliance Interfaces verified 7 Safety and Security How many vulnerabilities are identified and remediated by life-cycle phase? How many relevant attack patterns have been covered by test cases? - Vulnerabilities discovered, remediated - Cost to fix vulnerabilities - Test cases developed, verified per attack pattern 3 SLIDE INFORMATION************************************* *Slide Type (Content or Exercise): Content *ELO ID:  ELO 24.1.1.7 Given several process-focused and product-focused software quality assurance methods, describe how each assures quality in a software acquisition ELO 24.1.1.8 Given a software acquisition scenario, recognize the preferred method for identifying and tracking defects *************************************************************** Instruction version of ICM chart is highlighted to show from where potential answers may originate; prioritization is based on the current problems the program is having – the students can identify other priorities as long as they justify their answers.