2 Recap What is testing? Who does testing? Why do we do testing? Software testing process?Software TestingLevels of testingMethods/techniques of testingTest casesWriting effective test cases
3 What is SQA?Software Quality Assurance is an umbrella activity that is applied throughout the software process...
4 What is quality?Quality refers to any measurable characteristics such as correctness, maintainability, portability, testability, usability, reliability, efficiency, integrity, reusability and interoperability.
5 Quality terminologies Quality of Design refers to the characteristics that designer’s specify for an item.Quality of Conformance is the degree to which the design specifications are followed during manufacturing.Quality Control is the series of inspections, reviews and tests used throughout the development cycle to ensure that each work product meets the requirements placed upon it.Quality policy refers to the basic aims and objectives of an organization regarding quality as stipulated by the management.Quality assurance consists of the auditing and reporting functions of management.Cost of Quality includes all costs incurred in the pursuit of quality or in performing quality related activities such as appraisal costs, failure costs and external failure costs.Quality planning is the process of assessing the requirements of the procedure and of the product and the context in which these must be observed.Quality testing is assessment of the extent to which a test object meets given requirementsQuality assurance plan is the central aid for planning and checking the quality assurance.Quality assurance system is the organizational structure, responsibilities, procedures, processes and resources for implementing quality management.
7 Elements of S/W Quality Assurance StandardsReviews and auditsTestingError/defect collection and analysisChange managementEducationVendor managementSecurity managementSafetyRisk management
8 SQA tasks Prepares an SQA plan for a project Participates in the development of the project’s software process descriptionReviews software engineering activities to verify compliance with the defined software processAudits designated software work products to verify compliance with those defined as part of the software processEnsures that deviations in software work and work products are documented and handled according to a documented procedureRecords and noncompliance and reports to senior management
9 SQA Goals, Attributes and Metrics Number of ambiguous modifiers (e.g., many, large, human-friendly)Number of TBAs, TBDsNumber of sections/subsectionsNumber of changes per requirementTime (by activity) when change is requestedNumber of requirements not traceable to design/codeNumber of UML modelsNumber of descriptive pages per modelNumber of UML errorsExistence of architectural modelNumber of components that trace to architectural modelComplexity of procedural designLayout appropriatenessNumber of patterns usedGoalsRequirement qualityDesign qualityAttributesAmbiguityCompletenessUnderstandabilityVolatilityTraceabilityModel clarityArchitectural integrityComponent completenessInterface complexityPatterns
10 SQA Goals, Attributes and Metrics Cyclomatic complexityDesign factorsPercent internal commentsVariable naming conventionsPercent reused componentsReadability indexStaff hour percentage per activityActual vs. budgeted completion timeReview metricsNumber of errors found and criticalityEffort required to correct an errorOrigin of errorGoalsCode qualityQC effectivenessAttributesComplexityMaintainabilityUnderstandabilityReusabilityDocumentationResource allocationCompletion rateReview effectivenessTesting effectiveness
11 SQA plan Management section Documentation section describes the place of SQA in the structure of the organizationDocumentation sectiondescribes each work product produced as part of the software processStandards, practices, and conventions sectionlists all applicable standards/practices applied during the software process and any metrics to be collected as part of the software engineering workReviews and audits sectionprovides an overview of the approach used in the reviews and audits to be conducted during the projectTest sectionreferences the test plan and procedure document and defines test record keeping requirementsProblem reporting and corrective action sectiondefines procedures for reporting, tracking, and resolving errors or defects, identifies organizational responsibilities for these activitiesOthertools, SQA methods, change control, record keeping, training, and risk management
12 Statistical SQAInformation about software defects is collected and categorizedAn attempt is made to trace each defect to its underlying causeIsolate the vital few causes of the major source of all errorsThen move to correct the problems that have caused the defects
13 Statistical SQA – Categories of errors Incomplete or erroneous specification (IES)Misinterpretation of customer comm (MCC)Intentional deviation from specification (IDS)Violation of programming standards (VPS)Error in data representation (EDR)Inconsistent module interface (IMI)Error in design logic (EDL)Incomplete or erroneous testing (IET)Inaccurate or incomplete documentation (IID)Error in programming lang. Translation (PLT)Ambiguous or inconsistent human-computer interface (HCI)Miscellaneous (MIS)Most often IES, MCC and EDR are the vital few causes for majority of errors.
17 Statistical SQA – Six Sigma Most widely used strategy for statistical SQAThree core stepsDefine customer requirements, deliverables and project goals via well-defined methods of customer communicationMeasure the existing process and its output to determine qualityAnalyze defect metrics and determine the vital few causesIf an existing software process is in place, but improvement is required six sigma suggestsImprove the process by eliminating the root causes of defectsControl the process to ensure that future work does not reintroduce the cases of defectsIf an organization is developing a software process, the core steps are augmentedDesign the process to (1) avoid the root causes of defects and (2) to meet customer requirementsVerify that the process model will, in fact, avoid defects and meet customer requirements
18 Reviews To uncover errors/defects To uncover errors in function, logic, or implementation for any representation of the softwareTo verify that software meets its requirementsTo ensure that software representation meets predefined standardsTo achieve software development in a uniform mannerTo make projects more manageable
19 Review Roles Presenter (designer/producer). Coordinator (not person who hires/fires).Recorderrecords events of meetingbuilds paper trailReviewersmaintenance oraclestandards beareruser representativeothers
20 Formal Technical Reviews Involves 3 to 5 people (including reviewers)Advance preparation (no more than 2 hours per person) requiredDuration of review meeting should be less than 2 hoursFocus of review is on a discrete work productReview leader organizes the review meeting at the producer's request.Reviewers ask questions that enable the producer to discover his or her own error (the product is under review not the producer)Producer of the work product walks the reviewers through the productRecorder writes down any significant issues raised during the reviewReviewers decide to accept or reject the work product and whether to require additional reviews of product or not.
21 Formality and Timing Formal review presentations resemble conference presentations.Informal presentationsless detailed, but equally correct.Earlytend to be informalmay not have enough informationLatertend to be more formalFeedback may come too late to avoid rework
22 Formality and Timing Analysis is complete. Design is complete. After first compilation.After first test run.After all test runs.Any time you complete an activity that produce a complete work product.
23 Why do peer reviews? To improve quality. Catches 80% of all errors if done properly.Catches both coding errors and design errors.Enforce the spirit of any organization standards.Training and insurance.
24 Review Guidelines.. Review the product, not producer Set an agenda and maintain itLimit the debateEnunciate problem areas, not to solve every problem notedTake written notesAllocate resources and time schedule for FTR’sUse standards to avoid style disagreements.Let the coordinator run the meeting and maintain order.Limit the number of participants and insist upon advance preparationDevelop a checklist for each work product to be reviewedTraining for all reviewer’sReviewing earlier reviews Keep it short (< 30 minutes).Don’t schedule two in a row.Don’t review product fragments.
25 Effectiveness of review Defect Amplification and Removal Used to illustrate the generation and detection of errors during design and code generationErrors passed throughAmplified errors 1:xNewly identified errorsPercent efficiency for error detectionErrors from previous stepsErrors passed to next stepDevelopment stepDefectsDetection
26 Effectiveness of review Defect Amplification and Removal No reviewsWith reviews
27 Effectiveness of review Defect Amplification and Removal
29 Review metrics and their use Many metrics can be defined for technical reviewsThe following can be calculated for each review conducted:Preparation effort (Ep)Assessment effort (Ea)Rework effort (Er)Work product size (WPS)Minor errors found (Errminor)Major errors found (Errmajor)
30 Analyzing review metrics Total review effort (Ereview)Ereview = Ep + Ea + ErTotal number of errors (Errtot)Errtot = Errminor + ErrmajorError density represents the errors found per unit of work product reviewedError density = Errtot / WPSCost effectiveness of reviewsEffort saved per error = Etesting – Ereviews
31 Effectiveness of review Defect Amplification and Removal No reviewsWith reviews
32 Effectiveness of review Defect Amplification and Removal
33 Software reliabilityDefined as the probability of failure free operation of a computer program in a specified environment for a specified time.Can be measured directly and estimated using historical and developmental data (unlike many other software quality factors)Software reliability problems can usually be traced back to errors in design or implementation.Reliability metrics are units of measure for system reliabilitySystem reliability is measured by counting the number of operational failures and relating these to demands made on the system at the time of failureA long-term measurement program is required to assess the reliability of critical systems
34 Measuring S/W reliability A measure of software reliability is mean time between failures whereMTBF = MTTF + MTTRMTTF = mean time to failureMTTR = mean time to repairAvailability =MTTF/(MTTF + MTTR) * 100%Software availability is the probability that a program is operating according to requirements at a given point in time
36 Software reliability -- Software safety Processes that help reduce the probability that critical failures will occur due to SWHazard analysesIdentify hazards that could call failureDevelop fault treeIdentify all possible causes of the hazardFormally review the remedy for eachRedundancyRequire a written software safety planRequire independent verification & validation
37 Example Fault Tree -- Thermal Loss of heat...Power failureComputer failureIncorrectinputSW failed to throw switchComputer failureSW failed to throw switch...Logic reversed
38 Software Safety Redundancy Replicated at the hardware level Similar vs.. dis-similar redundancyVerificationAssuring that the software specifications are metValidationAssuring that the product functions as desiredIndependence
39 ISO 9000 Quality StandardsISO 9000 describes QA elements in generic termsElements include organizational structure, procedures, processes and resources.It treats an enterprise as a network of interconnected processes.To be ISO-complaint processes should adhere to the standards described.Ensures quality planning, quality control, quality assurance and quality improvement.From S/W engineering view point: An international standard which provides broad guidance to software developers on how to Implement, maintain and improve a quality software system capable of ensuring high quality softwareConsists of 20 requirements...Differs from country to country..23
40 ISO 9001 … requirements Management responsibility Quality system Contract reviewDesign ControlDocument and data controlPurchasingControl of customer supplied productProduct identification and traceabilityProcess controlInspection and testingControl of inspection, measuring and test equipmentInspection and test statusControl of non-confirming productCorrective and preventive actionHandling, storage, packaging, preservation and deliveryControl of quality recordsInternal quality auditsTrainingServicingStatistical techniques25
41 Summary SQA must be applied at each step SQA might be complex Software reviews are important SQA activitiesStatistical SQA helps improve product quality and software processSoftware Safety is essential for critical systemsISO 9001 standardizes the SQA activities27