2 Recap What is SQA? What is quality? Different Terminologies Cost of correcting an errorElements of SQA
3 SQA Goals, Attributes and Metrics Number of ambiguous modifiers (e.g., many, large, human-friendly)Number of TBAs, TBDsNumber of sections/subsectionsNumber of changes per requirementTime (by activity) when change is requestedNumber of requirements not traceable to design/codeNumber of UML modelsNumber of descriptive pages per modelNumber of UML errorsExistence of architectural modelNumber of components that trace to architectural modelComplexity of procedural designLayout appropriatenessNumber of patterns usedGoalsRequirement qualityDesign qualityAttributesAmbiguityCompletenessUnderstandabilityVolatilityTraceabilityModel clarityArchitectural integrityComponent completenessInterface complexityPatterns
4 SQA Goals, Attributes and Metrics Cyclomatic complexityDesign factorsPercent internal commentsVariable naming conventionsPercent reused componentsReadability indexStaff hour percentage per activityActual vs. budgeted completion timeReview metricsNumber of errors found and criticalityEffort required to correct an errorOrigin of errorGoalsCode qualityQC effectivenessAttributesComplexityMaintainabilityUnderstandabilityReusabilityDocumentationResource allocationCompletion rateReview effectivenessTesting effectiveness
5 SQA plan Management section Documentation section describes the place of SQA in the structure of the organizationDocumentation sectiondescribes each work product produced as part of the software processStandards, practices, and conventions sectionlists all applicable standards/practices applied during the software process and any metrics to be collected as part of the software engineering workReviews and audits sectionprovides an overview of the approach used in the reviews and audits to be conducted during the projectTest sectionreferences the test plan and procedure document and defines test record keeping requirementsProblem reporting and corrective action sectiondefines procedures for reporting, tracking, and resolving errors or defects, identifies organizational responsibilities for these activitiesOthertools, SQA methods, change control, record keeping, training, and risk management
6 Statistical SQAInformation about software defects is collected and categorizedAn attempt is made to trace each defect to its underlying causeIsolate the vital few causes of the major source of all errorsThen move to correct the problems that have caused the defects
7 Statistical SQA – Categories of errors Incomplete or erroneous specification (IES)Misinterpretation of customer comm (MCC)Intentional deviation from specification (IDS)Violation of programming standards (VPS)Error in data representation (EDR)Inconsistent module interface (IMI)Error in design logic (EDL)Incomplete or erroneous testing (IET)Inaccurate or incomplete documentation (IID)Error in programming lang. Translation (PLT)Ambiguous or inconsistent human-computer interface (HCI)Miscellaneous (MIS)Most often IES, MCC and EDR are the vital few causes for majority of errors.
11 Statistical SQA – Six Sigma Most widely used strategy for statistical SQAThree core stepsDefine customer requirements, deliverables and project goals via well-defined methods of customer communicationMeasure the existing process and its output to determine qualityAnalyze defect metrics and determine the vital few causesIf an existing software process is in place, but improvement is required six sigma suggestsImprove the process by eliminating the root causes of defectsControl the process to ensure that future work does not reintroduce the cases of defectsIf an organization is developing a software process, the core steps are augmentedDesign the process to (1) avoid the root causes of defects and (2) to meet customer requirementsVerify that the process model will, in fact, avoid defects and meet customer requirements
12 Reviews To uncover errors/defects To uncover errors in function, logic, or implementation for any representation of the softwareTo verify that software meets its requirementsTo ensure that software representation meets predefined standardsTo achieve software development in a uniform mannerTo make projects more manageable
13 Review Roles Presenter (designer/producer). Coordinator (not person who hires/fires).Recorderrecords events of meetingbuilds paper trailReviewersmaintenance oraclestandards beareruser representativeothers
14 Formal Technical Reviews Involves 3 to 5 people (including reviewers)Advance preparation (no more than 2 hours per person) requiredDuration of review meeting should be less than 2 hoursFocus of review is on a discrete work productReview leader organizes the review meeting at the producer's request.Reviewers ask questions that enable the producer to discover his or her own error (the product is under review not the producer)Producer of the work product walks the reviewers through the productRecorder writes down any significant issues raised during the reviewReviewers decide to accept or reject the work product and whether to require additional reviews of product or not.
15 Formality and Timing Formal review presentations resemble conference presentations.Informal presentationsless detailed, but equally correct.Earlytend to be informalmay not have enough informationLatertend to be more formalFeedback may come too late to avoid rework
16 Formality and Timing Analysis is complete. Design is complete. After first compilation.After first test run.After all test runs.Any time you complete an activity that produce a complete work product.
17 Why do peer reviews? To improve quality. Catches 80% of all errors if done properly.Catches both coding errors and design errors.Enforce the spirit of any organization standards.Training and insurance.
18 Review Guidelines.. Review the product, not producer Set an agenda and maintain itLimit the debateEnunciate problem areas, not to solve every problem notedTake written notesAllocate resources and time schedule for FTR’sUse standards to avoid style disagreements.Let the coordinator run the meeting and maintain order.Limit the number of participants and insist upon advance preparationDevelop a checklist for each work product to be reviewedTraining for all reviewer’sReviewing earlier reviews Keep it short (< 30 minutes).Don’t schedule two in a row.Don’t review product fragments.
19 Effectiveness of review Defect Amplification and Removal Used to illustrate the generation and detection of errors during design and code generationErrors passed throughAmplified errors 1:xNewly identified errorsPercent efficiency for error detectionErrors from previous stepsErrors passed to next stepDevelopment stepDefectsDetection
20 Effectiveness of review Defect Amplification and Removal No reviewsWith reviews
21 Effectiveness of review Defect Amplification and Removal
22 Review metrics and their use Many metrics can be defined for technical reviewsThe following can be calculated for each review conducted:Preparation effort (Ep)Assessment effort (Ea)Rework effort (Er)Work product size (WPS)Minor errors found (Errminor)Major errors found (Errmajor)
23 Summary SQA goal, attributes and metrics SQA plan Formal Technical Review (FTR)Statistical SQASix SigmaIdentifying vital few causesReview efficiency