Download presentation
Presentation is loading. Please wait.
Published byAleesha Blankenship Modified over 8 years ago
1
1 Chapter 1 The Software Quality Challenge
2
2 The uniqueness of software quality assurance DO you think that there is a bug-free software? Can software developers warrant their software applications and their documentation from any bugs or defects ? What are the essential elemental differences between software and other industrial products such as automobiles, washing machines etc?
3
3 The essential differences between software and other industrial products can be categorized as follows : 1. Product complexity : # of operational modes the product permit. 2. Product visibility : SW products are invisible. 3. Product development and production process.
4
4 The phases at which the possibility of detecting defects in industrial products and software products: SW products do not benefit from the opportunities for detection of defects at the three phases of the production process Industrial products: Product development : QA -> product prototype Product production planning : Production - line Manufacturing : QA procedure applied Software products: Product development : QA -> product prototype Product production planning : Not required Manufacturing : Copying the product & printing copies
5
5 Factors affecting detecting defects in SW products VS other industrial products: CharacteristicSW productsOther industrial products Complexity Usually, v. complex allowing for v. large number of operational options Degree of complexity much lower Visibility Invisible, impossible to detect defects or omissions by sight ( diskette or CD storing ) Visible, allowing effective detection of defects by sight Nature of development and production process Opportunities to detect defects arise in only one phase, namely product development Opportunities to detect defects arise in all phases of development and production
6
6 Important Conclusion The great complexity as well as invisibility of software, among other product characteristics, make the development of SQA methodologies and its successful implementation a highly professional challenge
7
7 Pupils & students Hobbies Engineers, economics, mgt & other fields SW development professionals All those SW developers are required to deal with SW quality problems “Bugs” The environment for which SQA methods are developed
8
8 SQA environment The main characteristics of this environment : 1. Contractual conditions 2. Subjection to customer-supplier relationship 3. Required teamwork 4. Cooperation and coordination with other SW teams 5. Interfaces with other SW systems 6. The need to continue carrying out a project despite team member changes. 7. The need to continue out SW maintenance for extended period.
9
9 Contractual conditions the activities of SW development & maintenance need to cope with : A defined list of functional requirements The project budget The project timetable
10
10 Subjection to customer-supplier relationship SW developer must cooperate continuously with customer : To consider his request to changes To discuss his criticisms To get his approval for changes
11
11 Required teamwork Factors motivating the establishment of a project team: Timetable requirements The need of variety The wish to benefit from professional mutual support & review for enhancement of project quality
12
12 Cooperation and coordination with other SW teams Cooperation may be required with: Other SW dev. Teams in the same org. HW dev. teams in the same org. SW & HW dev. teams of other suppliers Customer SW and HW dev. teams that take part in the project’s dev.
13
13 Interfaces with other SW Systems Input interfaces Output interfaces I/O interfaces to the machine’s control board, as in medical and lab. Control systems
14
14 The need to continue carrying out a project despite team member changes. During project dev. Period we might be face : Leave from the members of the team Switch in employees Transfer to another city
15
15 The need to continue out SW maintenance for extended period. From 5 to 10 years, customers need continue to utilizing their systems: Maintenance Enhancement Changes ( Modification )
16
16 Chapter 2 What is Software Quality ?
17
17 What is Software ? IEEE Definition: Software Is : Computer programs, procedures, and possibly associated documentation and data pertaining to the operation of a computer system.
18
18 IEEE Definition is almost identical to the ISO def. ( ISO/IEC 9000-3 ) Computer programs (“Code”) Procedures Documentation Data necessary for operation the SW system.
19
19 TO sum up: Software quality assurance always includes : Code quality The quality of the documentation And the quality of the necessary SW data
20
20 SW errors, faults and failures Questions arise from HRM conference Page 16. An error : can be a grammatical error in one or more of the code lines, or a logical error in carrying out one or more of the client’s requirements. Not all SW errors become SW faults. SW failures that disrupt our use of the software.
21
21 The relationship between SW faults & SW failures: Do all SW faults end with SW failures? The answer is not necessarily The SW fault becomes a SW failure only when it is activated. Example page 17-18
22
22 Classification of the causes of SW errors SW errors are the cause of poor SW quality SW errors can be Code error Documentation error SW data error The cause of all these errors are human
23
23 The nine causes of software errors 1. Faulty requirement definition 2. Client-developer communication failures 3. Deliberate deviation from SW requirements 4. Logical design errors 5. Coding errors 6. Non-compliance with documentation and coding instructions 7. Shortcomings of the testing process 8. Procedure errors 9. Documentation errors
24
24 Faulty requirement definition 1. Erroneous definition of requirements 2. Absence of vital requirements 3. Incomplete definition of requirements 4. Inclusion of unnecessary requirements
25
25 Client-developer communication failures Misunderstandings resulting from defective client-developer comunications. Misunderstanding of the client’s requirements changes presented to the developer In written forms Orally Responses to the design problems others
26
26 Deliberate deviation from SW requirements The developer reuse SW modules taken from the earlier project Due to the time budget pressure Due to the unapproved improvements
27
27 Logical design errors This is come from systems architects, system analysts, SW engineers such as: Erroneous algorithms Process definitions that contain sequencing errors Erroneous definition of boundary conditions Omission of required SW system states Omission of definitions concerning reactions to illegal operations
28
28 Coding errors Misunderstanding the design documentation Linguistic errors in the prog. Lang. Errors in the application of CASE and other dev. Tools etc
29
29 Non-compliance with documentation and coding Team members who need to coordinate their own codes with code modules developed by non-complying team members Individuals replacing the non-complying team member will find it difficult to fully understand his work. Design review to other non-complying team
30
30 Shortcomings of the testing process Incomplete testing plans Failures to document and report errors and faults Failures to promptly correct detected SW faults as a result of inappropriate indications of the reasons for the fault. Incomplete correction of detected errors.
31
31 Procedure errors & documentation errors See example page 22
32
32 Software quality - Definition IEEE 1. The degree to which a system, component, or process meets specified requirements. 2. The degree to which a system, component, or process meets customer or user needs or expectations.
33
33 Software Quality Pressman’s def. Conformance to explicitly stated functional and performance requirements, explicitly documented standards, and implicit characteristics that are expected of all professionally developed software.
34
34 Software Quality Assurance The IEEE Definition SQA is : 1. A planned and systematic pattern of all actions necessary to provide adequate confidence that an item or product conforms to established technical requirements. 2. A set of activities designed to evaluate the process by which the products are developed or manufactured. Contrast with quality control.
35
35 IEEE SQA definition – exclude the maintenance & timetable and budget issues. The Author adopts the following : SQA should not be limited to the development process. It should be extended to cover the long years of service subsequent to product delivery. Adding the software maintenance functions into the overall conception of SQA. SQA actions should not be limited to technical aspects of the functional requirements, It should include activities that deal with scheduling and timetable and budget.
36
36 SQA – Expanded Definition. This definition corresponds strongly with the concepts at the foundation of ISO 9000-3, 1997 and also corresponds to the main outlines of the CMM for software See the Table 2.2 page 27 A systematic, planned set of actions necessary to provide adequate confidence that the software development process or the maintenance of a software system product conforms to established functional technical requirements as well as with the managerial requirements of keeping the schedule and operating within the budgetary confines.
37
37 Software Quality Assurance Vs. Software Quality Control Quality Control : a set of activities designed to evaluate the quality of a developed or manufactured product. It take place before the product is shipped to the client. Quality Assurance : the main objective is to minimize the cost of guaranteeing quality by a variety of activities performed throughout the causes of errors, and detect and correct them early in the dev. Process.
38
38 The objectives of SQA activities see page 29 Software development ( process-oriented ) Software maintenance ( Product-oriented )
39
39 SQA Vs Software Engineering SW Engineering ( IEEE def. ) 1. The application of a systematic, restricted, quantifiable approach to the development and maintenance of SW; that is the application of engineering to software.
40
40 Chapter 3 Software Quality Factors
41
41 SQ. Factors From the previous chapters we have already established that the requirements document is one of the most important elements for achieving SQ. What is a “Good” SQ requirements document ?
42
42 The need for comprehensive SQ requirements Our Sales IS seems v. good, but it is frequently fails, at least twice a day for 20 minutes or more.( SW house claims no responsibility…. Local product contains a SW and every thing is ok, but, when we began planning the development of a European version, almost all the design and programming will be new. etc see page 36.
43
43 There are some characteristics common to all these buts : All SW projects satisfactorily fulfilled the basic requirements for correct calculations. All SW projects suffered from poor performance in important areas such as maintenance, reliability, SW reuse, or training. The cause for poor performance of the developed SW projects in these areas was lack of predefined requirements to cover these important aspects of the SW functionality. The solution is : The need for a comprehensive definition of requirements ( SQ Factors )
44
44 Classification of SW requirements into SW quality factors. McCall’s Factor Model This model classifies all SW requirements into 11 SW quality factors, grouped into 3 categories: Product operation: Correctness, Reliability, Efficiency, Integrity, Usability Product revision : Maintainability, Flexibility, Testability Product transition : Portability, Reusability, Interoperability. See the McCall model of SW quality factors tree see page 38
45
45 Product operation SW quality factors Correctness: Output specifications are usually multidimensional ; some common include: The output mission The required accuracy The completeness The up-to-dateness of the info. The availability of the info.( the reaction time ) The standards for coding and documenting the SW system See Example page 39.
46
46 Product operation SW quality factors Reliability: Deals with failures to provide service. They determine the maximum allowed SW system failure rate, and can refer to the entire system or to one or more of its separate functions. See examples page 39 ( heart-monitoring unit )
47
47 Product operation SW quality factors Efficiency: Deals with the HW resources needed to perform all the functions of the SW system in conformance to all other requirements. See examples page 40 ( CPU speed.. etc ) Integrity: Deals with the SW system security, that is requirements to prevent access to unauthorized persons. See examples page 40
48
48 Product operation SW quality factors Usability: Deals with the scope of staff resources needed to train a new employee and to operate the SW system. See examples page 41
49
49 Product revision SW quality factors Maintainability : Maintainability requirements determine the efforts that will be needed by users and maintenance personnel to identify the reasons for SW failures, to correct the failure, and to verify the success of the corrections. Example : Typical maintainability requirements: 1. The size of a SW module will not exceed 30 statements 2. The programming will adhere to the company coding standards and guidelines.
50
50 Product revision SW quality factors Flexibility : The capabilities and efforts required to support adaptive maintenance activities are covered by flexibility requirements. This factor’s requirements also support perfective maintenance activities, such as changes and additions to the SW in order to improve its service and adapt it to changes in the firm’s technical or commercial environment. Example :page 42
51
51 Product revision SW quality factors Testability : - Deal with the testing of an IS as well as with its operation. - Providing predefined intermediate results and log files. - Automatic diagnostics performed by the SW system prior starting the system, to find out whether all components of SW system are in working order. - Obtain a report about detected faults. Example :page 42, 43
52
52 Product transition SW quality factors Portability : - Tend to the adaptation of a SW system to other environments consisting : - Different HW - Different OS Example : SW designed to work under windows 2000 env. Is required to allow low-cost transfer to Linux.
53
53 Product transition SW quality factors Reusability : - Deals with the use of SW modules originally designed for one project in a new SW project currently begin developed. - The reuse of SW is expected to save resources., shorten the project period, and provide higher quality modules. These benefits of higher quality are based on the assumption that most SW faults have already been detected by SQA activities performed previously on it.
54
54 Product transition SW quality factors Interoperability : - Focus on creating interfaces with other SW systems or with other equipment firmware. - Example: - The firmware of medical lab. equipment is required to process its results according to a standard data structure that can be then serve as input for a number of standard laboratory IS.
55
55 Alternative Models Of SW Quality Factors Two other models for SQ factors: Evans and Marciniak 1987 ( 12 factors ) Deutsch and Willis 1988. ( 15 factors ) Five new factors were suggested Verifiability Expandability Safety Manageability Survivability
56
56 Alternative Models Of SW Quality Factors Five new factors were suggested Verifiability: define design and programming features that enable efficient verification of the design and programming ( modularity, simplicity, adherence to documentation and prog guidelines. ) Expandability: refer to future efforts that will be needed to serve larger populations, improve services, or add new applications in order to improve usability. Safety: meant to eliminate conditions hazardous to equipment as a result of errors in process control SW. Manageability: refer to the admin. tools that support SW modification during the SW development and maintenance periods. Survivability: refer to the continuity of service. These define the minimum time allowed between failures of the system, and the maximum time permitted for recovery of service.
57
57 Who is interested in the definition of quality requirements ? The client is not the only party interested in defining the requirements that assure the quality of the SW product. The developer is often interested also specially : Reusability Verifiability Porotability Any SW project will be carried out according to 2 requirements document : The client’s requirements document The developer’s additional requirements document.
58
58 Chapter 4 The Components Of the SQA system- Overview
59
59 The SQA system- an SQA architecture SQA system components can be classified into 6 classes : Pre-project components Components of project life cycle activities assessment Components of infrastructure error prevention and improvement. Components of SQ management Components of standardization, certification, and SQA system assessment Organizing for SQA- the human components
60
60 Pre-project Components : To assure that : 1. The project commitments have been adequately defined considering the resources required, the schedule and budget. 2. The development and quality plans have been correctly determined.
61
61 Components of project life cycle activities assessment: The project life cycle composed of two stages: 1. The development Life cycle stage: Detect design and programming errors Its components divided into: Reviews Expert opinions Software testing Assurance of the quality of the subcontractors’ work and customer-supplied parts. 2. The operation-maintenance stage Include specialize maintenance components as well as development life cycle components, which are applied mainly for functionality improving maintenance tasks.
62
62 Components of infrastructure error prevention and improvement : Main objectives of these components, which are applied throughout the entire organization, are : To eliminate or at least reduce the rate of errors, based on the organization’s accumlated SQA experience.
63
63 Components of software quality management : This class of components is geared toward several goal: The major ones being the control of development and maintenance activities and introduction of early managerial support actions that mainly prevent or minimize schedule and budget failures and their outcomes.
64
64 Components of standardization, certification, and SQA system assessment The main objective of this class are: 1. Utilization of international professional knowledge 2. Improvement of coordination of the organizational quality system with other organizations 3. Assessment of the achievements of quality systems according to a common scale. The various standards classified into 2 groupes: Quality management standards Project process standards.
65
65 Organizing for SQA- the human components The SQA organizational base includes : Managers Testing personnel The SQA unit and practitioners interested in SQ. The main objectives are to initiate and support the implementation of SQA components Detect deviation from SQA procedures and methodology Suggest improvements
66
66 Part II Pre-project SQ components Chapter 5 Contract Review
67
67 Contract Review Is the software quality element that reduces the probability of undesirable situation like in ( CFV project ). Contract review is a requirement by the ISO 9001 and ISO 9000-3 guidelines.
68
68 The Contract review process and its stages Several situations can lead a SW company to sign a contract with a customer such as : Participation in a tender Submission of a proposal according to the customer’s RFP. Receipt of an order from a company’s customer Receipt of an internal request or order from another department in the organization
69
69 The Contract review process and its stages Contract review : is the SQA component devised to guide review drafts of proposal and contract documents. If applicable, provides oversight ( supervision ) of the contracts carried out with potential project partners and subcontractors.
70
70 The Contract review process itself is conducted in two stages : Stage 1 – Review of the proposal draft prior to submission to the potential customer ( proposal draft review ): Reviews the final proposal draft and proposal’s foundations: Customer’s requirement documents Customer’s additional details and explanations of the requirements Cost and resources estimates Existing contracts or contract drafts of the supplier with partners and subcontractors.
71
71 The Contract review process itself is conducted in two stages : Stage 2 – Review of the proposal draft prior to signing ( Contract draft review ): Reviews the contract draft on the basis of the proposal and the understandings ( include changes ) reached during the contract negotiations sessions. The individuals who perform the review thoroughly examine the draft while referring to a comprehensive range of review subjects ( a Check-list ) is very helpful for assuring the full coverage of relevant subjects. See appendix 5A, 5B
72
72 Contract Review objectives: Proposal draft review objectives( assure the following ) Customer requirements have been clarified and documented Alternative approaches for carrying out the project have been examined Formal aspects of the relationship between the customer and SW firm have been specified. Identification of development risks Adequate estimation of project resources and timetable have been prepared. Examination of the customer’s capacity to fulfill his commitments Definition of partners and subcontractors participation conditions Definition and projection proprietary rights.
73
73 Contract Review objectives: Contract draft review objectives( assure the following ) No un-clarified issues remain in the contract draft All the understandings reached between the customer and the firm are to be fully and correctly documented. No changes, additions, or omissions that have not been discussed and agreed upon should be introduced into contract draft.
74
74 Factors affecting the extent of a contract review: Project magnitude, usually measured in man-month resources. Project technical complexity Degree of staff acquaintance with and experience in the project area. Project organizational complexity, the greater the number of organizations ( partners, subcontractors, and customers ) taking part in the project, the greater the contract review efforts required.
75
75 Who performs a contract review: The leader or another member of the proposal team The members of the proposal team An outside professional or a company staff member who is not a member of the proposal team. A team of outside experts.
76
76 Implementation of a contract review of a major proposal The characteristics of the major proposal : Very large-scale project Very high technical complexity New professional area for the company High organizational complexity The difficulties of carrying out contract reviews for major proposals : Time pressures Proper contract review requires substantial professional work The potential contract review team members are very busy.
77
77 Implementation of a contract review of a major proposal Recommended avenues ( approaches ) for implementing major contract reviews : The contract review should be scheduled. A team should carry out the contract review A contract team leader should be appointed The activities of the team leader include : Recruitment of the team members Distribution of review tasks Coordination between members Coordination between the review team and the proposal team Follow-up of activities, especially compliance with the schedule Summarization of the findings and their delivery to the proposal team.
78
78 Contract review for internal projects See table 5.1 page 86 The main point here is the internal relationship. Loose relationships are usually characterized by insufficient examination of the project’s requirements, its resources and development risks. To avoid the previous problems we have to apply the contract review to the internal as external projects by implementing procedures that define : An adequate proposal for the internal project Applying a proper contract review process An adequate agreement between the internal customer and the internal supplier.
79
79 Chapter 6 Development and quality plans Development plans and quality plans are the major elements needed for project compliance with ISO 9000.3 standards and ISO/IEC 2001 and with IEEE 730. It is also an important element in the Capability Maturity Model ( CMM ) for assessment of SW development organization maturity. The projects needs development and quality plans that : Are based on proposal materials that have been re-examined and thoroughly updated Are more comprehensive than the approved proposal, especially with respect to schedules, resources, estimates, and development risk evaluations Include additional subjects, absent from the approved proposal others
80
80 Development plan and quality plan objectives 1. Scheduling development activities that will lead to successful and timely completion of the project, and estimating the required manpower resources and budget. 2. Recruiting team members and allocating development resources. 3. Resolving development risks. 4. Implementing required SQA activities 5. Providing mgt. with data needed for project control.
81
81 Elements of the development plan 1. Project products 2. Project interfaces 3. Project methodology and development tools 4. SW development standards and procedures 5. The mapping of the development process.( proj. mgt. Gant ) 6. Project milestones ( documents, code, report ) 7. Project staff organization ( org. stru., prof. req., no of team mem., names of team leaders ) 8. Development facilities ( SW, HW tools, space, period req. for each use ) 9. Development risks ( see next slide ) 10. Control methods 11. Project cost estimation
82
82 Development risks Is a state or property of a development task or environment which, if ignored, will increase the likelihood of project failure. Such as : 1. Technological gap 2. Staff shortages 3. Interdependence of organizational elements- the likelihood that suppliers or specialized HW or SW subcontractors, for example, will not fulfill their obligations or schedule.
83
83 Elements of Quality Plan 1. Quality goals ( quantitative measures example page 102 ) 2. Planned review activities The scope of review activity The type The schedule ( priorities ) The specific procedure to be applied Who responsible for carrying out the rev. act. 3. Planned SW tests ( a complete list of planned SW tests should be provided ) each test The unit, integration or the complete system to be tested The type of testing activities to be carried out The planned test schedule The specific procedure Who responsible
84
84 Elements of Quality Plan 4. Planned acceptance tests for externally developed SW 5. Configuration management configuration mgt tools and procedures, including those change-control procedures meant to be applied throughout the project
85
85 Dev. And Quality Plan for small projects & internal projects See page 105, 106
86
86 Chapter 7 Integrating Quality activities in the project life cycle Classic and Other SW development Methodologies: SDLC ( Req. def., Analysis, Design, Coding, sys. Tests, install and conversion, op. and maintenance )
87
87 Integrating Quality activities in the project life cycle Prototyping
88
88 Integrating Quality activities in the project life cycle The Spiral model See page 128 It is an improved metho. for overseeing large and more complex projects Combines SDLC & prototyping At each iteration of the spiral the following activities are performed: Planning Risk analysis and reslution Engineering activities Customer evaluation, comm, changes, etc
89
89 Integrating Quality activities in the project life cycle The object-oriented model. Easy integration of existing sw modules ( Objects ) into newly developed sw sys. A SW component library serves this purpose by supplying sw components for reuse. See page 130 Advantages of library reuse: Economy Improve quality Shorter development time The advantages of OOPS will grow as the storage of reusable SW grows ( Example Microsoft and Unix )
90
90 Factors affecting intensity of quality assurance activities in the development projects Quality assurance activities will be integrated into development plan that implements one or more SW development models Quality assurance planners for project are required to determine : The list of QA activities needed for a project For each QA activity: Timing Who perform & the resources required Team members, external body for QA Resources required for removal of defects and introduction of changes.
91
91 Factors affecting intensity of quality assurance activities in the development projects Project factors Magnitude of the project Technical complexity and difficulty Extent of reusable SW components Severity of failure outcome if the project fails Team factors Professional qualification of team members Team acquaintance with the project and its experience in the area Availability of staff members who can professionally support team Familiarity with team members, in other words the percentage of new staff members in the team See example page 132
92
92 Verification, Validation and Qualification Three aspects of quality assurance of the SW product are examined under the issues of verification, validation, and qualification( IEEE std 610.12-1990) Verification : the process of evaluating a system or component to determine whether the products of a given development phase satisfy the conditions imposed at the start of that phase. It examines the consistency of the products being developed with products developed in the previous phases. Examiner can assure that development phases have been completed correctly
93
93 Verification, Validation and Qualification Validation : the process of evaluating a system or component during or at the end of the development process to determine whether it satisfies specified requirements. It represents the customer’s interest by examining the extent of compliance to his original req. Comprehensive validation reviews tend to improve customer satisfaction from the system
94
94 Verification, Validation and Qualification Qualification : the process used to determine whether a system or component is suitable for operational use. It focuses on operational aspects, where maintenance is the main issues Planners are required to determine which of these aspects should be examined in each quality assurance activity.
95
95 A model for SQA defect removal effectiveness & Cost The model deals with 2 quantitative aspects : 1. Effectiveness in removing project defects 2. The cost of removal See page 135
96
96 Defect removal effectiveness It is assumed that any SQA activity filters ( screens ) a certain percentage of existing defects. In most cases the percentage of removed defects is somewhat lower than the percentage of detected defects as some corrections are ineffective or inadequate. The next SQA activity will faces both the remaining defects and the new defects created in the current development phases. It is assumed that the filtering effectiveness of accumulated defects of each QA activity is not less than 40%. Table 7.4 page 136 list the average filtering effectiveness by QA activities.
97
97 Cost of defect removal The cost of defect removals varies by development phase, while costs rise substantially as the development process proceeds. Example : removal of a design defect detected in the design phase may require an investment of 2.5 working days; removal of the same defect may required 40 days during the acceptance tests. Defect-removal costs based on some surveys are shown in table 7.5 page 137.
98
98 The Model The model is based on the following assumptions: The development process is linear and sequential, following the waterfall model. A number of new defects are introduced in each development phase ( see table 7.3 page 135 ). Review and test SQA activities serve as filters, removing a percentage of the entering defects and letting the rest pass to the next phase. If we have 30 defects and the filtering efficiency 60% then 18 defects will be removed & 12 will stay to the next. At each phase the incoming defects are the sum of defects not removed together with the new defects introduced ( created ) in the current development phase. The cost is calculated for each QA activity by multiplying the number of defects removed by the relative cost of removing a defect. ( table 7.5 ) The remaining defects passed to the customer, will be detected by him.
99
99 The model presents the following POD : phase originated defects ( table 7.3 ) PD : passed defects. %FE : % filtering effectiveness ( table 7.4 ) RD : removed defects CDR : cost of defect removal ( table 7.5 ) TRC : total removal cost.
100
100 Chapter 8 Reviews IEEE definition Review process : A process or meeting during which a work product or set of products is presented to project personnel, managers, users, customers, or other interested parties for comment or approval.
101
101 Methodologies for reviewing documents Reviews acquire special importance in the SQA process because they provide early direction and prevent the passing of design and analysis errors “down-stream”, to stages where error detection and correction are much complicated and costly : The methodologies for reviewing : Formal design review Peer reviews ( inspections and walkthroughs ) Expert opinions Standards for SW reviews are the subject of IEEE std 1028 ( IEEE, 1997 ).
102
102 Reviews Objectives ( Direct Objectives ) To detect analysis & design errors as well as subjects where corrections, changes and completions are required with respect to the original specifications and approved changes. To identify new risks likely to affect completion of the project. To locate deviations from templates and style procedures and conventions. Correction of these deviations is expected to contribute to improved communication & coordination resulting from greater uniformity of methods & documentation style. To approve the analysis or design product. Approval allows the team to continue to the next development phase.
103
103 Reviews Objectives ( Indirect Objectives ) To provide an informal meeting place for exchange of professional knowledge about development methods, tools, and techniques. To record analysis and design errors that will serve as a basis for future corrective actions. The corrective actions are expected to improve development methods by increasing effective and quality, among other product features.
104
104 Formal design reviews ( DRs ) Formal design review, also called Design reviews ( DRs ) Formal technical reviews ( FTR ) Without this approval, the development team cannot continue to the next phase of SW development project. Formal design review can be conducted at any development milestone requiring completion of an analysis or design document, whether that document is a requirement specification or an installation plan.
105
105 A list of common Formal design reviews : DPR - development plan review SRSR- Software requirement specification review PDR – Preliminary design review DDR – Detail design review DBDR – Data base design review TPR – Test plan review STPR – Software test procedure review VDR – Version description review OMR – operator manual review SMR – Support manual review TRR – Test readiness review PRR – Product release review IPR – Installation Plan review
106
106 The Formal Design Review will focus on : The participants The prior preparations The DR session The recommended post-DR activities
107
107 The participants in a DR All DRs are conducted by A review leader A review team The review leader: characteristics Knowledge & experience in development of projects of the type reviewed. Seniority at a level similar to if not higher than that of the project leader A good relationship with the project leader and his team A position external to the project team. Small dev. Departments and software houses typically have difficulties finding an appropriate candidate to lead the review team. One possible solution to this is the appointement of an external consultant.
108
108 The Review Team It is desirable for non-project staff to make up the majority of the review team. The size of the review team from 3 to 5 to be an efficient team
109
109 Preparation for a DR A DR session are to be completed by all three main participants in the review : Review leader, an team Development team. Each one is required to focus on distinct aspects of the process. Review leader preparations (main tasks) : To appoint the team members To schedule the review sessions To distribute the design document among team members ( hard copy, electronic copy etc )
110
110 Preparation for a DR Review team preparations (main tasks) : Review the design document and list their comments prior to the review session team members may use a review checklists. See chapter 15 ( checklists ) Development team preparations ( main tasks ) Prepare a short presentation of the design document The presentation should focus on the main professional issues awaiting approval rather than wasting time on description of the project in general.
111
111 The DR session The agenda is the issue ( a typical DR session agenda ) : 1. A short presentation of the design document 2. Comments made by members of the review team. 3. Verification and validation in which each of the comments is discussed to determine the required actions ( corrections, changes and addition ) that the project team has to perform. 4. Decisions about the design product ( document ), which determines the project’s progress. These decisions take the following three forms:
112
112 Decisions forms : Full approval : enables immediate continuation to the next phase. It may be accompanied by demands for some minor corrections to be performed by project team. Partial approval: approval of immediate continuation to the next phase for some parts of the project, with major action items demanded for the remainder of the project. Denial of approval : demands to repeat of the DR
113
113 The DR report see appendix 8A one of the review leader responsibilities is to issue a DR report immediately after the review session. The development team should perform the corrections earlier and minimize the attendant delays to the project schedule. The report major sections contain : A summary of the review discussion The decision of the continuation of the project A full list of the required actions ( corr, changes, additions) and the anticipated completion dates. The name(s) of the review team member(s) assigned to follow up performance of corrections.
114
114 The follow-up process The review leader himself is required to determine whether each action item has been satisfactory accomplished as a condition for allowing the project to continue to the next phase. Follow-up should be documented to enable clarification.
115
115 Pressman (2000, chapter 8 ) Pressman’s 13 “golden guidelines “ for a successful design review: See page 157
116
116 Peer Reviews two review methods ( Inspection and Walkthrough ) The major difference between formal design reviews and peer review methods is rooted in participants & authority. In peer reviews, as expected, the project leader’s equals, members of his department and other units. The other difference lies in degree of authority & the objective of each review method. The peer review main objectives lies in detecting errors & deviations from standards. The appearance of the CASE tools reduce the value of manual reviews such as inspection and walkthrough. Researches find out that peer reviews are highly efficient as well as effective method.
117
117 Inspection & Walkthrough What differentiates a walkthrough from an inspection is the level of formality, inspection is the more formal of two. Inspection emphasizes the objective of corrective actions. Walkthrough’s findings are limited to comments on the document reviewed.
118
118 Inspection & Walkthrough Inspection is usually based on a comprehensive infrastructure, including : Development of inspection checklists developed for each type of design document as well as coding language and tool, which are periodically updated. Development of typical defect type frequency tables, based past findings, to direct inspectors to potential “defect concentration areas”. Periodic analysis of the effectiveness of past inspections to improve the inspection methodology Introduction of scheduled inspections into the project activity plan and allocation of the required resources, including resources for correction of detected defects.
119
119 Participants of peer reviews A review leader Main tasks & qualification page 161 The author Invariably a participant in each type of peer review. Specialized professional For inspections: A designer A coder or implementer A tester For walkthrough: A standards enforcer A maintenance expert A user representative.
120
120 Team assignments The presenter The Scribe
121
121 Preparations for a peer review session Leader preparation Team’s preparation
122
122 Session Documentation Inspection session findings report Prepared by the scribe Inspection session summary report Prepared by the leader See appendix 8b, 8c
123
123 Post-Peer review activities Post-inspection activities are conducted to attest to : The prompt, effective correction and reworking of all errors by the designer/author and his team, as performed by the inspection leader in the course of the assigned follow-up activities. Transmission of the inspection reports to the internal Corrective Action Board ( CAB ) for analysis. See Fig 8.2 ( comparison of the peer review methods ( Page 166 )
124
124 The efficiency of peer reviews Some of the more common metrics applied to estimate the efficiency of peer reviews: peer review detection efficiency( average hrs worked per defect detected) Peer review defect detection density ( average number of defects detected per page of the design document ) Internal peer review effectiveness ( % of defects detected by peer reviews as % of total defects detected by the developer).
125
125 Comparisons See tables page 167-169
126
126 Expert opinions ( external ) It is good in the following situations : Insufficient in-house proff. Temporary lack in-house proff. Disagreements In small organizations
127
127 Chapter 9 Software testing - Strategies Testing Definition : Testing is the process of executing a program with intention of finding errors. IEEEdefinition: 1. The process of operating a system or component under specified condition, observing or recording the results, and making an evaluation of some aspect of the system or component. 2. The process of analyzing a software item to detect the difference between the existing and required conditions ( that is, bugs ) and evaluate the features of the software item.
128
128 Software testing - Definition Is a formal ( SW test plan ) process carried out by specialized testing team ( independent ) in which a software unit, several integrated software units or entire software package are examined by running the programs on a computer. All the associated tests are performed according to approved test procedures on approved test case.
129
129 Software testing objectives Direct objectives: To identify and reveal as many errors as possible in the tested SW. To bring the tested SW, after correction of the identified errors and retesting, to an acceptable level of quality. To perform the required tests efficiently, within budgetary and scheduling limitations. Indirect objectives To compile a record of SW errors for use in error prevention ( by corrective & preventive actions )
130
130 Software testing Strategies To test the SW in its entirety, once the completed package is available; otherwise known as “big bang testing “. To test the SW piecemeal, in modules, as they are completed ( unit tests ) ; then to test groups of tested modules integrated with newly completed modules ( integrated tests ). This process continues until all the entire package is tested as whole ( system test ). This testing strategy is usually termed “incremental testing”
131
131 Incremental testing is also performed according to two basic Strategies: Bottom-Up ( 4 stages ) see fig page 183 Top-down ( 6 stages ) The incremental pathes: Horizontal sequence ( breadth first ) Vertical sequence ( Depth first )
132
132 Stubs & Drivers for incremental testing Stubs and drivers are SW replacement simulators required for modules not available when performing a unit test or an integration test. Stubs ( often termed a “Dummy Module “ )replaces an unavailable lower level module, subordinate to the module tested. It is required for top-down testing of incomplete systems. See example fig 9.2
133
133 Stubs & Drivers for incremental testing A driver is a substitute module but of the upper level module that activates the module tested. The driver is passing the test data on to the tested module and accepting the results calculated by it. It is required in bottom-up testing until the upper level modules are developed. See example fig 9.2
134
134 Bottom-Up Vs Top-Down strategies Main Adv. Of bottom-up : The relative ease of its performance Main disadv. The lateness at which the prog. As whole can be observed ( at the stage following testing of the last module ) Main adv. Of Top-down: The possibility it offers to demonstrate the entire prog. Function shortly after activation of the upper-level modules has been completed. This characteristic allows for early identification of analysis & design errors related to algorithms, functional requir., and the like. Main disadv. The relative difficulty of preparing the required stubs, with often require very complicated programming. The relative difficulty of analyzing the result of the tests.
135
135 Big bang Vs. Incremental testing The main disadvantages of big bang : Identification of errors becomes difficult Corrections will be at the same time. Error correction estimation of the required testing resources and testing schedule a rather fuzzy endeaver. Incremental testing adv.: Usually performed on relatively small SW modules, as unit or integration tests.( more errors detection ) Identification and correction of errors is much simpler and requires fewer resources because it is performed on a limited volume of SW.
136
136 Software test classification. Classification according to testing concept: Two testing classes have been developed: Black box ( functionality ) testing: Identifies bugs only according to SW malfunctioning as they are revealed in its erroneous output. Incases that outputs are found to be correct, black box testing disregarded the internal path of calculations and processing performed. White box ( structural ) testing: Examines internal calculation paths in order to identify bugs. The white is meant to emphasize the contrast between methods
137
137 Software test classification. Classification according to requirements: See table 9.1 Page 188
138
138 White Box Testing White box testing concept requires verification of every program statement and comment. See table 9.2, white box testing enables performance of data processing and calculations correctness tests SW qualification tests Maintainability tests Reusability tests Every computational operation in the sequence of operations created by each test case ( “Path” ) must be examined. This type of verification allows us to decide whether the processing operations and their sequences were programmed correctly for the path in question, but not for other pathes.
139
139 Data processing and calculation correctness tests Path coverage & line coverage Path coverage : total number of possible paths 10 if-then-else : 1024 paths Line coverage: for full line coverage, every line of code be executed at least once during the process of testing. The line coverage metrics for completeness of a line-testing plan are defined as the percentage of lines indeed executed during the tests. Flow chart and a program flow graph are used. Example 191.
140
140 McCabe’s cyclomatic complexity metrics Measures the complexity of a program or module at the same time as it determines the maximum number of independent paths needed to achieve full line Coverage of the program. The measure is based on graph theory using program flow graph. An independent path is defined with reference to the succession of independent paths accumulated. Independent path is any path on the program flow graph that includes at least one edge that is not included in any former independent paths. See table 9.5 page 194
141
141 McCabe’s cyclomatic complexity metrics The cyclomatic complexity metric V(G) V(G) = R = E – N + 2 = P + 1 Where R = the number of regions E = number of edges N = number of nodes P = number of decisions ( nodes having more than one leaving edge). See example page 195 An empirical studies show that if V(G) < 5 considered simple If it is 10 or less considered not too difficult If 20 or more it is high If it is exceed 50 the SW for practical purposes becomes untestable.
142
142 SW qualification & reusability testing SW qualification testing SW reusability testing Adv & disadv. Of White box Page 197
143
Black box testing Allows performing output correctness tests and most other classes of tests Output correctness tests Consume the greater part of testing resources Apply the concept of test cases Equivalence class partitioning Equivalence class (EC) A set of input variable values that produce the same output results or that are processed identically. Its boundaries are defined by a single numeric or alphabetic value, a group of numeric or alphabetic values, a range of values, etc. It aims to increase test efficiency and improve coverage of a potential error conditions. 143
144
Black box testing Equivalence class (EC) An EC that contains only valid states is defined as a “valid EC”, An EC that contains only invalid states is defined as an “invalid EC” Valid and invalid ECs should be defined for each variable in a SW Each valid EC and each invalid EC are included in at least one test case. The total number of required test cases to cover the valid ECs is equal to and in most cases significantly below the number of valid ECs (Why?) In invalid ECs, we must assign one test case to each “new” invalid EC(why?) Equivalence classes save testing resources as they eliminate duplication of the test cases defined for each EC. See example page 199 144
145
Black box testing Documentation tests Common components of documentation Functional descriptions of the software system. Installation manual User manual Programmer manual Document testing should include the following Document completeness check. Document correctness tests. Document style and editing inspection. 145
146
Black box testing Availability tests (reaction time) The time needed to obtain the requested information The time required for firmware to react. More importance in on-line applications Reliability tests Events occurring over time, such as average time between failures average time for recovery after system failure average downtime per month It is better to carry it out once the system is completed. Problem Testing process may need hundreds of hours Comprehensive test case file must be constructed. 146
147
Black box testing Stress tests load tests Relates to the functional performance of the system under maximal operational load: maximal transactions per minute, There are usually realized for loads higher than those indicated in the requirements specification 147
148
Black box testing Stress tests Durability tests They are typically required for real-time firmware These test include Firmware responses to climatic effects such as extreme hot and cold temperatures, dust,etc, Operation failures resulting from sudden electrical failures, voltage “jumps”, and sudden cutoffs in communications. 148
149
Black box testing Software system security tests It aims at Preventing unauthorized access to the system or parts of it, Detection of unauthorized access and the activities performed by the penetration, Recovery of damages caused by unauthorized penetration cases. 149
150
Black box testing Training usability tests Used to estimate the resources needed to train a new employee How many hours of training are required for a new employee To achieve a defined level of acquaintance with the system To reach a defined hourly production rate. Operational usability tests They focus on the operator’s productivity ( quantitatively and qualitatively) 150
151
Black box testing Maintainability tests Concerned by: System structure compliance to the standards and development procedures Programmer’s manual It is prepared according to approved documentation standards Internal documentation It is prepared according to coding procedures and conventions Covers the system’s documentation requirements. 151
152
Black box testing Flexibility tests The efforts required to adapt the software to Customer needs Additional changes to improve improving system functionality Testability tests The ease of testing the software system The addition of special features that help the testers in their work, such as Obtaining intermediate results Adding diagnostic tool for the analysis of the system and reporting any failure 152
153
Black box testing Portability tests The possibility to use other operating systems, hardware or communication equipment standards to use the tested SW Interoperability test Test the capabilities of interfacing with other HW or SW Receive inputs or send outputs 153
154
Advantages and disadvantages of black box testing Advantages Allows carrying out the majority of testing classes, such as load tests and availability tests Requires fewer resources than those required for white box testing Disadvantages Coincidental aggregation of several errors may produce correct response for a test case thus prevent error detection Absence of control of line coverage. Impossibility of testing the quality of coding and its strict adherence to the coding standards. 154
155
Chapter 10 Software testing Implementation 155
156
The testing process Determining the appropriate software quality standard Planning the tests Designing the tests Performing the tests (implementation) 156
157
Determining the appropriate software quality standard Depends on the characteristics of the software’s application Nature and magnitude of damages in case of system failure The higher the expected level of damage, the higher standard for software quality is needed see Table 8.1(Page 219) 157
158
Determining the software testing strategy This to decide The testing strategy A big bang or incremental testing strategy Bottom-up or top-down Which parts to be performed using white box testing model Which parts of the testing must be performed using automated testing 158
159
Planning the tests Unit tests Deal with small units of software or modules Integration tests Deal with several units that combine into a subsystem System tests Refer to the entire software package/system. 159
160
Planning the tests Commonly documented in a “software test plan” (STP) Issues to consider before initiating a specific test plan: What to test? Which sources to use for test cases? Who is to perform the tests? Where to perform the tests? When to terminate the tests? 160
161
What to test? It is always preferred to perform a full and comprehensive This will ensure top quality software Requires the investment of vast resources So we must decide Which modules should be unit tested Which integrations should be tested How to allocate available resources when performing the test ( priorities ) 161
162
Priority rating method Use two factors Factor A: Damage severity level: The severity of results in case the module or application fails. Factor B: Software risk level: The probability of failure. combined rating (C) C = A + B C = k × A + m × B C = A × B k and m are constant ( see page 222) 162
163
Priority rating method Issues affecting software risk level Module/application issues Magnitude, Complexity Difficulty Percentage of original software (vs. percentage of reused software) Programmer issues Professional qualifications Experience with the module’s specific subject matter Availability of professional support Acquaintance with the programmer (the ability to evaluate their capabilities 163
164
Which sources to use for test cases? Should we use Synthetic test case Real-life test cases Stratified Sampling Break down the random sample into sub- populations of test cases Reducing the proportion of the majority “regular” population tested Increasing the sampling proportion of small populations and high potential error populations Thus minimizes the number of repetitions 164
165
Which sources to use for test cases? For each component of the testing plan we must decide wither The use of a single or combined source of test cases, or both How many test cases from each source are to be prepared The characteristics of the test cases. 165
166
Who performs the tests? Integration tests and unit tests Generally performed by the SW dev. Team In some instances it is the testing unit System tests Usually performed by an independent testing team (internal or external) In large software systems more than one testing team can be employed In small software development organizations Another development team in the enterprise Outsourcing of testing responsibilities 166
167
Where to perform the tests? Usually is performed at the software developer’s site When the test is performed by external testing consultants It is performed at consultant’s site. As a rule, the environment at the customer’s site differs from that at the developer’s site Despite efforts to “simulate” that environment 167
168
When are tests terminated? The completed implementation route The mathematical models application route The error seeding route The dual independent testing teams route See pages (226-227) Termination after resources have petered out 168
169
Test design Composed of Detailed design and procedures for each test The software test procedure document Test case database/file The test case file document In some cases the two document are integrated in one document called Software test description (STD) 169
170
Test implementation In general it consists of A series of tests Corrections of detected errors Re-tests (regression tests) Regression testing is done to Verify that the detected errors in the previous test runs have been properly corrected No new errors have entered as a result of faulty corrections. 170
171
Test implementation It is advisable to re-test according to the original test procedure Usually only a portion of the original test procedure is re-tested to save testing resources This involve the risk of not detecting new errors produced when performing the correction 171
172
Automated testing This process includes Test planning Test design Test case preparation Test performance Test log and report preparation Re-testing after correction of detected errors (regression tests) Final test log and report preparation including comparison reports 172
173
Automated testing Types of automated tests Code auditing (Qualification testing) Does the code fulfill code structure/style Module size, Levels of (loop nesting, subroutine nesting) Prohibited constructs, such as GOTO. Naming conventions for variables, files, etc. Do the internal program documentation follows the coding style procedures? Location of comments in the file Help index and presentation style 173
174
Automated testing Types of automated tests Functional tests Replace manual black-box correctness tests These tests can be executed with minimal effort or professional resources. Coverage monitoring Produce reports about the line coverage. 174
175
Automated testing Types of automated tests Load tests If it to be performed manually, in most cases it is impractical and impossible in others The solution is to use computerized simulations In general it is combined with availability and efficiency tests In this test the load is gradually increased to the specified maximal load and beyon 175
176
Automated testing Types of automated tests Load tests The tester may wish to: Change the hardware Change the scenario in order to reveal the load contributed by each user or event Test an entirely different scenario Test new combinations of hardware and scenario components The tester will continue his iterations till he finds the appropriate hardware configuration 176
177
Automated testing Types of automated tests Test management It monitors performance of every item on long lists of test case files provide testers with reports timetable follow-up 177
178
Automated testing Advantages of automated tests Accuracy and completeness of performance Accuracy of results log and summary reports Comprehensiveness of information Few manpower resources required to perform tests Shorter duration of testing Performance of complete regression tests Performance of test classes beyond the scope of manual testing. 178
179
Automated testing Disadvantages of automated tests High investments required in package purchasing and training. High package development investment costs. High manpower requirements for test preparation. Considerable testing areas left uncovered. See example 245 179
180
Alpha site tests “Alpha site tests” are tests of a new software package that are performed at the developer’s site by the customer The identified errors are expected to include the errors that only a real user can reveal, and thus should be reported to the developer 180
181
Beta site tests Beta site tests are much more commonly applied than are alpha site tests. It is applied on an advanced version of the software package The developer offers it free of charge SW to one or more potential users in order to test them 181
182
Alpha and beta site testing Advantages Identification of unexpected errors A wider population in search of errors Low costs Disadvantages A lack of systematic testing Low quality error reports Difficult to reproduce the test environment Much effort is required to examine reports 182
183
Chapter 11 Assuring the quality of software maintenance components 183
184
Maintenance service components Corrective maintenance support services and software corrections. Adaptive maintenance adapts the software package to differences in new customer requirements, Functionality improvement maintenance perfective maintenance of new functions added to the software so as to enhance performance, preventive maintenance activities that improve reliability and system infrastructure for easier and more efficient future maintainability 184
185
Causes of user difficulties Code failure software failure Documentation failure user’s manual, help screens Incomplete, vague or imprecise doc Insufficient knowledge of the software system failure to use the documentation supplied 185
186
Software maintenance QA activities: objectives Assure, with an accepted level of confidence that: the software maintenance activities conform to the functional technical requirements. the software maintenance activities conform to managerial scheduling and budgetary requirements Initiate and manage activities to improve and increase the efficiency of software maintenance and SQA activities 186
187
The foundations of high quality Foundation 1: software package quality Foundation 2: maintenance policy 187
188
Foundation 1: software package quality 188
189
Foundation 2: maintenance policy 189 Version development policy How many versions of the software should be operative simultaneously The number of versions becomes a major issue for COTS software packages Can take a “sequential” or “tree” form
190
Foundation 2: maintenance policy 190 Version development policy Sequential version policy One version is made available to the entire customer population Includes a profusion of applications that exhibit high redundancy, an attribute that enables the software to serve the needs of all customers The software must be revised periodically but once a new version is completed, it replaces the version currently used by the entire user population.
191
Foundation 2: maintenance policy 191 Version development policy Tree version policy Supports marketing efforts by developing a specialized, targeted version for groups of customers A new version is inaugurated by adding special applications or omitting applications Versions vary in complexity and level of application
192
Foundation 2: maintenance policy 192 Version development policy Tree version policy Software package can evolve into a multi- version package, a tree with several main branches and numerous secondary branches, each branch representing a version with specialized revisions More difficult and time-consuming Some organizations apply a limited tree version policy See example P260
193
Foundation 2: maintenance policy 193 Change policy Refers to the method of examining each change request and the criteria used for its approval Permissive policy contributes to an often-unjustified increase in the change task load
194
Foundation 2: maintenance policy 194 Change policy A balanced policy is preferred Allows staff to focus on the most important and beneficial changes, as well as those that they will be able to perform within a reasonable time and according to the required quality standards
195
Maintenance software quality assurance tools SQA tools for corrective maintenance SQA tools for functionality improvement maintenance SQA infrastructure tools for software maintenance SQA tools for managerial control of software maintenance. 195
196
Maintenance software quality assurance tools SQA tools for corrective maintenance Entail (User support services and Software corrections “bug repairs”) Most bug repair tasks require the use of mini-testing tool Required to handle repair patch (small- scale) tasks (small number of coding line to be corrected rapidly) 196
197
Maintenance software quality assurance tools SQA tools for functionality improvement maintenance The same project life cycle tools are applied (reviews and testing etc..) Are implemented also for large-scale adaptive maintenance tasks 197
198
Maintenance software quality assurance tools SQA infrastructure component for S.W maintenance We need SQA infrastructure tools for: Maintenance procedures and work instructions Supporting quality devices Preventive and corrective actions Configuration management Documentation and quality record control Training and certification of maintenance teams 198
199
Maintenance software quality assurance tools Maintenance procedures and work instructions Remote handling of request for service On-site handling User support service Quality assurance control Customer satisfaction surveys 199
200
Maintenance software quality assurance tools Supporting quality devices Checklists for location of causes for a failure Templates for reporting how software failure were solved Checklists for preparing a mini testing procedure document 200
201
Maintenance software quality assurance tools Preventive and corrective actions Directed and controlled the CAB Corrective Action Board Changes in content and frequency of customer requests for user support services Increased average time invested in complying with customer’s user support requests Increased average time invested in repairing customer’s software failures Increased percentage of software correction failures. 201
202
Maintenance software quality assurance tools Configuration management Failure repair Software system version installed at the customer’s site A copy of the current code and its documentation Group replacement Decision making about the advisability of performing a group replacement Planning the group replacement, allocating resources and determining the timetable. Maintenance documentation and quality records 202
203
Chapter 12 Assuring the quality of external participants’ contributions 203
204
Types of external participants Subcontractors “outsourcing” Undertake to carry out parts of a project Advantages : staff availability, special expertise or low prices. Suppliers of COTS software and reused software modules Advantages: reduce time and cost increase quality: since these components have already been tested and corrected by the developers and previous customers 204
205
Types of external participants The customer themselves why? Apply the customers’ special expertise, respond to commercial or other security needs Keep internal development staff occupied, prevent future maintenance problems Disadvantages: Need a good customer–supplier relationship 205
206
Types of external participants 206
207
Risks and benefits of introducing external participants 207
208
Assuring the quality of external participants’ contributions: objectives Prevent delays and ensure early alert of anticipated delays. Assure acceptable quality levels of the parts developed Assure adequate documentation to serve the maintenance team Assure continuous, comprehensive and reliable control over external participants’ performance. 208
209
SQA tools for assuring the quality of external participants’ contributions 1. Requirements document reviews The contractor take the role of the customer 2. Participation in design reviews and software testing 3. Preparation of progress reports of development activities 4. Review of deliverables (documents) and acceptance tests 209
210
SQA tools for assuring the quality of external participants’ contributions 5. Establishment of project coordination and joint control committee Confirmation of the project timetable and milestones Follow-up according to project progress reports Making decisions about problems arising during follow-up about problems identified in design reviews and software tests 6. Evaluation of choice criteria regarding external participants Collection of information Systematic evaluation of the suppliers 210
211
SQA tools for assuring the quality of external participants’ contributions Collection of information Internal info about suppliers and subcontractors Past performance file based on cumulative experience. Requires systematic reporting Auditing the supplier’s quality system Opinions of regular users of the supplier’s products Internal units, Other organizations and Professional organizations that certified the supplier as qualified to specialize in the field) 211
212
Chapter 17 Corrective and preventive actions 212
213
Corrective and preventive actions (CAPA) definition: Corrective actions: A regularly applied feedback process that includes collection of information on quality non-conformities, identification and analysis of sources of irregularities as well as development and assimilation of improved practices and procedures, together with control of their implementation and measurement of their outcomes. 213
214
Corrective and preventive actions (CAPA) definition: Preventive actions: A regularly applied feedback process that includes collection of information on potential quality problems, identification and analysis of departures from quality standards, development and assimilation of improved practices and procedures, together with control of their implementation and measurement of their outcomes. 214
215
The corrective and preventive actions process Information collection Analysis of information Development of solutions and improved methods Implementation of improved methods Follow-up. 215
216
The corrective and preventive actions process 216
217
Information collection 217
218
Information collection 218
219
Analysis of collected information Screening the information and identifying potential improvements. Analysis of potential improvements Expected types and levels of damage Causes for faults Estimates of the extent of organization- wide potential faults of each type Generating feedback 219
220
Development of solutions and their implementation 1. Updating relevant procedures 2. Changes in practices, including updating of relevant work instructions 3. Shifting to a development tool that is more effective and less prone to the detected faults. 4. Improvement of reporting methods 5. Initiatives for training, retraining or updating staff See examples page(357) 220
221
Follow-up of activities 1. Follow-up of the flow of development and maintenance CAPA records 2. Follow-up of implementation 3. Follow-up of outcomes 221
222
Organizing for preventive and corrective actions The performance of CAPA activities depends on the existence of a permanent core organizational ad hoc team participants 222
223
Organizing for preventive and corrective actions CAPA activities depends on the existence of a permanent core organizational ad hoc team participants Can be members of the SQA unit top-level professionals, development and maintenance department managers 223
224
Organizing for preventive and corrective actions CAB committee tasks include Collecting CAPA records Screening the collected information Nominating entire ad hoc CAPA teams Promoting implementation of CAPA Following up 224
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.