CS 5150 1 CS 5150 Software Engineering Lecture 20 Reliability 2.

Slides:



Advertisements
Similar presentations
Verification and Validation
Advertisements

1 CS 501 Spring 2005 CS 501: Software Engineering Lecture 21 Reliability 3.
Software Modeling SWE5441 Lecture 3 Eng. Mohammed Timraz
Software Engineering-II Sir zubair sajid. What’s the difference? Verification – Are you building the product right? – Software must conform to its specification.
Testing Without Executing the Code Pavlina Koleva Junior QA Engineer WinCore Telerik QA Academy Telerik QA Academy.
CMSC 345, Version 11/07 SD Vick from S. Mitchell Software Testing.
CS CS 5150 Software Engineering Lecture 21 Reliability 2.
CS CS 5150 Software Engineering Lecture 22 Reliability 2.
1 CS 501 Spring 2007 CS 501: Software Engineering Lecture 21 Reliability 3.
1 CS 501 Spring 2005 CS 501: Software Engineering Lecture 19 Reliability 1.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Verification and Validation l Assuring that a software system meets a user's.
CS CS 5150 Software Engineering Lecture 20 Reliability 1.
Illinois Institute of Technology
1 CS 501 Spring 2008 CS 501: Software Engineering Lecture 19 Reliability 1.
Objectives Understand the basic concepts and definitions relating to testing, like error, fault, failure, test case, test suite, test harness. Understand.
CS CS 5150 Software Engineering Lecture 21 Reliability 3.
Testing an individual module
CS 501: Software Engineering Fall 2000 Lecture 21 Dependable Systems I Reliability.
CS 501: Software Engineering Fall 2000 Lecture 22 Dependable Systems II Validation and Verification.
CS CS 5150 Software Engineering Lecture 21 Reliability 1.
Outline Types of errors Component Testing Testing Strategy
1 CS 501 Spring 2008 CS 501: Software Engineering Lecture 21 Reliability 3.
Software Testing & Strategies
Issues on Software Testing for Safety-Critical Real-Time Automation Systems Shahdat Hossain Troy Mockenhaupt.
Verification and Validation CIS 376 Bruce R. Maxim UM-Dearborn.
SOFTWARE QUALITY ASSURANCE Maltepe University Faculty of Engineering SE 410.
Software System Integration
1CMSC 345, Version 4/04 Verification and Validation Reference: Software Engineering, Ian Sommerville, 6th edition, Chapter 19.
Chapter 24 - Quality Management Lecture 1 1Chapter 24 Quality management.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
CSCI 5801: Software Engineering
Software Testing Verification and validation planning Software inspections Software Inspection vs. Testing Automated static analysis Cleanroom software.
Verification and Validation Yonsei University 2 nd Semester, 2014 Sanghyun Park.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Verification and Validation.
Objectives Understand the basic concepts and definitions relating to testing, like error, fault, failure, test case, test suite, test harness. Explore.
CS 501: Software Engineering Fall 1999 Lecture 16 Verification and Validation.
CMSC 345 Fall 2000 Unit Testing. The testing process.
RUP Implementation and Testing
1 Software Engineering II Software Reliability. 2 Dependable and Reliable Systems: The Royal Majesty From the report of the National Transportation Safety.
CS 360 Lecture 3.  The software process is a structured set of activities required to develop a software system.  Fundamental Assumption:  Good software.
Testing Basics of Testing Presented by: Vijay.C.G – Glister Tech.
1 CS 501 Spring 2002 CS 501: Software Engineering Lecture 23 Reliability III.
CS CS 5150 Software Engineering Lecture 20 Reliability 2.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Software Verification, Validation and Testing.
©Ian Sommerville 2000Software Engineering, 6th edition. Chapter 19Slide 1 Chapter 19 Verification and Validation.
1 Introduction to Software Testing. Reading Assignment P. Ammann and J. Offutt “Introduction to Software Testing” ◦ Chapter 1 2.
1 CS 501 Spring 2002 CS 501: Software Engineering Lecture 22 Reliability II.
CS 360 Lecture 17.  Software reliability:  The probability that a given system will operate without failure under given environmental conditions for.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
CS 360 Lecture 16.  For a software system to be reliable:  Each stage of development must be done well, with incremental verification and testing. 
Chapter 1 Software Engineering Principles. Problem analysis Requirements elicitation Software specification High- and low-level design Implementation.
HNDIT23082 Lecture 09:Software Testing. Validations and Verification Validation and verification ( V & V ) is the name given to the checking and analysis.
Testing and Debugging. Testing Fundamentals  Test as you develop Easier to find bugs early rather than later Prototyping helps identify problems early.
Software Development Process CS 360 Lecture 3. Software Process The software process is a structured set of activities required to develop a software.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
VERIFICATION AND VALIDATION TECHNIQUES. The goals of verification and validation activities are to assess and improve the quality of the work products.
T EST T OOLS U NIT VI This unit contains the overview of the test tools. Also prerequisites for applying these tools, tools selection and implementation.
Testing Overview Software Reliability Techniques Testing Concepts CEN 4010 Class 24 – 11/17.
1 CS 501 Spring 2003 CS 501: Software Engineering Lecture 21 Reliability II.
CS 5150 Software Engineering Lecture 21 Reliability 2.
1 CS 501 Spring 2004 CS 501: Software Engineering Lecture 20 Reliability 2.
Verification and Validation
Chapter 9, Testing.
IEEE Std 1074: Standard for Software Lifecycle
Verification & Validation
Verification and Validation
Verification and Validation
Software testing strategies 2
Test Case Test case Describes an input Description and an expected output Description. Test case ID Section 1: Before execution Section 2: After execution.
CS 501: Software Engineering
Presentation transcript:

CS CS 5150 Software Engineering Lecture 20 Reliability 2

CS Administration

CS Metrics: Cost of Improved Reliability Time and $ Reliability metric 99% 100% Will you spend your money on new functionality or improved reliability? When do you ship?

CS Example: Central Computing System A central computer system (e.g., a server farm) is vital to an entire organization. Any failure is serious. Step 1: Gather data on every failure Many years of data in a data base Every failure analyzed: hardware software (default) environment (e.g., power, air conditioning) human (e.g., operator error)

CS Example: Central Computing System Step 2: Analyze the data Weekly, monthly, and annual statistics Number of failures and interruptions Mean time to repair Graphs of trends by component, e.g., Failure rates of disk drives Hardware failures after power failures Crashes caused by software bugs in each component

CS Example: Central Computing System Step 3: Invest resources where benefit will be maximum, e.g., Priority order for software improvements Changed procedures for operators Replacement hardware Orderly shut down after power failure Example. Supercomputers may average 10 hours productive work per day.

CS Static and Dynamic Verification Static verification: Techniques of verification that do not include execution of the software. May be manual or use computer tools. Dynamic verification: Testing the software with trial data. Debugging to remove errors.

CS Static Verification: Reviews Static testing carried out throughout the software development process. Verification Requirements specification Design Program REVIEWS

CS Reviews of Design or Code Concept Colleagues review each other's work: can be applied to any stage of software development, but particularly valuable to review program design or code can be formal or informal Reviews are a fundamental part of good software development

CS Review Process Preparation The developer provides colleagues with documentation (e.g., specification or design), or code listing Participants study the documentation in advance Meeting The developer leads the reviewers through the documentation, describing what each section does and encouraging questions Must allow plenty of time and be prepared to continue on another day.

CS Benefits of Design Reviews Benefits: Extra eyes spot mistakes, suggest improvements Colleagues share expertise; helps with training An occasion to tidy loose ends Incompatibilities between components can be identified Helps scheduling and management control Fundamental requirements: Senior team members must show leadership Good reviews require good preparation Everybody must be helpful, not threatening

CS Roles of the Review Team A review is a structured meeting, with the following roles Moderator -- ensures that the meeting moves ahead steadily Scribe -- records discussion in a constructive manner Developer -- person(s) whose work is being reviewed Interested parties -- people above and below in the software process Outside experts -- knowledgeable people who have are not working on this project Client -- representatives of the client who are knowledgeable about this part of the process

CS Static Verification: Pair Design and Pair Programming Concept: achieve benefits of review by shared development Two people work together as a team: design and/or coding testing and system integration documentation and hand-over Benefits include: two people create better software with fewer mistakes cross training Many software houses report excellent productivity

CS Static Verification: Analysis Tools Program analyzers scan the source of a program for possible faults and anomalies (e.g., Lint for C programs). Control flow: Loops with multiple exit or entry points Data use: Undeclared or uninitialized variables, unused variables, multiple assignments, array bounds Interface faults: Parameter mismatches, non-use of functions results, uncalled procedures Storage management: Unassigned pointers, pointer arithmetic Good programming practice eliminates all warnings from source code

CS Static Analysis Tools (continued) Static analysis tools Cross-reference table: Shows every use of a variable, procedure, object, etc. Information flow analysis: Identifies input variables on which an output depends. Path analysis: Identifies all possible paths through the program.

CS Static Analysis Tools in Programming Toolkits

CS Static Analysis Tools in Programming Toolkits

CS Static Verification: Program Inspections Formal program reviews whose objective is to detect faults Code may be read or reviewed line by line. 150 to 250 lines of code in 2 hour meeting. Use checklist of common errors. Requires team commitment, e.g., trained leaders So effective that it is claimed that it can replace unit testing

CS Inspection Checklist: Common Errors Data faults: Initialization, constants, array bounds, character strings Control faults: Conditions, loop termination, compound statements, case statements Input/output faults: All inputs used; all outputs assigned a value Interface faults: Parameter numbers, types, and order; structures and shared memory Storage management faults: Modification of links, allocation and de-allocation of memory Exceptions: Possible errors, error handlers

CS Dynamic Verification: Stages of Testing Testing is most effective if divided into stages User interface testing (this is carried out separately in the process) Unit testing unit test System testing integration test function test performance test installation test Acceptance testing

CS Testing Strategies Bottom-up testing. Each unit is tested with its own test environment. Top-down testing. Large components are tested with dummy stubs. user interfaces work-flow client and management demonstrations Stress testing. Tests the system at and beyond its limits. real-time systems transaction processing Most systems require a combination of all three strategies.

CS Methods of Testing Closed box testing Testing is carried out by people who do not know the internals of what they are testing. Example. IBM educational demonstration that was not foolproof Open box testing Testing is carried out by people who know the internals of what they are testing. Example. Tick marks on the graphing package

CS Testing: Unit Testing Tests on small sections of a system, e.g., a single class Emphasis is on accuracy of actual code against specification Test data is chosen by developer(s) based on their understanding of specification and knowledge of the unit Can be at various levels of granularity Open box or closed box: by the developer(s) of the unit or by special testers If unit testing is not thorough, system testing becomes almost impossible. If your are working on a project that is behind schedule, do not rush the unit testing.

CS Testing: System and Sub-System Testing Tests on components or complete system, combining units that have already been thoroughly tested Emphasis on integration and interfaces Trial data that is typical of the actual data, and/or stresses the boundaries of the system, e.g., failures, restart Carried out systematically, adding components until the entire system is assembled Open or closed box: by development team or by special testers System testing is finished fastest if each component is completely debugged before assembling the next

CS Dynamic Verification: Test Design Testing can never prove that a system is correct. It can only show that either (a) a system is correct in a special case, or (b) that it has a fault. The objective of testing is to find faults or demonstrate that program is correct in specific instances. Testing is never comprehensive. Testing is expensive.

CS Test Cases Test cases are specific tests that are chosen because they are likely to find specific faults. Test cases are chosen to balance expense against chance of finding serious faults. Cases chosen by the development team are effective in testing known vulnerable areas. Cases chosen by experienced outsiders and clients will be effective in finding gaps left by the developers. Cases chosen by inexperienced users will find other faults.

CS Dynamic Verification: Regression Testing Regression Testing is one of the key techniques of Software Engineering Basic technique is to repeat entire testing process after every change, however small. When software is modified, regression testing is to provide confidence that modifications behave as intended and do not adversely affect the behavior of unmodified code.

CS Regression Testing: Program Testing 1.Collect a suite of test cases, each with its expected behavior. 2.Create scripts to run all test cases and compare with expected behavior. (Scripts may be automatic or have human interaction.) 3.When a change is made to the system, however small (e.g., a bug is fixed), add a new test case that illustrates the change (e.g., a test case that revealed the bug). 4.Before releasing the changed code, rerun the entire test suite.

CS Incremental Testing (e.g., Daily Testing) Create a first iteration that has the structure of the final system and some basic functionality. Create an initial set of test cases. Check-in changes to the system on a daily basis, rebuild system daily. Run full regression test daily, identify and deal with any new faults. Add new test cases continually.

CS Incremental Testing: a Small Example Example. A graphics package consisting of a pre-processor, a runtime package (set of classes), and several device drivers. Starting point. A prototype with a good overall structure, and most of the functionality, but badly coded and full of bugs. Approach On a daily cycle: Design and code one small part of the package (e.g., an interface, a class, a dummy stub, an algorithm within a class) Integrate into prototype. Create additional test cases if needed. Regression test.

CS Documentation of Testing Testing should be documented for thoroughness, visibility and for maintenance (a) Test plan (b)Test specification and evaluation (c)Test suite and description (d) Test analysis report