Presentation is loading. Please wait.

Presentation is loading. Please wait.

11/04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Software Verification & Validation Karen Smiley ABB Inc., US Corporate.

Similar presentations


Presentation on theme: "11/04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Software Verification & Validation Karen Smiley ABB Inc., US Corporate."— Presentation transcript:

1 11/04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Software Verification & Validation Karen Smiley ABB Inc., US Corporate Research

2 2 - 11/04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Topics Classifications of Testing 5 min Testing Models: V, W, TDD10 min Software Lifecycle and Cost-Effectiveness of V&V5 min Types of “Tests”5 min Static Verification Methods5 min –Personal and Peer Reviews10 min –Inspections5 min Predicting Software Quality5 min Programming Defensively5 min Test Programming & Automation5 min Key Questions in Test Planning5 min Some Testing Experiences5 min Questions?

3 3 - 11/04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Classifications of Testing Test Categories: White (Clear) Box Black Box “Gray” Box Distinguishing Questions: Verification – did we build the thing right? (correctly) Validation – did we build the right thing? (customer desired) Customer requirements should be “testable”.

4 4 - 11/04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Testing Models Traditional (“V”)

5 5 - 11/04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Testing Models Contemporary (“W”)

6 6 - 11/04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Agile Testing Models Test-Driven Development (TDD) Write unit test(s) for new user story High-Level Design for planned features (“user stories”) Run all unit tests Write functionality for new user story Run all Unit Tests all old UTs pass; all new UTs fail all UTs pass (old and new) Add 1 new user story “Write” => detailed design and coding, with “refactoring” as needed any UT fails old UT fails new UT doesn’t fail Start Development All features done

7 7 - 11/04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” V&V in the Software Lifecycle

8 8 - 11/04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Cost-Effectiveness of V&V

9 9 - 11/04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Types of “Tests” Static –Do not involve executing the software under test –Examples: inspections, reviews, scenario walkthroughs, tools Integration –Check unit-tested components for how well they fit together –Examples: SW modules, SW-HW, “smoke”, GUI, I/Fs Functional –Execute the software to see how well it performs –Examples: unit test, component, system, transaction, compliance Non-functional –Verify compliance with non-functional requirements –Examples: configuration, usability, security Performance –Verify resilience and reliability of system –Examples: stress, reliability/soak, availability, fail-over

10 /04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Static Verification Methods Reviews –Personal reviews –Peer reviews Inspections Walkthroughs Automated tools All are white-box, can be done on any kind of software, and are far more effective than testing at finding defects. Note: “Pair Programming” involves continuous peer {review/inspection} of design and code.

11 /04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Personal Reviews Use your own personal checklists –look for the defects you tend to inject in each type of work product Review code before you compile – why? –Have to review to find all of your syntax errors anyhow –Reviews are 1-pass; getting to a clean compile can take multiple passes –Allows you to use the compiler as an objective way to assess (before testing) how effective your code review was Yield = number of bugs found by your review. number of bugs that existed when you started review* * you don’t know this total until later, when you find bugs that were missed

12 /04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Peer Reviews Asking one or more of your peers to review your work and provide feedback will: –Leverage the experience of your colleagues to benefit your work products: authors are inherently biased reviewers you: learn what kinds of problems they have learned, from their experience, to look for –Improve a development team’s “truck number” Peer reviews can be done: –real-time, as a group –asynchronously (e.g., via ) “Human beings, who are almost unique in having the ability to learn from the experience of others, are also remarkable for their apparent disinclination to do so.” – Douglas Adams

13 /04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Inspections More formal than peer reviews –example: “Fagan inspections” Also use checklists, but apply them more thoroughly –line by line, paragraph by paragraph Require advance preparation by: –Author (e.g., clean up typos, fix any compilation errors) –Multiple reviewers –Moderator Take more time (slower ‘review rate’ than reviews) Can deliver excellent overall yield –“Capture-recapture” method can predict how many defects are likely to remain

14 /04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Key Success Factors for Reviews and Inspections Training in how to do them Proper preparation by participants Good checklists Reasonable review rate (amount of work inspected per hour) –Review rate can be a good predictor of yield –Optimal: neither too slow (low) nor too fast (high)  If review rates are too high, consider re-reviewing or re-inspecting.  If modules prove to be more defective in later phases than expected, consider re-reviewing / re-inspecting or redesigning / rewriting the module.

15 /04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Predicting Software Quality Measure, for each type of development activity: –Size, time, and defects Calculate: –yields (# of defects found vs. # present) typical yields from reviews and inspections: ~70-90% typical yields from System Test and Acceptance Test: ~50% –projected number of defects remaining # of defects remaining after ST/AT ~= the # you found in ST/AT –defect densities (# of defects found/size) reviews, inspections, compile, unit test –time ratios design review vs. design (>=100% for Pair Programming?) code review vs. coding (>=100% for Pair Programming?) design vs. coding

16 /04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” PSP SM /TSP SM Quality Profile Design Time Code Time (Ratio >= 100%) Design Review Time Design Time (Ratio >= 50%) Unit Test Defect Density (< 5 defects/KLOC) Code Review Time Code Time (Ratio >= 50%) Compile Defect Density (< 10 defects/KLOC) All values=0 in center, =1 at outer edge if goal value is met or bettered. (PQI=product of all 5 values) >=.4 indicates a likely high quality component. Source: Software Engineering Institute (http://sei.cmu.edu) at Carnegie Mellon University, TSP http://sei.cmu.edu See for further information on the Quality Profile. PSP, TSP, Personal Software Process, and Team Software Process are service marks of CMU.

17 /04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Programming Defensively Date input –First UI program – line input overflow –Same type of problem is still being exploited by modern viruses! Most security vulnerabilities in software are due to [design] flaws. –Many such bugs can be found with a good review or inspection Range checking –Dates (Y2K) “The best defense is a good offense.”

18 /04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Test Programming & Automation Test software - “quick and dirty” or real? Automation gotchas – e.g., using WinRunner Testing the tests / Use of simulators Building in testability Troubleshooting on regression and soak tests

19 /04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Key Questions in Test Planning Why test? And how do you know... what to test? (and what not to test) –Should 100% of the work be reviewed and/or inspected? –What level of “test coverage” should be targeted? –What should the system be able to do at this point of the project? how to execute the tests? where a bug really is –in the product? (which part?) –in the test? (procedure or software) when you’ll be done testing? (so the product can be delivered to its customers)

20 /04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Some Testing Experiences MPLOT – expandability & test automation Space Shuttle database – accommodating spare parts, impact of expanding the database structure, changing dash numbers NavCal – ‘fixed’ bugs resurfacing, pseudo-code “N” error, table automation Fleet Advisor – PCMCIA card removal, scale reading Domino – interactions with French ISDN and Win DLLs Pliant 3000 – integrating host and remote, SW + HW, GR303, SNMP

21 /04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Contact Information Phone:

22 /04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” References “Software Testing 101”, Rick Clements, “Unit Testing – Approaches, Tools, and Traps”, Tom Rooker, Aug presentation at RTP SPIN, 8.ppt 8.ppt “Software Testing and PR – What They Didn’t Teach You in School”, Bob Galen, Dec presentation at RTP SPIN, Agile Testing (and links), “Test Infected: Programmers Love Writing Tests”, Kent Beck & Erich Gamma, Exploratory testing articles, Cem Kaner, James Bach, and Bret Pettichord, “Lessons Learned in Software Testing: A Context-Driven Approach”, John Wiley & Sons, Dec See Bret Pettichord’s Publications, “Software Testing Hotlist”,


Download ppt "11/04/2003Karen Smiley – Duke SE, Fall 2003 – “Software Verification and Validation” Software Verification & Validation Karen Smiley ABB Inc., US Corporate."

Similar presentations


Ads by Google