Presentation is loading. Please wait.

Presentation is loading. Please wait.

Automatic Test Data Generation : Who, When and Where? Jeff Offutt Software Engineering George Mason University Fairfax, VA USA

Similar presentations


Presentation on theme: "Automatic Test Data Generation : Who, When and Where? Jeff Offutt Software Engineering George Mason University Fairfax, VA USA"— Presentation transcript:

1 Automatic Test Data Generation : Who, When and Where? Jeff Offutt Software Engineering George Mason University Fairfax, VA USA www.cs.gmu.edu/~offutt/offutt@gmu.edu

2 1.Industrial Software Problems 2.Automatic Test Data Generation 3.Input Validation Testing 4.Bypass Testing of Web Applications 5.The Future of Web Testing and ATDGOUTLINE UMass Boston 2009© Jeff Offutt2

3 Mismatch in Needs and Goals Industry wants testing to be simple and easy –Testers with no background in computing or math Universities are graduating scientists –Industry needs engineers Testing needs to be done more rigorously Agile processes put lots of demands on testing –Programmers have to do unit testing – with no training, education or tools ! –Tests are key components of functional requirements – but who builds those tests ? UMass Boston 2009© Jeff Offutt3 Bottom line—lots of crappy software

4 Failures in Production Software NASA’s Mars lander, September 1999, crashed due to a units integration fault—over $50 million US ! Huge losses due to web application failures –Financial services : $6.5 million per hour –Credit card sales applications : $2.4 million per hour In Dec 2006, amazon.com’s BOGO offer turned into a double discount 2007 : Symantec says that most security vulnerabilities are due to faulty software Stronger testing could solve most of these problems UMass Boston 2009© Jeff Offutt4 World-wide monetary loss due to poor software is staggering Thanks to Dr. Sreedevi Sampath

5 How to Improve Testing ? Testers need to adopt practices and techniques that lead to more efficient and effective testing –More education –Different management organizational strategies Testing / QA teams need more technical expertise –Developer expertise has been increasing dramatically Testing / QA teams need to specialize more –This same trend happened for development in the 1990s Testers need more and better software tools UMass Boston 2009© Jeff Offutt5

6 Quality of Industry Tools My student recently evaluated three industrial automatic unit test data generators –Jcrasher, TestGen, JUB –Generate tests for Java classes –Evaluated on the basis of mutants killed Compared with two test criteria –Random test generation (by hand) –Edge coverage criterion (by hand) Eight Java classes –61 methods, 534 LOC, 1070 mutants (muJava) UMass Boston 2009© Jeff Offutt6 — Shuang Wang and Jeff Offutt, Comparison of Unit-Level Automated Test Generation Tools, Mutation 2009

7 Unit Level ATDG Results UMass Boston 2009© Jeff Offutt7 These tools essentially generate random values !

8 Quality of Criteria-Based Tests Two other students recently compared four test criteria –Edge-pair, All-uses, Prime path, Mutation –Generated tests for Java classes –Evaluated on the basis of finding hand-seeded faults Twenty-nine Java packages –51 classes, 174 methods, 2909 LOC Eighty-eight hand-generated faults UMass Boston 2009© Jeff Offutt8 — Nan Li, Upsorn Praphamontripong and Jeff Offutt, An Experimental Comparison of Four Unit Test Criteria: Mutation, Edge-Pair, All-uses and Prime Path Coverage, Mutation 2009

9 Criteria-Based Test Results UMass Boston 2009© Jeff Offutt9 Researchers have invented very powerful techniques

10 Industry and Research Tool Gap We cannot compare these two studies directly However, we can summarize their conclusions : –Industrial test data generators are ineffective –Edge coverage is much better than the tests the tools generated –Edge coverage is by far the weakest criterion Biggest challenge was hand generation of tests Software companies need to test better UMass Boston 2009© Jeff Offutt10 Luckily, we have lots of room for improvement !

11 Four Roadblocks to Adoption 1.Lack of test education 2.Necessity to change process 3.Usability of tools 4.Weak and ineffective tools UMass Boston 2009© Jeff Offutt11 Bill Gates says half of MS engineers are testers, programmers spend half their time testing Number of undergrad CS programs in US that require testing ? 0 Number of MS CS programs in US that require testing ? Number of undergrad testing classes in the US ? 0 ~30 Most test tools don’t do much – but most users do not know it ! Adoption of many test techniques and tools require changes in development process Many testing tools require the user to know the underlying theory to use them This is very expensive for large software companies Do we need to know how an internal combustion engine works to drive ? Do we need to understand parsing and code generation to use a compiler ? Few tools solve the key technical problem – generating test values automatically

12 1.Industrial Software Problems 2.Automatic Test Data Generation 3.Input Validation Testing 4.Bypass Testing of Web Applications 5.The Future of Web Testing and ATDGOUTLINE UMass Boston 2009© Jeff Offutt12

13 Automatic Test Data Generation ATDG tries to create effective test input values –Values must match syntactic input requirements –Values must satisfy semantic goals The general problem is formally unsolvable Syntax depends on the test level –System : Create inputs based on user-level interaction –Unit : Create inputs for method parameters and non-local variables Semantic goals vary –Random values –Special values, invalid values –Satisfy test criteria UMass Boston 2009© Jeff Offutt13 I will start by considering test criteria applied to program units

14 Unit Level ATDG Origins Late ’70s, early ’80s † –Fortran and Pascal functions –Symbolic execution to create constraints and LP-like solvers to find values UMass Boston 2009© Jeff Offutt14 Early ’90s †† –Heuristics for solving constraints –Revised algorithms for symbolic evaluation Mid to late ’90s ††† –Dynamic symbolic evaluation (concolic) –Dynamic domain reduction algorithm for solving constraints Current : Search-based procedures Boyer, Elpas, and Levitt. Select-a formal system for testing and debugging programs by symbolic execution. SIGPLAN Notices, 10(6), June 1975 Clarke. A system to generate test data and symbolically execute programs. TSE, 2(3):215-222, September 1976 Ramamoorthy, Ho, and Chen. On the automated generation of program test data. TSE, 2(4):293-300, December 1976 Howden. Symbolic testing and the DISSECT symbolic evaluation system. TSE, 3(4), July 1977 Darringer and King. Applications of symbolic execution to program testing. IEEE Computer, 11(4), April 1978 † Korel. Automated software test data generation. TSE, 16(8):870-879, August 1990 DeMillo and Offutt. Constraint-based automatic test data generation. TSE, 17(9):900-910, September 1991 †† Korel. Dynamic method for software test data generation. STVR, Verification, and Reliability, 2(4):203-213, 1992 Jeff Offutt, Zhenyi Jin and Jie Pan. The Dynamic Domain Reduction Approach to Test Data Generation. SP&E, 29(2):167-193, January 1999 ††† 10-15 line functions, algorithms often failed at statement coverage Larger functions, edge coverage, >90% data flow, > 80% mutation Handled loops, arrays, pointers, > 90% mutation scores

15 Dynamic Domain Reduction Previous techniques generated complete systems of constraints to satisfy test requirements –Memory requirements blow up quickly DDR does its work “on the fly” 1.Defines an initial symbolic domain for each input variable 2.Picks a test path through the program 3. Symbolically evaluates the path, reducing the input domains at each branch 4.Evaluates expressions with domain-symbolic algorithms 5.After walking the path, values in the input variables’ domains ensure execution of the path 6.If a domain is empty, the path is re-evaluated with different decisions at branches UMass Boston 2009© Jeff Offutt15

16 DDR Example UMass Boston 2009© Jeff Offutt16 1 6 2 7 8 3 4 9 5 10 mid = z mid = y mid = x x > z x >= y x <= y x > y y >= z Initial Domains x: y: z: Test Path [ 1 2 3 5 10 ] y < z mid = x x < z x >= y 1. Edge (1, 2) y < z split point is 0 x: y: z: 2. Edge (2, 3) x >= y split point is -5 x: y: z: 3. Edge (3, 5) x < z split point is 2 x: y: z: Any values from the domains for x, y and z will execute test path [ 1 2 3 5 10 ] For example : (x = 0, y = -10, z = 8)

17 ATDG Adoption These algorithms are very complicated –But very powerful Three companies have attempted to build commercial tools based on these algorithms –Two failed and generate random values –Agitar created Agitator, which used algorithms very similar to the DDR … –But Agitar went out of business Search-based procedures are easier but less effective A major question is how to solve ATDG beyond the unit testing level ? –For example … web applications ? UMass Boston 2009© Jeff Offutt17

18 1.Industrial Software Problems 2.Automatic Test Data Generation 3.Input Validation Testing 4.Bypass Testing of Web Applications 5.The Future of Web Testing and ATDGOUTLINE UMass Boston 2009© Jeff Offutt18

19 © Jeff Offutt19 Validating Inputs Before starting to process inputs, wisely written programs check that the inputs are valid How should a program recognize invalid inputs ? What should a program do with invalid inputs ? If the input space is described as a grammar, a parser can check for validity automatically –This is very rare –It is easy to write input validators – but also easy to make mistakes ! Input Validation Deciding if input values can be processed by the software UMass Boston 2009

20 Representing Input Domains Goal domains are often irregular Goal domain for credit cards † –First digit is the Major Industry Identifier –First 6 digits and length specify the issuer –Final digit is a “check digit” –Other digits identify a specific account Common specified domain –First digit is in { 3, 4, 5, 6 } (travel and banking) –Length is between 13 and 16 Common implemented domain –All digits are numeric UMass Boston 2009© Jeff Offutt20 † More details are on : http://www.merriampark.com/anatomycc.htm All digits are numeric

21 Representing Input Domains UMass Boston 2009© Jeff Offutt21 goal Desired inputs (goal domain) specified Described inputs (specified domain) implemented Accepted inputs (implemented domain) This region is a rich source of software errors … … and security vulnerabilities !!!

22 1.Industrial Software Problems 2.Automatic Test Data Generation 3.Input Validation Testing 4.Bypass Testing of Web Applications 5.The Future of Web Testing and ATDGOUTLINE UMass Boston 2009© Jeff Offutt22

23 Web Application Input Validation Sensitive Data Bad Data Corrupts data base Crashes server Security violations Check data Malicious Data Can “bypass” data checking Client Server UMass Boston 200923© Jeff Offutt

24 Bypass Testing Web apps often validate on the client (with JavaScript) Users can “bypass” the client-side constraint enforcement by skipping the JavaScript Bypass testing constructs tests to intentionally violate validation constraints –Eases test automation –Validates input validation –Checks robustness –Evaluates security Case study on commercial web applications... UMass Boston 2009© Jeff Offutt24 — Offutt, Wu, Du and Huang, Bypass Testing of Web Applications, ISSRE 2004

25 Bypass Testing Results UMass Boston 2009© Jeff Offutt25 v — Vasileios Papadimitriou. Masters thesis, Automating Bypass Testing for Web Applications, GMU 2006

26 Theory to Practice—Bypass Testing Six screens tested from “production ready” software Tests are invalid inputs – exceptions are expected Effects on back-end were not checked UMass Boston 2009© Jeff Offutt26 Web ScreenTestsFailing TestsUnique Failures Points of Contact 422312 Time Profile 5323 Notification Profile 3412 6 Notification Filter 2616 7 Change PIN 5 1 1 Create Account 241714 TOTAL1849263 33% “efficiency” rate is spectacular! — Offutt, Wang and Ordille, An Industrial Case Study of Bypass Testing on Web Applications, ICST 2008

27 1.Industrial Software Problems 2.Automatic Test Data Generation 3.Input Validation Testing 4.Bypass Testing of Web Applications 5.The Future of Web Testing and ATDGOUTLINE UMass Boston 2009© Jeff Offutt27

28 21st Century Software Testing We are going through a time of change UMass Boston 2009© Jeff Offutt28 Industry is going through a revolution in what testing means to the success of software products Today’s software market : –is much bigger –is more competitive –has more users Agile processes put increased pressure on testers More safety critical, real-time, embedded software Security is now all about software faults Secure software is reliable software The web offers a new deployment platform Very competitive and available to more users Web apps are distributed Web apps must be highly reliable Industry desperately needs our inventions ! Software defines behavior

29 Major Problems with ATDG ATDG is not used because –Existing tools only support weak ATDG or are extremely difficult to use –Tools are difficult to develop –Companies are unwilling to pay for tools Researchers want theoretical perfection –Testers expected to recognize infeasible test requirements –Tools expected to satisfy all test requirements This requires testers to become experts in ATDG ! UMass Boston 2009© Jeff Offutt29 Practical testers want easy-to-use engineering tools that make software better—not perfect tools !

30 Needed UMass Boston 2009© Jeff Offutt30 ATDG tools must be integrated into development Unit level ATDG tools must be designed for developers ATDG tools must be easy to use ATDG tools must give good tests … but not perfect tests

31 A Practical Unit-Level ATDG Tool Principles : –Users must not be required to know testing –Tool must ignore theoretical problems of completeness and infeasibility—an engineering approach –Tool must integrate with IDE –Must automate tests in JUnit Process : –After my unit compiles cleanly, ATDG kicks in –Generates tests, runs them, returns a list of results –If any results are wrong, tester can start debugging UMass Boston 2009© Jeff Offutt31

32 A Practical Unit-Level ATDG Tool A power level dial should be available : Level 1 ( Edge coverage ) Level 2 ( Edge-pair coverage ) Level 3 ( Prime path coverage ) Level 4 ( Active clause coverage ) Level 5 ( All-uses coverage ) Level 6 ( Mutation coverage ) Theoretical compromises –Infeasible test requirements simply ignored –100% coverage is not required Advanced : –Return a report on coverage –Let developers mark infeasible test requirements (or subpaths) UMass Boston 2009© Jeff Offutt32

33 Practical System-Level ATDG Tool Principles : – Tests should be based on input domain description – Input domain should be extracted from UI – Tool must not need source – Test must be automated – Humans must be allowed to provide values and tests Process : –Tests should be created as soon system is integrated ATDG part of integration tool –Should support testers, allowing them to accept, override, or modify any parameters and test values UMass Boston 2009© Jeff Offutt33

34 Summary Researchers strive for perfect solutions Universities teach CS students to be theoretically very strong—almost mathematicians UMass Boston 2009© Jeff Offutt34 Industry needs usable, useful engineering tools Industry needs engineers to develop software ATDG is ready for technology transition A successful tool should probably be free—open source

35 © Jeff Offutt35Contact Jeff Offutt offutt@gmu.eduhttp://cs.gmu.edu/~offutt/ UMass Boston 2009


Download ppt "Automatic Test Data Generation : Who, When and Where? Jeff Offutt Software Engineering George Mason University Fairfax, VA USA"

Similar presentations


Ads by Google