Presentation is loading. Please wait.

Presentation is loading. Please wait.

Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP.

Similar presentations


Presentation on theme: "Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP."— Presentation transcript:

1 Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP RAS), Moscow. February, 2000

2 1 Cambridge, February, 2000 Ideal Testing Process Why formal specification? What kind of specifications? Forward engineering: Design specifications Sources ? Oracles Partition Reverse engineering: Sources ? Criteria ? Oracles Partition Pre- and post-conditions, for Oracles and Partition invariants ? Algebraic specifications for Test sequences Tests ? Criteria

3 2 Cambridge, February, 2000 Ideal Testing Process Why formal specification? What kind of specifications? Forward engineering: Design specifications Sources Criteria Oracles Partition Reverse engineering: Post- specifications Sources Criteria Oracles Partition Pre- and post-conditions, for Oracles and Partition invariants ? Algebraic specifications for Test sequences Tests

4 3 Cambridge, February, 2000 KVEST project history Started under contract with Nortel Networks in 1994 to develop a system automatically generating test suites for regression testing from formal specifications, reverse engineered from the existing code A joint collaboration effort between Nortel Networks and ISPRAS background: —Soviet Space Mission Control Center OS and networks; —Soviet space shuttle “Buran” OS and real-time programming language; —formal specification of the real-time programming language

5 4 Cambridge, February, 2000 What is KVEST? KVEST: Kernel Verification and Specification Technology Area of Application: specification, test generation and test execution for API like OS kernel interface Specification Language: RAISE/RSL (VDM family) Specification Style: state-oriented, implicit (pre- and post-conditions, subtype restrictions) Target Language: Programming language like C/C++ Size of Application:over 600Kline Size of specification: over 100Kline Size of test suites: over 2Mline Results: over hundred errors have been detected in several projects

6 5 Cambridge, February, 2000 Position Constraint specification Semi-automated test production Fully automated test execution and test result analysis Orientation on use in industrial software development processes

7 6 Cambridge, February, 2000 Research and design problems Test system architecture Mapping between specification and programming languages Integration of generated and manual components - re-use of manual components Test sequence and test case generation

8 7 Cambridge, February, 2000 Verification processes Reverse engineering: (post-) specification, testing based on the specification Forward engineering: specification design, development, test production Co-verification: specification design, simultaneous development and test production

9 8 Cambridge, February, 2000 Reverse engineering: Technology stream Software contract contents Interface A1 ……………………. Interface A2 ……………………. Detected error & test coverage reports Documentation Source code Test drivers and test cases Interface definition Phase 1 Specification Phase 2 Test suite production Phase 3 Test execution analysis Phase 4 Test plans Actual documentation

10 9 Cambridge, February, 2000 Key features of KVEST test suites Phase 1: Minimal and orthogonal API (Application Programming Interface) is determined Phase 2: Formal specification in RAISE Specification Language is developed for API. Phase 3: Automatic generation of sets of test suites (test cases and test sequences) in target language. Phase 4: Automatic execution of generated test suites. Pass/fail verdict is assigned for every test case execution. Error summary is provided at the end of the run. User has an option of specifying completeness of the test coverage and the form of tracing.

11 10 Cambridge, February, 2000 DAY_OF_WEEK : INT > RC >< WEEKDAY DAY_OF_WEEK( tday, tyear ) as ( post_rc, post_Answer ) post if tyear <= 0 \/ tday <= 0 \/ tday > 366 \/ tday = 366 /\ ~a_IS_LEAP( tyear ) then BRANCH( bad_param, "Bad parameters" ); post_Answer = 0 /\ post_rc = NOK else BRANCH( ok, "OK" ); post_Answer = (a_DAYS_AFTER_INITIAL_YEAR(tyear, tday ) + a_INITIAL_DAY_OF_WEEK ) \ a_DAYS_IN_WEEK /\ post_rc = OK end An example of specification in RAISE

12 11 Cambridge, February, 2000 Partition based on the specification Specification post if a \/ b \/ c \/ d /\ e then BRANCH( bad_param, "Bad parameters" ) else BRANCH( ok, "OK" ) end Partition (Branches and Full Disjunctive Normal Forms - FDNF) BRANCH "Bad parameters” a/\b/\c/\d/\e ~a/\b/\c/\d/\e... BRANCH "OK" ~a/\~b/\~c/\~d/\e …

13 12 Cambridge, February, 2000 Test execution scheme Specifications Verdict and trace Test drivers Test harness Target platform UNIX Program behavior model Comparison Test suite generators SUT Test case parameters

14 13 Cambridge, February, 2000 Test execution management Repository Navigator: - test suite generation - repository browser - test plan run Unix workstationTarget platform Test bed: - process control - communication - basic data conversion Test suite Script driver MDC Basic drivers MDC - Manually Developed Components

15 14 Cambridge, February, 2000 KVEST Test Drivers Hierarchy of Test Drivers —Basic test drivers: test single procedure by receiving input, calling the procedure, recording the output, assigning a verdict —Script drivers: generate sets of input parameters, call basic drivers, evaluate results of test sequences, monitor test coverage —Test plans: define the order of script driver calls with given test options and check their execution KVEST uses set of script driver skeletons to generate script drivers Test drivers are compiled from RAISE into the target language

16 15 Cambridge, February, 2000 Test generation scheme RAISE specifications Script driver skeletons Basic driver generator Script driver generator Test case generator RAISE -> target language compiler Basic drivers Test case parameters Script drivers Target platform Test suites Tools (UNIX)

17 16 Cambridge, February, 2000 Test generation scheme, details RAISE -> target language compiler Basic drivers Test case parameters Script drivers Target platform Test suites RAISE specifications Script driver skeletons Basic driver generator Script driver generator Test case generator Tools (UNIX) Iterators Data converters State observers Filters Manually Developed Components

18 17 Cambridge, February, 2000 Test sequence generation based on implicit Finite State Machine (FSM) –Partition based on pre- and post-conditions –Implicit FSM definition. S1 S4 S2 S3 op2 op3 op2 op1 op3

19 18 Cambridge, February, 2000 Test sequence generation based on implicit FSM S1 S4 S2 S3 op2 1 op3 op2 op1 op3 Partition (Branches and Full Disjunctive Normal Forms - FDNF) BRANCH "Bad parameters” a/\b/\c/\d/\e-- op1 ~a/\b/\c/\d/\e-- op2... BRANCH "OK" ~a/\~b/\~c/\~d/\e-- op i …

20 19 Cambridge, February, 2000 Conclusion on KVEST experience Code inspection during formal specification can detect up to 1/3 of the errors Code inspection can not replace testing. Up to 2/3 of the errors are detected during and after testing. Testing is necessary to develop correct specifications. Up to 1/3 of the errors were caused by the lack of knowledge on pre- conditions and some details of the called procedures’ behavior.

21 20 Cambridge, February, 2000 What part of testware is generated automatically? Kind of source for test generation Percen- tage in the sources Ratio between source size and generated tests size Kind of generation result Specification501:5Basic drivers Data converters, Iterators and State observers ( MDC) 501:10Script drivers

22 21 Cambridge, February, 2000 Solved and unsolved problems in test automation Have been automated or simple problems Not automated and not simple Phase 2 Phase 4 For legacy software Test result understanding For well designed For single operations Test oracles, partition, filters Interface definition Phase 1 Specification Test suite production Phase 3 Test execution analysis Test plans, execution and analysis, browsing, reporting Test sequence design for operation groups

23 22 Cambridge, February, 2000 Specification based testing: problems and prospects Problems Lack of correspondence between any specification and programming languages There is users’ resistance to study any specification language and any additional SDE Methodology of Test sequence generation Testing methodologies for specific software areas Prospects Use an OO programming language specification extension and standard SDE instead a specific SDE FSM extraction from implicit specification, FSM factorization Research on Distributed software specification and testing

24 Part II. KVEST revision

25 24 Cambridge, February, 2000 Specification notation revision. UniTesK : Universal TEsting and Specification toolKit Formal methods deployment problems —lack of users with theoretical background —lack of tools —non conventional languages and paradigms UniTesK Solutions —first step is possible without “any theory” —extension of C++ and Java —integration with standard software development environment Related works —ADL/ADL2 —Eiffel, Larch, iContract

26 25 Cambridge, February, 2000 UniTesK: Test generation scheme Specifications in Java or C++ extension Test oracles generator OO test suite generator Test oracles Target platform Test suites in the target language Tools Iterators, FSM Path builder engines Test sequence fabric Use cases

27 26 Cambridge, February, 2000 Integration of Constraint Verification tools into software development environment UML based design environment A standard Software Development Environment Specification, Verification tools for the standard notation

28 Part III. Test generation inside

29 28 Cambridge, February, 2000 Requirements. Test coverage criteria –All branches –All disjuncts (all accessible disjuncts) Specification post if a \/ b \/ c \/ d /\ e then BRANCH( bad_param, "Bad parameters" ) else BRANCH( ok, "OK" ) end Partition (Branches and Full Disjunctive Normal Forms - FDNF) BRANCH "Bad parameters” a/\b/\c/\d/\e ~a/\b/\c/\d/\e... BRANCH "OK" ~a/\~b/\~c/\~d/\e …

30 29 Cambridge, February, 2000 Test sequence kinds. Kinds 1 st, 2 nd, 3 rd Such procedures can be tested separately because no other target procedure is needed to generate input parameters and analyze outcome. —Kind 1. The input is data that could be represented in literal (text) form and can be produced without accounting for any interdependencies between the values of different parameters.. —Kind 2. No interdependencies exist between the input items (values of input parameters). Input does not have to be in literal form. —Kind 3. Some interdependencies exist, however separate testing is possible.

31 30 Cambridge, February, 2000 Kinds 1 st, 2 nd, 3 rd. What are automated? KindAutomaticallyManually 1 st EverythingNothing 2 nd Test sequences and Parameter tuple iterators Data type iterators 3 rd Test sequences Parameter tuple iterators

32 31 Cambridge, February, 2000 Test sequence kinds. Kinds 4 th and 5 th Kinds 4 th and 5 th. The operations cannot be tested separately, because some input can be produced only by calling another operation from the group and/or some outcome can be analyzed only by calling other procedures.

33 32 Cambridge, February, 2000 Requirements for kinds 4 th and 5 th The same requirements: all branches/all disjuncts Additional problem: how to traverse all states?

34 33 Cambridge, February, 2000 FSM use for API testing Traditional FSM approach (explicit FSM definition): —define all states —for each state define all transitions (operation, input parameters, outcome, next state) ISPRAS approach (implicit FSM definition): —the state is defined by type definition —for each state - operations and input are defined by pre-conditions - outcome and next state are defined by post-conditions

35 34 Cambridge, February, 2000 Advanced FSM use —FSM factorization —Optimization of exhaustive FSM traversing —Use-case based test sequence generation —Test scenario modularization —Friendly interface for test sequence generation and debugging

36 35 Cambridge, February, 2000 References –Igor Bourdonov, Alexander Kossatchev, Alexander Petrenko, and Dmitri Galter. KVEST: Automated Generation of Test Suites from Formal Specifications.- Proceedings of World Congress of Formal Methods, Toulouse, France, LNCS, N 1708, 1999, pp.608-621. –Igor Burdonov, Alexander Kosachev, Victor Kuliamin. FSM using for Software Testing. Programming and Computer Software, Moscow-New-York, No. 2, 2000.

37 36 Cambridge, February, 2000 Contacts Alexander Petrenko Institute for System Programming of Russian Academy of Sciences (ISP RAS), Moscow, Russia petrenko@ispras.ru phone:+7 (095) 912-5317 ext 4404 fax: +7 (095) 912-1524 http://www.ispras.ru/~RedVerst/index.html


Download ppt "Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP."

Similar presentations


Ads by Google