Towards Automatic Generation of Parameterized Test Cases from Abstractions Jens R. Calamé Natalia Ioustinova Jaco van de Pol Centrum voor Wiskunde en Informatica,

Slides:



Advertisements
Similar presentations
Model-Based Testing with Smartesting Jean-Pierre Schoch Sogetis Second Testing Academy 29 April 2009.
Advertisements

Auto-Generation of Test Cases for Infinite States Reactive Systems Based on Symbolic Execution and Formula Rewriting Donghuo Chen School of Computer Science.
BAiT: Adaptive Test Case Execution in Practice Jens R. Calamé Centrum voor Wiskunde en Informatica, Amsterdam.
Model Checking for an Executable Subset of UML Fei Xie 1, Vladimir Levin 2, and James C. Browne 1 1 Dept. of Computer Sciences, UT at Austin 2 Bell Laboratories,
A System to Generate Test Data and Symbolically Execute Programs Lori A. Clarke September 1976.
An Introduction to the Model Verifier verds Wenhui Zhang September 15 th, 2010.
Budapest University of Technology and EconomicsDagstuhl 2004 Department of Measurement and Information Systems 1 Towards Automated Formal Verification.
Formal Semantics of Programming Languages 虞慧群 Topic 6: Advanced Issues.
Rigorous Software Development CSCI-GA Instructor: Thomas Wies Spring 2012 Lecture 11.
Using TTCN-3 in Interoperability Testing for Real-time Communication Systems Zhiliang Wang, Jianping Wu, Xia Yin, Xingang Shi and Beihang Tian Department.
1st DWFTT - 21/06/05 - Amsterdam TT-Medal Project Overview Jaco van de Pol CWI, SEN 2 Amsterdam Stefan Blom, Jens Calamé, Wan Fokkink, Nicu Goga, Natalia.
1 Pierangelo Dell’Acqua Dept. of Science and Technology Linköping University Constraint programming 2001 November 13th 2001
Efficient Reachability Analysis for Verification of Asynchronous Systems Nishant Sinha.
1 Statecharts for the many: Algebraic State Transition Diagrams Marc Frappier GRIL – Groupe de recherche en ingénierie du logiciel.
VerTeCS Verification models and techniques applied to the Testing and Control of reactive Systems Thierry Jéron IRISA/INRIA Rennes, France
ISBN Chapter 3 Describing Syntax and Semantics.
Automated creation of verification models for C-programs Yury Yusupov Saint-Petersburg State Polytechnic University The Second Spring Young Researchers.
An Integration of Program Analysis and Automated Theorem Proving Bill J. Ellis & Andrew Ireland School of Mathematical & Computer Sciences Heriot-Watt.
Formal Methods of Systems Specification Logical Specification of Hard- and Software Prof. Dr. Holger Schlingloff Institut für Informatik der.
Automated Model-Based Testing of Hybrid Systems Michiel van Osch PROSE January 25,
Model-based Testing of Hybrid Systems Michiel van Osch IPA Spring Days on Testing 19 April – 21 April 2006.
Programming Language Semantics Denotational Semantics Chapter 5 Based on a lecture by Martin Abadi.
Abstractions. Outline Informal intuition Why do we need abstraction? What is an abstraction and what is not an abstraction A framework for abstractions.
1 Jan Tretmans Embedded Systems Institute Eindhoven Radboud University Nijmegen Model-Based Testing.
Programming Language Semantics Mooly SagivEran Yahav Schrirber 317Open space html://
1 Jan Tretmans Radboud University Nijmegen (NL) © Jan Tretmans Radboud University Nijmegen together with: University of Twente Enschede.
Copyright © 2006 The McGraw-Hill Companies, Inc. Programming Languages 2nd edition Tucker and Noonan Chapter 18 Program Correctness To treat programming.
Embedded Systems Laboratory Department of Computer and Information Science Linköping University Sweden Formal Verification and Model Checking Traian Pop.
Semantics with Applications Mooly Sagiv Schrirber html:// Textbooks:Winskel The.
Operational Semantics Semantics with Applications Chapter 2 H. Nielson and F. Nielson
Describing Syntax and Semantics
Programming Language Semantics Denotational Semantics Chapter 5 Part III Based on a lecture by Martin Abadi.
1 Jan Tretmans Embedded Systems Institute Eindhoven, NL Radboud University Nijmegen, NL Model-Based Testing with Labelled Transition.
Regular Model Checking Ahmed Bouajjani,Benget Jonsson, Marcus Nillson and Tayssir Touili Moran Ben Tulila
An Introduction to MBT  what, why and when 张 坚
The State of Hybrid Model-Based Testing Michiel van Osch
Testing Theory cont. Introduction Categories of Metrics Review of several OO metrics Format of Presentation CEN 5076 Class 6 – 10/10.
Program Analysis and Verification Spring 2015 Program Analysis and Verification Lecture 2: Operational Semantics I Roman Manevich Ben-Gurion University.
Scientific Computing By: Fatima Hallak To: Dr. Guy Tel-Zur.
Formalizing the Asynchronous Evolution of Architecture Patterns Workshop on Self-Organizing Software Architectures (SOAR’09) September 14 th 2009 – Cambrige.
Introduction to Formal Methods Based on Jeannette M. Wing. A Specifier's Introduction to Formal Methods. IEEE Computer, 23(9):8-24, September,
1 Automatic Non-interference Lemmas for Parameterized Model Checking Jesse Bingham, Intel DEG FMCAD 2008.
Automatic Test Generation from here until the end (of my Phd.) University of Geneva Levi Lúcio SMV & Les Diablerets.
Model Based Testing Group 7  Nishanth Chandradas ( )  George Stavrinides ( )  Jeyhan Hizli ( )  Talvinder Judge ( )  Saajan.
ISBN Chapter 3 Describing Semantics -Attribute Grammars -Dynamic Semantics.
Conformance Test Suites, Extensionally Arend Rensink University of Twente Dutch Workshop on Formal Testing Techniques University of Twente 13 September.
Introduction to Problem Solving. Steps in Programming A Very Simplified Picture –Problem Definition & Analysis – High Level Strategy for a solution –Arriving.
Testing Railway Interlockings with N. Ioustinova, J. van de Pol, N. Goga Centrum voor Wiskunde en Informatica Amsterdam, The Netherlands TT-Medal Review.
Natallia Kokash (Accepted for PACO’2011) ACG, 31/05/ Input-output conformance testing for channel-based connectors 1.
1 Black-box conformance testing for real-time systems Stavros Tripakis VERIMAG Joint work with Moez Krichen.
Software Verification 2 Automated Verification Prof. Dr. Holger Schlingloff Institut für Informatik der Humboldt Universität and Fraunhofer Institut für.
Ed Brinksma Course 2004 TorX : A Test Generation Tool.
Testing Railway Interlockings with TTCN-3 Stefan Blom University of Innsbruck Natalia Ioustinova,Jaco van de Pol
Proving Non-Termination Gupta, Henzinger, Majumdar, Rybalchenko, Ru-Gang Xu presentation by erkan.
Parameterized Models for Distributed Java Objects Tomás Barros & Rabéa Boulifa OASIS Project INRIA Sophia Antipolis April 2004.
Towards Interoperability Test Generation of Time Dependent Protocols: a Case Study Zhiliang Wang, Jianping Wu, Xia Yin Department of Computer Science Tsinghua.
Model Checking Linearizability via Refinement 1 ICFEM 2008 Model Checking Linearizability via Refinement Yang LIU, Wei CHEN, Yanhong A. LIU, and Jun SUN.
Model Based Testing implementing with tools Ruud van Houwelingen 1 December 2, 2009.
A Calculus of Atomic Actions Tayfun Elmas, Shaz Qadeer and Serdar Tasiran POPL ‘ – Seminar in Distributed Algorithms Cynthia Disenfeld 27/05/2013.
Operational Semantics Mooly Sagiv Tel Aviv University Sunday Scrieber 8 Monday Schrieber.
An Automated Test Generation Process from UML Models to TTCN-3 Jens R. Calamé.
Authors: Amira RADHOUANI Akram IDANI Yves LEDRU Narjes BEN RAJEB Laboratoire d’Informatique de Grenoble.
An Automated Test Generation Process from UML Models to TTCN-3
Lecture 5 Floyd-Hoare Style Verification
Model Checking for an Executable Subset of UML
Programming Languages 2nd edition Tucker and Noonan
Overview of the ETSI Test Description Language
Test Generation with Abstraction and Concretization
Programming Languages 2nd edition Tucker and Noonan
Presentation transcript:

Towards Automatic Generation of Parameterized Test Cases from Abstractions Jens R. Calamé Natalia Ioustinova Jaco van de Pol Centrum voor Wiskunde en Informatica, Amsterdam

IPA Lentedagen Agenda 1.Testing Theory 2.Data Abstraction 3.Test Case Parameterization 4.Some Lemmas 5.Conclusion

IPA Lentedagen Testing Theory Conformance Testing: –Implementation Imp conforms specification Spec, iff for all traces t in Spec holds: all input should be accepted after t, but at most the specified output should be generated by Imp. –Based on Tretmans‘ ioco theory Test Generation by the tool TGV (Test Generation with Verification Techniques) –Based on state enumeration –Limitation: Data often leads to state-space explosion –Solution: Data abstraction mitigates this limitation

IPA Lentedagen Our Test Generation Process Abstract System Specification System Specification (formal) Test Purpose TGV System Specification (e.g. UML) Data Abstraction Abstract Test Case Test Case Generation Parameterizable Test Case (TTCN-3) Rule System for Data Selection TTCN-3 Generation Constraint Generation

IPA Lentedagen Data Abstraction Motivation: Input and output data from large (infinite) domains leads to state explosion  Application of state-based test generation tools impossible Solution (from model-checking): data abstraction –Introduction of a chaotic value ╥ for each datatype D –Lifting functions for original values from D –Replacement of input variables by ╥ and propagation through the system –Introduction of may functions for guards over three- value-logic (semantics: may( ╥ ) = true) –Consequence: Introduction of extra traces by non-determinism

IPA Lentedagen Data Abstraction – Example ?getPin(x) ?getBalance !Balance(b) 6 7 ?getAmount(y) ? (y) 8 01 ?initPin(p)?initBalance(b) !pinIncorrect  )(px  !pinCorrect  )(px  !Money(y); b:=b-y  )(by  !LowBalance(b)  )(by 

IPA Lentedagen Data Abstraction – Test Case by TGV

IPA Lentedagen Test Case Parameterization To be solved after data abstraction: –Pruning of traces introduced by overapproximation –Finding possible value ranges for test data Rule System (Prolog) –Represents SUT specification –Defines rules on data (addition, substraction; and, or etc.) –Defines rules for process behavior of Spec Query (Prolog) –Represents the abstract test case (one rule per trace) –All transitions in trace taken as query body –Alternative: one query for whole CTG

IPA Lentedagen Testing with Abstraction – Rule System … getPin(state(2,P,B,X,Y), state(3,P,B,X1,Y), param(X1)). pinIncorrect(state(3,P,B,X,Y), state(8,P,B,X,Y),param(_)) :- X ≠ P. pinCorrect(state(3,P,B,X,Y), state(4,P,B,X,Y), param(_)) :- X = P. … ?getPin(x) ?getBalance !Balance(b) 6 7 ?getAmount(y) ? (y) 8 01 ?initPin(p)?initBalance(b) !pinIncorrect  )(px  !pinCorrect  )(px  !Money(y); b:=b-y  )(by  !LowBalance(b)  )(by 

IPA Lentedagen Testing with Abstraction – Query oracle(P,B,X,Yin,Yout) :- initPin(state(0,0,0,0,0), G1, param(P)), initBalance(G1, G2, param(B)), getPin(G2, G3, param(X)), pinCorrect(G3, G4, _), getAmount(G4, G5, param(Yin)), money(G5, _, param(Yout)). P=P{-1.0inf..1.0inf} B=B{-1.0inf..1.0inf} X=P{-1.0inf..1.0inf} Yin=Yout{-1.0inf..1.0inf} Yout=Yout{-1.0inf..1.0inf} Yout{-1.0inf..1.0inf}-B{-1.0inf..1.0inf}=<0

IPA Lentedagen Test Execution 1.Pre-solve one trace to pass statically (skip internal steps) 2.Execute this trace until the SUT leaves it 3.Try to find a trace to pass  solve and execute it 4.If no trace to pass: Try to find a trace to inconc  solve it 5.If no trace to inconc: set test verdict fail PASS 01 INCONC !initPin(P)!initBalance(B) !getPin(X) ?pinCorrect ?pinInCorrect ?lowBalance !getAmount(Yin)!getBalance ?Balance(x50) !getAmount(x60) ?Money(Yout) ?eatCard

IPA Lentedagen Some Lemmas The original system Spec is simulated by Spec ╥ in all details (and some more). The control flow of the synchronous product Spec x TP is simulated by Spec ╥ x TP. The set of accepting/refusing traces of Spec x TP is a subset of those of Spec ╥ x TP. If the test oracle holds for a trace in the CTG under a certain valuation, this trace is valid in the original system under the given valuation. The test algorithm terminates with a sound verdict. For further details and proofs see the technical report.

IPA Lentedagen Case Study CEPS Common Electronic Purse Specifications: protocol for electronic payment using a multi-currency smart-card Input and output parameters of card actions mainly natural numbers  (in)finite domain Variables partially arrays (up to 16 elements in simplified µCRL realization) As µCRL specification: –54 summands –44 process variables (netto) –207process variables (brutto), due to arrays of structures

IPA Lentedagen Case Study CEPS (cont'd) Instantiation and reduction (abstracted specification): ca. 16 min. on five 2.2GHz Athlon 64bit single CPU computers (1 GB RAM each) Generation of two test cases: 594 and 109 states, resp. in less than one second (one 2.2GHz AMD Athlon XP 32 bit CPU and 1 GB RAM) Constraint solving produces results in negligable time

IPA Lentedagen Conclusion Data abstraction makes state-based test generation applicable to systems with large data domains Approach successfully evaluated on CEPS case study Ongoing and future work: –Redesign of tools (add "real" constraint-solving) –On-the-fly constraint-solving –Treatment of  -steps –(Generation and) execution of TTCN-3 test cases –Integration of UML as specification language

IPA Lentedagen Related Links TT-Medal Project: Test generator TGV: –www-verimag.imag.fr/~async/TGVwww-verimag.imag.fr/~async/TGV – Data abstraction tools: Technical report and papers: Jens R. Calamé Natalia Ioustinova Jaco van de Pol