1 Jan Tretmans Embedded Systems Institute Eindhoven, NL Radboud University Nijmegen, NL Model-Based Testing with Labelled Transition.

Slides:



Advertisements
Similar presentations
Model-Based Testing and Test-Based Modelling
Advertisements

1 Lars Frantzen, Pieter Koopman, René de Vries, Tim Willemse, Jan Tretmans Radboud University Nijmegen © Jan Tretmans Radboud University Nijmegen Testing.
MOdel-based GENeration of Tests for Embedded Systems # FP7-ICT Embedded Systems Design Institute for Software Technology – Graz University.
Translation-Based Compositional Reasoning for Software Systems Fei Xie and James C. Browne Robert P. Kurshan Cadence Design Systems.
Abstraction and Modular Reasoning for the Verification of Software Corina Pasareanu NASA Ames Research Center.
SOFTWARE TESTING. INTRODUCTION  Software Testing is the process of executing a program or system with the intent of finding errors.  It involves any.
Formal Conformance Testing of Systems with Refused Inputs and Forbidden Actions Igor Burdonov, Alexander Kossatchev, Victor Kuliamin ISP RAS, Moscow.
Testing Transition Systems with Input and Output Testers Alexandre Petrenko Nina Yevtushenko Jia Le Huo TestCom’03, May 27 th, 2003.
Automated Model-Based Testing of Hybrid Systems Michiel van Osch PROSE January 25,
November 2005J. B. Wordsworth: J5DAMQVT1 Design and Method Quality, Verification, and Testing.
Model-based Testing of Hybrid Systems Michiel van Osch IPA Spring Days on Testing 19 April – 21 April 2006.
1 Jan Tretmans Embedded Systems Institute Eindhoven Radboud University Nijmegen Model-Based Testing.
1 Jan Tretmans University of Nijmegen © Jan Tretmans University of Nijmegen Model Based Testing Property Checking for Real.
Department of CIS University of Pennsylvania 1/31/2001 Specification-based Protocol Testing Hyoung Seok Hong Oleg Sokolsky CSE 642.
1 Jan Tretmans Radboud University Nijmegen (NL) © Jan Tretmans Radboud University Nijmegen together with: University of Twente Enschede.
Software Testing. “Software and Cathedrals are much the same: First we build them, then we pray!!!” -Sam Redwine, Jr.
[ §4 : 1 ] 4. Requirements Processes II Overview 4.1Fundamentals 4.2Elicitation 4.3Specification 4.4Verification 4.5Validation Software Requirements Specification.
TEST CASE DESIGN Prepared by: Fatih Kızkun. OUTLINE Introduction –Importance of Test –Essential Test Case Development A Variety of Test Methods –Risk.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
Formal Methods 1. Software Engineering and Formal Methods  Every software engineering methodology is based on a recommended development process  proceeding.
Software Testing Verification and validation planning Software inspections Software Inspection vs. Testing Automated static analysis Cleanroom software.
CS527: (Advanced) Topics in Software Engineering Overview of Software Quality Assurance Tao Xie ©D. Marinov, T. Xie.
Software Testing Content Essence Terminology Classification –Unit, System … –BlackBox, WhiteBox Debugging IEEE Standards.
1. Topics to be discussed Introduction Objectives Testing Life Cycle Verification Vs Validation Testing Methodology Testing Levels 2.
TESTING.
The State of Hybrid Model-Based Testing Michiel van Osch
Testing with Formal Methods Ed Brinksma course 2004 A Formal Framework.
Software Development Software Testing. Testing Definitions There are many tests going under various names. The following is a general list to get a feel.
What is software testing? 1 What are the problems of software testing? 2 Time is limited Applications are complex Requirements are fluid.
Software Engineering Research paper presentation Ali Ahmad Formal Approaches to Software Testing Hierarchal GUI Test Case Generation Using Automated Planning.
Model Based Testing Group 7  Nishanth Chandradas ( )  George Stavrinides ( )  Jeyhan Hizli ( )  Talvinder Judge ( )  Saajan.
Reactive systems – general
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 22 Slide 1 Software Verification, Validation and Testing.
Conformance Test Suites, Extensionally Arend Rensink University of Twente Dutch Workshop on Formal Testing Techniques University of Twente 13 September.
Natallia Kokash (Accepted for PACO’2011) ACG, 31/05/ Input-output conformance testing for channel-based connectors 1.
Formal methods & Tools Ed Brinksma University of Twente, Netherlands ISSTA/Wosp Rome, July 24 th, 2002 Qualitative and Quantitative Analysis of Software.
1 Black-box conformance testing for real-time systems Stavros Tripakis VERIMAG Joint work with Moez Krichen.
1 Checking Interaction Consistency in MARMOT Component Refinements Yunja Choi School of Electrical Engineering and Computer Science Kyungpook National.
Ed Brinksma Course 2004 TorX : A Test Generation Tool.
Software Development Problem Analysis and Specification Design Implementation (Coding) Testing, Execution and Debugging Maintenance.
Testing Implementation Conformance with respect to its Architectural specification Software Architectures and Testing Begin Antonia Bertolino IEI - CNR,
Software Engineering 2004 Jyrki Nummenmaa 1 BACKGROUND There is no way to generally test programs exhaustively (that is, going through all execution.
Model-driven Test Generation Oleg Sokolsky September 22, 2004.
Lecture 5 1 CSP tools for verification of Sec Prot Overview of the lecture The Casper interface Refinement checking and FDR Model checking Theorem proving.
LSR Test purposes: adapting the notion of specification to testing Yves Ledru, L. du Bousquet, P. Bontron, O. Maury, C. Oriat, M.-L. Potet LSR/IMAG Grenoble,
Software Testing and Quality Assurance 1. What is the objectives of Software Testing?
08120: Programming 2: SoftwareTesting and Debugging Dr Mike Brayshaw.
Chapter 1 Software Engineering Principles. Problem analysis Requirements elicitation Software specification High- and low-level design Implementation.
Formal Testing with Input-Output Transition Systems Ed Brinksma Course 2004.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 23 Slide 1 Software testing.
C++ for Engineers and Scientists, Second Edition 1 Problem Solution and Software Development Software development procedure: method for solving problems.
Test Generation for Input/Output Transition Systems Ed Brinksma Course 2004.
Mutation Testing Laraib Zahid & Mariam Arshad. What is Mutation Testing?  Fault-based Testing: directed towards “typical” faults that could occur in.
Software Testing.
Software Testing.
Methodological Issues in Model-Based Testing (MBT)
Chapter 8 – Software Testing
Verification and Testing
Preorders on Labelled Transition Systems
Software Design Methodology
Introduction to Software Testing
Software Testing (Lecture 11-a)
Software Verification and Validation
Software Verification and Validation
Regression testing Tor Stållhane.
Software Verification and Validation
Software Development Chapter 1.
A test generation framework for quiescent real-time systems
08120: Programming 2: SoftwareTesting and Debugging
Presentation transcript:

1 Jan Tretmans Embedded Systems Institute Eindhoven, NL Radboud University Nijmegen, NL Model-Based Testing with Labelled Transition Systems

© Jan Tretmans 2 Testing Testing: checking or measuring some quality characteristics of an executing object by performing experiments in a controlled way w.r.t. a specification IUT tester specification

© Jan Tretmans 3 Types of Testing unit integration system efficiency maintainability functionality white boxblack box Level of detail Accessibility Characteristics usability reliability module portability

© Jan Tretmans 4 Automated Model-Based TestingModel-BasedAutomatedTesting model IUT confto model TTCN test cases pass fail test tool test generation tool test execution tool IUT passes tests IUT confto model 

© Jan Tretmans 5 Towards Model Based Testing  Increase in complexity, and quest for higher quality software  testing effort grows exponentially with complexity  testing cannot keep pace with development  More abstraction  less detail  model-based development  Checking quality  practice: testing - ad hoc, too late, expensive, lot of time  research: formal verification - proofs, model checking,.... with disappointing practical impact

© Jan Tretmans 6 Towards Model Based Testing  Model based testing has potential to combine  practice - testing  theory- formal methods  Model Based Testing :  testing with respect to a (formal) model / specification state model, pre/post, CSP, Promela, UML, Spec#,....  promises better, faster, cheaper testing:  algorithmic generation of tests and test oracles : tools  formal and unambiguous basis for testing  measuring the completeness of tests  maintenance of tests through model modification

© Jan Tretmans 7 A Model-Based Development Process informal requirements specification realization design code formalizable validation formal verification testing model- based informal world world of models real world

© Jan Tretmans 8 formal world real world Formal Verification model m of i sat m modcheck s m sat s     completesound model checker m modcheck s assumption: m is valid model of i sat specification s implementation i

© Jan Tretmans 9 formal world real world Formal Testing specification s implementation i confto IUT passes T s IUT confto s  correctness of test generation ? IUT passes T s IUT fails T s test execution test generation test suite T S implementation i IUT

© Jan Tretmans 10 Approaches to Model-Based Testing Several modelling paradigms:  Finite State Machine  Pre/post-conditions  Labelled Transition Systems  Programs as Functions  Abstract Data Type testing  Labelled Transition Systems

© Jan Tretmans 11 Model-Based Testing for LTS test execution test generation tests model IUT confto Involves: model / specification implementation IUT + models of IUTs correctness tests test generation test execution test result analysis pass / fail

© Jan Tretmans 12 Labelled Transition System  S, L, T, s 0  ?coin ?button !alarm ?button !coffee states actions transitions T  S  (L  {  })  S initial state s 0  S Models of Specifications: Labelled Transition Systems

© Jan Tretmans 13 ?dub ! choc ?kwart !tea ! coffee ?dub ?kwart ?dub ?kwart ! choc ?dub !tea Example Models ! coffee ?dub !tea ?dub ! coffee ?dub ( Input-Enabled ) Transition Systems

© Jan Tretmans 14 i ioco s = def   Straces (s) : out (i after  )  out (s after  ) Intuition: i ioco-conforms to s, iff if i produces output x after trace , then s can produce x after  if i cannot produce any output after trace , then s cannot produce any output after  ( quiescence  ) Correctness Implementation Relation ioco

© Jan Tretmans 15 i ioco s = def   Straces (s) : out (i after  )  out (s after  ) Correctness Implementation Relation ioco p  p=  !x  L U  {  }. p !x out ( P )= { !x  L U | p !x, p  P }  {  | p  p, p  P } Straces ( s )= {   (L  {  })* | s  } p after  = { p’ | p  p’ }

© Jan Tretmans 16 ?dub ! choc ?kwart !tea ! coffee ?dub ?kwart ?dub ?kwart ! choc ?dub !tea ioco Implementation Relation ioco ! coffee ?dub !tea s ioco  ?dub ! coffee ?dub

© Jan Tretmans 17 Test Cases  ‘quiescence’ label   tree-structured  finite, deterministic  final states pass and fail  from each state  pass, fail :  either one input !a  or all outputs ?x and  ?coffee !dub !kwart ?tea ?coffee ?tea  !dub  pass fail pass Model of a test case = transition system :

© Jan Tretmans 18 Algorithm To generate a test case from transition system specification s 0 compute T(S), with S a set of states, and initially S = s 0 after  ; 1end test case pass For T(S), apply the following recursively, non-deterministically: 2supply input !a T ( S after ?a   ) ioco Test Generation Algorithm allowed outputs or  : !x  out ( S ) forbidden outputs or  : !y  out ( S ) 3observe output fail T ( S after !x ) fail allowed outputsforbidden outputs ?y  ?x

© Jan Tretmans 19 ?coffee  ?tea pass fail ?coffee pass fail  ?tea    Test Generation Example s ?dub !coffee ?dub test !dub

© Jan Tretmans 20 Test Execution Example Two test runs : t  i dub  pass  i' t  i dub coffee  i passes t ?coffee  ?tea pass fail ?coffee pass fail  ?tea test !dub i ?dub !coffee ?dub

© Jan Tretmans 21 Test Result Analysis Completeness of ioco Test Generation For every test t generated with algorithm we have:  Soundness : t will never fail with correct implementation i ioco s implies i passes t  Exhaustiveness : each incorrect implementation can be detected with a generated test t i ioco s implies  t : i fails t

© Jan Tretmans 22 Formal Testing with Transition Systems exec : TESTS  IMPS   (OBS) gen : LTS   (TTS) T s  TTS s  LTS IUT  IMPS ioco i IUT  IOTS passes : IOTS  TTS  {pass,fail} Proof soundness and exhaustiveness:  i  IOTS. (  t  gen(s). i passes t )  i ioco s Test hypothesis :  IUT  IMP.  i IUT  IOTS.  t  TTS. IUT passes t  i IUT passes t pass / fail =

© Jan Tretmans 23 Variations on a Theme i ioco s     Straces(s) : out ( i after  )  out ( s after  ) i  ior s     ( L  {  } )* : out ( i after  )  out ( s after  ) i ioconf s     traces(s) : out ( i after  )  out ( s after  ) i ioco F s     F : out ( i after  )  out ( s after  ) i mioco smulti-channel ioco i uioco s universal ioco i wioco s non-input-enabled ioco i sioco ssymbolic ioco i (r)tioco s(real) timed tioco (Aalborg, Twente, Grenoble, Bordeaux,....) i ioco r srefinement ioco i hioco shybrid ioco...

© Jan Tretmans 24 ? money ? button1 ? button2 ! coffee ! tea test case fai l ! money ! button2 ? tea fai l ? coffee pass  n: int  [ n  35 ] -> [ n  50 ] -> with data model and hybrid and time c := 0 c < 10 c < 15 [ c  5 ] -> c := 0 d V t / dt = 3 d V c / dt = 2 V c := 0 [V c = 10 ] -> V t := 0 [V t = 15 ] -> ? Testing Transition Systems:StatusExtensions ?coin1 ?coin3 ?coin2 and action refinement

© Jan Tretmans 25 !x ?ok !err ?but !err ?but !ok ?ok ?er r !x ?err !y ?ok ?err ?ok ?err ?ok ?err ?but  !x i 1 ioco s 1 i 2 ioco s 2 ioco s 1 ||s 2 i 1 ||i 2 ok err but XyXy ok err but XyXy Component Based Testing  ?but !y ?but

© Jan Tretmans 26 Compositional Testing Component Based Testing i1i1 i2i2 s2s2 s1s1 ioco i 1 ioco s 1 i 2 ioco s 2 s 1 || s 2 i 1 || i 2 If s 1, s 2 input enabled - s 1, s 2  IOTS - then ioco is preserved !

© Jan Tretmans 27 Variations on a Theme: uioco ?a ?b !z ?b?a !y !x i ioco s     Straces(s) : out ( i after  )  out ( s 0 after  ) s0s0 s1s1 s2s2 out ( s 0 after ?b ) =  but ?b  Straces(s) : under-specification : anything allowed after ?b out ( s 0 after ?a ?a ) = { !x } and ?a ?a  Straces(s) but from s 2, ?a ?a is under-specified : anything allowed after ?a ?a ?

© Jan Tretmans 28 Variations on a Theme: uioco ?a ?b !z ?b?a !y !x s0s0 s1s1 s2s2 i uioco s     Utraces(s) : out ( i after  )  out ( s 0 after  ) Now s is under-specified in s 2 for ?a : anything is allowed. Utraces(s) = {   Straces (s) |   1 ?a  2 = ,  s': s  1 s'  s' ?a } ioco  uioco

© Jan Tretmans 29 Variations on a Theme: uioco LILU LILU  LI LI  ?a ?b !z ?b?a !y !x s0s0 s1s1 ?b  ?a  s2s2 i uioco s     Utraces(s) : out ( i after  )  out ( s 0 after  ) Alternatively, via chaos process  for under-specified inputs

© Jan Tretmans 30 Testing Components method invocations IUT component || IUT component method invocations methods invoked method call IUT component method returned method called method return IUT component method invocation

© Jan Tretmans 31 Testing Components tester method call IUT component method return method call method return L I = offered methods calls  used methods returns L U = offered methods returns  used methods calls specification s  LTS(L I, L U )

© Jan Tretmans 32 Testing Components tester method call IUT component method return method call method return Input-enabledness:  s of IUT,  ?a  L I : s ?a ? No ! ?

© Jan Tretmans 33 i uioco s = def   Utraces (s) : out (i after  )  out (s after  ) Correctness Implementation Relation wioco in (s after  ) = { a?  L I | s after  must a? } i wioco s = def   Utraces (s) : out (i after  )  out (s after  ) and in (i after  )  in (s after  ) s after  must a? =  s’ ( s  s’  s’ a? )

© Jan Tretmans 34 Formal Testing with Transition Systems exec : TESTS  IMPS   (OBS) gen : LTS   (TTS) T s  TTS s  LTS IUT  IMPS ioco i IUT  IOTS passes : IOTS  TTS  {pass,fail} Proof soundness and exhaustiveness:  i  IOTS. (  t  gen(s). i passes t )  i ioco s Test hypothesis :  IUT  IMP.  i IUT  IOTS.  t  TTS. IUT passes t  i IUT passes t pass / fail =

© Jan Tretmans 35 Test Assumption IUT input a?  L I output x?  L U quiescence  Sequencing of inputs, outputs, and  : Input-enabledness:  s of IUT,  ?a  L I : IUT  s ?a IUT behaves as an IOTS (input-enabled LTS)

© Jan Tretmans 36 Comparing Transition Systems Testing Equivalences S1S2S1S2 environment  Suppose an environment interacts with the systems:  the environment tests the system as black box by observing and actively controlling it;  the environment acts as a tester;  Two systems are equivalent if they pass the same tests.

© Jan Tretmans 37 Formal Testing : Test Assumption Test assumption :  IUT.  i IUT  MOD.  t  TEST. IUT passes t  i IUT passes t IUT i IUT test t

© Jan Tretmans 38 Completeness of Formal Testing IUT passes T s  def  t  T s. IUT passes t Test hypothesis :  t  TEST. IUT passes t  i IUT passes t Proof obligation :  i  MOD. (  t  T s. i passes t )  i imp s IUT passes T s  IUT confto s ? Definition : IUT confto s IUT confto s i IUT imp s   t  T s. i IUT passes t   t  T s. IUT passes t  IUT passes T s 

© Jan Tretmans 39 ! choc ?dub !tea ioco Test Assumption ! coffee ?dub !tea s More tests may be needed, starting in initial state:  meta-assumption: reliable restart

© Jan Tretmans 40 ?dub ! choc ?dub !tea ! choc ?dub !tea Alternative Test Assumption ioco Test:1. do ?dub 2. make core dump 3. make many copies of core dump 4. continue test with each copy An “Ambramsky”-test can distinguish them

© Jan Tretmans 41 Alternative Test Assumption ioco ?kwart ?dub ?kwart !tea !choc ?dub ?kwart ?dub ?kwart !tea ! choc With test ?dub.?kwart.undo you can distinguish them

© Jan Tretmans 42 Concluding  Testing can be formal, too (M.-C. Gaudel, TACAS'95)  Testing shall be formal, too  A test generation algorithm is not just another algorithm :  Proof of soundness and exhaustiveness  Definition of test assumption and implementation relation  For labelled transition systems :  ioco for expressing conformance between imp and spec  a sound and exhaustive test generation algorithm  tools generating and executing tests: TGV, TestGen, Agedis, TorX,....

© Jan Tretmans 43 Model based formal testing can improve the testing process : model is precise and unambiguous basis for testing  design errors found during validation of model longer, cheaper, more flexible, and provably correct tests  easier test maintenance and regression testing automatic test generation and execution  full automation : test generation + execution + analysis extra effort of modelling compensated by better tests Perspectives

© Jan Tretmans 44 Thank You