Presentation is loading. Please wait.

Presentation is loading. Please wait.

Model-Based Testing and Test-Based Modelling

Similar presentations


Presentation on theme: "Model-Based Testing and Test-Based Modelling"— Presentation transcript:

1 Model-Based Testing and Test-Based Modelling
Jan Tretmans Embedded Systems Institute, Eindhoven, NL and Radboud University, Nijmegen, NL Quasimodo

2 Overview Model-Based Testing
Model-Based Testing with Labelled Transition Systems Model-Based Testing: A Wireless Sensor Network Node Test-Based Modelling

3 Software Testing

4 (Software) Testing SUT
checking or measuring some quality characteristics of an executing object by performing experiments in a controlled way w.r.t. a specification tester specification SUT System Under Test

5 Sorts of Testing phases accessibility aspects system integration
module unit accessibility portability white box black box maintainability efficiency usability reliability functionality aspects

6 Paradox of Software Testing
Testing is: important much practiced 30-50% of project effort expensive time critical not constructive (but sadistic?) But also: ad-hoc, manual, error-prone hardly theory / research no attention in curricula not cool : “if you’re a bad programmer you might be a tester” Attitude is changing: more awareness more professional

7 Trends in Software Development
Testing Challenges Trends in Software Development Increasing complexity more functions, more interactions, more options and parameters Increasing size building new systems from scratch is not possible anymore integration of legacy-, outsourced-, off-the shelf components Blurring boundaries between systems more, and more complex interactions between systems systems dynamically depend on other systems, systems of systems Blurring boundaries in time requirements analysis, specification, implementation, testing, installation, maintenance overlap more different versions and configurations What is a failure ?

8 Models

9 Formal Models !coffee ?coin !alarm ?button
Proza vs formeel model Toelichting: Model staat geen aannames toe, alle (relevant) gedrag expliciet maken. Relevant, omdat niet alles getest hoeft te worden met MBT. Plaats in V-model: parallel aan specificatie fase Toelichting: vragen en onduidelijkheden resulteren in veranderingen/verbeteringen specificaties, later zou een lawine aan changes veroorzaken met hoge(re) herstelkosten. Iteratief functioneel ontwerpen en testmodelleren. MBT wordt MBV Toelichting: accent verschuift van testen naar valideren. Het tot stand komen van het testmodel kost veel meer tijd dan het testen zelf, maar levert per saldo tijdwinst op (inherent aan de curve van Boehm). BPV kwaliteitsattributen Toelichting: Testmodel toetsen aan de 12/13 regels. Attributen tonen? (Klaas Smit)

10 Model-Based Testing

11 Developments in Testing 1
Manual testing SUT System Under Test pass fail

12 Developments in Testing 2
TTCN test cases Manual testing Scripted testing test execution SUT pass fail

13 Developments in Testing 3
high-level test notation Manual testing Scripted testing High-level scripted testing test execution SUT pass fail

14 Developments in Testing 4
model-based test generation system model Test cases TTCN TTCN Manual testing Scripted testing High-level scripted testing Model-based testing test execution SUT pass fail

15 Model-Based . . . . . Verification, Validation, Testing, . . . . .

16 Validation, Verification, and Testing
ideas ideas wishes validation validation properties model abstract models, math verification concrete realizations testing testing SUT

17 Verification and Testing
Model-based verification : formal manipulation prove properties performed on model Model-based testing : experimentation show error concrete system formal world concrete world Verification is only as good as the validity of the model on which it is based Testing can only show the presence of errors, not their absence

18 Code Generation from a Model
A model is more (less) than code generation: views abstraction testing of aspects verification and validation of aspects met

19 Model-Based Testing with Labelled Transition Systems

20 model-based test generation
Model-Based Testing model-based test generation system model Test cases TTCN TTCN test execution SUT pass fail

21 MBT with Labelled Transition Systems
LTS model SUT behaving as input-enabled LTS TTCN Test cases pass fail LTS test execution ioco test generation input/output conformance ioco set of LTS tests

22 Models: Labelled Transition Systems
Labelled Transition System:  S, LI, LU, T, s0  initial state states transitions input actions output actions ? = input ! = output ?coin ?button !alarm !coffee

23 Models: Generation of Test Cases
specification model test case model ! coin ! button ?alarm ?coffee --- pass ?coin ?button !alarm !coffee fail fail

24 Models: Generation of Test Cases
specification model test case model ! button ! coin ? alarm ? coffee --- fail pass ?coin ?button !alarm !coffee

25 Conformance: ioco i ioco s =def    Straces (s) : out (i after )  out (s after ) p  p =  !x  LU  {} . p !x Straces ( s ) = {   ( L  {} )* | s  } p after  = { p’ | p  p’ } out ( P ) = { !x  LU | p !x , pP }  {  | p  p, pP }

26 Conformance: ioco i ioco s =def    Straces (s) : out (i after )  out (s after ) Intuition: i ioco-conforms to s, iff if i produces output x after trace , then s can produce x after  if i cannot produce any output after trace , then s cannot produce any output after  ( quiescence  )

27 Example: ioco  !coffee !choc ioco ioco !coffee specification model
?dime ?quart ?dime ?quart !choc ?dime !tea ioco ioco !coffee ?dime !tea specification model ioco ioco ?dime !coffee ?dime !choc ?quart !tea

28 Example: ioco ! x ! -x ? x (x < 0) SUT models specification model
! y (|yxy–x| < ε) ! -x ? x (x < 0) ? x (x >= 0) ? x !error LTS and ioco allow: non-determinism under-specification the specification of properties rather than construction

29 i ioco s =def    Straces (s) : out (i after )  out (s after )
?dub !tea !coffee s !coffee ?dub !tea i ioco s s ioco i out (i after ?dub.?dub) = out (s after ?dub.?dub) = { !tea, !coffee } out (i after ?dub..?dub) = { !coffee }  out (s after ?dub..?dub) = { !tea, !coffee }

30 Test Case   ?coffee test case = labelled transition system ?coffee
fail ?coffee ?tea !dub test case = labelled transition system fail pass ?coffee ?tea !kwart ‘quiescence’ label  tree-structured finite, deterministic final states pass and fail from each state  pass, fail : either one input !a or all outputs ?x and  ?coffee ?tea fail ?coffee ?tea fail fail !dub ?coffee ?tea pass pass fail

31 Test Generation Algorithm: ioco
Algorithm to generate a test case t(S) from a transition system state set S, with S   ( initially S = s0 after  ). Apply the following steps recursively, non-deterministically: 1 end test case pass 3 observe all outputs forbidden outputs allowed outputs ?y ?x 2 supply input !a allowed outputs fail fail forbidden outputs ?y ?x !a t ( S after !x ) fail fail allowed outputs (or ): !x  out ( S ) forbidden outputs (or ): !y  out ( S ) t ( S after !x ) t ( S after ?a   )

32 Example: ioco Test Generation
specification test ?dime ?dime !coffee

33 Example: ioco Test Generation
specification test ?dime ?dime !coffee

34 Example: ioco Test Generation
specification test ?dime ?dime !coffee

35 Example: ioco Test Generation
specification test !dime ?dime ?dime !coffee

36 Example: ioco Test Generation
specification test ?coffee !dime ?tea ?dime ?dime fail fail !coffee

37 Example: ioco Test Generation
specification test ?coffee !dime ?tea ?dime ?dime fail fail ?coffee ?tea !coffee pass fail

38 Example: ioco Test Generation
specification test ?coffee !dime ?tea ?dime ?dime fail fail ?coffee ?tea !coffee pass fail ?coffee ?tea fail fail pass

39 Test Result Analysis: Completeness
For every test t generated with the ioco test generation algorithm we have: Soundness : t will never fail with a correct implementation i ioco s implies i passes t Exhaustiveness : each incorrect implementation can be detected with a generated test t i ioco s implies  t : i fails t

40 Completeness of MBT with ioco
LTS model SUT behaving as input-enabled LTS TTCN Test cases pass fail LTS test execution ioco test generation SUT ioco model input/output conformance ioco exhaustive sound  set of LTS tests SUT passes tests

41 Model-Based Testing More Theory

42 Testing Equivalences ? ?  
environment e S1  S2   e  E . obs ( e, S1 ) = obs (e, S2 )   ? ?

43 MBT: Test Assumption Test assumption :  SUT .  mSUT  MODELS .  t  TEST . SUT passes t  mSUT passes t SUT mSUT test t test t

44 Soundness and Completeness
Test assumption : SUTIMP . mSUT IOTS . tTTS . SUT passes t  mSUT passes t s  LTS SUT i ioco s test tool gen : LTS  (TTS) t  SUT Prove soundness and exhaustiveness: mIOTS . ( tgen(s) . m passes t )  m ioco s SUT passes gen(s) SUT comforms to s   sound exhaustive pass fail

45 MBT : Completeness     ? SUT passes Ts  SUT conforms to s
SUT passes Ts def  t Ts . SUT passes t  t  Ts . SUT passes t test hypothesis:  t TEST . SUT passes t  mSUT passes t  t  Ts . mSUT passes t prove:  m MOD. (  t  Ts . m passes t )  m imp s mSUT imp s define : SUT conforms to s iff mSUT imp s SUT conforms to s

46 Genealogy of ioco ioco Labelled Transition Systems
IOTS ( IOA, IA, IOLTS ) Trace Preorder Canonical Tester conf Testing Equivalences (Preorders) Quiescent Trace Preorder Repetitive Quiescent Trace Preorder (Suspension Preorder) Refusal Equivalence (Preorder) ioco

47 Variations on a Theme i ioco s    Straces(s) : out ( i after )  out ( s after ) i ior s    ( L  {} )* : out ( i after )  out ( s after ) i ioconf s    traces(s) : out ( i after )  out ( s after ) i iocoF s    F : out ( i after )  out ( s after ) i uioco s    Utraces(s) : out ( i after )  out ( s after ) i mioco s multi-channel ioco i wioco s non-input-enabled ioco i eco e environmental conformance i sioco s symbolic ioco i (r)tioco s (real) timed tioco (Aalborg, Twente, Grenoble, Bordeaux,..... ) i rioco s refinement ioco i hioco s hybrid ioco i qioco s quantified ioco i poco s partially observable game ioco i stiocoD s real time and symbolic data

48 Model-Based Testing : There is Nothing More Practical than a Good Theory
Arguing about validity of test cases and correctness of test generation algorithms Explicit insight in what has been tested, and what not Use of complementary validation techniques: model checking, theorem proving, static analysis, runtime verification, Implementation relations for nondeterministic, concurrent, partially specified, loose specifications Comparison of MBT approaches and error detection capabilities

49 Test Selection in Model-Based Testing

50 Test Selection Exhaustiveness never achieved in practice
Test selection to achieve confidence in quality of tested product select best test cases capable of detecting failures measure to what extent testing was exhaustive Optimization problem best possible testing  within cost/time constraints

51 Test Selection: Approaches
random domain / application specific: test purposes, test goals, … model / code based: coverage usually structure based a? x! a? x! test: a! x? 100% 50% transition coverage

52 Test Selection: Semantic Coverage
correct implementations: CS = { i | i ioco s } S’ measure for test quality: area PT \ CS passing implementations PT = { i | i passes T } weaker model s’: { i | i ioco s }  { i | i ioco s’ } CS S implementations PT FT

53 Test Selection: Lattice of Specifications
top element allows any impl.  chaos  s1 is stronger than s2  s1  s2  { i | i ioco s1 }  { i | i ioco s2 } S2 S3 LI ? Lu ! if specs are input-enabled then ioco is preorder then   ioco` S1 CS1 implementations

54 Test Selection by Weaker Specification
but? on! off! but? on! off! chaos  LI ? Lu ! ? x (x >= 0) ! y (|yxy–x| < ε) ?x (0<x<10) ! y (|yxy–x| < 2ε)

55 Model-Based Testing A Wireless Sensor Network Node

56 Wireless Sensor Networks
warehouses: sense & control tracing products: actieve labels trains: seat reservation health care: on-body networks

57 Myrianed: Wireless Sensor Network
RF TRANCEIVER (NORDIC SEMI) CPU (ATMEL XMEGA128) RF ANTENNA I/O INTERFACES

58 Myrianed: a WSN with Gossiping
Communication inspired on biology and human interaction Epidemic communication Analogy: spreading a rumor or a virus MYRIANED “GOSSIP” protocol RF broadcast (2.4 Ghz ISM)

59 WSN: Typical Test Put WSN nodes together: end-to-end testing
Test all WSN nodes together, i.e. test the Network provide inputs to network observe outputs from network

60 WSN: Model-Based Conformance Test
Model-based testing of a single node: protocol conformance test of the gMAC protocol according to ISO 9646 local test method time is important in gMAC: real-time model-based testing

61 WSN Node in Protocol Layers
Upper Tester Lower Tester Application Layer gMAC Radio Application Layer gMAC Radio Application Layer gMAC Radio application interface radio Medium Layer

62 Local Test Method Hardware Software Approach: only software, on host
application interface radio Upper Tester Lower Tester clock tick GMAC layer Hardware Software Approach: only software, on host simulated, discrete time if(gMacFrame.currentSlotNumber > gMacFrame.lastFrameSlot) { gMacFrame.currentSlotNumber = 0; gMacFrame.syncState = gMacNextFrame.syncState; if (semaphore.appBusy == 1) { mcTimerSet(SLOT_TIME); mcPanic(MC_FAULT_APPLICATION); return; }

63 WSN: Test Architecture
Upper Tester Lower Tester GMAC layer: Software Model-Based Test Tool: TorXakis JTorX Uppaal-Tron Adapter transform messages synchronize simulated time sockets sockets

64 Models Proza vs formeel model Toelichting: Model staat geen aannames toe, alle (relevant) gedrag expliciet maken. Relevant, omdat niet alles getest hoeft te worden met MBT. Plaats in V-model: parallel aan specificatie fase Toelichting: vragen en onduidelijkheden resulteren in veranderingen/verbeteringen specificaties, later zou een lawine aan changes veroorzaken met hoge(re) herstelkosten. Iteratief functioneel ontwerpen en testmodelleren. MBT wordt MBV Toelichting: accent verschuift van testen naar valideren. Het tot stand komen van het testmodel kost veel meer tijd dan het testen zelf, maar levert per saldo tijdwinst op (inherent aan de curve van Boehm). BPV kwaliteitsattributen Toelichting: Testmodel toetsen aan de 12/13 regels. Attributen tonen? (Klaas Smit)

65 model-based test generation
Model-Based Testing model-based test generation system model Test cases TTCN TTCN test execution SUT pass fail

66 WSN: Model-Based Testing
MBT tool model-based test generation system model Test cases TTCN TTCN pass fail test runs C test adapter SUT (vague) descriptions guru ad-hoc model learning WSN software on PC

67 WSN: Test-Based Modeling
Uppaal-Tron TorXakis JTorX model-based test generation ??? Make a model from observations made during testing system model Test cases TTCN TTCN pass fail test runs adapter SUT WSN software on PC

68 Test-Based Modelling

69 Model-Based Testing IF there exists a model THEN automatic generation of tests FALSE TRUE No models, or models difficult to obtain complex, designers make no models, evolving legacy, third party, outsourced, reuse, no documentation

70 Model-Based Testing SUT model-based test generation system model
algorithm system model Test cases TTCN TTCN test execution SUT

71 Test-Based Modelling SUT learning algorithm system model Test cases
Learning a model of SUT behaviour from observations made during test black-box reverse engineering test-based modelling automata learning observation-based modelling behaviour capture and test learning algorithm system model Test cases TTCN TTCN test execution SUT active   passive

72 Validation, Verification, and Testing
ideas ideas wishes validation properties model abstract models, math concrete realizations learning SUT

73 Empirical Cycle phenomenon model theory predict validate
model world physical world

74 MBT and TBM MBT SUT model conforming refine model satisfied more tests
yes no refine model no satisfied more tests yes no improve model no improve SUT model world physical world

75 MBT and TBM agile mbt But MBT is also SUT repair model repair
improve MBT: more tests improve MBT: better model until satisfied Basic MBT process: make model take SUT generate tests execute tests assign verdict is only decision process iterative and incremental process agile mbt starting point: SUT initial model or no model at all: learning model from scratch by improving and refining until satisfied Satisfaction: MBT: sufficient confidence in correctness of SUT TBM: sufficient confidence in preciseness of model

76 Learning: Precision  B  Lu ! LI ? A  : not precise, cheap S’’ S’ S
measure for learning precision: ( A \ B ) / A S : precise, expensive  : not precise, cheap LI ? Lu ! chaos  Learning: Precision B S’’ S’ S implementations A SUT

77 Learning: Lattice of Models
ST  chaos  model for any SUT ST s1 is stronger than s2  s1  s2  { i | i ioco s1 }  { i | i ioco s2 } most precise model is testing equivalent to SUT S2 S3 if specs are input-enabled then ioco is preorder then   ioco` S1 CS1 implementations

78 Refining Learned Models
but? on! off! but? on! off! chaos  LI ? Lu ! ? x (x >= 0) ! y (|yxy–x| < ε) ?x (0<x<10) ! y (|yxy–x| < 2ε)

79 Refining Learned Models
but? on! off! but? on! off! chaos  MBT and TBM: two sides of same coin LI ? Lu ! ? x (x >= 0) ! y (|yxy–x| < ε) ?x (0<x<10) ! y (|yxy–x| < 2ε)

80 Test-Based Modelling Algorithms, Active: Algorithms, Passive:
D. Angluin (1987) : Learning regular sets from queries and counter-examples LearnLib: Adaption of Angluin for FSM - Tool searching for unique, precise model T. Willemse (2007) : TBM – ioco-based Test-Based Modelling approximation via n-bounded ioco Algorithms, Passive: process mining: ProM - many algorithms and tool model generation from set of traces Use as complete model, or as starting point for active testing

81 Wireless Sensor Network: Passive Learning with ProM

82 TBM: Use of Learned Models
No use for testing of SUT from which it was learned But for understanding, communication analysis, simulation, model-checking regression testing testing of re-implemented or re-factored system legacy replacement testing wrt. a reference implementation

83 Thank You !


Download ppt "Model-Based Testing and Test-Based Modelling"

Similar presentations


Ads by Google