Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Model-Based Testing and Test-Based Modelling Jan Tretmans Embedded Systems Institute, Eindhoven, NL and Radboud University, Nijmegen, NL Q uasimodo.

Similar presentations


Presentation on theme: "1 Model-Based Testing and Test-Based Modelling Jan Tretmans Embedded Systems Institute, Eindhoven, NL and Radboud University, Nijmegen, NL Q uasimodo."— Presentation transcript:

1 1 Model-Based Testing and Test-Based Modelling Jan Tretmans Embedded Systems Institute, Eindhoven, NL and Radboud University, Nijmegen, NL Q uasimodo

2 2 Overview 1.Model-Based Testing 2.Model-Based Testing with Labelled Transition Systems 3.Model-Based Testing: A Wireless Sensor Network Node 4.Test-Based Modelling

3 3 Software Testing

4 4 checking or measuring some quality characteristics of an executing object by performing experiments in a controlled way w.r.t. a specification tester specification SUT System Under Test (Software) Testing

5 5 Sorts of Testing unit integration system efficiency maintainability functionality white boxblack box phases accessibility aspects usability reliability module portability

6 6 But also: ad-hoc, manual, error-prone hardly theory / research no attention in curricula not cool : if youre a bad programmer you might be a tester Testing is: important much practiced 30-50% of project effort expensive time critical not constructive (but sadistic?) Attitude is changing: more awareness more professional Paradox of Software Testing

7 7 Trends in Software Development Increasing complexity more functions, more interactions, more options and parameters Increasing size building new systems from scratch is not possible anymore integration of legacy-, outsourced-, off-the shelf components Blurring boundaries between systems more, and more complex interactions between systems systems dynamically depend on other systems, systems of systems Blurring boundaries in time requirements analysis, specification, implementation, testing, installation, maintenance overlap more different versions and configurations What is a failure ? Testing Challenges

8 8 Models

9 9 Formal Models ?coin ?button !alarm ?button !coffee (Klaas Smit)

10 10 Model-Based Testing

11 11 SUT System Under Test pass fail Developments in Testing 1 1.Manual testing

12 12 SUT pass fail test execution TTCN test cases 1.Manual testing 2.Scripted testing Developments in Testing 2

13 13 SUT pass fail test execution 1.Manual testing 2.Scripted testing 3.High-level scripted testing Developments in Testing 3 high-level test notation

14 14 system model SUT TTCN Test cases pass fail model-based test generation test execution 1.Manual testing 2.Scripted testing 3.High-level scripted testing 4.Model-based testing Developments in Testing 4

15 15 Model-Based..... Verification, Validation, Testing,.....

16 16 Validation, Verification, and Testing model properties SUT ideas concrete realizations ideas wishes abstract models, math validation testing verification testing validation

17 17 formal world concrete world Verification is only as good as the validity of the model on which it is based Verification and Testing Model-based verification : formal manipulation prove properties performed on model Model-based testing : experimentation show error concrete system Testing can only show the presence of errors, not their absence

18 18 Code Generation from a Model A model is more (less) than code generation: views abstraction testing of aspects verification and validation of aspects met

19 19 Model-Based Testing with Labelled Transition Systems

20 20 system model SUT TTCN Test cases pass fail model-based test generation test execution Model-Based Testing

21 21 MBT with Labelled Transition Systems LTS model SUT behaving as input-enabled LTS TTCN Test cases pass fail LTS test execution ioco test generation input/output conformance ioco set of LTS tests

22 22 Models: Labelled Transition Systems states output actions transitions initial state ? = input ! = output ?coin ?button !alarm ?button !coffee Labelled Transition System: S, L I, L U, T, s 0 input actions

23 23 test case model ! coin ! button ?alarm ?coffee --- pass specification model Models: Generation of Test Cases ?coin ?button !alarm ?button !coffee fail

24 24 specification model Models: Generation of Test Cases ?coin ?button !alarm ?button !coffee test case model ! button ! coin ? alarm ? coffee --- fail pass fail

25 25 p p = !x L U { }. p !x out ( P ) = { !x L U | p !x, p P } { | p p, p P } Straces ( s ) = { ( L { } )* | s } p after = { p | p p } Conformance: ioco i ioco s = def Straces (s) : out (i after ) out (s after )

26 26 i ioco s = def Straces (s) : out (i after ) out (s after ) Intuition: i ioco - conforms to s, iff if i produces output x after trace, then s can produce x after if i cannot produce any output after trace, then s cannot produce any output after ( quiescence ) Conformance: ioco

27 27 ! coffee ?dime ?quart ?dime ?quart ?dime ! choc ?quart !tea ! coffee ?dime !tea specification model Example: ioco ioco ?dime ! coffee ?dime ! choc ?dime !tea

28 28 ? x (x >= 0) ! y (|y x y–x| < ε) specification model ! x ? x (x < 0) ? x (x >= 0) SUT models ? x LTS and ioco allow: non-determinism under-specification the specification of properties rather than construction Example: ioco ! - x ? x (x < 0) ? x (x >= 0) ? x !error

29 29 out (i after ?dub.?dub) = out (s after ?dub.?dub) = { !tea, !coffee } i ioco s s ioco i out (i after ?dub..?dub) = { !coffee } out (s after ?dub..?dub) = { !tea, !coffee } i ioco s = def Straces (s) : out (i after ) out (s after ) i ?dub !tea ?dub !coffee ?dub s !coffee ?dub !tea ?dub !tea

30 30 Test Case –quiescence label –tree-structured –finite, deterministic –final states pass and fail –from each state pass, fail : either one input !a or all outputs ?x and !dub !kwart ?tea ?coffee ?tea !dub fail test case = labelled transition system fail ?coffee ?tea fail pass ?coffee ?tea fail ?coffee ?tea ?coffee passfailpass

31 31 Algorithm to generate a test case t(S) from a transition system state set S, with S ( initially S = s 0 after ). Apply the following steps recursively, non-deterministically: 1end test case pass 2supply input !a !a t ( S after ?a ) Test Generation Algorithm: ioco allowed outputs (or ): !x out ( S ) forbidden outputs (or ): !y out ( S ) 3observe all outputs fail t ( S after !x ) fail allowed outputs forbidden outputs ?y ?x fail t ( S after !x ) fail allowed outputs forbidden outputs ?y ?x

32 32 Example: ioco Test Generation specification ?dime !coffee ?dime test

33 33 Example: ioco Test Generation specification ?dime !coffee ?dime test

34 34 Example: ioco Test Generation specification ?dime !coffee ?dime test

35 35 Example: ioco Test Generation specification ?dime !coffee ?dime test !dime

36 36 Example: ioco Test Generation specification ?dime !coffee ?dime test !dime ?coffee ?tea fail

37 37 ?coffee fail ?tea Example: ioco Test Generation specification ?dime !coffee ?dime test !dime ?coffee ?tea fail pass

38 38 Example: ioco Test Generation specification ?dime !coffee ?dime test !dime ?coffee ?tea pass fail ?coffee ?tea fail ?coffee ?tea pass fail

39 39 Test Result Analysis: Completeness For every test t generated with the ioco test generation algorithm we have: Soundness : t will never fail with a correct implementation i ioco s implies i passes t Exhaustiveness : each incorrect implementation can be detected with a generated test t i ioco s implies t : i fails t

40 40 LTS model SUT behaving as input-enabled LTS TTCN Test cases pass fail LTS test execution ioco test generation input/output conformance ioco set of LTS tests SUT passes tests SUT ioco model exhaustive sound Completeness of MBT with ioco

41 41 Model-Based Testing More Theory

42 42 ? ? S1 S2 e E. obs ( e, S1 ) = obs (e, S2 ) Testing Equivalences S1S2S1S2 environment e

43 43 Test assumption : SUT. m SUT MODELS. t TEST. SUT passes t m SUT passes t SUT m SUT test t MBT: Test Assumption

44 44 s LTS SUT i ioco s test tool gen : LTS (TTS) t SUT SUT passes gen(s) SUT comforms to s sound exhaustive Prove soundness and exhaustiveness: m IOTS. ( t gen(s). m passes t ) m ioco s Test assumption : SUT IMP. m SUT IOTS. t TTS. SUT passes t m SUT passes t Soundness and Completeness pass fail

45 45 SUT passes T s def t T s. SUT passes t test hypothesis: t TEST. SUT passes t m SUT passes t prove: m MOD. ( t T s. m passes t ) m imp s SUT passes T s SUT conforms to s ? define : SUT conforms to s iff m SUT imp s SUT conforms to s m SUT imp s t T s. m SUT passes t t T s. SUT passes t SUT passes T s MBT : Completeness

46 46 Genealogy of ioco Labelled Transition Systems IOTS ( IOA, IA, IOLTS ) Testing Equivalences (Preorders) Refusal Equivalence (Preorder) Canonical Tester conf Quiescent Trace Preorder Repetitive Quiescent Trace Preorder (Suspension Preorder) ioco Trace Preorder

47 47 Variations on a Theme i ioco s Straces(s) : out ( i after ) out ( s after ) i ior s ( L { } )* : out ( i after ) out ( s after ) i ioconf s traces(s) : out ( i after ) out ( s after ) i ioco F s F : out ( i after ) out ( s after ) i uioco s Utraces(s) : out ( i after ) out ( s after ) i mioco s multi-channel ioco i wioco s non-input-enabled ioco i eco e environmental conformance i sioco ssymbolic ioco i (r)tioco s(real) timed tioco (Aalborg, Twente, Grenoble, Bordeaux,..... ) i rioco srefinement ioco i hioco shybrid ioco i qioco squantified ioco i poco spartially observable game ioco i stioco D sreal time and symbolic data...

48 48 Model-Based Testing : There is Nothing More Practical than a Good Theory Arguing about validity of test cases and correctness of test generation algorithms Explicit insight in what has been tested, and what not Use of complementary validation techniques: model checking, theorem proving, static analysis, runtime verification,..... Implementation relations for nondeterministic, concurrent, partially specified, loose specifications Comparison of MBT approaches and error detection capabilities

49 49 Test Selection in Model-Based Testing

50 Test Selection Exhaustiveness never achieved in practice Test selection to achieve confidence in quality of tested product –select best test cases capable of detecting failures –measure to what extent testing was exhaustive Optimization problem best possible testing within cost/time constraints 50

51 Test Selection: Approaches 1.random 2.domain / application specific: test purposes, test goals, … 3.model / code based: coverage –usually structure based 51 test: a! x? a? x! a? x! a? x! 100% 50% transition coverage

52 Test Selection: Semantic Coverage 52 implementations correct implementations: C S = { i | i ioco s } passing implementations P T = { i | i passes T } measure for test quality: area P T \ C S FTFT PT PT PT PT CSCS CSCS S S weaker model s: { i | i ioco s }

53 implementations Test Selection: Lattice of Specifications 53 s 1 is stronger than s 2 s 1 s 2 { i | i ioco s 1 } { i | i ioco s 2 } CS1CS1 CS1CS1 S1S1 S2S2 if specs are input-enabled then ioco is preorder then ioco` S3S3 STST S T top element allows any impl. chaos L I ? L u !

54 Test Selection by Weaker Specification 54 but? on! but? off! but? on! off! ? x (x >= 0) ! y (|y x y–x| < ε) ?x (0

55 55 Model-Based Testing A Wireless Sensor Network Node

56 56 warehouses: sense & control tracing products: actieve labels trains: seat reservation health care: on-body networks Wireless Sensor Networks

57 Myrianed: Wireless Sensor Network RF ANTENNA RF TRANCEIVER (NORDIC SEMI) CPU (ATMEL XMEGA128) I/O INTERFACES

58 58 Communication inspired on biology and human interaction Epidemic communication Analogy: spreading a rumor or a virus MYRIANED GOSSIP protocol RF broadcast (2.4 Ghz ISM) Myrianed: a WSN with Gossiping

59 59 WSN: Typical Test Put WSN nodes together: end-to-end testing Test all WSN nodes together, i.e. test the Network provide inputs to network observe outputs from network

60 60 WSN: Model-Based Conformance Test Model-based testing of a single node: protocol conformance test of the gMAC protocol according to ISO 9646 local test method time is important in gMAC: real-time model-based testing

61 61 WSN Node in Protocol Layers node 1 Application Layer gMAC Layer Radio Layer Application Layer gMAC Layer Radio Layer node 2 Application Layer gMAC Layer Radio Layer node 3 Medium Layer application interface radio interface Upper Tester Lower Tester

62 62 Local Test Method application interface radio interface Upper Tester Lower Tester clock tick GMAC layer Approach: only software, on host simulated, discrete time if(gMacFrame.currentSlotNumber > gMacFrame.lastFrameSlot) { gMacFrame.currentSlotNumber = 0; gMacFrame.syncState = gMacNextFrame.syncState; if (semaphore.appBusy == 1) { mcTimerSet(SLOT_TIME); mcPanic(MC_FAULT_APPLICATION); return; } Hardware Software

63 63 WSN: Test Architecture Upper Tester Lower Tester GMAC layer: Software Model-Based Test Tool: TorXakis JTorX Uppaal-Tron sockets Adapter transform messages synchronize simulated time sockets

64 64 Models (Klaas Smit)

65 65 system model SUT TTCN Test cases pass fail model-based test generation test execution Model-Based Testing

66 SUT 66 TTCN Test cases model-based test generation WSN: Model-Based Testing system model MBT tool C test adapter pass fail test runs WSN software on PC (vague) descriptions guru ad-hoc model learning

67 SUT 67 TTCN Test cases model-based test generation WSN: Test-Based Modeling system model Uppaal-Tron TorXakis JTorX adapter pass fail test runs WSN software on PC ??? Make a model from observations made during testing

68 68 Test-Based Modelling

69 Model-Based Testing IF there exists a model THEN automatic generation of tests 69 TRUE FALSE No models, or models difficult to obtain –complex, designers make no models, evolving –legacy, third party, outsourced, reuse, no documentation

70 test execution 70 system model SUT TTCN Test cases model-based test generation algorithm Model-Based Testing

71 71 system model SUT TTCN Test cases learning algorithm test execution Test-Based Modelling Learning a model of SUT behaviour from observations made during test black-box reverse engineering test-based modelling automata learning observation-based modelling behaviour capture and test active passive

72 72 Validation, Verification, and Testing model properties SUT ideas concrete realizations ideas wishes abstract models, math validation learning

73 73 Empirical Cycle model theory phenomenon predict validate model world physical world

74 MBT and TBM 74 improve SUT improve SUT model MBT conforming yes no improve model improve model no satisfied more tests more tests yes no refine model refine model no model world physical world

75 Basic MBT process: 1.make model 2.take SUT 3.generate tests 4.execute tests 5.assign verdict is only decision process 75 MBT and TBM But MBT is also 1.SUT repair 2.model repair 3.improve MBT: more tests 4.improve MBT: better model until satisfied iterative and incremental process agile mbt starting point: 1.SUT 2.initial model 3.or no model at all: learning model from scratch by improving and refining until satisfied Satisfaction: MBT: sufficient confidence in correctness of SUT TBM: sufficient confidence in preciseness of model

76 L I ? L u ! chaos Learning: Precision 76 measure for learning precision: ( A \ B ) / A S : precise, expensive : not precise, cheap implementations A B B S S S SUT

77 implementations Learning: Lattice of Models 77 s 1 is stronger than s 2 s 1 s 2 { i | i ioco s 1 } { i | i ioco s 2 } CS1CS1 CS1CS1 S1S1 S2S2 if specs are input-enabled then ioco is preorder then ioco` S3S3 STST S T chaos model for any SUT most precise model is testing equivalent to SUT

78 Refining Learned Models 78 but? on! but? off! but? on! off! ? x (x >= 0) ! y (|y x y–x| < ε) ?x (0

79 Refining Learned Models 79 but? on! but? off! but? on! off! ? x (x >= 0) ! y (|y x y–x| < ε) ?x (0

80 Test-Based Modelling Algorithms, Active: –D. Angluin (1987) : Learning regular sets from queries and counter-examples –LearnLib: Adaption of Angluin for FSM - Tool searching for unique, precise model –T. Willemse (2007) : TBM – ioco-based Test-Based Modelling approximation via n-bounded ioco Algorithms, Passive: –process mining: ProM - many algorithms and tool model generation from set of traces –Use as complete model, or as starting point for active testing 80

81 Wireless Sensor Network: Passive Learning with ProM 81

82 TBM: Use of Learned Models No use for testing of SUT from which it was learned But for –understanding, communication –analysis, simulation, model-checking –regression testing –testing of re-implemented or re-factored system –legacy replacement –testing wrt. a reference implementation 82

83 83 Thank You !


Download ppt "1 Model-Based Testing and Test-Based Modelling Jan Tretmans Embedded Systems Institute, Eindhoven, NL and Radboud University, Nijmegen, NL Q uasimodo."

Similar presentations


Ads by Google