Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Black-box conformance testing for real-time systems Stavros Tripakis VERIMAG Joint work with Moez Krichen.

Similar presentations


Presentation on theme: "1 Black-box conformance testing for real-time systems Stavros Tripakis VERIMAG Joint work with Moez Krichen."— Presentation transcript:

1 1 Black-box conformance testing for real-time systems Stavros Tripakis VERIMAG Joint work with Moez Krichen

2 2 Black-box conformance testing Specification SUT (system under test) Tester outputs Verdicts (pass/fail/?) inputs Does the SUT conform to the Specification ? black box

3 3 Model-based testing The specification is given as a formal model. The SUT also behaves according to an unknown model (black-box). Conformance of SUT to the specification is formally defined w.r.t. these models.

4 4 Real-time Testing SUT Tester outputs Verdicts (pass/fail) inputs Tester observes events and time-stamps. Our models of preference Theory: timed automata Practice: the IF language (www-verimag.imag.fr/~async/IF/)

5 5 Plan of talk Specification model Conformance relation Analog & digital tests Test generation Tool and case studies

6 6 Plan of talk Specification model Conformance relation Analog & digital tests Test generation Tool and case studies

7 7 Specification model: general timed automata with input/output/unobservable actions Timed automata = finite-state machines + clocks. Input/output actions: interface with environment and tester. Unobservable actions: –Model partial observability of the tester. –Good for compositional specifications.

8 8 Simple example 1 a? x:=0 x  4 b! “Output b at most 4 time units after receiving input a. ”

9 9 Compositional specifications with internal (unobservable) actions. AB C

10 10 Compositional specifications internal (unobservable) actions.

11 11 Modeling assumptions on the environment system (spec) Compose the specification with a model of the environment. environment Export the interactions between them (make them observable).

12 12 Simple example 2 a? x  10x  4 b! “Output b at most 4 time units after receiving input a, provided a is received no later than 10 time units. ” Constraints on the inputs model assumptions. x:=0 Constraints on the outputs model requirements.

13 13 Simple example 2 a! y  10 “Output b at most 4 time units after receiving input a, provided a is received no later than 10 time units. ” A compositional modeling of the same example. y:=0 a? x:=0 x  4 b! a?b!

14 14 Plan of talk Specification model Conformance relation Analog & digital tests Test generation Tool and case studies

15 15 Conformance relation: tioco A timed extension of Tretman’s ioco (input- output conformance relation). Informally, A tioco B if –Every output of the implementation is allowed by the specification, including time delays. A: implementation/SUT (input-complete). B: specification (not always input-complete ( model environment assumptions).

16 16 Conformance relation Formally: A tioco B (A: implementation, B:specification) iff  Traces(B). out(A after  )  out(B after  )

17 17 Conformance relation where: A after  = {s |  Seq. s 0  s  proj( ,Obs)=  }  out(S) = delays(S)  outputs(S)

18 18 Conformance relation where: outputs(S) = {a  Outputs |  s  S. s  } a delays(S) = {t  R |  s  S.  UnobsSeq. time(  ) = t  s  } 

19 19 Examples “Output b at most 4 time units after receiving input a. ” a? x:=0 x  4 b! Spec:

20 20 Examples “Output b at most 4 time units after receiving input a. ” a? x:=0 x  4 b! Spec: a? x:=0x = 4 b! Impl 1:

21 21 Examples “Output b at most 4 time units after receiving input a. ” a? x:=0 x  4 b! Spec: a? x:=0x = 4 b! Impl 1: OK!

22 22 Examples “Output b at most 4 time units after receiving input a. ” a? x:=0 x  4 b! Spec: a? x:=0x = 4 b! Impl 1: a? x:=0 x  2 b! Impl 2: OK!

23 23 Examples “Output b at most 4 time units after receiving input a. ” a? x:=0 x  4 b! Spec: a? x:=0x = 4 b! Impl 1: a? x:=0 x  2 b! Impl 2: OK!

24 24 Examples “Output b at most 4 time units after receiving input a. ” a? x:=0 x  4 b! Spec: a? x:=0x = 5 b! Impl 3:

25 25 Examples “Output b at most 4 time units after receiving input a. ” a? x:=0 x  4 b! Spec: a? x:=0x = 5 b! Impl 3: NOT OK!

26 26 Examples “Output b at most 4 time units after receiving input a. ” a? x:=0 x  4 b! Spec: a? x:=0x = 5 b! Impl 3: a? Impl 4: NOT OK!

27 27 Examples “Output b at most 4 time units after receiving input a. ” a? x:=0 x  4 b! Spec: a? x:=0x = 5 b! Impl 3: a? Impl 4: NOT OK!

28 28 Plan of talk Specification model Conformance relation Analog & digital tests Test generation Tool and case studies

29 29 Timed tests Two types of tests: Analog-clock tests: –Can measure real-time precisely –Difficult to implement for real-time SUTs –Good (flexible) for discrete-time SUTs with unknown time step Digital-clock tests: –Can count “ticks” of a periodic clock/counter –Implementable for any SUT –Conservative (may say PASS when it’s FAIL)

30 30 Timed tests Analog-clock tests: –They can observe real-time precisely, e.g.: Digital-clock (or periodic-sampling) tests: –They only have access to a periodic clock, e.g.: ba 1.32.42.7 c time bac 123

31 31 Timed tests Analog-clock tests: –They can observe real-time precisely, e.g.: Digital-clock (or periodic-sampling) tests: –They only have access to a periodic clock, e.g.: ba 1.32.42.7 c time bac 123

32 32 Note Digital-clock tests does not mean we discretize time: –The specification is still dense-time –The capabilities of the observer are discrete-time ) –Many dense-time traces will look the same to the digital observer (verdict approximation)

33 33 Plan of talk Specification model Conformance relation Analog & digital tests Test generation Tool and case studies

34 34 Untimed tests Can be represented as finite trees (“strategies”): i o1 o2o3o4 fail i1i2i3 … … fail pass

35 35 Digital-clock tests Can be represented as finite trees: i o1 o2o3o4tick fail… i1i2i3 … … fail pass Models the tick of the tester’s clock

36 36 Analog-clock tests Cannot be represented as finite trees: i o1 o2o3o4 0.1 fail i1i2i3 … … fail pass 0.110.2 … Infinite number of unknown delays Solution: on-the-fly testing

37 37 On-the-fly testing Generate the testing strategy during test execution. Symbolic generation. Can be applied to digital-clock testing as well.

38 38 Test generation principle current estimate = set of possible states of specification observation (event or delay) next estimate runs matching observation If empty, FAIL.

39 39 Test generation algorithmics Sets of states are represented symbolically (standard timed automata technology, DBMs, etc.) Updates amount to performing some type of symbolic reachability. Implemented in verification tools, e.g., Kronos. IF has more: dynamic clock creation/deletion, activity analysis, parametric DBMs, etc.

40 40 Digital-clock test generation Can be on-the-fly or static. Same algorithms. Trick: Generate “untimed” tester: tick is observable. Can also model skew, etc, using other “Tick” automata. “Tick” tick! z = 1 z:= 0 original specification automaton new specification automaton

41 41 Recent advances Representing analog-clock tests as timed automata. Coverage criteria.

42 42 Timed automata testers On-the-fly testing needs to be fast: –Tester reacts in real-time with the SUT. –BUT: reachability can be costly. –Can we generate a timed automaton tester ? Problem undecidable in general: –Non-determinizability of timed automata. Pragmatic approach: –Fix the number of clocks of the tester. –Fix their reset positions. –Synthesize the rest: locations, guards, etc.

43 43 Timed automata testers Example: a? x:=0 1  x  4 b! Spec: a! x=1 Tester: x > 4 b? x < 1 FAIL PASS 1  x  4 b?

44 44 Coverage A single test is not enough. Exhaustive test suite up to given depth: –Explosion: # of tests grows exponentially! Coverage: few tests, some guarantees. Various criteria: –Location: cover locations of specification. –Edge: cover edges of specification. –State: cover states (location,clocks) of spec. Algorithms: –Based on symbolic reachability graph. –Performance can be impressive: 8 instead of 15000 tests.

45 45 Plan of talk Specification model Conformance relation Analog & digital tests Test generation Tool and case studies

46 46 Implementation Implemented on top of IF environment. TTG: Timed Test Generation

47 47 Tool Input language: IF timed automata –Dynamic creation of processes/channels. –Synchronous/asynchronous communication. –Priorities, variables, buffers, external data structures, etc. Tool options: –Generate analog tester (or monitor). –Generate digital test/monitor suite: Interactively (user guided). Exhaustive up to given length. Coverage (current work).

48 48 Real-time Monitoring/Testing SUT Tester outputs Verdicts (pass/fail) inputs SUT Monitor outputs Verdicts (pass/fail)

49 49 A sample test generated by TTG

50 50 Case studies A bunch of simple examples tried out –A simple light controller.light controller –15000 digital tests up to depth 8. –8 tests suffice to cover specification. A larger example: NASA K9 Rover executive. –SUT: 30000 lines of C++ code. –TA specification generated automatically from mission plans. –Monitors generated automatically from TA specs. –Traces generated by NASA and tested by us.

51 51 Papers 1.Krichen, Tripakis, “Black-box conformance testing for real-time systems”, SPIN’04, LNCS 2989. 2.Bensalem, Bozga, Krichen, Tripakis, “Testing conformance of real-time applications by automatic generation of observers”, Runtime Verification’04, ENTCS. 3.Krichen, Tripakis, “Real-time testing with timed automata testers and coverage criteria”, submitted.

52 52 merci ! des questions ?


Download ppt "1 Black-box conformance testing for real-time systems Stavros Tripakis VERIMAG Joint work with Moez Krichen."

Similar presentations


Ads by Google