Presentation is loading. Please wait.

Presentation is loading. Please wait.

Separating Test Execution from Test Analysis StarEast 2011 Jacques Durand (Fujitsu America, Inc.) 1.

Similar presentations


Presentation on theme: "Separating Test Execution from Test Analysis StarEast 2011 Jacques Durand (Fujitsu America, Inc.) 1."— Presentation transcript:

1 Separating Test Execution from Test Analysis StarEast 2011 Jacques Durand (Fujitsu America, Inc.) jdurand@us.fujitsu.com 1

2 Fujitsu Computer Systems  The case for “2-Phase” testing  Case study: the “WS-Interoperability” test suite  Test Analysis with Test Assertions  Leveraging XML: tamelizer tool (open source) Separating Test Execution from Test Analysis Fujitsu America, Inc. 2

3 Fujitsu Computer Systems A bit of Background  These Ideas come from testing Service- Oriented environments  Testing a Service vs. testing a component? Service = “Contract” – “Right” use  gives you good “service” Service = Reusable – But a lot of variation in contexts of use… Invocation  use a stack of technologies/standards, –the integration of which needs be included in testing Fujitsu America, Inc. 3

4 Fujitsu Computer Systems A bit of Background (2)  It became obvious that: Ways of using a Service are many more than you can afford to test…  Need to make the best of each service invocation Understanding the Context of Use is key in understanding failures.  Need to factor-in prior invocations, other services and middleware composed with  Application building more and more like assembling Services Reused components, mash-ups… Fujitsu America, Inc. 4

5 Fujitsu Computer Systems System Testing today: (in general) Test Suites = workflow of Test Cases Test Case = test execution + analysis + reporting SUTAPIAPI Test case 1 Test case 2 Test case 3 opA opB opC Test Report opA FAIL opC PASS opB FAIL Test Suite Fujitsu America, Inc. 5

6 Fujitsu Computer Systems Test Case-level Integration of Execution and Analysis  Each Test Cases is doing: Prepare test (input data…) Call operation XYZ on System Under Test Compare output data with reference data. Report Error / Success  Mixing two different activities: Test execution Test analysis  Why is this NOT great? Fujitsu America, Inc. 6

7 Fujitsu Computer Systems Issues with each Test Case doing it all (test execution + analysis + reporting)  Under-reporting A test case is designed to test ONE feature But needs accessory use of OTHER features Designed to detect/report failures of “main” feature, NOT of “accessory” features  Mis-diagnosis: Accessory features are assumed to work well, but may fail Causing failure of the test case  a FAIL is reported for the “main” feature under test. Fujitsu America, Inc. 7

8 Fujitsu Computer Systems Main Feature under test and Accessory Features  Example: “Array structure” under test Test case 123 is testing the “sum” function  Sequence of operations: (1) Create the array, e.g. size 20 (2) Set each array entry to some value (3) Calculate “Sum” of all values (4) Compare output of (3) with reference sum  What if “Set” does not work above index 10? SUM test case will FAIL! “Set” failure will not be reported by 123 accessory main Mis-diagnosis analysis Under-reporting Fujitsu America, Inc. 8

9 Fujitsu Computer Systems The need to Use Accessory Features  Example: Array object under test  Test case 122 is testing the “set” function  Sequence of operations for test 122: (1) Create the array, e.g. size 10 (2) Set each array entry to some value (3) Read each array entry and make a list (3) Compare result list with reference list accessory feature main feature analysis accessory Fujitsu America, Inc. 9

10 Fujitsu Computer Systems Start end F1 F2 F3 F4 FAIL AnalyzeReport Test Case 100 SUT Main Feature under test for test 100 F2 F3 F4 Test report F5 FAIL Accessory features for test 100 Execute scenario “Streaming” test suites: a workflow of “do-it-all” Test Cases Fujitsu America, Inc. 10

11 Fujitsu Computer Systems 11 Separating Test Execution from Test Analysis Test Operations Test scenarios Phase 1 Test Analysis Test assertions Phase 2 SUT Execution Report Test Report Fujitsu America, Inc.

12 Fujitsu Computer Systems  In a Streaming Test Suite: Test Suite = workflow of test cases Each test case has a main feature in focus Each test case: executes + analyzes Test case produces a validation report item (pass/fail)  In a 2-Phase Test Suite: Phase 1 = workflow of test cases –Phase 2 is a separate, global analysis phase Every SUT feature needs be exercised too –But no “main feature” to be reported on Test case executes a test scenario, NO analysis –produces an execution report item ( operation trace ) 2-Phase Test Suites vs. Streaming T. Suites Fujitsu America, Inc. 12

13 Fujitsu Computer Systems Test Tools: overview Web Service ANALYZER MONITOR Interceptor Logger Message artifacts Use Case: WS-Interoperability Testing Execution report Test report Fujitsu America, Inc. Phase 1 Phase 2 Test SCENARIOS Client code SUT 1 (SUT 2) 13

14 Fujitsu Computer Systems Testing for EVERY operation used (Accessory or Main) in EVERY test case  Example: Test the “Create” Array operation All initial values must be = 0  “Tracing” the create operation: (1) Trace input (nm=“ABC” sze= “10”) (2) Create ABC [10] (3) Read ABC [1], ABC [2] … ABC [10] (4) Trace output Execution Report Create ABC: ( Sze = 10 Out = (1,0)(2,0)… (10,0) ) Fujitsu America, Inc. 14

15 Fujitsu Computer Systems Testing for EVERY operation used (Accessory or Main) in EVERY test case  “Tracing” the set operation: (1) Trace input (index=“2”, val= “50”) (2) Set array entry 2 to input value 50 (3) Read ABC[2] (4) Trace output Execution Report Set ABC: ( Ind = 2 Val = 50 Out = (2,50) ) Fujitsu America, Inc. 15

16 Fujitsu Computer Systems Start end F1 Test Case 100 SUT F2 F3 F4 Execute “Tracing” each operation Fujitsu America, Inc. Trace (F1) Trace (F2) Trace (F3) Execution Report Trace Traces A library of test wrappers 16

17 Fujitsu Computer Systems  Test case 123 is executing (NOT verifying) the “sum” function  Sequence of operations: (1) Trace (Create the array ABC, size 11) (2) Trace (Set each array entry from -5 to +5) (3) Trace (Sum of all values) (should be 0) Rewriting Test Case 123 for Phase 1 in a 2-Phase test Suite Create ABC: (Sze = 21; Out= (1,0)…(11,0)) Set ABC: (Ind=1; Val = -5; Out= (1, -5)) Set ABC: (Ind=2; Val = -4; Out = (2, -4)) Sum ABC: ( Out = 0) Set ABC: (Ind=11; Val = 5; Out = (11, 5)) Execution Report … Fujitsu America, Inc. 17

18 Fujitsu Computer Systems Start end F1 F3 FAIL Trace Test Case 100 SUT F2 F3 F4 Test report FAIL Execute Trace F1 F2 F3 Execution Report F1 F3 F4 F2 ANALYSISANALYSIS F3 F4 “2-Phase” test suites Phase 2 Phase 1 Fujitsu America, Inc. 18

19 Fujitsu Computer Systems Test case 1Test case 2 Test case 3Test case 4 Feature A Feature B Feature C Feature D Feature A Feature B Feature C Feature D main Exercised but NOT reported Exercised + reported “Streaming” Test suite 2-phase Test suite Fujitsu America, Inc. SUT feature Coverage 19

20 Fujitsu Computer Systems  Verify the Set operation = Verify EVERY Set trace (from every test case)  Verify the Sum operation = after its Create and Set traces have been verified Phase 2: Test Analysis Execution Report Create ABC: (Sze = 21; Out= (1,0)…(11,0)) Set ABC: (Ind=1; Val = -5; Out= (1,-5)) Set ABC: (Ind=2; Val = -4; Out = (2,-4)) Sum ABC: ( Out = 0) Set ABC: (Ind=11; Val = 5; Out = (11,5)) … Fujitsu America, Inc. Under-reporting Mis-diagnosis 20

21 Fujitsu Computer Systems  Ensuring correct diagnosis: Failure of an accessory feature may cause failure of the Test Case. Test output is meaningful for main feature only if no failure on accessory features. In streaming Test Suites: hard to isolate fail cause –Need ordering: accessory test first, separate, exhaustive In 2-Phase Test Suites: much better – Only test ANALYSIS needs be ordered & separate – Accessory features always tested in their real context Test Analysis: Test(sum) = { create + set + sum } Fujitsu America, Inc. 21

22 Fujitsu Computer Systems  Tough diagnosis: Sometimes need to look at several Test Case outputs to understand which feature failed Advanced Test Analysis: Test(sum) => Create + Set + Sum Test(set) => Set + Read FAILOK andRead FAILED FAIL OK andSum (or Create?) FAILED FAIL andSet (or Read + Sum) FAILED FAIL Most likely cause: Fujitsu America, Inc. + 22

23 Fujitsu Computer Systems  The case for 2-Phase test suites  Case study: the “WS-Interoperability” test suite  Test Analysis with Test Assertions  Leveraging XML: tamelizer tool (open source) Separating Test Execution from Test Analysis Fujitsu America, Inc. 23

24 Fujitsu Computer Systems Test Tools: overview HTTP SOAP XML WSDL, XML schemas Web Service ANALYZER MONITOR Interceptor Logger Meta-data Use Case: WS-Interoperability Testing What’s under test? Description artifacts Web service run-time Client Execution report Test report Fujitsu America, Inc. Test SCENARIOS Client code 24

25 Fujitsu Computer Systems  Several “WS Profiles” to be tested Profile = way to combine underlying Standards –(SOAP, HTTP, XML…)  Phase 1 for Basic Profile 2.0: about 20 test scenarios Execution report = XML-formatted log  Phase 2 150 Executable test assertions (XML + Xpath )  Standard, HTML test reports Use Case: WS-Interoperability Testing Fujitsu America, Inc. 25

26 Fujitsu Computer Systems WS-I Profiles WS-Interoperability Testing (1) Schema A WSDL Message 1 Schema B Message 2 Message 3 Message 100 Consolidated XML Execution Report wsdl HTTP Message capture Test Assertion Engine Analyzer Fujitsu America, Inc. 26

27 Fujitsu Computer Systems  Basic Profile 2.0 Phase 1: ~20 Test scenarios, each producing average 2 messages in execution report (~40).  Phase 2 “coverage”: ~150 Test Assertions (TA) Each message hits an average of 20 Test Assertions  ~800 Test Items reports  COMPARE WITH: a “streaming” Test suite would have required: Executing more test scenarios (ideal 1 per TA) Yet would produce only 150 to 300 Test Items WS-Interoperability Testing (2) Fujitsu America, Inc. 27

28 Fujitsu Computer Systems  Basic Profile 2.0 Phase 1: ~20 Test scenarios, each producing average 2 messages in execution report (~40).  Phase 2 “coverage”: ~150 Test Assertions (TA) Each message hits an average of 20 Test Assertions  ~800 Test Items reports WS-Interoperability Testing (2) Fujitsu America, Inc. COMPARE WITH a “streaming” Test suite : Executing more test scenarios (ideal 1 per TA) Yet would produce only 150 to 300 Test Items 28

29 Fujitsu Computer Systems  The case for 2-Phase test suites  Case study: the “WS-Interoperability” test suite  Test Analysis with Test Assertions  Leveraging XML: tamelizer tool (open source) Separating Test Execution from Test Analysis Fujitsu America, Inc. 29

30 Fujitsu Computer Systems  A definition A Definition “a testable or measurable expression for evaluating the adherence of [part of] an implementation to a normative statement in a specification” (Test Assertions Guidelines OASIS TC)  Usually distinct from a Test Case:  Test Case = Executable set of test tools, programs & files  Test Assertion = a declarative, logical statement Back to Fundamentals: Test Assertions Fujitsu America, Inc. 30

31 Fujitsu Computer Systems Test Case Test Case Test assertion Test assertion Specification Test assertion Test Case addresses Test suite 1..n Normative statement Normative statement Normative statement 1..n The Role of Test Assertions derived from Conformance Clause defines a Conformance Profile for measures or indicates conformance of an implementation for specify: SUT behavior, API, etc Fujitsu America, Inc. 31

32 Fujitsu Computer Systems A Definition  Designing a Streaming Test Suite Test assertion derive Normative statement Test Suite design methodologies Test case = executable scenario + analysis + reporting Scenario Analysis  Designing a 2-Phase Test Suite Test assertion Normative statement Test case = Scenario + tracing Executable TA ( analysis + report ) derive Fujitsu America, Inc. 32

33 Fujitsu Computer Systems Normative statement Normative statement ID Normative Source Target Predicate Prescription Prerequisite Normative statement Specification Test Assertion [part of] an SUT Anatomy of a Test Assertion Analysis Logic Allows conditional Execution of TA ( MUST / SHOULD / MAY ) Fujitsu America, Inc. According to OASIS TAG 33

34 Fujitsu Computer Systems ID Normative Source Target Predicate Prescription Prerequisite Leveraging XML + XPath …. (Xpath) Fujitsu America, Inc. 34

35 Fujitsu Computer Systems  The case for 2-Phase test suites  The “WS-Interoperability” test suite use case  Test Analysis with Test Assertions  Leveraging XML: tamelizer tool (open source) demonstration Separating Test Execution from Test Analysis Fujitsu America, Inc. 35

36 Fujitsu Computer Systems Test Assertion ID Normative Source Target Predicate Prescription level Prerequisite Tags Test Assertion XPath Reporting Variables Add an ID scheme for target instances Convenient references to other documents Test assertion Markup + XPath = executable test “rule” Error message Diagnostic details XPath Fujitsu America, Inc. XPath 36

37 Fujitsu Computer Systems Simplest : http://www.oasis-open.org/committees/tc_home.php?wg_abbrev=tag For every target instance in the input XML document that matches the Target expression, do : if the Predicate expression is false report "failed" else report "passed" Fujitsu America, Inc. Test assertion mark-up from OASIS 37

38 Fujitsu Computer Systems Generating the Analyzer Generator (XSLT 2.0) Test Assertions (XML+XPath) Analyzer (XSLT 2.0) Code generation Testing run- time HTML Input Execution Report (XML) Test Analysis with Tamelizer Tool generating an Analyzer from a set of XML Test Assertions Test Report (XML) Rendering (XSLT1.0) Input generates Fujitsu America, Inc. 38

39 Fujitsu Computer Systems 2-Phase test suites separate Execution from Analysis: more reliable and “productive” than conventional (streaming) test suites Test Assertions are key to test analysis (phase 2) Mature XML processing technology can automate Analysis phase end-to-end Conclusion Fujitsu America, Inc. 39

40 Fujitsu Computer Systems OASIS Test Assertions Guidelines Useful Links Fujitsu America, Inc. http://www.oasis-open.org/committees/tc_home.php?wg_abbrev=tag Tamelizer Analysis tool (open source) http://code.google.com/p/tamelizer/ 40

41 Separating Test Execution from Test Analysis StarEast 2011 Jacques Durand (Fujitsu America, Inc.) jdurand@us.fujitsu.com 41


Download ppt "Separating Test Execution from Test Analysis StarEast 2011 Jacques Durand (Fujitsu America, Inc.) 1."

Similar presentations


Ads by Google