Presentation is loading. Please wait.

Presentation is loading. Please wait.

Overview of ETSI Testing Methodology ICT-OSA/Parlay Workshop Brazil, March 2006 Anthony Wiles Manager ETSI Protocol and Testing Competence Centre.

Similar presentations


Presentation on theme: "Overview of ETSI Testing Methodology ICT-OSA/Parlay Workshop Brazil, March 2006 Anthony Wiles Manager ETSI Protocol and Testing Competence Centre."— Presentation transcript:

1 Overview of ETSI Testing Methodology ICT-OSA/Parlay Workshop Brazil, March 2006 Anthony Wiles Manager ETSI Protocol and Testing Competence Centre

2 Testing Methodology - ICT Workshop, Brazil March 2006 2 Interoperability is … The ultimate aim of ICT standardisation The red thread running through the entire standards development process, its not an isolated issue Not something to be somehow fixed at the end The ability of systems and products to interwork is fundamental ETSI approach ETSI produces with the required degree of parallelism Base Standards and Test Specifications Base standards should be designed with interoperability in mind Profiles designed to reduce potential non-interoperability Protocol conformance and Interoperability statements Two complementary forms of testing Conformance testing Interoperability testing (formal IOT) Testing provides vital feedback into the standardization work

3 Testing Methodology - ICT Workshop, Brazil March 2006 3 Technical Committee MTS (Methods for Testing and Specification) Development of methodologies, techniques and languages (including TTCN3) http://portal.etsi.org ETSIs Protocol and Testing Competence Centre ( PTCC ) Supports ETSI committees on the application of formal techniques in standards on a daily basis Development of test specifications (conformance and interop) http://www.etsi.org/ptcc ETSI Plugtests Service Validation of standards and prototypes through interoperability events http://www.etsi.org/plugtestshttp://www.etsi.org/plugtests These resources are highly complementary ETSI Resources for Interoperability

4 Testing Methodology - ICT Workshop, Brazil March 2006 4 Typical Standards Development Process in ETSI (Unit) Conformance Testing Interoperability Testing Products mature from prototypes to commercial products ETSI: Development of base standards Certification Industry time Conformance Test Specifications Interoperability Test Specifications Iterative feedback

5 Testing Methodology - ICT Workshop, Brazil March 2006 5 Different Kinds of ETSI Test Specifications Conformance Robustness Performance Interoperability Network Integration RF/EMC

6 Testing Methodology - ICT Workshop, Brazil March 2006 6 Progressive Testing All engineering disciplines accept that testing should be applied to progressively complex units From individual components to entire systems This demands a number of different testing solutions There is no silver bullet In the two extremes Just testing the components is not enough Just testing the system is not enough Limitations are economic, not technical What can I afford to test? What can I afford not to test? What should be covered by standardised test specifications What is the right kind of testing for the job What kind of thing is being tested? Device? System? Service? What is the most economic method(s)? What resources are (or must be made) available? Tools, test scripts, testbeds etc.

7 Testing Methodology - ICT Workshop, Brazil March 2006 7 Conformance and Interoperability Testing are Complementary ETSI experience As you move up a system stack the emphasis should change from conformance to IOT Lower layer protocols, infrastructure Emphasis on conformance Middleware, enablers Combination of Conformance + IOT Services, applications, systems Emphasis on IOT Conformance testing as a pre-requisite to IOT Interop Testing Conformance Testing Interoperability

8 Testing Methodology - ICT Workshop, Brazil March 2006 8 Conformance Testing Tester Is Black-Box testing Stimulation and Response Device Point of Control and Observation (PCO) Reqs.

9 Testing Methodology - ICT Workshop, Brazil March 2006 9 Conformance Testing Tests is Usually Layer-byLayer lat ii 123 456 789 * 8# latigid Test System SUT API MMI L3 L2 PHY SUT = System Under Test IUT = Implementation Under Test L3 Upper Tester Lower Tester

10 Testing Methodology - ICT Workshop, Brazil March 2006 10 Characteristics of Conformance Testing (1) Is unit testing Tests a single part of a device (e.g., a protocol layer) Often incrementally – layer-by-layer Tests against well-specified requirements For conformance to the requirements in a base specification or profile (standard) Usually limited to one requirement per test Tests at a 'low' level At the protocol (message/behaviour) level (bits) Or service primitive, API, Interface Tests over standardised interfaces May not be available to normal user Requires a test system (and executable test cases) Can be expensive (e.g., radio-based test system) Tests under ideal conditions

11 Testing Methodology - ICT Workshop, Brazil March 2006 11 Characteristics of Conformance Testing (2) High control and observability Means we can explicitly test error behaviour Can provoke and test non-normal (but legitimate) scenarios Can be extended to include robustness tests It is usually automated and tests are repeatable Conformance Testing is DEEP and NARROW Thorough and accurate but limited in scope Gives a high-level of confidence that key components of a device or system are working as they were specified and designed to do

12 Testing Methodology - ICT Workshop, Brazil March 2006 12 Limitations of Conformance Testing Does not prove end-to-end functionality (interoperability) between communicating systems Conformance tested implementations may still not interoperate This is often a specification problem rather than a testing problem! Need minimum requirements or profiles Does not test a complete system Tests individual system components, not the whole A system is often greater than the sum of its parts! Does not test functionality –Does not test the users perception of the system Standardised conformance tests do not include proprietary aspects Though this may well be done by a manufacturer with own conformance tests for proprietary requirements

13 Testing Methodology - ICT Workshop, Brazil March 2006 13 Conformance Testing Documentation and Process: ISO 9646 Methodology Test Suite ( Test Cases ) ATS Req. checklist ICS Test Purposes TSS & TP testing Test Report logging and analysis Test System Base Standard (or Profile) implementation Product Implementation Under Test compilation Executable Test Suite (e.g., C++) Industry validation

14 Testing Methodology - ICT Workshop, Brazil March 2006 14 Implementation Conformance Statement (ICS) The ICS is a checklist of the capabilities supported by the Implementation Under Test (IUT) Provides an overview of the features and options that are implemented in any given product The ICS can be used to select and parameterise test cases and can be used as an indicator of basic interoperability between different products. Conditional support e.g., Qn must be supported if Q1 supported then otherwise not. Q#ICS QuestionRefStatusSupport Q1Support of Feature F1[x] Clause aOPTIONAL Q2Support of Feature F2[x] Clause bMANDATORY :::: QnQn Support of Message M1[y] Clause cCONDITIONAL :::: QkQk Support of message Parameter P1[y] Clause dOPTIONAL

15 Testing Methodology - ICT Workshop, Brazil March 2006 15 Profile ICS Q#ICS QuestionRefStatusProfile StatusSupport Q1Support of feature F1[x] Clause aOPTIONALMANDATORY Q2Support of feature F2[x] Clause bMANDATORY ::::: QnQn Support of Message M1[y] Clause cCONDITIONALMANDATORY ::::: QkQk Support of parameter P1[y] Clause dOPTIONALN/A A profile ICS reflects how a (protocol) profile restricts the scope of a base standard by making certain options mandatory or excluding certain options.

16 Testing Methodology - ICT Workshop, Brazil March 2006 16 Selecting Tests with the ICS Q#ICS QuestionRefStatusSupport Q1Support of Feature F1[x] Clause aOPTIONAL Q2Support of Feature F2[x] Clause bMANDATORY :::: QnQn Support of Message M1[y] Clause cCONDITIONAL :::: QkQk Support of message Parameter P1[y] Clause dOPTIONAL A filled-in ICS can be used to select tests. For example, a test for an OPTIONAL feature which is not supported in a given product will be deselected from the test suite.

17 Testing Methodology - ICT Workshop, Brazil March 2006 17 Parameterizing Tests with eXtra Information for Testing (IXIT) Q#IXIT QuestionRefAllowed ValuesValue Q1Network address[x] Clause aValid IP address12.34.56.76 Q2Value of Timer T1[x] Clause b1.. 7 s5s ::::: A filled-in IXIT can be used to parameterize tests. The Value column is used to specify explicit IUT or test system dependent values.

18 Testing Methodology - ICT Workshop, Brazil March 2006 18 Test Purposes (TP) Test Purposes (TP) are natural language, precise descriptions of the purpose of the test in relation to a particular (base standard) requirement They are very focussed. They are not code. A TP is WHAT to test not HOW to test TPId:SIP_SS _PR_CE _V_012 Selection: Mandatory [Q2] Precondition: Registration of both simulated UA to the IUT and an initiated session Ref:5.1.1 [1], 17.3.6 [1], 17.4 [1], 10.46.6 [1] Purpose: Ensure that the IUT on receipt of a Server Internal Error (500 Server Internal Error) server failure response, sends an ACK request, with the same field Via and Branch parameter as in the previous INVITE, to the UAS and forwards it to the UAC.

19 Testing Methodology - ICT Workshop, Brazil March 2006 19 Test Suite Structure (TSS) Registration Registrant Valid Invalid Registrar : Session Originating Endpoint Call Establishment TP_S_OE_CE_01 TP_S_OE_CE_02 TP_S_OE_CE_03 : Call Release : Terminating Endpoint Proxy Redirect Server : Test Purposes are grouped into a logical Test Suite Structure (TSS) according to suitable criteria e.g., basic interconnection, error handling, functionality etc.). The complete document is usually called the Test Suite Structure and Test Purposes (TSS&TP)

20 Testing Methodology - ICT Workshop, Brazil March 2006 20 Abstract Test Suite A Test Case is the implementation of the test purpose Is the HOW to test not WHAT to test The Test Suite (ATS) is the collection of all Test Cases Most ETSI Test Suites are written in TTCN Predominantly in TTCN-2 Progression to TTCN-3 for new work Not RF testing (generally not physical layer) TTCN-3 Testing and Test Control Notation Allows tests to be specified in detail (test code) that is independent of the (eventual) real test system on which it may be run i.e., TTCN-3 will run on any test system that supports a standardised TTCN-3 execution environment

21 Testing Methodology - ICT Workshop, Brazil March 2006 21 Executable Test Suites Executable test suites are compiled from the abstract Either direct to binary, or More commonly to some intermediate programming language (C++, Java, etc.) ETSI does not deliver executables, but We try to ensure all code can be compiled And validated by executing the tests against a real implementation on at least one test system 5 test systems in the case of UMTS, for example

22 Testing Methodology - ICT Workshop, Brazil March 2006 22 Interoperability Testing End-to-End Interoperability of devices What is being tested? Assignment of verdicts? Need to distinguish between Formal interoperability testing And informal interoperability events used to validate/develop technologies Device 1 Device n Device 2

23 Testing Methodology - ICT Workshop, Brazil March 2006 23 SUT API MMI L3 L2 PHY API MMI L3 L2 PHY QE = Qualified Equipment EUT = Equipment Under Test Interoperability Testing Tests Functionality Between Devices Conformance Monitoring Interoperability Testing (of terminals) QEEUT Means of Communication (MoC)

24 Testing Methodology - ICT Workshop, Brazil March 2006 24 Characteristics of Interoperability Testing Is system testing Tests a complete device or a collection of devices Shows that (two) devices interoperate Albeit within a limited scenario Tests at a high level (as perceived by users) Tests the whole, not the parts e.g, protocol stacks + applications Tests functionality Shows function is accomplished (but not how) Does not necessarily require a test system Uses existing interfaces (standard/proprietary) Interoperability Testing is BROAD and SHALLOW Less thorough but wide in scope Gives a high-level of confidence that devices (or components in a system) will interoperate with other devices (components)

25 Testing Methodology - ICT Workshop, Brazil March 2006 25 Limitations of Interoperability Testing Does not prove interoperability with other implementations with which no testing has been done A may interoperate with B and B may interoperate with C. But it doesnt necessarily follow that A will interoperate with C. Combinatorial explosion Does not prove that a device is conformant Interoperable devices may still interoperate even though they are non-conformant Cannot explicitly test error behaviour or unusual scenarios Or other conditions that may need to be forced (lack of controllability) Has limited coverage (does not fully exercise the device) Not usually automated and may not be repeatable

26 Testing Methodology - ICT Workshop, Brazil March 2006 26 Interoperability Test Methodology ETSI TS 102 237-1 Uses best principles of ISO 9646 and adapts them for interoperability testing Definitions EUT – Equipemt Under Test QE – Qualified Equipment Concepts IFS (like the PICS) Interoperable Functions Statement Generic testing architecture Test Descriptions (in prose, tabular format) If suitable interfaces (API/MMI) are exposed then can automate the tests by running TTCN-3 scripts Processes Basic test specification development process

27 Testing Methodology - ICT Workshop, Brazil March 2006 27 Interoperability Testing Documentation and Process: TS 102 237 Methodology Interop Test Cases Test Report logging and analysis Base Standard (or Profile) implementation Product 1 Qualified Equipment Product 2 Equipment Under Test implementation IFS (functional checklist) Interop Test Purposes testing

28 Testing Methodology - ICT Workshop, Brazil March 2006 28 Test Purpose Language (TPLan) ETSI language, not necessarily restricted to IOP testing Keywords and syntax provide clear and consistent structure Keywords chosen for communications applications ( sends, receives etc.) Text between keywords not part of syntax so free expression possible A TPs basic structure (corresponding keyword): Header ( TP id ) Pre-conditions ( with ) Stimulus ( when ) Expected response ( then )

29 Testing Methodology - ICT Workshop, Brazil March 2006 29 TP id : TP_COR_1719_02 Summary : 'EUT sends packet to All-RT LL M/cast address' RQ ref : RQ_COR_1719 Config : CF_021_I TD ref : TD_COR_1719_02 with { QE1 'configured as a router' and QE2 'configured as a router'} ensure that { when { EUT is requested to 'send packet to All-Routers Link-Local M/cast addr' } then { QE1 indicates 'receipt of the packet' and QE2 indicates 'receipt of the packet' } } TPLan Example for Interoperability

30 Testing Methodology - ICT Workshop, Brazil March 2006 30 Interoperability Test Descriptions Specify detailed steps to be followed to achieve stated test purpose Steps are specified clearly and unambiguously without unreasonable restrictions on actual method: Example: Answer incoming call NOT Pick up telephone handset Written in a structured and tabulated natural language so tests can be performed manually Can be automated using TTCN 3 when EUT has software interfaces

31 Testing Methodology - ICT Workshop, Brazil March 2006 31 Example Test Description Test: COR_001_02 Selection Criteria:Mandatory Selected:Yes Title:Auto configure QE using a unique address in a simple network Test Purpose: ensure that { when { etc………. Pre test conditions: Configure the equipment to ensure that QE1 will not intend to use the same link-local address than the one used by EUT (EUT and QE1 interfaces carry different Interface identifiers and both run stateless autoconfig, or EUT Link-Local address configured manually). EUT interface is up and carries a link-local address QE1 interface is down StepTest description Verdict PassFail 1Turn on QE1 interface and wait a few seconds 2EUT sends an echo request to the Link-Local address of QE1 3Check:does EUT receive an echo reply from QE1? YesNo 4QE1 sends an echo request to the Link-Local address of EUT 5Check:does QE1 receive an echo reply from EUT? YesNo Observations:

32 Testing Methodology - ICT Workshop, Brazil March 2006 32 Automated IOP Testing Using TTCN-3 Test Driver (TTCN-3) Test Driver (TTCN-3) Test Controller (TTCN-3) AT Commands

33 Testing Methodology - ICT Workshop, Brazil March 2006 33 Network Integration Testing (NIT) End-to-end testing of the system Components in the network must interoperate Test Driver 2 SUT SIP RADIUS DIAMETER PoC Test Driver 1

34 Testing Methodology - ICT Workshop, Brazil March 2006 34 Example NIT for IMS

35 Testing Methodology - ICT Workshop, Brazil March 2006 35

36 Testing Methodology - ICT Workshop, Brazil March 2006 36 TTCN-3 can be used for Conformance and Interoperability Testing

37 Testing Methodology - ICT Workshop, Brazil March 2006 37 Benefits of TTCN-3 Specifically designed for testing Concentrates on the test not the test system Commonly understood syntax and operational semantics Constantly maintained and developed Off-the-shelf tools and TTCN-based test systems are readily available Single language for many (all?) testing activities Education and training costs can be rationalized Maintenance of test suites (and products) is easier Allows the application of a common methodology and style, both on a corporate level and within standardization

38 Testing Methodology - ICT Workshop, Brazil March 2006 38 Main Capabilities of TTCN-3 Dynamic concurrent testing configurations Various communication mechanisms (synch and asynch) Data and signature templates with powerful matching mechanisms (including regular expressions) Specification of encoding information Display and user-defined attributes Test suite parameterization Control of Test Case execution and selection mechanisms Control of complex test configurations Assignment and handling of test verdicts Harmonized with ASN.1 (XML and IDL coming) Different presentation formats Well-defined syntax, static - and operational semantics

39 Testing Methodology - ICT Workshop, Brazil March 2006 39 The Core Language and Other Presentation Formats TTCN-3 Core Language Text format Presentation Format 3 Presentation Format n Graphical Format Tabular Format Core format is text based (most popular) TTCN-3 can be edited or viewed in other formats Tabular format (for TTCN-2 people) Graphical format (good for visual overview) Other standardized formats in the future? Proprietary formats possible

40 Testing Methodology - ICT Workshop, Brazil March 2006 40 Example Core (Text) Format function PO49901(integer FL) runs on MyMTC { L0.send(A_RL3(FL,CREF1,16)) TAC.start alt { [] L0.receive(A_RC1((FL+1) mod 2)) { TAC.cancel verdict.set(pass) } [] TAC.timeout { verdict.set(inconc) } [] any.receive { verdict.set(fail) }

41 Testing Methodology - ICT Workshop, Brazil March 2006 41 Example Graphical Format

42 Testing Methodology - ICT Workshop, Brazil March 2006 42 Example Tabular Format

43 Testing Methodology - ICT Workshop, Brazil March 2006 43 Use of TTCN-3 With Other Languages TTCN can be integrated with other languages Fully harmonized with ASN.1 (1997) Harmonized with other languages IDL, XML, C/C++ TTCN-3 Core Language IDL Types & Values Other types & Values n XML Types & Values ASN.1 Types & Values

44 Testing Methodology - ICT Workshop, Brazil March 2006 44 Test Suite Major Elements of a TTCN-3 Test Suite Test Behaviour Test System Architecture Actual Test Data Test Data Types Definition of data types for Protocol Messages and Information elements (fields, parameters) Internal data (e.g., for computation) Test coordination messages etc. Predefined simple types Integer, boolean, float bitstring, hexstring, octetstring charstring, universal charstring... and complex types record, record of, set, set of union, enumerated... and special types such as verdict, address Actual test data (values) used during testing Message templates (Remote) signature templates Matching mechanisms values, lists, wildcards, regular expressions etc. Encoding information Test components Test ports Connections between test components to System Under Test Configurations static dynamic test cases test steps default behaviour management of test components sending/receiving messages verdict assignment computation (e.g., checksums) timers and timeouts test execution and control etc.

45 Testing Methodology - ICT Workshop, Brazil March 2006 45 Simple TTCN-3 Architecture IUTTest System stimulus response Point of Control and Observation (PCO) Test port Implementation Under Test Black-Box TTCN-3 based Test System

46 Testing Methodology - ICT Workshop, Brazil March 2006 46 TTCN-3 Architecture – Test Components Test System Parallel Test Component (PTC) Parallel Test Component (PTC) Main Test Component (MTC) Communication to IUT Internal Communication IUT

47 Testing Methodology - ICT Workshop, Brazil March 2006 47 TTCN-3 Module module Example { // Definitions part control { // Control part } } with {encode "BER"} Attributes Module (…) Module Control Module Definitions

48 Testing Methodology - ICT Workshop, Brazil March 2006 48 module Example { group TestArchitecture { // Port and Test Component definitions } group MessageTypes { // Defintions of message types } group ActualMessages { // Templates for instances of actual messages } group TestCases { // Test Case definitions } Definitions Part module Example { // Port and Test Component definitions // Defintions of message types // Templates for instances of actual messages // Test Case definitions }

49 Testing Methodology - ICT Workshop, Brazil March 2006 49 Specifying Test Messages type record MsgType { integer msgId charstring msgName, charstring msgInfo } template MsgType INVITE { msgId 0 msgName "INVITE", msgInfo " For a good Brazilian Dinner" } template MsgType ACCEPT { msgId 1 msgName "ACCEPT", msgInfo * }

50 Testing Methodology - ICT Workshop, Brazil March 2006 50 TTCN-3 Behaviour INVITEACCEPT or DECLINE P testcase TC1() runs on PTC1 { P.send(INVITE) T1.start(30E3) alt { []PCO.receive(ACCEPT) {verdict.set(pass)} []PCO.receive(DECLINE) {verdict.set(inconc)} []PCO.receive {verdict.set(fail)} []T1.timeout {verdict.set(fail)} }

51 Testing Methodology - ICT Workshop, Brazil March 2006 51 Attributes The Control Part Module (…) Module Control Module Definitions module Example { // Definitions part params { boolean Q1; : } control { if Q1 then execute (TC1()) : } } with {encode "BER"}

52 Testing Methodology - ICT Workshop, Brazil March 2006 52 TTCN-3 Test System ENCODER DECODER (N-protocol specific) Adaptation Layers TRI TTCN-3 Runtime Interface TCI TTCN-3 Control Interface The Industry Test Suite in TTCN-3 (source) TTCN-3 Test Suite (object) Compilation PICS etc (reqs. catalogue) Parameterisation Selection Underlying Protocol Stack (N-1) Connection to the SUT Control / Logging User

53 Testing Methodology - ICT Workshop, Brazil March 2006 53 TTCN-3 Standards http://www.ttcn-3.org ES 201 873-1 (Z.140) TTCN-3 Core Language ES 201 873-2 (Z.141) TTCN-3 Tabular Presentation Format (TFT) TR 101 873-3 (will eventually be ES 201 873-3) (Z.142) TTCN-3 Graphical Presentation Format (GFT) ES 201 873-4 (Z.143) TTCN-3 Operational Semantics ES 201 873-5 TTCN-3 Runtime Interface (TRI) ES 201 873-6 TTCN-3 Control Interfaces (TCI) ES 201 873-7 and upwards Using ASN.1, XML, IDL, C/C++, with TTCN-3

54 Testing Methodology - ICT Workshop, Brazil March 2006 54 Concluding Thoughts (1) Bad publicity in trade magazines Embarrassment for the manufacturer Annoyance of the end customer In the past, interoperability failures meant: Today, interoperability failures in the field means: Front page headlines in the Financial Times Fall in manufacturers stock price Loss of investor confidence Unrecoverable damage to brand name Irretrievable loss of customers We can no longer afford to get it wrong! Testing is part of ETSIs strategy to get it right!

55 Testing Methodology - ICT Workshop, Brazil March 2006 55 Concluding Thoughts (2) ETSI membership believes in methodical testing A good example GSM/UMTS is growing by 1 Million users per day! Can you imagine what would happen if users were experiencing interoperability problems ? More than 1 Million Euros invested per year for writing formal test specifications and test suites We have experienced what it means to get it right! A bad example We have not always got it right! For GPRS we did not take a formal approach to testing First market products did not interoperate We have learnt that testing is not cheap, but it pays in the long run

56 Thank You! Anthony Wiles ETSI Protocol and Testing Competence Centre Phone +33 (0)4 92 94 43 26 e-mail ptcchelp@etsi.org Website www.etsi.org/ptcc


Download ppt "Overview of ETSI Testing Methodology ICT-OSA/Parlay Workshop Brazil, March 2006 Anthony Wiles Manager ETSI Protocol and Testing Competence Centre."

Similar presentations


Ads by Google