Presentation is loading. Please wait.

Presentation is loading. Please wait.

© Copyright 2008, SoftWell Performance AB 1 Performance Measurements of IMS systems, Concepts.

Similar presentations


Presentation on theme: "© Copyright 2008, SoftWell Performance AB 1 Performance Measurements of IMS systems, Concepts."— Presentation transcript:

1 © Copyright 2008, SoftWell Performance AB 1 Performance Measurements of IMS systems, Concepts

2 © Copyright 2008, SoftWell Performance AB 2 Background to this presentation and proposal

3 © Copyright 2008, SoftWell Performance AB 3 A key to continued success (IMS) Quality needs to be defined and measured The involved standards (IMS) should include definitions of: Quality metrics Measurements of quality metrics Evaluation and verification of measured quality figures Metrics should include: Product characteristics Production characteristics Background This was our message in a panel discussion at VON Rome

4 © Copyright 2008, SoftWell Performance AB 4 Background General reasons We gain a general understanding of what should be regarded as performance We will get a standard vocabulary for performance related issues We will have one interpretation of every metrics We will get a standard notation for Performance Test Cases The Babylon of Performance testing will come to an end Practical reasons Different tests that are performed according to the standard can be compared Performance test tools can be certified Performance test cases will be transferable between Test Tools that comply to the standard Standardized Performance Test Case descriptions will be easier to maintain Possible to define and certify Performance Test Suits Will simplify regression testing (set-up and compare of results) The goal To expand the notation of TTCN-3 to cover performance testing. This presentation is a short introduction to performance testing in general with extensions to IMS and its architecture. We hope to initiate a performance task. Why should performance testing have a standard?

5 © Copyright 2008, SoftWell Performance AB 5 Performance Objectives & Performance Metrics

6 © Copyright 2008, SoftWell Performance AB 6 Performance Objectives Three classes of performance objectives Can the test object deliver what is required? Can the test object maintain services as required? Is the test object designed to be powerful and reliable? Powerful Reliable Efficient Measured Object

7 © Copyright 2008, SoftWell Performance AB 7 Performance Metrics Metrics describing powerfulness Delivery aspects Capacity Performance measurements of - Output (throughput) - Input capacity (peak load) - Concurrency (tx concurrently being processed) Speed of operation Performance measurements of time for tested services - Response time (time to deliver response to a request) - Latency time (time to process or react) - Roundtrip time Scalability Performance measurements of improvements by - Adding more HW - Adding faster HW Powerful Reliable Efficient Measured Object

8 © Copyright 2008, SoftWell Performance AB 8 Performance Metrics (cont.) Metrics describing reliability Powerful Reliable Efficient Measured Object Carrier-Grade aspects Stability Has the tested system trends or patterns in Powerfulness and/or Efficiency over time? Availability Unavailability in service delivery over time due to - Physical problems (Platform) - Logical problems (Application) Robustness Service levels during extreme conditions - Externally (DoS attacks, peaks loads) - Internally (partial HW outage) Recovery Time to recover from - Partial or full restart of system - Update of system HW / SW Correctness Does the tested system deliver correct results under load?

9 © Copyright 2008, SoftWell Performance AB 9 Performance Metrics (cont.) Metrics describing efficiency Powerful Reliable Efficient Measured Object Design aspects Resource usage Performance limits due to - Bottlenecks (design) - Too dynamic resource allocation (Peak-on-Peak) Resource utilization Performance limits due to - Load distribution - Resource access (Queues) Resource balance Performance limits due to shortages in resources when there is plenty left of other resources Linearity & Scalability Performance limits due to - Non-linear processing - Multi CPU / Core design limits

10 © Copyright 2008, SoftWell Performance AB 10 Performance Metrics (cont.) Performance Metrics must be: - Uniquely identified ……………..(metrics identifiers) - Understandable ……………..….(metrics specifications) - Comparable..………………….....(recording conditions and methods) - Repeatable...…………………….(test case specs, test tool specs and test object specs) - Accurate.....……………………...(data recording methods and configurations) Requirements on performance metrics

11 © Copyright 2008, SoftWell Performance AB 11 Performance Metrics (cont.) Performance metrics descriptions Metrics identifiers - What ……. - Where ….. - When …… Metrics format - Value type - Unit - Accuracy (+/- %) Measurement Value(s) Measurement conditions - How ……. Metrics types - Raw data - Normalized data - Derived data from …

12 © Copyright 2008, SoftWell Performance AB 12 Performance Test Bench Overview

13 © Copyright 2008, SoftWell Performance AB 13 Test Bench Performance Test Bench overview Performance test bench surroundings Measured Unit Performance test specification Powerful Reliable Efficient Measurement data and evaluation

14 © Copyright 2008, SoftWell Performance AB 14 Test Bench Measured Unit External Performance DataInternal Performance DataExternal Performance Data Test Tool Servers Test Tool Clients Probes Performance Test Bench overview Performance test bench components Measurement evaluation Powerful Reliable Efficient

15 © Copyright 2008, SoftWell Performance AB 15 Test sizing Performance Test Bench overview Performance test specifications Performance test specification Test bench configuration Test application (services) Test data specification Test evaluation Test reporting Measurement specifications

16 © Copyright 2008, SoftWell Performance AB 16 Performance Test Bench overview Performance test specifications (cont.) Performance test specification Test bench configuration Test application (services) Measurement specifications Test evaluation Test reporting Test data specification Test sizing specifications Specify test duration How long time will the test run? Specify load - Load levels - Load patterns - Load level variations Specify simulated volumes - Number of simulated users - Amounts of test data Test sizing

17 © Copyright 2008, SoftWell Performance AB 17 Test sizing Performance Test Bench overview Performance test specifications (cont.) Performance test specification Test application (services) Test data specification Test evaluation Test reporting Measurement specifications Test bench specifications Test bench components (physical & logical) - Test object components (SUT) - Test tool components Test bench transmission specs. - Used protocols - Addresses and ports for traffic - Load level variations Test bench traffic mapping - Mapping test tool components to test object components - Addresses and ports for traffic Test bench configuration

18 © Copyright 2008, SoftWell Performance AB 18 Test sizing Performance Test Bench overview Performance test specifications (cont.) Performance test specification Test data specification Test evaluation Test reporting Measurement specifications Test application specifications (client side) Test application content (service profile) - What services should be requested Test application flow (traffic profile) - In what order shall services be requested - Intelligent handling of different results of a service req. - Randomized order of service requests Test application customization - Service request formats, multiple formats per service type - Individually modified service request messages Test application processing specifications - Specification of processing flow of a service - Timer settings Test bench configuration Test application (services)

19 © Copyright 2008, SoftWell Performance AB 19 Test sizing Performance Test Bench overview Performance test specifications (cont.) Performance test specification Test data specification Test evaluation Test reporting Measurement specifications Test application specifications (server side) Test application flow (response profile) - Normal response mode - Negative response mode (errors, time-outs, disconnects etc.) Test application customization - Service response formats, multiple formats per service type - Individually modified service response messages Test application processing specifications - Specification of processing flow of a service - Timer settings Test bench configuration Test application (services)

20 © Copyright 2008, SoftWell Performance AB 20 Test sizing Performance Test Bench overview Performance test specifications (cont.) Performance test specification Test application (services) Test evaluation Test reporting Measurement specifications Test data specifications User name specification User names to be inserted in messages - Client user name spaces - Server user name spaces User data specification User data variables to be inserted in messages Test bench configuration Test data specification

21 © Copyright 2008, SoftWell Performance AB 21 Test sizing Performance Test Bench overview Performance test specifications (cont.) Performance test specification Test application (services) Test data specification Test evaluation Test reporting Measurement specifications Test measurement specifications Measurement outside test object What should be measured at the test tools interfaces - Traffic rates, traffic volumes - Response time, latency time etc. Measurement inside test object What should be measured inside the test object - resources - measurement points (per server, per process, …) Test bench configuration

22 © Copyright 2008, SoftWell Performance AB 22 Performance Measurement Methods

23 © Copyright 2008, SoftWell Performance AB 23 Performance Measurement Methods Three performance measurement methods Measured Unit Load Simulation Monitoring WhenPre Deployment Pre Deployment Post Deployment WhyMeasuring limitsVerify production req.Verifying requirements How Isolated functionsComplex trafficActive monitoring Simple trafficWhat-if testingPassive monitoring

24 © Copyright 2008, SoftWell Performance AB 24 IMS Performance Test Objects

25 © Copyright 2008, SoftWell Performance AB 25 Test Bench IMS function(s) External Performance DataInternal Performance DataExternal Performance Data Test Tool Servers Test Tool Clients Probes IMS Performance Test Bench Performance test bench components Measurement evaluation Powerful Reliable Efficient

26 © Copyright 2008, SoftWell Performance AB 26 IMS Performance Test Bench The IMS architecture (simplified) P-CSCF SIP-AS OSA-SCS IM-SSF BGCF MGCF MRF-C SGW MGW MRF-P UE The IMS media plane RTP The IMS signaling plane 06 SIP / Mj 01 SIP / Gm HSS SLF 01 Diameter / Cx 01 Diameter / Dx 03 SIP / ISC 03 HTTP / Ut 02 SIP / Mw I-CSCF 02 SIP / Mw S-CSCF 02 SIP / Mw P-CSCF UE 03 Diameter / Sh H248 / Mn 07 SIP / Mg 05 SIP / Mi H248 / Mp Application services AAA services Control services ISUP / IP 04 SIP / Mr 01 SIP / Gm IMS is an architecture containing a large number of inter-acting functions, where each function performs a its tasks based on traffic over connected protocols. Overall performance is never better than the weakest function in the service chain, i.e. any function that does not match the performance of adjacent functions is a bottleneck or a risk.

27 © Copyright 2008, SoftWell Performance AB 27 IMS Performance Test Bench Performance testing IMS functions An integrated IMS system must be performance tested from individual functions, via sets of interacting functions to entire systems and interconnected systems A test bed for a single function may be quite complex to configure and set up. Control functions Application services BGCF The IMS signaling plane HSS Diameter / Cx Dx Traffic server Test Tools SLF P-CSCFI-CSCFS-CSCF AAA services such as a PTT server or an Presence server or … such as an Diameter / Cx Dx SIP / Gm / Mw / ISC SIP / MG / Mi / Mj / Mr HTTP / XCAP / Ut SIP / ISC Diameter / Sh / … or a or … such as a SBC MRF-C or … Diameter / Sh / … Traffic server Test Tools Traffic server Test Tools

28 © Copyright 2008, SoftWell Performance AB 28 IMS Performance Test Bench IMS function interfaces and requirements A test bed for a single function may be quite complex to configure and set up. IMS function SIP Diameter XCAPother TCP UDP SCTPother Application protocols Transmission protocols Multiplexed Single users per portIndividual IP address. Interfaces

29 © Copyright 2008, SoftWell Performance AB 29 IMS Performance Test Bench IMS function Test Tool interfaces Test Tools must have traffic units that match interfaces of all IMS components Test Tool Servers Test Tool Peers Server side Client side Test Tool Clients Test Tool Peers Server side Client side Test Tool Servers Test Tool Bridges Server side Client side

30 © Copyright 2008, SoftWell Performance AB 30 IMS Function IMS Function IMS Performance Test Bench IMS function Test Tool interfaces A Test bed may be set-up with several IMS functions interconnected by the test tool Test Tool Peers Server side Client side Test Tool Peers Server side Client side Test Tool Servers Test Tool Bridges Server side Client side IMS Function Test Tool Clients

31 © Copyright 2008, SoftWell Performance AB 31 What is next, suggestions

32 © Copyright 2008, SoftWell Performance AB 32 What is next, suggestions The track will contain a number of tasks Create a proposal for performance related definitions and method Create an abstract but precise notation for performance test cases Create a standard for test tool components in a test bench Create a proposal for language extensions to TTCN-3 First of all som practical activities Investigate the interest for this and if yes Set up a task force Thank you for your time A standard for performance testing

33 © Copyright 2008, SoftWell Performance AB 33 To measure is to know - Lord Kelvin


Download ppt "© Copyright 2008, SoftWell Performance AB 1 Performance Measurements of IMS systems, Concepts."

Similar presentations


Ads by Google