Presentation is loading. Please wait.

Presentation is loading. Please wait.

Application Layer Testing: An Example

Similar presentations


Presentation on theme: "Application Layer Testing: An Example"— Presentation transcript:

1 Application Layer Testing: An Example
Doc.: IEEE /1582r0 January 2005 January 2005 Application Layer Testing: An Example Date: Authors: Notice: This document has been prepared to assist IEEE It is offered as a basis for discussion and is not binding on the contributing individual(s) or organization(s). The material in this document is subject to change in form and content after further study. The contributor(s) reserve(s) the right to add, amend or withdraw material contained herein. Release: The contributor grants a free, irrevocable license to the IEEE to incorporate material contained in this contribution, and any modifications thereof, in the creation of an IEEE Standards publication; to copyright in the IEEE’s name any IEEE Standards publication even though it may include portions of this contribution; and at the IEEE’s sole discretion to permit others to reproduce in whole or in part the resulting IEEE Standards publication. The contributor also acknowledges and accepts that this contribution may be made public by IEEE Patent Policy and Procedures: The contributor is familiar with the IEEE 802 Patent Policy and Procedures < ieee802.org/guides/bylaws/sb-bylaws.pdf>, including the statement "IEEE standards may include the known use of patent(s), including patent applications, provided the IEEE receives assurance from the patent holder or applicant with respect to patents essential for compliance with both mandatory and optional portions of the standard." Early disclosure to the Working Group of patent information that might be relevant to the standard is essential to reduce the possibility for delays in the development process and increase the likelihood that the draft publication will be approved for publication. Please notify the Chair as early as possible, in written or electronic form, if patented technology (or technology under patent application) might be incorporated into a draft standard being developed within the IEEE Working Group. If you have questions, contact the IEEE Patent Committee Administrator at Tom Alexander, VeriWave Dr. Michael D. Foegelle, ETS-Lindgren

2 Abstract and Outline Outline:
Doc.: IEEE /1582r0 January 2005 January 2005 Abstract and Outline Considerable discussion has taken place recently on the need to correlate application layer tests with controlled lower-layer tests. This presentation gives an example: An end-to-end application layer test revealed anomalous performance loss The performance loss was modeled, and traced to an unexpected MAC layer effect The MAC layer effect could then be the subject of a controlled test on the WLAN APs or clients alone Outline: A brief introduction to the application layer test, and some results in a controlled open-air test environment The MAC layer effect, and the relevant controlled lower-layer test Tom Alexander, VeriWave Dr. Michael D. Foegelle, ETS-Lindgren

3 The Controlled Open-Air Test
Doc.: IEEE /1582r0 January 2005 January 2005 The Controlled Open-Air Test Tom Alexander, VeriWave Dr. Michael D. Foegelle, ETS-Lindgren

4 Application Layer Test: VoWLAN Performance
January 2005 Application Layer Test: VoWLAN Performance Objective: to characterize the performance of a DUT (APs + WLAN switch) when used in enterprise-class VoIP applications Various performance parameters measured Call quality (R-value, jitter, loss) Call quality degradation with data loading Failover time and call drop counts Impact of failover on R-values Various test conditions and DUT configuration parameters Different numbers of handsets QoS enabled and disabled on DUT Artificial delays representing WAN links between sites Tom Alexander, VeriWave

5 Test Setup January 2005 18 WLAN devices Wired network OTA test
14 VoIP handsets 2 Enterprise APs 2 Traffic Generator / Analyzer units Wired network VoIP call server/gateway WLAN switch/router Other LAN devices OTA test 1 or 2 BSSIDs set up with physical and channel separation Handsets had integrated antennas Some DUTs had integrated antennas Tom Alexander, VeriWave

6 Controlled Open-Air Testing
January 2005 Controlled Open-Air Testing Several measures taken to achieve repeatability Control of propagation environment Enclosed room, concrete walls, minimal metallic content within LOS zones Careful equipment & furniture positioning (secured in place for duration of tests) Minimize movement of scatterers (metallic objects) and absorbers (people) Control of interference Eliminate RF interference (cordless phones, Bluetooth, etc.) Eliminate other WLAN devices (scan for and shut off) Precision traffic generation and analysis Traffic generator offered load could be controlled to 5% accuracy Handset clocks aligned to within 4 ppm Analysis timestamps aligned to within 50 nsec Detailed monitoring of environment and devices during test Traffic analyzer reported duration & strength of noise bursts Offered load monitored from trial to trial FER levels monitored from trial to trial Tom Alexander, VeriWave

7 Repeatability Observed During Tests
January 2005 Repeatability Observed During Tests Actual repeatability observed, within trial and across trials: R-factor variation across flows (handsets) within trial: < ±5%, typically under ±2% By comparison, variation from DUT to DUT could be >50% R-factor variation across trials (same DUT & handsets): < ±1%, typically under ±0.2% Failover roaming time variation across handsets within trial: < ±15% By comparison, variation from DUT to DUT could be as much as 500% Failover roaming time variation across trials (same DUT & handset): < ±5% Data throughput (forwarding rate) variation across trials: < ±2%, typically < ±0.5% Tom Alexander, VeriWave

8 Driving WLAN Metrics From Application Layer Measurements
Doc.: IEEE /1582r0 January 2005 January 2005 Driving WLAN Metrics From Application Layer Measurements Tom Alexander, VeriWave Dr. Michael D. Foegelle, ETS-Lindgren

9 Anomalous Performance Issues
January 2005 Anomalous Performance Issues Performance results were counter-intuitive Very limited number of calls supported (most DUTs did not support more than 6 calls – under 1 Mb/s of voice traffic!) Injection of just 1 Mb/s of data with < 1 Mb/s of voice traffic caused dramatic R value reductions, voice dropouts, calls dropping off line Very poor call roaming times (up to 15 seconds!) during failover test Excessive roaming time was particularly puzzling Time to restore VoIP streams far exceeded time for a “data” client to disassociate with AP #1 and authenticate/associate with AP #2 One example: Total time for 128 “golden” Layer 4 data clients to roam: 307 milliseconds Average time for 14 VoIP handsets to roam: seconds Worst-case handset roaming time >10 seconds! Tom Alexander, VeriWave

10 Analysis of Underlying Behavior
January 2005 Analysis of Underlying Behavior Handsets were making continuous channel/AP availability measurements Packet delay / loss variations monitored on a packet-by-packet basis Excessive loss or jitter triggered active probing (probe request/response) If active probing indicated response times >30 – 60 msec, channel scanning started Once channel scanning started, load went up and throughput went down Result: dramatic variations in R value DUTs were introducing probe request/response handshake delays Primary AP goes down; handsets move to backup AP in less than 100 milliseconds Handset sends probe requests to backup AP AP fails to return probe response, or is too late (>30 msec) Handset assumes AP not present / overloaded; does not associate, keeps scanning Sometimes probes took >15 sec before acceptable probe responses received This had a significant impact on both voice quality and failover time Long dead times in voice streams Sometimes calls dropped entirely due to timeouts Long dead times and delays in roaming Enormous disconnect between L2 and L7 events Throughput tests using “data” traffic indicate ample bandwidth for VoIP traffic Failover roaming tests using “data” traffic indicated millisecond failover times Actual VoIP handsets have very different behavior from data client adapters Tom Alexander, VeriWave

11 Layer 2 Tests Indicated Voice RTP TCP IP MAC PHY January 2005
“Expected” QoS testing End-to-end delay Delay variation Packet loss Burst loss profile over time Impact of background data traffic on the above Additional tests Probe request/response time Especially in the presence of multiple concurrent probe requests/responses Impact of background data on probe requests/responses Some DUTs worked well provided they were not required to transport data concurrently with voice, even though the level of bandwidth was well below the DUT capacity TCP IP MAC PHY Tom Alexander, VeriWave

12 Conclusion The 802.11 protocol is complex and has many moving parts
January 2005 Conclusion The protocol is complex and has many moving parts Some of these moving parts have significant and non-intuitive impact on application layer performance While the user may not know (or care) about effects, he/she certainly cares about application layer performance We should devote some energy to picking metrics that are driven by actual application layer measurements and modeling Tom Alexander, VeriWave


Download ppt "Application Layer Testing: An Example"

Similar presentations


Ads by Google