Presentation is loading. Please wait.

Presentation is loading. Please wait.

Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 1 Environment and Metrics Laboratory vs. Real World Notice:

Similar presentations


Presentation on theme: "Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 1 Environment and Metrics Laboratory vs. Real World Notice:"— Presentation transcript:

1 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 1 Environment and Metrics Laboratory vs. Real World Notice: This document has been prepared to assist IEEE 802.11. It is offered as a basis for discussion and is not binding on the contributing individual(s) or organization(s). The material in this document is subject to change in form and content after further study. The contributor(s) reserve(s) the right to add, amend or withdraw material contained herein. Release: The contributor grants a free, irrevocable license to the IEEE to incorporate material contained in this contribution, and any modifications thereof, in the creation of an IEEE Standards publication; to copyright in the IEEE’s name any IEEE Standards publication even though it may include portions of this contribution; and at the IEEE’s sole discretion to permit others to reproduce in whole or in part the resulting IEEE Standards publication. The contributor also acknowledges and accepts that this contribution may be made public by IEEE 802.11. Patent Policy and Procedures: The contributor is familiar with the IEEE 802 Patent Policy and Procedures, including the statement "IEEE standards may include the known use of patent(s), including patent applications, provided the IEEE receives assurance from the patent holder or applicant with respect to patents essential for compliance with both mandatory and optional portions of the standard." Early disclosure to the Working Group of patent information that might be relevant to the standard is essential to reduce the possibility for delays in the development process and increase the likelihood that the draft publication will be approved for publication. Please notify the Chair as early as possible, in written or electronic form, if patented technology (or technology under patent application) might be incorporated into a draft standard being developed within the IEEE 802.11 Working Group. If you have questions, contact the IEEE Patent Committee Administrator at.http:// ieee802.org/guides/bylaws/sb-bylaws.pdfstuart.kerry@philips.compatcom@ieee.org Date: 2005-1-17 Authors:

2 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 2 Abstract There still remains a level of confusion and disagreement as to the goals of the TGT. This presentation attempts to more clearly define some basic concepts in an effort to illustrate the need for certain approaches. The goal is to then determine if and how we can satisfy the needs of the various groups involved in TGT given this information. From there, a better understanding of the document requirements (framework) can be reached.

3 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 3 Overview Review of TGT PAR Background Concepts Environment vs. Metrics Environments Metrics Measurement Framework –Metrics –Environment

4 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 4 Review of TGT PAR Official scope of TGT (from PAR): –The scope of the project is to provide a set of performance metrics, measurement methodologies, and test conditions to enable measuring and predicting the performance of 802.11 WLAN devices and networks at the component and application level. Official purpose of TGT (from PAR): –The purpose of the project is to enable testing, comparison, and deployment planning of 802.11 WLAN devices based on a common and accepted set of performance metrics, measurement methodologies and test conditions.

5 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 5 Background Concepts There are two principal classes of testing commonly referred to for wireless devices. –Interoperability Testing Refers to testing a DUT with other DUTs or an operational network to verify expected performance. Looks for incompatibilities and combined performance behavior. –Conformance Testing This is a general definition, NOT current Wi-Fi Alliance conformance. Refers to testing the performance of an individual DUT using standardized methods and traceable test equipment. Conformance implies testing performance to a requirement. Usually serves as a baseline prior to performing interop. testing. Often involves comparing results to industry standards/requirements. Customer driven testing.

6 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 6 Environment vs. Metrics There tends to be some confusion when discussing usage cases and the impact of the real-world on those cases. It’s important to show how the environment can be considered completely independent of the application. –Environmental effects are inputs to the low level metrics which eventually impact the application level performance. –Methodology related to measuring application level metrics, or corresponding methods of predicting application level performance from lower level metrics, is far removed from the inputs to those low level metrics.

7 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 7 Environments

8 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 8 Environments Real-World Environments represent the actual user experience – in that particular environment. –Any sufficiently rigorous test method should be able to reproduce the same results in that same environment. If the tests are designed properly, these results should represent the user experience in this environment. –A given real-world environment is likely to demonstrate effects similar to that of another real world environment. Qualitative measurements of the relative performance between two or more devices are possible. The relative differences between devices should be similar for different real world environments, given appropriate methodology. Absolute performance metrics of an individual device are not practical in a real-world environment. –Can’t compare DUT1 from RWE1 to DUT2 in RWE2.

9 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 9 Environments Case Study: Real-World Environment Testing –Assume two different real world environments. –Three APs are tested by placing each one in the same place and comparing the performance between a number of clients placed in various locations (varied distance and relationship). –Similar clients are used in both cases.

10 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 10 Environments Case Study: Real-World Environment Testing – Comparison #1 –Identical clients are placed the same distance from the AP. –Environment #1 has a throughput of 5 MB/s. What’s the throughput in Environment #2?

11 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 11 Environments Case Study: Real-World Environment Testing – Comparison #2 –AP 2 is substituted for AP1 in the exact same position. –Environment #1 has a throughput of 2 MB/s. What’s the throughput in Environment #2? Is AP1 better than AP2? Does Environment #2 prove this too?

12 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 12 Environments Case Study: Real-World Environment Testing – Comparison #3 –A client is moved away from each AP in a straight line in 5 m steps until the throughput drops from maximum. –In Env. #1 AP 1’s throughput drops at 25m and AP 2 drops at 35m At what distance does throughput drop in Environment #2? Is AP1 better than AP2? Does Environment #2 prove this too?

13 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 13 Environments Case Study: Real-World Environment Testing –Absolute comparisons between environments are obviously impractical. The meaning of distance in one environment can be considerably different than another, due to reflections, etc. –Relative comparisons between DUTs aren’t guaranteed to be equivalent between different real-world environments. Without specific controls, it is possible to confuse issues related to test setup in a specific test environment with DUT performance issues. –(Case in point being the previously described methodology.) –Standardization of methodology would have to address these setup issues in an effort to compensate for the non-standardized environment. –Likely to entail specialized testing of the environment to determine levels of contributing factors before each test. Statistical approach would be necessary to remove fixed errors. –Eg. Location and orientation of antenna may be more critical than location and orientation of base. Thus, multiple locations and orientations of DUT and antenna are necessary to determine relevant factors.

14 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 14 Environments Controlled Laboratory Environments provide ways of simulating real-world behavior in a repeatable, comparable, and traceable manner. –Laboratory environment includes both DUT test environment, and the test equipment used to simulate real-world effects. –DUT test environment can be kept conceptually simple so that they are easily replicated, allowing tests to be duplicated by anyone. –Traceability of laboratory test equipment is a common practice and allows results from different labs to be compared with a known confidence (uncertainty) level. It is not necessary to test two devices in the same lab in order to compare results. Traceability is to a standardized calibration, test method, and/or test system design.

15 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 15 Environments Environmental effects can be separated into two basic categories: –Purely Physical: Separation distance/path loss Multi-path fading Adjacent channel interference Noise –Systemic (MAC): Network loading Hidden nodes Real-World environments combine all of these simultaneously at varying levels. Lab environments can simulate each one individually.

16 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 16 Environments Will expand on this topic further after the next section on Metrics.

17 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 17 Metrics There seems to be an inherent assumption that real-world testing implies real-world (application level) metrics, while laboratory testing implies low level (sub-metric) testing. While this seems logical, it doesn’t have to be so. –There’s no reason application level metrics can’t be measured in a laboratory environment.

18 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 18 Metrics Application level metrics entail directly measuring QOS performance exactly as the user encounters. User experience is both qualitative and quantitative. –Qualitative measurements are often: Subjective: “Sounds good to me…” Expensive: User panels Difficult to repeat. Hard to compare. –Quantitative measurements are more useful, but may be difficult to perform at the application level for some usage cases. Implies a measurement system inherent in application or a custom application for the purpose of measurement. –The latter is no longer truly a real-world case.

19 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 19 Metrics All applications (usage cases) share the effects of basic sub-metrics such as throughput, delay, jitter, etc. –Exact relationship of these sub-metrics to the usage case metric(s) is currently undefined. There is a difference between application level testing and testing with a particular usage case application (i.e. VOIP, streaming video, etc.) –Application level testing involves testing a given metric at the Application Layer of the ISO model. Ensures that effects due to things like drivers are accounted for. –Application testing (usage case testing) involves testing a given application (actual user software) or usage case (application simulation) to determine its performance, thus combining the effects of all of the basic metrics into one measurement. By definition, this occurs at the Application Layer.

20 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 20 Metrics Application Level Testing –Run network traffic generation/measurement application to generate a given type of traffic and test any given metric (throughput, forwarding rate, etc.) through all layers. Application Testing –Run a real application software to determine QOS. –Run a simulation application that sends/receives real data streams (e-mail, streaming video, VOIP) in order to be able to record application specific metrics (video frame rate, audio noise ratio, etc.)

21 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 21 Metrics Example Application: VOIP –QOS is function of cabled network, wireless networks, AP DUTs, operating system of client DUTs, VOIP applications, etc. –Apparent that application level testing is important to QOS.

22 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 22 Metrics Currently, Client testing requires Application level (Application Layer) testing anyway. –Without a test API, some sort of test traffic client application must be run to create or record the traffic used for testing. –A test API may allow a way to test at a lower level, but this would provide additional diagnostic capabilities without necessarily eliminating the need to perform Application level tests. Tests of AP may or may not involve higher layers, depending on the metric or sub-metric being tested. –Forwarding rate through an AP from Client to Client does not involve the application layer of the AP. –Server-side metrics of higher level network layers may affect end user performance, but are these part of the APs performance metric? Certainly not part of its Wireless performance! Question of scope of the TGT…

23 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 23 Measurement Framework-Metrics It’s apparent that Application Level testing is necessary, at least on the client side. –Need to ensure that we’ve tested all components of a device. Application (usage case) testing is a slightly greyer area. –Ideally any application specific testing would be standardized. Requires specialized applications for measuring things like video frame rate, audio reproduction quality, etc. –Making the link between sub-metrics and the associated performance of a given usage case should be relatively straightforward. There are already applications for measuring throughput, etc. at the Client Application level. –Opportunity for some R&D in this area.

24 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 24 Measurement Framework-Environment The usefulness of real-world testing environments is less obvious. –There are significant limitations to the usefulness of data generated in this manner. –In light of the previous metric discussion, much of the value of real world testing lies in application level metrics, not real-world environments. –Most metrics can be measured in a low cost conducted laboratory environment, giving less credence to concerns over cost of testing. –The fact that “anyone can do it” is not sufficient to justify incorporation of real-world testing into TGT. However, real-world testing does have a place.

25 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 25 Measurement Framework-Environment

26 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 26 Measurement Framework-Environment

27 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 27 Measurement Framework-Environment Real-World Environment Tests are useful for: –Initial R&D work: Correlation to laboratory tests to develop appropriate models. Validation of models. Such R&D work is input to TGT, so should associated test methodology really be an output? –Verification of Predicted Results: Allows user to confirm that installation is performing as expected. Allows adjustment to model inputs after testing small sub-installation before completing entire installation. Methodology for this purpose is a legitimate output of TGT.

28 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 28 Measurement Framework-Environment A goal of TGT is to define controlled environment tests that can suitably predict performance in the real world. Real-World Environment testing provides an initial input and feedback mechanism for this process. –Problems identified in the real world can then be investigated to determine how to simulate them in a controlled environment. –Real-World Environment test results can be compared to predictions based on controlled tests to determine if adjustments are required. There is also a level of political pressure for Real- World Product Testing.

29 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 29 Measurement Framework-Environment Real-World Environment Tests are primarily interoperability tests. –Can’t provide the traceability needed for conformance testing. –Provide relative interoperability information under un-controlled or semi-controlled environmental influences. Conformance type performance testing requires a controlled environment to provide necessary traceability. Controlled Environment testing can also provide interoperability test capability.

30 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 30 Measurement Framework-Environment Real-World Environment Tests need a level of standardization to obtain useful results. –Refer to previous case study. –Statistical approach critical Move DUT(s) ~ 1/2 wavelength each direction and repeat test to determine multipath and/or near field contributions. Real-World Environment Tests ARE NOT SUBSTITUTES for Controlled Environment Tests! –The real long-term value of TGT will be to produce standardized tests performed in controlled environments capable of simulating all critical real-world interactions and producing results suitable for comparing products and predicting real-world performance without the need for real-world tests.

31 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 31 Conclusion It’s important to separate Test Metrics from Test Environments. Both Application Level and Low Level metrics are useful, and can be performed in the same test environment. Laboratory Test Environment can simulate necessary real-world effects and provide needed traceability for product comparison and prediction. Real-World Environment Tests are useful for R&D validation and verification of prediction results (feedback loop), but a level of standardization is required to ensure validity of results.

32 Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 32 Conclusion Real-World Environment Tests are NOT suitable for product comparison testing or prediction of performance in any resultant environment. Real-World Environment Tests ARE NOT SUBSTITUTES for Controlled Environment Tests! Goal of TGT (per the PAR) is to develop test methods for predicting real world performance. This implies defining Controlled Environment tests to simulate the Real-World Environment effects. –Otherwise, what’s the point? There is no prediction involved in real world testing.


Download ppt "Doc.: IEEE 802.11-05/1582r1 Submission January 2005 Dr. Michael D. Foegelle, ETS-LindgrenSlide 1 Environment and Metrics Laboratory vs. Real World Notice:"

Similar presentations


Ads by Google