Presentation is loading. Please wait.

Presentation is loading. Please wait.

Developing for Optimal VoIP Service Quality Understanding Quality From User’s Perspective Arun Bhardwaj Director, Business Development Keynote Systems,

Similar presentations


Presentation on theme: "Developing for Optimal VoIP Service Quality Understanding Quality From User’s Perspective Arun Bhardwaj Director, Business Development Keynote Systems,"— Presentation transcript:

1

2 Developing for Optimal VoIP Service Quality Understanding Quality From User’s Perspective Arun Bhardwaj Director, Business Development Keynote Systems, Inc. arun.bhardwaj@keynote.com

3 3 Agenda VoIP Market View Unique Approach to Voice Service Quality Measurement Comparative Analysis of Various Voice Technologies Often Ignored Factors Affective Quality Service Quality Trends in the VoIP Industry

4 Unique Nature of VoIP

5 5 Voice Communication Landscape PSTN User PSTN Network LDC Network PSTN User PSTN Network BSC BTS MSC BSC BTS MSC IP Network SoftPhone VoIP Phone Access Network POTS WiFi Phone Gateway IP-PSTN Gateway PacketCable Phone

6 6 Why is Voice Quality Essential? SubscribersRevenue 20041.3million$200million 20054.5million$1billion VoIP GrowthBarriers to VoIP Adoption US RBOCs losing 150,000 subscriber lines / month VoIP Providers gaining 100,000 subscribers / month Wireless only service gaining 50,000 subscribers / month VoIP will capture 22% of LEC’s existing market LEC’s will lose $18.2b between 2006 and 2010

7 7 Rank the relative performance of PSTN, PacketCable, VoIP hard phone, and VoIP soft phone service providers. Identify industry trends in service level performance since the last Keynote study. Identify the range of performance between the best voice service providers and the worst. Examine peak and prime-time performance variations. Identify the strengths and weaknesses of each service provider and voice service technology. Competitive Research Study Objectives

8 8 Measurement Topology

9 9 Monitoring Scope [New York location only][San Francisco location only] PacketCable Services VoIP Hard Phone ServicesVoIP Soft Phone Services

10 10 Audio Characteristics Analysis All CallsCalls with MOS < 3.0 # of callsPercentage# of callsPercentage Hiss00.0%0 Static5392.9%40771.3% Hum70.0%0 Frequency Clipping00.0%0 Front Clipping50.0%10.2% Back Clipping4222.3%40570.9% Other Clipping6073.3%52091.1% Holdover1,0995.6%325.6% Total18,456571 Last Mile Impairments: Measuring Within Network is Not Enough

11 11 Keynote Voice Perspective Agent Technology New York Cable/DSL/Sprint Caller & Responder San Francisco Cable/DSL/Sprint Caller & Responder Responder Agent Accepts calls; sends audio sample Caller Agent Initiates calls; requests audio sample Caller agent compares received and reference audio samples

12 12 Measured Parameters Voice Service Quality Reliability Audio Clarity Responsivenes s Holistic Customer Experience !!! - Average Mean Opinion Score (MOS) - %Calls > Acceptable MOS - MOS Geographic Variability - Average Audio Delay - %Calls > Acceptable Delay - Audio Delay Geo Variability - Service Availability - #Dropped Calls - Average Answer Time

13 13 True End-to-End Monitoring Methodology What Others Measure What KEYNOTE Measures What Customer Experiences Core Network PSTN User SoftPhon e VoIP Phone PSTN Network Access Network IP-PSTN Gateway Voice Path POTS

14 14 Competitive Research Report VoIP Phone Digital Cable Phone Soft Phone PSTN

15 15 Case Study: Invisible Annoyance Low MOS score for > 90% of calls Analyzed Audio Characteristics of all calls for the problem period using Voice Perspective Keynote Analysis Silence period frequency profile showed audible Hum on 70% of the VoIP Agents Customer Problem

16 16 Diagnosis Hum problem and hardware ATA model type showed strong correlation VoIP Perspective AgentATA Model% of Calls with Hum New York AT&TModel B96.9% New York SprintModel B87.6% New York Time Warner CableModel B97.6% New York UUNetModel B97.7% New York Verizon DSLModel B92.3% San Francisco AT&TModel A0.0% San Francisco Comcast CableModel B97.1% San Francisco SBC DSLModel B97.4% San Francisco SprintModel A0.0% San Francisco UUNetModel A0.1% Low MOS score for > 90% of calls Analyzed Audio Characteristics of all calls for the problem period using Voice Perspective Keynote Analysis Silence period frequency profile showed audible Hum on 70% of the VoIP Agents Customer Problem Case Study: Invisible Annoyance

17 17 The problem was in a specific telephone adapter model type Audio Clarity Ranking improved by TWO places after replacing adapters Increased customer satisfaction (Mean Opinion Score increased by 0.3) Audio Clarity Ranking improved by TWO places after replacing adapters Increased customer satisfaction (Mean Opinion Score increased by 0.3) Improvement Diagnosis Hum problem and hardware ATA model type showed strong correlation Low MOS score for > 90% of calls Analyzed Audio Characteristics of all calls for the problem period using Voice Perspective Keynote Analysis Silence period frequency profile showed audible Hum on 70% of the VoIP Agents Customer Problem Case Study: Invisible Annoyance

18 18 Data Collection Period and Size Data collected from August 1 st – August 31 st, 2006. Long distance and local VoIP to PSTN calls between New York and San Francisco every 30 minutes. Call placed on every VoIP provider and network combination Total of over 125,000 phone calls were placed during the one month period

19 Study Results Overview

20 20 Summary of Results Key performance indicators such as Service Availability and Average MOS improved for most providers. Average one-way audio delay between 150 and 250 ms [Best 62 ms; Worst 335 ms]. Average Mean Opinion Score range 3.0 to 4.0. [Best 4.24; Worst 2.64]. Calls on most providers have clipping or audio holdover causing service degradation.

21 21 Summary of Results Primetime Versus Non-primetime Performance –Higher variation in audio delay than in Mean Opinion Score. –DSL connections offered less audio delay variance –Cable modem connection delivered more consistent MOS

22 22 Reliability Overview – Service Types PacketCable service providers are more reliable than PSTN, VoIP Hard Phone, and VoIP Soft Phone service providers.

23 23 Audio Quality Overview – Service Types PacketCable service providers had better overall audio quality than PSTN, VoIP Hard Phone, and VoIP Soft Phone service providers.

24 24 Trends – Service Availability Of the eleven service providers measured in previous studies, seven had a better Service Availability percentage in August, 2006 than in any previous Keynote study.

25 25 Trends – Average MOS Of the eleven service providers measured in previous studies, seven had a higher Average MOS in August, 2006 than in any previous Keynote study.

26 26 Audio Delay and MOS Trends

27 27 Performance Ranges – Audio Delay Only four of the fourteen providers measured in August had an Average Audio Delay below 150 ms. The best Average Audio Delay was 62 ms; the worst was 335 ms.

28 28 Performance Ranges – Average MOS Only four of the fourteen providers measured an Average MOS above the 4.0 “toll quality” threshold. The best Average MOS was a 4.24; the worst was a 2.64.

29 29 Codecs Used The most commonly used codec is ITU-T G.711 PCMU. Every VoIP Hard Phone provider with an Average MOS over 4.0 used the ITU-T G.711 PCMU codec. ITU-T G.721 and ITU-T G.729 are still in use by a few VoIP service providers. [Note: Codec used cannot be determined for PacketCable providers and some VoIP Soft Phone software clients with proprietary signaling protocols. There is no codec used in the customer premises equipment for analog PSTN service.]

30 30 Analog Telephone Adaptors and Software Clients

31 Summary

32 32 Industry Trends Most providers measured in previous studies are improving their reliability PacketCable providers now exceed the overall reliability of PSTN service. VoIP providers as an industry need to improve in service availability. As a whole, the industry standards in responsiveness and audio clarity continue to improve, and PacketCable service providers lead the other voice technologies.

33 33 How to Improve VoIP Quality –Watch the competition – Ensure that your service not only performs well all the time, but also performs better than or at par with your competition. –Focus on end user experience – Measure VoIP performance as close as possible to the end-user experience. Actual waveform analysis of call audio brings the measurement perspective as close as possible to what your customers are experiencing. –Measure service holistically – Small things can ruin the best service experience. Focus on measuring every aspect of your VoIP call experience, and use the insight gained from the measurements to tune your network infrastructure to ensure few outages and excellent call quality.

34 34 Public Agents Based Contact Center Monitoring PSTN Network ABC Enterprise Contact Center KRKR Public (Caller) Agent Infrastructure IP-PSTN GW SF VoIP Network KRKR Keynote Responder (Terminates VoIP or PSTN Calls) CHI NY DAL FL

35 Appendix B Measurement technology

36 36 The Audio Quality index ranking is based on Keynote extensions of the Apdex * standard to represent user satisfaction with audio quality: Mean Opinion Score (MOS) [T, F] = [4.0, 3.1] ** Audio Delay (ms) [T, F] = [150, 400] *** Each call is determined to be in the Satisfied, Tolerating, or Frustrated performance ranges for MOS and audio delay, based upon industry standard thresholds. * See http://www.apdex.org/http://www.apdex.org/ ** Thresholds based on Telecommunications Industry Association Technical Services Bulletin 116 “Voice Quality Recommendations for IP Telephony”. *** Thresholds based on International Telecommunications Unions standard ITU-T G.114 “One-way transmission time”. Ranking Methodology – Audio Quality Total samples _____________ ___________________________ Tolerating count Satisfied count + 2 1000 x


Download ppt "Developing for Optimal VoIP Service Quality Understanding Quality From User’s Perspective Arun Bhardwaj Director, Business Development Keynote Systems,"

Similar presentations


Ads by Google