1 / 36

VoIP Competitive Intelligence Survey

VoIP Competitive Intelligence Survey. -Understanding Voice Quality from an end users perspective. Rajeev Kutty Product Manager –Web performance Keynote Systems Inc Rajeev.Kutty@Keynote.com. Session Objectives . Competitive VoIP Landscape Hidden Factors Affecting Voice Service Quality

bigler
Télécharger la présentation

VoIP Competitive Intelligence Survey

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. VoIP Competitive Intelligence Survey -Understanding Voice Quality from an end users perspective Rajeev Kutty Product Manager –Web performance Keynote Systems Inc Rajeev.Kutty@Keynote.com

  2. Session Objectives • Competitive VoIP Landscape • Hidden Factors Affecting Voice Service Quality • Comparative Analysis of Voice Technologies • You can’t manage what you don’t measure • Service Quality Trends in the VoIP Industry

  3. Unique Nature of VoIP

  4. Methodology 1. Investigation  What voice providers are the largest in two major metropolitan areas: San Francisco and New York?  What types of Internet Service Providers are being used to access these voice providers? • 2. Deployment • Accounts are created at voice providers using their normal customer sign-up processes • Calls are defined across all provider and network connection combinations • Each call uses the same reference audio to ensure consistency in measurement from call to call •  All calls are made to PSTN numbers, to ensure comparable measurements  Measurement configurations are deployed to the Keynote agents for measurement 3. Data Collection  Measurements are taken over a month • 5. Rankings •  Rankings are created and presented • Best and worst providers named 4. Analysis  Data sample undergoes statistical analysis  Providers and carriers are investigated individually  Industry trends are noted

  5. Voice Service Providers Profiled

  6. Network Carriers in the Study New York Time Warner Cable Verizon DSL San Francisco Comcast Cable at&t DSL

  7. Agent Topology

  8. Keynote Agent dials a PSTN phone number Call audio is sent and recorded for analysis Keynote Agent repeats process with next Provider Calls Compared • Each call includes the one-way transmission of a reference audio file specifically created with characteristics appropriate to voice audio quality measurement • All calls are placed with the Analog Telephone Adaptor hardware or the software client provided by the service provider

  9. Data collected from August 1st – August 31st, 2007 Long distance and local PSTN and VoIP to PSTN calls were placed in both directions between New York and San Francisco on every VoIP provider and network combination once every 30 minutes Time Warner Digital Phone calls were placed every 30 minutes from New York to destinations in New York and San Francisco Comcast Optimum Voice calls were placed every 30 minutes from San Francisco to destinations in New York and San Francisco Total of over 102,000 phone calls were placed: over 10,000 calls per VoIP service provider, and over 2,800 calls on each PSTN and PacketCable service provider Data Collection Period and Size

  10. Ranking Methodology – Reliability Performance Factors • Service Availability • Call Completion • Average Answer Time • Dropped Audio • The Reliability index ranking was computed based on the Service Availability, Call Completion, Average Answer Time, and Dropped Audio Performance Factors. • Each provider earns points based on their performance relative to the range of performance measured for each factor. • Service Availability and Call Completion are each worth 40% of the total, Dropped Audio is worth 15% of the total, and Average Answer Time is worth 5% of the total. • The final score for each provider is scaled out of a possible 1000 points.

  11. SoftPhone PSTN User Access Network IP-PSTN Gateway VoIP Phone PSTN Network Core Network POTS Voice Path What Customer Experiences What Others Measure What KEYNOTE Measures True End-to-End Monitoring Methodology

  12. Last Mile Impairments: Measuring Within Network is Not Enough

  13. - Average Mean Opinion Score (MOS) - %Calls > Acceptable MOS - MOS Geographic Variability Audio Clarity Voice Service Quality Responsiveness Reliability - Average Audio Delay - %Calls > Acceptable Delay - Audio Delay Geo Variability - Service Availability - #Dropped Calls - Average Answer Time Holistic Customer Experience !!!

  14. New York Cable/DSL/Sprint Caller & Responder Hackensack - New Jersey Cable/DSL (Caller) Alexandria - Virginia Cable/DSL (Caller) Tampa Cable/DS (Caller) San Francisco Cable/DSL/Sprint Caller & Responder Plano (FiOS) Carrollton - Dallas Cable/DSL (Caller) Keynote Voice Perspective Agent Technology Caller agent compares received and reference audio samples Responder Agent Accepts calls; sends audio sample Caller Agent Initiates calls; requests audio sample

  15. Key Performance Indicator Scorecard

  16. Analyzed Audio Characteristics of all calls for the problem period using Voice Perspective Keynote Analysis Silence period frequency profile showed audible Hum on 70% of the VoIP Agents Case Study: Invisible Annoyance Customer Problem Low MOS score for > 90% of calls

  17. Silence period frequency profile showed audible Hum on 70% of the VoIP Agents Case Study: Invisible Annoyance Customer Problem Low MOS score for > 90% of calls Analyzed Audio Characteristics of all calls for the problem period using Voice Perspective Keynote Analysis Diagnosis Hum problem and hardware ATA model type showed strong correlation

  18. Silence period frequency profile showed audible Hum on 70% of the VoIP Agents Improvement Case Study: Invisible Annoyance Customer Problem Low MOS score for > 90% of calls Analyzed Audio Characteristics of all calls for the problem period using Voice Perspective Keynote Analysis Diagnosis Hum problem and hardware ATA model type showed strong correlation The problem was in a specific telephone adapter model type Audio Clarity Ranking improved by TWO places after replacing adapters Increased customer satisfaction (Mean Opinion Score increased by 0.3)

  19. Study Results Overview

  20. Summary of Results • The two PSTN service providers outperformed the other service providers in both Reliability and Audio Quality • PacketCable providers suffered from weaker performance in this study than in the previous study • All VoIP providers had a lower rate of calls with dropped audio on the DSL network connection than on the cable modem network connection

  21. Summary of Results – Audio Delay • PSTN and PacketCable service providers measured a geometric mean one-way audio delay below the 150 ms threshold for end user satisfaction • Most VoIP service providers measured a geometric mean one-way audio delay between 150 and 250 ms • The best geometric mean audio delay for VoIP providers was 149 ms, and the worst was 279 ms • Performance issues with the San Francisco cable modem connection on the 14th adversely affected some VoIP providers

  22. Summary of Results – MOS • Only two providers in this study, had a geometric mean MOS below 3.7 • Five service providers had a geometric mean MOS over 4.0; the best geometric mean MOS was a 4.20, and the worst was a 3.05 • Problems on the San Francisco cable modem connection affected Comcast Digital Voice and all VoIP providers on the 25th, but affected those with the highest geometric mean MOS values the most • Most service providers do not have problems with hiss, static, or high frequency clipping, but can have many calls with temporal clipping or audio holdover

  23. Summary of Results – Variations • In general, voice service providers had a higher worst-case hourly variation between the prime and non-prime hours of the day in audio delay than in Mean Opinion Score • Cable modem connections delivered more consistent prime vs. non-prime worst-case hourly audio delay performance • All VoIP providers had a lower rate of calls with dropped audio on the DSL network connection than on the cable modem network connection

  24. Reliability – Service Types • During this study period, PSTN service providers were more reliable than PSTN or VoIP Hard Phone service providers.

  25. Audio Quality – Service Types • PSTN service providers had better overall audio quality than PacketCable or VoIP Hard Phone service providers

  26. Audio Characteristics of PSTN • The most common poor audio characteristics encountered on PSTN are audio holdover, other clipping, front clipping, and hum • Back clipping, front clipping, hiss, audio holdover, hum, and other clipping occurs with a much higher frequency in calls that measured a MOS below 3.1 • Back clipping, high frequency clipping, hiss, and static occur only rarely [Note: Combined totals and percentages for at&t PSTN and Verizon PSTN service providers]

  27. Audio Characteristics of PacketCable Providers • The most common poor audio characteristics encountered on PacketCable providers are audio holdover and other clipping • Back clipping and other clipping occur with much higher frequency in calls that measured a MOS below 3.1 • No calls on the two PacketCable service providers had measurable levels of hiss or static [Note: Combined totals and percentages for Comcast Digital Voice and Time Warner Digital Phone service providers]

  28. Audio Characteristics of Hard Phone Providers • The most common poor audio characteristics encountered on Hard Phone providers are audio holdover, front clipping, other clipping, and hum • Each of these poor audio characteristics occurs with a much higher frequency in calls that measured a MOS below 3.1 • Back clipping, high frequency clipping, hiss, and static occur only rarely [Note: Combined totals and percentages for AT&T CallVantage, EarthLink trueVoice, Packet8, Primus Lingo, SunRocket, Verizon VoiceWing, Vonage, and Vonics Digital service providers]

  29. Summary

  30. Industry Trends • Continuous improvement in reliability • PSTN and PacketCable service quality gap narrowing • VoIP Service Availability still needs improvements • VoIP service as a whole improving and PacketCable leading the other voice technologies

  31. Industry Areas of Focus • Improving call completion rate – Only two of the twelve voice service providers had a call completion rate of 99.5% or higher. The worst VoIP provider had a call completion rate below 90%. • Lower audio delay – Only one of the VoIP service providers had a geometric mean one-way audio delay below 150 ms, a target value recommended in ITU-T Standard G.114. The worst VoIP service provider had geometric mean audio delay of 279 ms. • Better MOS Performance – While the best voice service providers have geometric mean MOS over 4.0, the worst VoIP service providers have very poor MOS performance. One provider measured a geometric mean MOS below 3.1.

  32. KR KR KR PSTN Network Challenges faced by Contact centers Branch Office Keynote Public (Caller) Agents New York Contact Center VoIP Networks Chicago Dallas IP-PSTNGateway Contact Center LA SFO KR Keynote Responder

  33. Invest in Planning Improving VoIP Quality • Measureservice holistically Focuson end user experience Watch The Competition

  34. Q&A Thanks

  35. Tolerating count _____________ Satisfied count + 2 ___________________________ 1000 x Total samples Ranking Methodology – Audio Quality The Audio Quality index ranking is based on Keynote extensions of the Apdex* standard to represent user satisfaction with audio quality: • Mean Opinion Score (MOS) [T, F] = [4.0, 3.1]** • Audio Delay (ms) [T, F] = [150, 400]*** Each call is determined to be in the Satisfied, Tolerating, or Frustrated performance ranges for MOS and audio delay, based upon industry standard thresholds. * See http://www.apdex.org/ ** Thresholds based on Telecommunications Industry Association Technical Services Bulletin 116 “Voice Quality Recommendations for IP Telephony”. *** Thresholds based on International Telecommunications Unions standard ITU-T G.114 “One-way transmission time”.

More Related