Presentation is loading. Please wait.

Presentation is loading. Please wait.

Quality of Service Development Group (QSDG) meeting

Similar presentations


Presentation on theme: "Quality of Service Development Group (QSDG) meeting"— Presentation transcript:

1 Quality of Service Development Group (QSDG) meeting
(2016 ) Parameters/Thresholds for Quality of Voice and Data Services (Case of Ghana) Isaac Annan Laryea-Monitoring and Compliance, NCA-Ghana Amsterdam 2016

2 Outline QoS Legal Framework Role of NCA/Monitoring & Compliance
What we Measure Verification of Compliance Stats Operations Tools Main Indicators Measured Target for Indicators Impact (Operators & Consumers) Way forward for us Amsterdam 2016

3 Legal Framework Electronic Communications Act 2008, Act775
Section 6 subsection 2 The Authority shall specify (a) quality of service indicators for classes of public telecommunications service, and b) the means to enforce a licensee's compliance with its stated quality of service standards, including measures by which a licensee shall compensate users adversely affected by a failure to provide electronic communications service in accordance with the standards. For every effective QoS regulation there should be a legal framework Amsterdam 2016

4 Role of NCA /M & C in Ghana
QoS assurance through periodic monitoring and evaluation. Adopt internationally inter-operative standards Establish equipment type approval regulations and carry out and certify Type approval of equipment Sets/Adopts standards and ensures compliance to them Establish and maintain a mutually conducive environment for operators, the public and authorities that promote and safeguard consumer interests. Performs monitoring of performance by the operators and directs improvements where necessary Publishes the industry performance in the media on quarterly bases for consumer consumption Data Drop Rate should be equal or less than one per cent (1%). Data Drop Rate is the probability to drop in connection to public server without end user’s intervention Amsterdam 2016

5 The practicability for operators to make the required measurements
What We Measure !!! In defining Parameters the following factors, among others are generally taken into consideration: The practicability for operators to make the required measurements The practicability for regulators or any independent entity to audit the results The measurement being made should retain the customer experience aspect and influence the satisfaction. In effect Subjective and objective measurements are adhered! Subjective methods by surveying users Objective methods by making tests, sampling calls, counting complaints Data Drop Rate should be equal or less than one per cent (1%). Data Drop Rate is the probability to drop in connection to public server without end user’s intervention Amsterdam 2016

6 Verification of Compliance Statistics
Operators periodically submit statistics on the network performance and customer support and this is independently verified. This is also verified during monitoring; the statistics include network coverage , outage, consumer complaint reports and network performance statistics. only customer centric statistics are monitored in the drive tests. Operators are also required to report any major network faults and a database is constantly updated.

7 Operations Voice services are evaluated end-to-end, using “voice call” as the basic test unit- Data collected in all parts but more concentration in major towns, highways and installations. Data Measurements are done in stationary vehicle at selected hotspot with a mobile equipment system. Tests are performed automatically with no human intervention on the technology used. The set up ensures both slave and the master assess the field. The MO and MT calls are assessed at the master. Network coverage assessment is made by measuring the downlink signal levels, RxLev (Received signal Level) for GSM, CPICH RSCP (Common Pilot Channel Received Signal Code Power) for WCDMA and 1x/EVDO for CDMA, RSRP for LTE. (GPRS/EDGE/WCDMA/CDMA1X-EVDO/LTE) Drive Testing Amsterdam 2016

8 Tools The tool is a multichannel benchmarking platform
Provides a direct comparison of multiple networks during a single test drive Captures quality and radio parameters from actual subscriber devices and Utilizes standardized algorithms [PESQ and POLQA} Evaluates the network, end to end, utilizing the devices and services used by customers to provide QoE Utilizes multiple devices (phones, modems, scanners) all running in parallel Speech quality testing-PESQ (ITU P.862) and POLQA (ITU P.863) algorithms for MOS quality Mobile-to-PSTN and mobile-to-mobile calls Data testing-Ping, FTP (get and put), HTTP (download and upload), Iperf (UDP and TCP), IE browser Uses complex scripting capability to emulate customer activity which helps in post-processing analysis Amsterdam 2016

9 Main indicators Measured
Network Coverage – Signal strength of coverage of the radio networks; Voice Call set up time – period of time that the network takes to establish the communication, after the correct sending of the request (target telephone number); Voice Call Setup Congestion- probability of failure of accessing a signalling channel during setup; Voice Call Completion Rate – probability that a call has, after being successfully set up, to be maintained during a period of time, ending normally, i.e., according to the user’s will; Voice Call Congestion Rate- probability of failure of accessing a traffic channel during call setup; Voice Call Drop Rate- probability of a call terminating without any of the users’ will; Voice Call Audio Quality– perceptibility of the conversation during a call Amsterdam 2016

10 Main indicators Measured - Data
Data Access Time – Data Access Time is a measure of the time lapse in activating a PDP Context for data service. (Moment PDP Request message is sent-Moment PDP Accept message is received) Data Access Success Rate – Date Access Success Rate is the probability of success in connecting to the public serve Data Drop Rate - Data Drop Rate is the probability to drop in connection to public server without end user’s intervention Throughput – Throughput is the rate of data transfer in [kbps] Amsterdam 2016

11 Targets for Indicators-Voice
VOICE CALL SET UP TIME Call set up time is the period of time elapsing from the sending of a complete destination address (target telephone number) to the setting up of a call. Call Set up time[s] = t calling signal – t address sending t address sending – moment when the user presses the send button t calling signal – moment one hears the call signal on the caller terminal Compliance requirement is: Call Set-up Time is better than ten(10)seconds in 95% of the time.

12 Targets for Indicators cont.
Voice Call Set up Congestion The probability of failure accessing a signalling channel during setup; Set Up Congestion [%]= Number of Call Blocked x Total number of Call Attempts Compliance requirement is: Stand-alone Dedicated Control Channel (SDCCH) less or equal to 1% Voice Call Congestion The probability of failure of accessing a traffic channel during call setup; Call Congestion [%]= Number of Calls Failed x 100% Total number of call attempts Compliance requirement is: Traffic Channel Congestion less or equal to 1%.

13 Targets for Indicators cont.
Voice Call Completion Rate The probability that a call has, after being successfully set up, to be maintained during a period of time, ending normally, i.e. according to the user’s will; Call Completion [%]= Number of normally ended calls x 100% Total number of call attempts Compliance requirement is: Call Completion greater or equal to 70%. Voice Call Drop The probability of a call terminating without any of the users’ will; Drop Rate[%]= Number of Calls terminated unwillingly x 100% Compliance requirement is : Call Drop Rate less or equal to 3%

14 Targets for Indicators -Data
Data Access Time (Moment PDP Request message is sent-Moment PDP Accept message is received) Compliance requirement is : Data Access Time should be better than five(5) seconds in 100% of the time. Data Access Success Rate Date Access Success Rate is the probability of success in connecting to the public server Compliance requirement is: Data Access Success Rate greater or equal to 95%. Data Drop Rate is the probability to drop in connection to public server without end user’s intervention Compliance requirement is: Data Drop Rate should be equal or less than one per cent (1%). Throughput – Minimum Downlink Data Speed Rate of greater than kbps in 90% of connections

15 2G and 3G coverage representation of a suburb in the Northern Region of Ghana
Greece 2015

16 2G RxLev Representation

17 3G (RSCP) Coverage representation

18 POLQA Narrow band representation

19 3D Speech Quality representation with POLQA algorithm

20 POLQA narrow band representation

21 Impact These actions ensure that communications providers are compliant with their regulatory obligations and achieve our market objectives and purpose. Improvement of coverage due to directives to improve coverage in certain areas. Helped operators to identify networks defects (CST) on their network and tasked their vendor to find solutions Vast improvement in QoS performance making consumers achieve good value for money. Impact to Operators and Consumers Affected Consumers of bad QoS performance has been Compensated Amsterdam 2016

22 Way Forward Live reporting on web publisher on QoS performance
Standalone QoS regulations Technical studies into quality of experience parameters - Voice/Data Monitoring the QoS and QoE of data services in 4G networks (such as LTE ...) Amsterdam 2016

23 THANK YOU !!! Amsterdam 2016


Download ppt "Quality of Service Development Group (QSDG) meeting"

Similar presentations


Ads by Google