When studying rankings of mobile operators in specialist publications, you often learn about maximum download and upload speeds, the reliability and availability of offered services, and data transport performance. These technical parameters describing quality of service (QoS) are the bedrock of a reliable mobile service and a well-performing network; they are the basis for subscriber satisfaction. But are they enough to reduce subscriber churn and to improve customer loyalty?
In the age of video calling – via third-party media services such as Facebook and WhatsApp – and live video streaming, for example on YouTube, it is imperative that large-scale benchmarking campaigns examine and weight not only QoS but also the end user’s quality of experience.
A campaign to evaluate quality of experience (QoE) in real field measurements requires a different setup than the typical before-and-after analysis of dedicated technical parameters and campaign key performance indicators (KPI). Measurement campaigns based on QoE need:
- a comprehensive set of emulated use cases and QoE-centered KPIs that are supported by the measurement equipment
- the structure of a user profile based on a real human user needs
- measurements that include all places and venues where subscribers typically use mobile services
KPIs and measures that quantify quality of experience
In general, QoE-centered KPIs and measures are values that reflect what a user really perceives. One example is the mean opinion score (MOS) for voice and video quality. MOS is a perceptual measure. For voice, it indicates the naturalness of transmitted speech; for video, it expresses how a viewer rates the visual quality of a video.
QoS is something you configure, QoE is something you experience.
Apart from MOS values, in telephony it is fundamental to rate the call setup time, i.e. the overall length of time required to establish a call and start a conversation. For video, this would be the time between requesting a video from a service and actually seeing it play.
For voice and video, QoE measures appear obvious. Measures describing the QoE of data or messaging services are also defined in our test solutions. These include the download time of highly popular web pages or files, such as new applications, and the time it takes to deliver messages or post pictures on a social media timeline.
Compared to plain technical metrics that linearly describe a technical value, such as dB or Mbit/s, measures quantifying quality of experience are non-linear. They are bounded by saturation areas where perception no longer changes.
With this in mind, it becomes clear that is not enough to test a network’s core services, such as telephony, SMS and data throughput. Popular media services, such as Facebook, WhatsApp and YouTube, also need to be considered since customers are increasingly using these types of services in their everyday routines.
Transparency, fairness, comparative results
How can service providers guarantee transparency and fairness and obtain comparative results? On the one hand, they must evaluate all available networks under the same conditions. This means all operators need to be measured simultaneously, at the same location and under the same test conditions. On the other hand, to perform extensive drive and walk tests within a very short time frame, they need equipment that is highly reliable, up and running quickly and capable of continuously collecting masses of data for hours.
Beyond the reliability of the equipment, mobile benchmarking defines its own set of additional requirements. The basic idea is to compare different networks on a fair and equal basis. Fairness entails the discerning selection of tests, transparent descriptions of all test cases and the preferred use of standardized KPIs and metrics. Moreover, to guarantee congruent conditions across all network, all test cases must be the same and conducted synchronously.
Commercial phones measure quality of experience
Regardless of whether driving or walking, all of our measurement solutions use commercially available smartphones as measurement probes to ensure equivalent test procedures and results – and to be as close as possible to what end users perceive. To compare networks on a fair basis, the smartphones have to be state-of-the-art and support all the latest technologies and services.
Benchmarking, especially when for the public and focused on QoE-metrics, requires bulletproof data. This is ensured by the consistent use of standardized measuring procedures wherever applicable.
Along with standardized procedures, full transparency is indispensable. Benchmarked operators have to know what was measured, how it was rated and how the final ranking was calculated. A clear rating procedure based on statistically significant amounts of data obtained in a transparent manner makes the ranking easy to understand and comprehend.
For a detailed ranking overview, including the underlying principle and the weightings as applied in the ranking of network operators in the German-speaking market (Germany, Austria, Switzerland) in 2017, read the customer case study “Independent measurement campaigns – mobile operators in the public eye”. The case study also descriptively illustrates the benefits of close collaboration between service providers and equipment vendors such as ourselves.