The Network Performance Score (NPS) reflects the perceived technical performance of a mobile network or one of its subsets such as region, period, or technology. The overall NPS score is the combination of carefully considered and weighted main key performance indicators (KPI) for a wide range of services that are essential and exemplary for service quality. But what needs to be planned and prepared to get meaningful results from NPS benchmarking campaigns?
This post explains how to conduct an NPS benchmarking campaign so that it gives you statistically sound insight into network quality. It describes the selection of critical test configuration elements, such as websites for browsing tests, and how you can gain fast insights into your network from hundreds of hours of measurement data.
Governing principles for NPS campaigns
Rohde & Schwarz has implemented the NPS according to the methodology published by ETSI in TR 103 559. According to the ETSI report, there are some important principles to be followed to achieve fair and accurate benchmarking results. These principles include, for example,
- fair play and test method transparency;
- appropriate test device selection and similar placement of devices/antennas in the test setup;
- selection of a neutral test server that is not, for example, hosted by one of the competing providers;
- selection of a representative and sufficiently large set of popular web pages for browsing tests.
Fair play also means that the devices used for testing should not be treated differently by the operators, for example, by installing a special firmware or detecting the recognizable traffic pattern produced during testing. Boosting the performance of devices used for testing would lead to results that are not comparable to the real customer experience.
The test method transparency includes not only an explicit declaration of the used scoring model, underlying KPI definition, number of collected samples, and detailed test configuration setup; it also includes the device model, firmware version, and the tariff used for data collection, as these specifications are critical for results interpretation.
A systematic approach to NPS campaigns
It is a rather complex task to set up a countrywide benchmarking campaign that yields representative results. We can break it down into a series of several steps.
1. Plan regional categories and routes
One of the first topics to consider is how to get a data sample that is representative of a whole country or a whole market. The same answer does not apply to all cases, however, but depends on the benchmarking campaign’s focus and goals.
The underlying metric could be representative of the whole population or the number of active mobile network users. It could put a special weighting on a certain area, because either it is of high importance like an international airport or the mobile connection might be the only connectivity available in this region.
Once the focus and goals are clear, the next step is to transfer this into a set of regional categories and weight their importance. Usually, a campaign for testing the network quality in a big European country should include categories for big cities, smaller towns, connecting roads, trains, and some hotspots like highly-frequented airports.
In other countries, public transport may be less important, and train measurements can be omitted, or there might not be any big cities or hotspots to measure. It is also possible to set up a campaign for just one city and define the different city quarters as categories so that it is easy to see which part of the city has the most potential for mobile network improvements.
With other factors absent, we propose to use a population-centered approach for the region selection, and the goal should be to cover at least 25% of the population. The choice of measurement locations has to be representative of the overall distribution so that the measurement results resemble the hypothetical distribution of the whole population not only for the overall score but also for each KPI.
2. Set up the measurement system
A typical system setup includes three separate devices for data and voice measurements (A- and B-side of telephony tests) per operator and optionally an RF scanner. Data and voice tests are always executed at the same time and in the same geographic location, and the devices should be synchronized between the different operators to conduct the tests in parallel and keep the results comparable.
The device selection considers the country’s current technology requirements for a fair evaluation of the performance of all network operators. The device should be, for example, 5G-capable, even if only one of the local operators has rolled out the 5G technology. Not only existing radio technologies have to be covered but also special services, such as VoLTE, and, of course, the devices must support the licensed frequency bands in the country.
In this context, the installed firmware version and the used SIM cards are of high importance. It might be necessary to set up different systems for drive and walk tests, for example, using a non-portable system in a drive test vehicle and a portable system in a backpack, respectively. Nevertheless, the tests and campaign structure must be strongly aligned between the used systems for drive and walk tests.
3. Select the websites for browsing tests
HTTP browsing is one of the most commonly used web applications in mobile networks. In this context, browsing stands for all applications downloading HTML content from multiple sources to the end-user devices, which is also typical for most smartphone apps. Consequently, testing HTTP browsing is a core metric for benchmarking the quality of experience (QoE) and is weighted strongly in the overall Network Performance Score (NPS) that describes a network’s performance in general.
When testing HTTP browsing, several factors need to be considered. This includes success ratio and download time, both of which depend heavily on the website structure and the connection to the Core Data Network (CDN).
Today’s popular websites are highly dynamic, which means that content and advertisements change within short periods. Therefore, multiple different websites are included in benchmarking campaigns to diversify and average the sites’ individual behaviors. Moreover, they have to follow a specific set of rules: it is advisable to avoid websites
- of services that are predominantly accessed via a dedicated app on the smartphone, for example, Facebook,
- with a straightforward structure and a small amount of data, for example, a login or landing page,
- with an exceptional amount of advertisements or embedded long or live video content as these elements appear to the user as complete before the end of the download.
4. Configure the campaign in SmartBenchmarker
After the system setup and the website selection, the task of setting up the actual measurement campaigns is next. This used to be an error-prone task, requiring the setup of a complex test structure with all parameters being compliant with the measurement scenario defined by Rohde & Schwarz based on the ETSI documentation.
A performant enough and neutrally hosted test server has to be set up for HTTP and ping testing, and some video streams, preferably with live content, have to be selected for video testing. After the setup, it is recommended to perform a small pre-study to validate the configuration of the system.
5. Do the drive testing and data import
It can take many days or several weeks of drive testing to collect the needed data. The number of days can be reduced by increasing the number of measurement systems used in parallel.
During a running campaign, it is important to keep an eye on the test results and validate them at least daily. For example, it might be the case that the URL of the YouTube live video stream changes, and suddenly all video tests show as failed. These kinds of problems should be spotted early to avoid redrives.
6. Gain insight into network quality
After importing the data into SmartAnalytics, the NPS dashboard and the various analysis and benchmarking scenarios offer immediate insight into the network quality.
In the following example, both operators achieved a very good score above 800 points in the benchmark. Operator 2 is in the lead with a not too big margin. Splitting the score into the data and the voice part shows that operator 2 is ahead in the data service score, but behind in the voice part.
Imagine operator 2 wants to improve his network to be the leading provider for voice services, too. The analysis should start with the voice KPIs and the potential for improvement that they are revealing.
In Figure 5, the filled part of the bars indicates the achieved points; the empty part indicates the improvement potential. These are absolute numbers in points, so a long empty bar indicates more improvement potential. In this example, the highest potential for operator 2 in an individual KPI can be found in the ratio of calls that take longer than 15 s to setup (CST > 15 s Ratio).
Drilling down into the used call technologies reveals something interesting: Although all calls of both operators start in LTE technology on A and B side, the call technology used for operator 2 is in almost 40% of its tests CSFB instead of VoLTE. This kind of call setup strategy takes on average much longer and leads to a non-neglectable portion of calls taking longer than 15 s to connect.
For operator 2 to improve his voice score, he should start at this point and revisit the selection strategy for the call mode to be used in LTE technology.
Locate fundamental weaknesses
The planning, configuration, and conduction of a Network Performance Score (NPS) campaign take effort and careful consideration to yield representative results that can lead to real insights into network quality. Once the data has been collected, the NPS and the underlying point score system make it easy to get an immediate overview of the network quality for each operator and rich data with many drill-down possibilities that greatly facilitate locating the fundamental weaknesses and improvement potentials.
Learn more about the NPS and its implementation in our benchmarking and analytics tools in the NPS white paper.