Can NPS be used with machine learning to help enhance customer experience?

NPS is a valuable tool and covers a range of interactions between customers and network operators. It includes perceptions of metrics that we want to understand, such as network quality and service performance. We also use network and signalling data to try to build an objective picture. Can we align offline subjective survey results with online, objective measures to build new KPIs and gain fresh insights?

NPS is subjective, valuable and delivers insights – but it does have limitations

Net Promoter Score – NPS – is widely used by operators to try to gauge the quality of their networks. An NPS score is typically generated when customers complete multi-dimensional questionnaires, which are then evaluated to deliver an appropriate value. They are very useful and have long acted as proxies for metrics that are captured from the network.

However, they don’t actually relate to specific KPIs that an operator can use to enhance performance. NPS is subjective, based on offline reporting. We measure and track NPSs, but they don’t inherently tell us about the real experience, only that perceived by the customer.

Aligning NPS with other metrics has long been desired

Wouldn’t it be great if NPS could be turned into something that can match the kinds of things operators track, so that a real feedback loop can be created – converting customer NPS into metrics that can, in turn, drive optimisation and performance enhancements in the network?

Of course, there have been various efforts to achieve this – using different algorithms to try to measure voice or video quality, for example, often through the translation of a subjective evaluation to an empirical one, based on selected and specific calls or user sessions. These have and continue to have value. But, there remains a disconnect between such results and those obtained from NPS surveys – which can be influenced by factors such as recency, excitement or other sentiments. We want to see how these can be aligned further – and a new technique we’ve developed for KALIX may just help.

A new technique with KALIX

As many will know, KALIX uses KPIs to report different metrics. We’ve created a new one, which we think is a great step in bridging the gap between NPS and other factors. Most of our current KPIs use service quality indicators (SQIs) and measure different things, such as voice or messaging quality.

Different users place different weight on these and create formulas for their priorities and specific reporting objectives. Once established, these often just run, accumulating data and generating the appropriate reports. This is a tried and trusted method – but we want to go further and see how we can take NPS data that relates specifically to network and service performance, and include this in continuous evaluations.

Using Machine Learning to process NPS data

If you add Machine Learning (ML) techniques, you can process large volumes of data to determine which KPIs have the greatest impact on customer experience and how they relate to each other. If something changes, then the algorithms will detect this, and the results will shift – positively or negatively. We need to know what data is most relevant here.

Well, NPS involves lots of questions that cover the way in which a customer interacts with their provider. Some of these are related to the perception of specific quality performance, while others relate to interactions with customer care. We don’t want the resulting NPS values as such; we only want to understand the questions that relate to perceived quality.

So, our team has developed a new technique. That is, we take only the results that relate to service performance and network quality and then match this with real data generated by customers and their devices. We can correlate these to understand how what they say and report through NPS questions matches the real, observed data regarding their interaction with the network – because we can filter down to individual users. So, if we have an NPS report from customer Mrs X, we can relate the bits that cover quality to the network data generated by Mrs X. Do they match? Is what she says reflected in the actual data that captures her experience?

ML allows us to do this at scale and to automate the processing. Note that we’re not trying to predict NPS for a given customer, or across the subscriber base. Instead, we want to know if we can build a new KPI for perceived quality that draws on both offline feedback and, crucially, online objective data.

Correlating NPS and network data

As a result, for any given cohort of subscribers that answered a survey, say in March 2021, we can extract all of the signalling events that KALIX has captured for them and use this as our dataset to train the ML models and then extract the results. We can also take the resulting indicators and check with the rest of the subscriber base to see what might have been the result, had they also answered the same survey questions.

In other words, we have now created a new KPI for perceived network quality that seeks to close the gap between offline survey results and online network data. What can we do with this?

Well, while you can continue to use existing service quality indicators, the Customer Experience index and Technical promoter score, we have a more reliable KPI of how customers perceive quality. It allows us to:

 

  • Review trends through time
  • Correlate changes with network enhancements, updates and revisions
  • Segment the customer base
  • Understand quality perception in different locations
  • Understand quality perception at different times (people respond to surveys during defined time periods – might the results have been different the week before, or the week after?)
  • Benchmark different devices, based on user opinions, not only absolute performance

 

We now have a brand-new measure that can be used alongside formula-driven KPIs, but which brings in a valuable new data source, that is, the relevant elements of NPS studies. That’s quite exciting, as it could help to close a known gap and help you to both understand how users perceive the services you deliver, at scale – and enhance your quality performance and improvement programmes.

If you’d like to discuss this topic with us further, please get in touch!

 

You might be also interested to watch our recent webinar, where we covered this topic in more details.

 

 

POLYSTAR MEDIA CONTACT

Inna Ott
Director of Marketing
Phone: +46 8 50 600 600
Email: inna.ott@polystar.com