Keeping people connected has been one of the biggest themes of 2020. No one can be in any doubt about just how network-dependent society and businesses have become. In fact the deepening extension of connectivity into our lives has even begun to create a Luddite-like reaction from those who feel disengaged from the connected economy, or even hostile towards it. (see Tower inferno as conspiracy theories get out of hand)
Yet there’s an interesting paradox here. Despite the critical importance of connectivity to people and businesses they continue to place an ever-lower value on it.
One of the challenges has been that the telecoms industry has failed to transition from a well-understood unit of measurement (minutes) to a data-centric business model that is simple, appealing and yet still lucrative. The industry has largely focused on fulfilling the first of these qualities – simplicity – replacing the minute with a single dimension of data – its size. There’s been a nod to other parameters such as speed, but no guarantees and a lot of opacity and caveats in the small print (to the point where no-one trusts such measures any more).
Using a simple parameter has a certain appeal – it’s very easy to understand and measure. But even in the past, telecoms used more than just the minute to charge for their services – tariffs included other billing parameters such as distance and time of day. Focusing on a single parameter means there isn’t much scope to communicate quality or anything the customer values, which makes differentiation very difficult. A single parameter creates an illusion that the product (connectivity measured in gigabytes) is identical. However, in the real world, while products may be similar, few are completely identical. This approach has driven commoditisation and meant firms cannot sustain (higher) pricing but are locked into brutal price-based competition.
But think about it. A kilo of gold would appear to measure the same thing, but it quickly becomes apparent it doesn’t. ‘Gold’, as sold to the public, is an alloy, and without knowing how many parts per thousand is actual gold rather than another metal (purity), we cannot value it. 9 carat gold is not as valuable as 24 carat gold, simply because the latter has more actual gold in it.
In a similar way, not all networks are equal and not all customers or services need the same thing. Valuing connectivity by gigabyte does not tell us what we’re buying. And in an increasingly network-dependent age we need more information and more granularity than just the ‘weight’ of connectivity. We need to know things such as can we use it when we want (availability, error rates, reliability), how can we use it (throughput and latency), and whether using it will make us sick (because the connectivity is ‘infected’ with malware or anomalous traffic).
No single element is more important than another, because the balance of importance depends on us – the customer – and what we’re using connectivity for. Each service and each user has different needs.
The current situation is a bit like going to the grocers and buying a kilo of food. Even if the food is nutritious, you have no idea whether it’s going to be potatoes, tins of beans, chocolate or chilli peppers. Few people live on just potatoes and few people use just one connected service. In fact, you need a number of key ingredients to make a nutritious meal. Similarly, you need different ingredients for a quality experience of different connected services.
All of this is immensely complicated and therein lies the problem. Because it’s challenging to package and market quality in telecoms, which is after all somewhat subjective, so we’ve largely chickened out and defaulted to a simplistic measure (gigabytes) which doesn’t tell us very much at all – other than we’re not confident enough about quality to put any meaningful guarantees against it.
Some specialist buyers – mainly large enterprises – can get into the weeds of connectivity and buy services based on a range of more detailed parameters. But even here, it’s only recently that the black box of the network has really opened up to expose the granular performance individual users are actually experiencing, rather than the theoretical performance promised.
But why does any of this matter?
It matters because like a perennial weed that keeps coming back, QoS has once again been raised as being incredibly important in new business models and charging parameters. Every now and again over (at least) the last 20 years we’ve had as stab at nailing this sucker, and every now and again a new generation of marketing managers become convinced that quality is what matters. Vodafone, for example, recently followed the fixed line broadband providers away from consumption based pricing towards speed-based pricing. This is a half step towards quality, but the big fly in the ointment is that quality of experience doesn’t equate to just the speed of the network. A text message or IM doesn’t require much data or much speed; video on the other hand is notoriously latency-sensitive. Reliability is an essential, but often overlooked, component of a quality of experience whose importance has been underlined by our experiences during the COVID-19 crisis.
And yet quality of service is the big new idea for monetising 5G. To even have a stab at doing this requires us to have a far higher level of control over our networks than we’ve ever had before. It means being able to analyse and expose network experience, and adjust the network to maintain experiences at a level we’ve promised to deliver.
It also means being able to articulate to customers why sometimes they might need a 24 carat network experience rather than just a 9 carat network experience, or that slightly dodgy market-stall ‘pure gild’ network experience that leaves your finger with a green ring around it.
If QoS-based charging models are to prosper, the technology has to work hand in glove with the marketing offer. This is not beyond our grasp, but it is a level of sophistication that many service providers have still to deliver.