Now with added video!
Depending on who you listen to the severity of the digital divide varies greatly. But with so many different opinions, how do you actually know what is going on? And if you don’t have a clue, how can you possibly solve the problem?
This topic is one which carries a particularly heavy amount of political charge, for good reason might we add, and is not limited to the US. Digital inclusion is a buzzword and objective associated with almost every nation due to the increasingly complex and embedded role digital is having in our lives. Every society should be considering strategies to ensure everyone is taken forward into the digital utopia, but the success of such initiatives is questionable.
Here we are going to have a look at the US market, but not question how successful the political administration and telcos have been at closing the gap, but whether they have the right foundations in the first place. To tackle a problem, you have to actually know what it is, and this is where we feel the industry is failing right now.
First of all, let’s start with the obvious issue. The telcos clearly favour the denser urban environments due to the economics of connectivity; providing customers the internet is an expensive job in the beginning. Not only do you have to buy the materials and the equipment, you have to process planning permission, deal with lawyers and do the dirty-job of civil engineering. But, you also to have to have the confidence customers will buy services off you. When there is such a sparse residential population in a region, it can be difficult to make the equation add up.
This is the issue in the US, and perhaps why the digital divide is so much bigger than somewhere like the UK. The land mass is substantially bigger, there are a huge number of isolated communities and connectivity tariffs are much more expensive. The problem has been compounded every time connectivity infrastructure improves, creating today’s problem of a digital divide.
But, here lies the issue. How do you solve a problem when you have no idea what the extent actually is?
An excellent way to illustrate this is with a road-trip. You know the final destination, as does everyone trying to conquer the digital divide, but if you don’t know the starting point how can you possibly plan the route? You don’t know what obstacles you might encounter on the way to Eden, or even how much money you will need for fuel (investment), how many packets of crisps you’ll need (raw materials such as fibre) or how many friends you’ll need to share time at the wheel (workforce).
The industry is trying to solve a problem when it doesn’t understand what it actually is?
The FCC don’t seem to be helping matters. During Tom Wheeler’s time in-charge of the agency, minimum requirements for universal broadband speeds were tabled at 25 Mbps, though this was then dropped to 10 Mbps by today’s Chairman Ajit Pai. Rumours are these requirements will once again be increased to 25 Mbps.
Not only does this distort the image of how many people have fallen into the digital divide, it messes around with the CAPEX and OPEX plans of the telcos. With higher requirements, more upgrades will be needed, or perhaps it would require a greenfield project. Once you drop the speeds, regions will once again be ignored because they have been deemed served. If you increase these speeds, will the telcos find a loophole to ignore them, or might they unintentionally slip through the net?
Under the 25 Mbps requirements it has been suggested 24 million US customers, just over 7%, fall into the digital divide, though this is an estimate. And of course, this 25 million figure is only meaningful if you judge the digital served customers as those who can theoretically access these products.
A couple of weeks ago, Microsoft released research which suggested the digital divide could be as wide as 150 million people. We suspect Microsoft is stroking the figures, but there will certainly be a difference because of the way the digital divide has been measured.
In the research, Microsoft measured internet usage across the US, including those who have broadband but are not able to surf the web at acceptable speeds. Microsoft considers those in the digital divide as those who are being under-served, or have no internet at all, whereas the FCC seems to be taking the approach of theoretical accessibility. There might be numerous reasons people fall into the digital divide but are not counted by the FCC, price of broadband for example, but this variance shows the issue.
Another excellent example is in Okta’s speed tests across Q2-Q3 which have been released this week. The Okta data suggests a 35.8% increase in mean download speed during the last year, ranking the US as the 7th best worldwide for broadband download speeds. According to this data, average download speed across the US for Q2-Q3 was 96.25 Mbps. This research would suggest everything is rosy in the US and there is no digital divide at all.
As you can see there is no consolidated approach to arguing the digital divide. Before we know it campaigning for the next Presidential Election will begin and the digital divide will become another political tool. Republican’s will massage the figures to make it seem like the four-year period has been a successful one, while Democrat’s will paint a post-apocalyptic image.
And of course, it is not just the politicians who will play these political games. Light Reading’s Carol Wilson pointed out Microsoft has a commercial stake in getting more bandwidth to more people so that more people can access their cloud apps and make them more money. Should we trust this firm to be objective in contributing to the digital divide debate? Even if the digital divide is narrowing, Microsoft will want to paint a gloomy picture to encourage more investment as this would increase its own commercial prospects.
The issue which is at the heart of the digital divide is investment and infrastructure. The telcos need to be incentivised to put networks in place, irrelevant as to the commercial rewards from the customer. Seeing at this bridge is being built at a snail’s pace, you would have to assume the current structure and depth of federal subsidies is simply not good enough.
The final complication to point out is the future. Ovum’s Kristin Paulin pointed out those in the digital divide are only those who are passed by fixed wireless, not taking into account almost every US citizen has access to one of the four LTE networks. Fixed Wireless Access will certainly play a role in the future of broadband, but whether this is enough to satisfy the increasingly intensifying data diets of users is unknown. 5G will certainly assist, but you have to wonder how long it will take to get 5G to the regions which are suffering in the divide today.
Paulin points to the affordability question as well. With the FCC only counting those US citizens who cannot access the internet in the digital divide, who knows how many citizens there are who can’t afford broadband. A New York times article from 2016 suggested the average broadband tariff was $55 a month, meaning 25% of the city, and 50% of those who earned under $20,000 would not be able to afford broadband. The Lifeline broadband initiative project is supposed to help here, but Paulin politely stated this is suffering some hiccups right now.
If citizens cannot afford broadband, is this even a solution? It’s like trying to sell a starving man, with $10 in his wallet, a sandwich for $25. What’s the point?
Mobile broadband might well be the answer, Nokia certainly believes a fibre network with wireless wings is the answer, though progress is slow here. Congestion is increasingly becoming a problem, while video, multi-screen and IOT trends will only make the matter more complicated.
As it stands, the digital divide is a political ping-pong ball being battered as it ducks and dives all over the landscape. But, the US technology industry needs to ask itself a very honest question; how big is the digital divide? Right now, we’re none the wiser, and it will never be narrowed without understanding the problem in the first place.
Research from Ovum, commissioned by Three UK, has concluded 5G-powered fixed wireless access could replace fixed connections for most UK households.
The report, entitled 5G Wireless Home Broadband: A Credible Alternative to Fixed Broadband, was commissioned to assess the potential of 5G as a substitute to fixed wired broadband in the UK. The bandwidth promised by 5G rivals what is currently available to most UK households via traditional fixed lines and it doesn’t involve digging up the pavements, so what’s not to like?
In fact Ovum anticipates speeds of 80-100 Mbps from FWA, compared to the current fixed-line average of 46 Mbps. Furthermore Ovum reckons 85% of urban punters currently get less than 80 Mbps, so they would receive a boost from 5G FWA.
“Advantages of 5G wireless broadband technology are not just in speed: wireless is more flexible, does not require long-term contracts, is faster and cheaper to deploy and less of a burden for customers – no waiting time, no engineer visits,” said Dario Talmesio of Ovum, who wrote the report. “With low availability of fibre and high cost of deployment, 5G Wireless becomes a viable alternative to fixed-line broadband. While the UK continues its fibre roll-out, this is a quicker and more economical way to satisfy customers’ fast-growing demand for data.”
“5G gives consumers the opportunity to bin their fixed line, enjoy faster speeds and save money,” said Three UK CEO Dave Dyson. “Wireless home broadband means that we can speed up access to super-fast internet services at a lower cost, without installation delays or inflexible contracts.
“The efficient and widespread rollout of superfast broadband across households and businesses is crucial to the growth of our economy. Wireless home broadband de-risks government’s ambitions for a Digital Britain by providing alternatives to a fibre-to-the-home solution.”
Now it should be noted that Three UK doesn’t have a stake in the UK fixed line market and that it’s keen to show something for its £250 million acquisition of UK Broadband, part of the stated reason for which was to offer 5G FWA over the 3.4 GHz spectrum that came with it. Three expects to launch a UK FWA service sometime next year, so it’s fair to say it has a strong commercial interest in bigging up the potential of FWA.
Now with added video!
At Ovum’s Digital Futures event Telefónica’s Chief Digital Officer talked about the importance of making our interactions with technology more intuitive and natural.
Chema Alonso is an expert in cognitive intelligence, which in the tech context seems to be all about making computers think and act in a way that is more ‘human’. He heads up the team within Telefónica that is dedicated to artificial intelligence and its commercial use. His keynote at the event was entitled ‘How AI is Changing the Customer Experience and Telefónica’s Business’.
“Data is good,” opened Alonso, before adding “It’s time for computers to learn the human way of doing things.” The point of these two statements is that, while we’re in a digital era, we’re not so good at making use of all the data we’re constantly generating and accumulating. He danced around various considerations such as security but soon got to the core of his talk: AI and what Telefónica is doing with it.
Telefónica launched a platform/service called Aura at MWC 2017 that is designed to repurpose all the clever AI and cognitive intelligence stuff it’s doing internally into a something it can offer third parties. Right now this mainly means Telefónica’s Spanish operator Movistar, but the plan seems to be for anyone to use it. You can see a video explaining the point of Aura below.
Alonso refers to all this stuff as the ‘4th platform’, in reference to its internal role in unifying how data is handled within Telefónica, across systems and geographies. But on top of being some kind of middleware it seems to be all about using AI to make the user interface with technology more intuitive in all scenarios.
In a subsequent panel session Orange VP of Digital Innovation said “AI is the new UI,” which is designed to be short and memorable but is only partially true. In practice this AI is increasingly manifested through the voice UI, as first introduced to the mass market when Apple launched Siri and now commonplace thanks to voice-driven smart speakers.
Where AI comes in is in improving the voice UI. This doesn’t so much mean using data to anticipate your needs like some kind of creepy digital stalker, but using computing power to enable natural language processing, machine learning and context awareness to make voice interactions with machines at least as easy and productive as those with people. Some would argue this is a low bar, but it’s where we need to start nonetheless.
The main illustration Alonso, who used his keynote largely to big-up his employer, has for how great Aura is was Movistar Home, which is positioned as a superior smart home experience to the kind of Alexa-driven thing we currently have. It ultimately seems to come down to an improved voice UI and, perhaps, a more extensively connected home.
In the introductory presentation Ovum’s Richard Mahony warned of the dangers of AI concentrating power in too many hands. To illustrate this point he flagged up China’s plans to introduce ‘social credits’ – a system that tracks individuals constantly and gives or takes away social credit depending on how closely their behaviour conforms to the will of the Communist Party. The AI genie is out of the bottle and it will doubtlessly confer many benefits, but in the wrong hands it will enable and concentrate control on an unprecedented scale and so should be treated with profound caution.
Analyst firm Ovum hosted its Digital Futures event at which the Nokia CTO explained why we need to fundamentally redesign the network.
The keynote was delivered by Marcus Weldon, CTO of Nokia Networks and CEO of Bell Labs where they, as Weldon put it, “invent stuff”. “We have to rebuild the infrastructure in order to digitize our world,” he announced, before vowing to explain why.
Weldon said we’re at the start of only the 6th technological revolution there has ever been, the previous one being the internet. This one he named ‘the automation of everything’ and is more concerned with the movement of knowledge and insight as opposed to unqualified data. The internet connects people to each other and to digital media, but not the rest of the world, which is where this next revolution – generally referred to as IoT – comes in.
This should create new opportunities for telcos to extract themselves from the ‘dumb pipe’ trap they seem to be constantly battling. While the transfer of unadorned data has become a commoditized, thankless task, the transfer of knowledge – in the form of data collected from countless sensors and then processed to enable informed actions – could give telcos the chance to once more add value and thus margin.
But here lies the technological challenge. For a lot of IoT cleverness, such as VR or autonomous vehicles, to work we need network latency in the region of 1 millisecond. The technology to enable that is a core part of the 5G cunning plan, but no amount of clever tinkering from the likes of Weldon can overcome the fact that even light can only travel 100 km in 1 ms. This means that if you want to keep latency at that level your network can’t be any larger than 100 km in diameter – hence the need to radically re-architect them.
So the network needs to move from being quite centralised to massively distributed and the broader trend, says Weldon, is the shift from global to local. This could have all manner of implications over and above the need to localise the network and could possibly begin to reverse the trend of the internet being dominated by a few ‘webscale’ players.
The discussion them moved to a panel consisting of Weldon, CEO of Iron Group (startups) Anne de Kerckove, CEO of Cisco UK and Ireland Scot Garner, MD of Bain Capital Melissa Bethell and CSO of Liberty Global Jim Ryan. It was chaired by Informa’s own head honcho Stephen Carter (both Telecoms.com and Ovum are part of Informa) and coincidentally the photo of the panel below provides a great illustration of latency at work.
Compare Carter’s hand gesture on the video screen with his one in real-life. His fingers are clearly positioned differently, thus illustrating the slight delay experienced even in the closed network between a camera and screen only a few meters apart.
The panel explored a wide variety of topics stemming from the general digital futures theme. Bethell said equity values are significantly overblown due to the unprecedented period of loose monetary policy (low interest rates) that we’ve had since the 2008 financial SNAFU. Ryan said he thinks the real internet growth opportunities lie not in the advertising model typified by the likes of Google and Facebook, but by finding more stuff people are willing to actually pay for, ideally by subscription.
The whole panel agreed that we’re only at the very start of this latest technological revolution. When asked by Telecoms.com why things weren’t progressing more quickly since everyone seems to be holding their breath for it, some panellists blamed it on a lack of talent and ambition while others pointed out that there’s no point in moving faster than the consumer market can tolerate.