Depending on who you listen to the severity of the digital divide varies greatly. But with so many different opinions, how do you actually know what is going on? And if you don’t have a clue, how can you possibly solve the problem?
This topic is one which carries a particularly heavy amount of political charge, for good reason might we add, and is not limited to the US. Digital inclusion is a buzzword and objective associated with almost every nation due to the increasingly complex and embedded role digital is having in our lives. Every society should be considering strategies to ensure everyone is taken forward into the digital utopia, but the success of such initiatives is questionable.
Here we are going to have a look at the US market, but not question how successful the political administration and telcos have been at closing the gap, but whether they have the right foundations in the first place. To tackle a problem, you have to actually know what it is, and this is where we feel the industry is failing right now.
First of all, let’s start with the obvious issue. The telcos clearly favour the denser urban environments due to the economics of connectivity; providing customers the internet is an expensive job in the beginning. Not only do you have to buy the materials and the equipment, you have to process planning permission, deal with lawyers and do the dirty-job of civil engineering. But, you also to have to have the confidence customers will buy services off you. When there is such a sparse residential population in a region, it can be difficult to make the equation add up.
This is the issue in the US, and perhaps why the digital divide is so much bigger than somewhere like the UK. The land mass is substantially bigger, there are a huge number of isolated communities and connectivity tariffs are much more expensive. The problem has been compounded every time connectivity infrastructure improves, creating today’s problem of a digital divide.
But, here lies the issue. How do you solve a problem when you have no idea what the extent actually is?
An excellent way to illustrate this is with a road-trip. You know the final destination, as does everyone trying to conquer the digital divide, but if you don’t know the starting point how can you possibly plan the route? You don’t know what obstacles you might encounter on the way to Eden, or even how much money you will need for fuel (investment), how many packets of crisps you’ll need (raw materials such as fibre) or how many friends you’ll need to share time at the wheel (workforce).
The industry is trying to solve a problem when it doesn’t understand what it actually is?
The FCC don’t seem to be helping matters. During Tom Wheeler’s time in-charge of the agency, minimum requirements for universal broadband speeds were tabled at 25 Mbps, though this was then dropped to 10 Mbps by today’s Chairman Ajit Pai. Rumours are these requirements will once again be increased to 25 Mbps.
Not only does this distort the image of how many people have fallen into the digital divide, it messes around with the CAPEX and OPEX plans of the telcos. With higher requirements, more upgrades will be needed, or perhaps it would require a greenfield project. Once you drop the speeds, regions will once again be ignored because they have been deemed served. If you increase these speeds, will the telcos find a loophole to ignore them, or might they unintentionally slip through the net?
Under the 25 Mbps requirements it has been suggested 24 million US customers, just over 7%, fall into the digital divide, though this is an estimate. And of course, this 25 million figure is only meaningful if you judge the digital served customers as those who can theoretically access these products.
A couple of weeks ago, Microsoft released research which suggested the digital divide could be as wide as 150 million people. We suspect Microsoft is stroking the figures, but there will certainly be a difference because of the way the digital divide has been measured.
In the research, Microsoft measured internet usage across the US, including those who have broadband but are not able to surf the web at acceptable speeds. Microsoft considers those in the digital divide as those who are being under-served, or have no internet at all, whereas the FCC seems to be taking the approach of theoretical accessibility. There might be numerous reasons people fall into the digital divide but are not counted by the FCC, price of broadband for example, but this variance shows the issue.
Another excellent example is in Okta’s speed tests across Q2-Q3 which have been released this week. The Okta data suggests a 35.8% increase in mean download speed during the last year, ranking the US as the 7th best worldwide for broadband download speeds. According to this data, average download speed across the US for Q2-Q3 was 96.25 Mbps. This research would suggest everything is rosy in the US and there is no digital divide at all.
As you can see there is no consolidated approach to arguing the digital divide. Before we know it campaigning for the next Presidential Election will begin and the digital divide will become another political tool. Republican’s will massage the figures to make it seem like the four-year period has been a successful one, while Democrat’s will paint a post-apocalyptic image.
And of course, it is not just the politicians who will play these political games. Light Reading’s Carol Wilson pointed out Microsoft has a commercial stake in getting more bandwidth to more people so that more people can access their cloud apps and make them more money. Should we trust this firm to be objective in contributing to the digital divide debate? Even if the digital divide is narrowing, Microsoft will want to paint a gloomy picture to encourage more investment as this would increase its own commercial prospects.
The issue which is at the heart of the digital divide is investment and infrastructure. The telcos need to be incentivised to put networks in place, irrelevant as to the commercial rewards from the customer. Seeing at this bridge is being built at a snail’s pace, you would have to assume the current structure and depth of federal subsidies is simply not good enough.
The final complication to point out is the future. Ovum’s Kristin Paulin pointed out those in the digital divide are only those who are passed by fixed wireless, not taking into account almost every US citizen has access to one of the four LTE networks. Fixed Wireless Access will certainly play a role in the future of broadband, but whether this is enough to satisfy the increasingly intensifying data diets of users is unknown. 5G will certainly assist, but you have to wonder how long it will take to get 5G to the regions which are suffering in the divide today.
Paulin points to the affordability question as well. With the FCC only counting those US citizens who cannot access the internet in the digital divide, who knows how many citizens there are who can’t afford broadband. A New York times article from 2016 suggested the average broadband tariff was $55 a month, meaning 25% of the city, and 50% of those who earned under $20,000 would not be able to afford broadband. The Lifeline broadband initiative project is supposed to help here, but Paulin politely stated this is suffering some hiccups right now.
If citizens cannot afford broadband, is this even a solution? It’s like trying to sell a starving man, with $10 in his wallet, a sandwich for $25. What’s the point?
Mobile broadband might well be the answer, Nokia certainly believes a fibre network with wireless wings is the answer, though progress is slow here. Congestion is increasingly becoming a problem, while video, multi-screen and IOT trends will only make the matter more complicated.
As it stands, the digital divide is a political ping-pong ball being battered as it ducks and dives all over the landscape. But, the US technology industry needs to ask itself a very honest question; how big is the digital divide? Right now, we’re none the wiser, and it will never be narrowed without understanding the problem in the first place.