The US digital divide – does anyone have a clue what’s going on?

Depending on who you listen to the severity of the digital divide varies greatly. But with so many different opinions, how do you actually know what is going on? And if you don’t have a clue, how can you possibly solve the problem?

This topic is one which carries a particularly heavy amount of political charge, for good reason might we add, and is not limited to the US. Digital inclusion is a buzzword and objective associated with almost every nation due to the increasingly complex and embedded role digital is having in our lives. Every society should be considering strategies to ensure everyone is taken forward into the digital utopia, but the success of such initiatives is questionable.

Here we are going to have a look at the US market, but not question how successful the political administration and telcos have been at closing the gap, but whether they have the right foundations in the first place. To tackle a problem, you have to actually know what it is, and this is where we feel the industry is failing right now.

First of all, let’s start with the obvious issue. The telcos clearly favour the denser urban environments due to the economics of connectivity; providing customers the internet is an expensive job in the beginning. Not only do you have to buy the materials and the equipment, you have to process planning permission, deal with lawyers and do the dirty-job of civil engineering. But, you also to have to have the confidence customers will buy services off you. When there is such a sparse residential population in a region, it can be difficult to make the equation add up.

This is the issue in the US, and perhaps why the digital divide is so much bigger than somewhere like the UK. The land mass is substantially bigger, there are a huge number of isolated communities and connectivity tariffs are much more expensive. The problem has been compounded every time connectivity infrastructure improves, creating today’s problem of a digital divide.

But, here lies the issue. How do you solve a problem when you have no idea what the extent actually is?

An excellent way to illustrate this is with a road-trip. You know the final destination, as does everyone trying to conquer the digital divide, but if you don’t know the starting point how can you possibly plan the route? You don’t know what obstacles you might encounter on the way to Eden, or even how much money you will need for fuel (investment), how many packets of crisps you’ll need (raw materials such as fibre) or how many friends you’ll need to share time at the wheel (workforce).

The industry is trying to solve a problem when it doesn’t understand what it actually is?

The FCC don’t seem to be helping matters. During Tom Wheeler’s time in-charge of the agency, minimum requirements for universal broadband speeds were tabled at 25 Mbps, though this was then dropped to 10 Mbps by today’s Chairman Ajit Pai. Rumours are these requirements will once again be increased to 25 Mbps.

Not only does this distort the image of how many people have fallen into the digital divide, it messes around with the CAPEX and OPEX plans of the telcos. With higher requirements, more upgrades will be needed, or perhaps it would require a greenfield project. Once you drop the speeds, regions will once again be ignored because they have been deemed served. If you increase these speeds, will the telcos find a loophole to ignore them, or might they unintentionally slip through the net?

Under the 25 Mbps requirements it has been suggested 24 million US customers, just over 7%, fall into the digital divide, though this is an estimate. And of course, this 25 million figure is only meaningful if you judge the digital served customers as those who can theoretically access these products.

A couple of weeks ago, Microsoft released research which suggested the digital divide could be as wide as 150 million people. We suspect Microsoft is stroking the figures, but there will certainly be a difference because of the way the digital divide has been measured.

In the research, Microsoft measured internet usage across the US, including those who have broadband but are not able to surf the web at acceptable speeds. Microsoft considers those in the digital divide as those who are being under-served, or have no internet at all, whereas the FCC seems to be taking the approach of theoretical accessibility. There might be numerous reasons people fall into the digital divide but are not counted by the FCC, price of broadband for example, but this variance shows the issue.

Another excellent example is in Okta’s speed tests across Q2-Q3 which have been released this week. The Okta data suggests a 35.8% increase in mean download speed during the last year, ranking the US as the 7th best worldwide for broadband download speeds. According to this data, average download speed across the US for Q2-Q3 was 96.25 Mbps. This research would suggest everything is rosy in the US and there is no digital divide at all.

As you can see there is no consolidated approach to arguing the digital divide. Before we know it campaigning for the next Presidential Election will begin and the digital divide will become another political tool. Republican’s will massage the figures to make it seem like the four-year period has been a successful one, while Democrat’s will paint a post-apocalyptic image.

And of course, it is not just the politicians who will play these political games. Light Reading’s Carol Wilson pointed out Microsoft has a commercial stake in getting more bandwidth to more people so that more people can access their cloud apps and make them more money. Should we trust this firm to be objective in contributing to the digital divide debate? Even if the digital divide is narrowing, Microsoft will want to paint a gloomy picture to encourage more investment as this would increase its own commercial prospects.

The issue which is at the heart of the digital divide is investment and infrastructure. The telcos need to be incentivised to put networks in place, irrelevant as to the commercial rewards from the customer. Seeing at this bridge is being built at a snail’s pace, you would have to assume the current structure and depth of federal subsidies is simply not good enough.

The final complication to point out is the future. Ovum’s Kristin Paulin pointed out those in the digital divide are only those who are passed by fixed wireless, not taking into account almost every US citizen has access to one of the four LTE networks. Fixed Wireless Access will certainly play a role in the future of broadband, but whether this is enough to satisfy the increasingly intensifying data diets of users is unknown. 5G will certainly assist, but you have to wonder how long it will take to get 5G to the regions which are suffering in the divide today.

Paulin points to the affordability question as well. With the FCC only counting those US citizens who cannot access the internet in the digital divide, who knows how many citizens there are who can’t afford broadband. A New York times article from 2016 suggested the average broadband tariff was $55 a month, meaning 25% of the city, and 50% of those who earned under $20,000 would not be able to afford broadband. The Lifeline broadband initiative project is supposed to help here, but Paulin politely stated this is suffering some hiccups right now.

If citizens cannot afford broadband, is this even a solution? It’s like trying to sell a starving man, with $10 in his wallet, a sandwich for $25. What’s the point?

Mobile broadband might well be the answer, Nokia certainly believes a fibre network with wireless wings is the answer, though progress is slow here. Congestion is increasingly becoming a problem, while video, multi-screen and IOT trends will only make the matter more complicated.

As it stands, the digital divide is a political ping-pong ball being battered as it ducks and dives all over the landscape. But, the US technology industry needs to ask itself a very honest question; how big is the digital divide? Right now, we’re none the wiser, and it will never be narrowed without understanding the problem in the first place.

AI plays critical role in network security, according to BT boffin

Artificial intelligence (AI) is going to play a critical role in network security in the coming years and is already helping BT defend its infrastructure.

Ben Azvine, the Global Head of Security Research & Innovation at BT, has been at the heart of cutting-edge network security developments at BT for several years and has helped develop a cybersecurity strategy that combines AI-enabled visualization of cybersecurity threats with highly-trained network security personnel. He shared some of his thoughts on the matter with attendees at this week’s Broadband World Forum event.

“We are taking AI and making it help humans to be better… We are more about the Iron Man version of AI than the Terminator version,” he said, sparking ludicrous cinematic pitch ideas in the minds of some of his audience (I mean, Alien vs Predator sort of worked, right?).

Azvine pointed out that with the number of connected devices growing rapidly, old ways of securing assets were no longer relevant: Now, companies (including network operators) need to think about having a cybersecurity strategy comprising three steps – prevention, detection/prediction and response. The response needs to be much quicker than in the past (hours, not days) while the detection/prediction is tough to do without sophisticated analytics and AI algorithms.

What BT is doing is a great example of analytics and AI in action in the communications networking sector, rather than AI as a marketing hype machine — see ‘Why BT’s Security Chief Is Attacking His Own Network’ for more details.

But security is just one of seven key telecom AI use cases, as identified in a recent report, Artificial Intelligence for Telecommunications Applications, from research house Tractica (a sister company to Telecoms.com).

That report identified the seven main use cases as:

1) Network operations monitoring and management

2) Predictive maintenance

3) Fraud mitigation

4) Cybersecurity

5) Customer service and marketing virtual digital assistants (or ‘bots’)

6) Intelligent CRM systems

7) Customer Experience Management.

“The low hanging fruit seems to be chat bots to augment call center workers,” said Heavy Reading Senior Analyst James Crawshaw, who will be one of the expert moderators digging deeper into the use of AI tools by telcos during Light Reading’s upcoming ‘Software-Defined Operations & the Autonomous Network’ event.

“The more challenging stuff is making use of machine learning in network management. That’s still a science project for most operators — Verizon’s Matt Tegerdine was pretty frank about that in his recent interview with Light Reading. (See Verizon: Vendor AI Not Ready for Prime Time).

That analysis from the Verizon executive shows it’s still early days for the application of machine learning in production communications networks. And, as Crawshaw noted, AI is not a magic wand and can’t be applied to anything and everything. “It can be applied to the same things you would apply other branches of mathematics to, such as statistics. But it’s only worth using if it brings some advantage over simpler techniques. You need to have clean data and a clear question you are seeking to answer — you can’t just invoke machine learning to magically making everything good,” adds the analyst, bringing a Harry Potter element to the proceedings.

So what should network operators be ding to take advantage of AI capabilities? BT appears to have set a good example by hiring experts, investing in R&D, applying AI tools in a very focused way (on its cybersecurity processes) and combining the resulting processes with human intelligence and know-how.   “You don’t need to recruit an army of data scientists to take advantage of machine learning,” said Crawshaw. “Nor should you remain totally reliant on third parties. Develop a core team of experts and then get business analysts to leverage their expertise into the wider organisation.”

Blockchain Set to Play Key Role in Telco Operations: Analyst

Blockchain technology is set to be used by telcos in multiple applications across all areas of operations in coming years, according to an industry analyst who has delved into the potential use of the digital ledger technology (DLT) in the space.

James Crawshaw, a senior analyst at Heavy Reading, says communications service providers (CSPs) see significant potential for the use of the much-hyped technology, which is best known for underpinning cryptocurrencies such as Bitcoin.

“Today, CSPs use databases for thousands of applications. Blockchain might reach dozens of applications in the next few years. Examples include mobile number portability, SLA monitoring, or replacing CDRs for billing,” says the analyst, who describes blockchain, in essence, as a “decentralized, immutable electronic ledger; a write-once-read-many record of historical transactions, as opposed to a database that can be written over.”

Currently, CSPs are considering using blockchain in three key areas, according to Crawshaw:

  1. Fraud management: for roaming and subscription identity fraud.
  2. Identity management: storing identity transactions (network logins, purchases, etc.).
  3. IoT connectivity: a blockchain could enable secure and error-free peer-to-peer connectivity for thousands of IoT devices with cost-efficient self-managed networks.

Crawshaw examined those use cases in depth in a recent report, Blockchain Opportunities for CSPs: Separating Hype From Reality.

And while there is a certain level of marketing enthusiasm around blockchain currently, that shouldn’t get in the way of real-world tests and deployments, notes the analyst.

“Like all complex new technologies there is a degree of hype and bandwagon-jumping with blockchain. Its main purpose is as an alternative to centralized systems for recording information (primarily databases). By distributing the control, you eliminate the risk of a hack of the central controller and the information being altered fraudulently.  By using clever computer science you can replace the central controller (and the fees they normally charge) with software and get a cheaper, more reliable solution. But in most cases where we use a database today we will continue to use them in the future,” notes Crawshaw.

So which CSPs are taking the lead with the exploration of blockchain as a useful tool? Colt is one network operator that has been taking a close look at multiple ways to exploit blockchain’s potential for some time.

The operator, in collaboration with Zeetta Networks, is also set to deliver a proof-of-concept demonstration of a blockchain-based offering that enables network carriers to buy and sell network services in a secure, distributed marketplace. That PoC will be unveiled at the upcoming MEF2018 show in Los Angeles.

And Colt is one of the operators participating in a panel discussion – What Opportunities Are There For Blockchain In Telecoms & How Can These Aid Automation? – on November 8 in London as part of Light Reading’s ‘Software-Defined Operations & the Autonomous Network’ event. PCCW Global and Telefónica will also be involved in that discussion.

There are also a number of industry initiatives involving multiple CSPs: The key ones related to blockchain are:

  • The Carrier Blockchain Study Group, which counts Axiata, Etisalat, Far EasTone, KT, LG Uplus, PLDT, SoftBank, Sprint, Telin, Turkcell, Viettel and Zain among its participants
  • The Mobile Authentication Taskforce, which includes AT&T, Sprint, T-Mobile and Verizon
  • The International Telecoms Week Global Leaders’ Forum, in which BT, HGC Global Communications, Telefónica and Telstra are involved.

In time, blockchain might be joined in CSP back offices by other DLTs. “Blockchain is a particular type of DLT that uses cryptographically hashed blocks to record transactions in a time series or chain. If security is less of an issue you could use a simpler DLT. But then again, you might just use a regular database,” notes Crawshaw.