Now with added video!
Telecoms.com periodically invites other scribes to share their views on the industry’s most pressing issues. In this piece Ray Le Maistre, Editor-in-Chief at Light Reading, notes that with mobile operators and vendors scrambling for new business opportunities, the burgeoning wireless private networks sector could be a mini honey pot.
One of the biggest, and hardest to answer, questions asked every day at telecoms operators and vendors concerns new business opportunities – the identity of new revenue streams.
That question has been highlighted even more in the past few years as operators, in particular, seek to justify their 5G investments and develop a return on investment (ROI) plan that combines lower opex with additional revenues.
There won’t be a single answer to that question, of course: It will be a mixed bag of savings (hopefully!) and sales opportunities.
One of the more immediate revenue stream opportunities right now is wireless private networks, and the good news is that this opportunity doesn’t require 5G. Instead, the potential looks set to be enhanced by the availability of a full set of 5G standards (including the yet-to-be concluded core network specs) and the maturity of associated technology.
In the meantime, 4G/LTE has already been the cellular foundation for an increasingly thriving wireless private networks sector that, according to ABI Research, will be worth $16.3 billion by 2025 (see https://www.abiresearch.com/press/private-lte-will-be-us163-billion-opportunity-2025-and-foundation-5g-services-end-vertical-markets/).
Another market sizing prediction, this time by SNS Telecom & IT, pitches annual spending on private 4G and 5G networks at $4.7 billion by the end of 2020 and almost $8 billion by 2023.
However this plays out, there’s clear anticipation of growing investment. What’s particularly interesting, though, is which organizations might pocket that investment. That’s because enterprises and/or organizations looking to benefit from having a private wireless network have a number of options once they decide to move ahead with a private network – here are three permutations that look most likely to me:
- Build and run it themselves – technology vendors get some sales in this instance
- Outsource the network planning, construction and possibly even the day-to-day. management of the network to a systems integrator (SI) – the SI and some vendors get the spoils. It’s possible here, of course, that the SI could be a technology vendor.
- Outsource to a mobile network operator – the operator and some vendors will get some greenbacks.
For sure there will be other permutations, but it shows how many different parts of the ecosystem have some skin in the game, which is what makes this sector so interesting.
What’s also interesting, of course, is what the enterprises do with their private networks: Does it enhance operations? Help reduce costs? Create new business opportunities? All of the above?
Let’s not forget the role of the regulators in all of this. In the US the private wireless sector has been given a shot in the arm by the availability of CBRS (Citizens Broadband Radio Service) shared spectrum in the currently unlicensed 3.5 GHz band: This has given rise to numerous trials and deployments in locations such as sports stadiums, Times Square and even prisons.
In Germany, the regulator has set aside 100MHz of 5G spectrum for private, industrial networks has caused a storm and even led to accusations from the mobile operators that the move ramped up the cost of licenses in the spectrum auction held earlier this year. (See Telcos complain about auction as German regulator bags €6.5bn https://telecoms.com/497897/telcos-complain-about-auction-as-german-regulator-bags-e6-5bn/)
In the UK, Ofcom is making spectrum available in four bands:
- the 1800 MHz and 2300 MHz shared spectrum bands, which are currently used for mobile services;
- the 3.8-4.2 GHz band, which supports 5G services, and
- the 26 GHz band, which has also been identified as one of the main bands for 5G in the future.
The process to enable companies and organizations (Ofcom has identified manufacturers, business parks, holiday/theme parks and farms as potential users) in the UK to apply for spectrum will go live before the end of this year, with Ofcom believing that thousands of private networks could be up and running in the coming years.
Network infrastructure giant Nokia believes more than a dozen countries are planning to allocate spectrum specifically for private wireless networks, and that will suit the Finnish vendor just fine as it’s building a sizeable business by focussing on the specific needs of such buildouts.
It recently boasted that it already supports more than 120 private wireless networks around the world, with its customers including 24 in transportation, 35 in the energy sector, 32 in public sector and smart cities, and 11 in manufacturing and logistics. The company believes that, potentially, the number of private network base stations could dwarf the 7 million deployed by the world’s commercial mobile network operators, possibly by as much as twofold.
Nokia isn’t alone in spying this business opportunity, of course, although it has clearly built itself a solid foundation on which to build: Sweden’s Ericsson, as you’d expect, is also hot for such business, while specialists such as Redline and Federated Wireless are also chasing business.
So where is this market heading? And what are the drivers for further private network rollout? That’s something I’ll be checking into at the Private Networks in a 5G World (https://tmt.knect365.com/private-networks/) event in London (November 26-27), where the likes of Shell, Heathrow Airport, the Antwerp Port Authority and multiple major mobile network operators and technology developers will be sharing their stories.
Telecoms.com periodically invites third wheels to share their views on the industry’s most pressing issues. In this piece Ray Le Maistre, Editor-in-Chief at Light Reading notes that the traditional telecom community needs to look in the mirror and ask itself some pretty tough questions.
Thirty years ago, telcos and their technology suppliers were living a life of relative ease and luxury, milking the early days of cellular/mobile (3G was in pre-launch hype mode!) and able to take years to make strategic decisions without fear of having the profit rug pulled from under their feet.
Not any more.
Communications service providers and the attendant vendor community are currently in a state of controlled panic, going about their daily business while the foundations of the industry that used to prop up the balance sheet become ever more eroded by competition, regulation and a pace of business model change that has left them in a spin.
Currently, most are clinging to 5G as a potential saviour and the catalyst for major change. But change what? And how? The industry has to answer some very tough questions right now and figure out a new plan of action – doing nothing will only accelerate their demise.
The list of potential questions is ‘super long’ as the youth of today might say, but my close friend, Mr J. Daniels of Tennessee, and I have come up with a few that we think are worth exploring to get the ball rolling, including: Should telcos own media/video/tv companies?; Should telcos be banks?; Is it possible to get a decent cup of coffee at a telecoms trade show? (OK, so that last one got axed from the final list, but I still think it’s a key industry concern…)
We’ve compiled them in the form of a brief survey that I hope you’ll find 2 minutes to complete, so we can get a sense of where you, the people actually going on this journey, might be heading.
The survey is here: https://www.surveygizmo.com/s3/5327993/2020-Vision-Summit-Survey-2019
The results will be compiled during early December and we’ll be sharing key findings initially with attendees at Light Reading’s annual 2020 Vision Executive Summit in Vienna and then with the wider world shortly after that: All answers are anonymous, so there’s no comeback or any chance of being hounded by related spam.
I hope you’ll share your view with us and check back in December to see what the broader community thinks.
Depending on who you listen to the severity of the digital divide varies greatly. But with so many different opinions, how do you actually know what is going on? And if you don’t have a clue, how can you possibly solve the problem?
This topic is one which carries a particularly heavy amount of political charge, for good reason might we add, and is not limited to the US. Digital inclusion is a buzzword and objective associated with almost every nation due to the increasingly complex and embedded role digital is having in our lives. Every society should be considering strategies to ensure everyone is taken forward into the digital utopia, but the success of such initiatives is questionable.
Here we are going to have a look at the US market, but not question how successful the political administration and telcos have been at closing the gap, but whether they have the right foundations in the first place. To tackle a problem, you have to actually know what it is, and this is where we feel the industry is failing right now.
First of all, let’s start with the obvious issue. The telcos clearly favour the denser urban environments due to the economics of connectivity; providing customers the internet is an expensive job in the beginning. Not only do you have to buy the materials and the equipment, you have to process planning permission, deal with lawyers and do the dirty-job of civil engineering. But, you also to have to have the confidence customers will buy services off you. When there is such a sparse residential population in a region, it can be difficult to make the equation add up.
This is the issue in the US, and perhaps why the digital divide is so much bigger than somewhere like the UK. The land mass is substantially bigger, there are a huge number of isolated communities and connectivity tariffs are much more expensive. The problem has been compounded every time connectivity infrastructure improves, creating today’s problem of a digital divide.
But, here lies the issue. How do you solve a problem when you have no idea what the extent actually is?
An excellent way to illustrate this is with a road-trip. You know the final destination, as does everyone trying to conquer the digital divide, but if you don’t know the starting point how can you possibly plan the route? You don’t know what obstacles you might encounter on the way to Eden, or even how much money you will need for fuel (investment), how many packets of crisps you’ll need (raw materials such as fibre) or how many friends you’ll need to share time at the wheel (workforce).
The industry is trying to solve a problem when it doesn’t understand what it actually is?
The FCC don’t seem to be helping matters. During Tom Wheeler’s time in-charge of the agency, minimum requirements for universal broadband speeds were tabled at 25 Mbps, though this was then dropped to 10 Mbps by today’s Chairman Ajit Pai. Rumours are these requirements will once again be increased to 25 Mbps.
Not only does this distort the image of how many people have fallen into the digital divide, it messes around with the CAPEX and OPEX plans of the telcos. With higher requirements, more upgrades will be needed, or perhaps it would require a greenfield project. Once you drop the speeds, regions will once again be ignored because they have been deemed served. If you increase these speeds, will the telcos find a loophole to ignore them, or might they unintentionally slip through the net?
Under the 25 Mbps requirements it has been suggested 24 million US customers, just over 7%, fall into the digital divide, though this is an estimate. And of course, this 25 million figure is only meaningful if you judge the digital served customers as those who can theoretically access these products.
A couple of weeks ago, Microsoft released research which suggested the digital divide could be as wide as 150 million people. We suspect Microsoft is stroking the figures, but there will certainly be a difference because of the way the digital divide has been measured.
In the research, Microsoft measured internet usage across the US, including those who have broadband but are not able to surf the web at acceptable speeds. Microsoft considers those in the digital divide as those who are being under-served, or have no internet at all, whereas the FCC seems to be taking the approach of theoretical accessibility. There might be numerous reasons people fall into the digital divide but are not counted by the FCC, price of broadband for example, but this variance shows the issue.
Another excellent example is in Okta’s speed tests across Q2-Q3 which have been released this week. The Okta data suggests a 35.8% increase in mean download speed during the last year, ranking the US as the 7th best worldwide for broadband download speeds. According to this data, average download speed across the US for Q2-Q3 was 96.25 Mbps. This research would suggest everything is rosy in the US and there is no digital divide at all.
As you can see there is no consolidated approach to arguing the digital divide. Before we know it campaigning for the next Presidential Election will begin and the digital divide will become another political tool. Republican’s will massage the figures to make it seem like the four-year period has been a successful one, while Democrat’s will paint a post-apocalyptic image.
And of course, it is not just the politicians who will play these political games. Light Reading’s Carol Wilson pointed out Microsoft has a commercial stake in getting more bandwidth to more people so that more people can access their cloud apps and make them more money. Should we trust this firm to be objective in contributing to the digital divide debate? Even if the digital divide is narrowing, Microsoft will want to paint a gloomy picture to encourage more investment as this would increase its own commercial prospects.
The issue which is at the heart of the digital divide is investment and infrastructure. The telcos need to be incentivised to put networks in place, irrelevant as to the commercial rewards from the customer. Seeing at this bridge is being built at a snail’s pace, you would have to assume the current structure and depth of federal subsidies is simply not good enough.
The final complication to point out is the future. Ovum’s Kristin Paulin pointed out those in the digital divide are only those who are passed by fixed wireless, not taking into account almost every US citizen has access to one of the four LTE networks. Fixed Wireless Access will certainly play a role in the future of broadband, but whether this is enough to satisfy the increasingly intensifying data diets of users is unknown. 5G will certainly assist, but you have to wonder how long it will take to get 5G to the regions which are suffering in the divide today.
Paulin points to the affordability question as well. With the FCC only counting those US citizens who cannot access the internet in the digital divide, who knows how many citizens there are who can’t afford broadband. A New York times article from 2016 suggested the average broadband tariff was $55 a month, meaning 25% of the city, and 50% of those who earned under $20,000 would not be able to afford broadband. The Lifeline broadband initiative project is supposed to help here, but Paulin politely stated this is suffering some hiccups right now.
If citizens cannot afford broadband, is this even a solution? It’s like trying to sell a starving man, with $10 in his wallet, a sandwich for $25. What’s the point?
Mobile broadband might well be the answer, Nokia certainly believes a fibre network with wireless wings is the answer, though progress is slow here. Congestion is increasingly becoming a problem, while video, multi-screen and IOT trends will only make the matter more complicated.
As it stands, the digital divide is a political ping-pong ball being battered as it ducks and dives all over the landscape. But, the US technology industry needs to ask itself a very honest question; how big is the digital divide? Right now, we’re none the wiser, and it will never be narrowed without understanding the problem in the first place.
Artificial intelligence (AI) is going to play a critical role in network security in the coming years and is already helping BT defend its infrastructure.
Ben Azvine, the Global Head of Security Research & Innovation at BT, has been at the heart of cutting-edge network security developments at BT for several years and has helped develop a cybersecurity strategy that combines AI-enabled visualization of cybersecurity threats with highly-trained network security personnel. He shared some of his thoughts on the matter with attendees at this week’s Broadband World Forum event.
“We are taking AI and making it help humans to be better… We are more about the Iron Man version of AI than the Terminator version,” he said, sparking ludicrous cinematic pitch ideas in the minds of some of his audience (I mean, Alien vs Predator sort of worked, right?).
Azvine pointed out that with the number of connected devices growing rapidly, old ways of securing assets were no longer relevant: Now, companies (including network operators) need to think about having a cybersecurity strategy comprising three steps – prevention, detection/prediction and response. The response needs to be much quicker than in the past (hours, not days) while the detection/prediction is tough to do without sophisticated analytics and AI algorithms.
What BT is doing is a great example of analytics and AI in action in the communications networking sector, rather than AI as a marketing hype machine — see ‘Why BT’s Security Chief Is Attacking His Own Network’ for more details.
But security is just one of seven key telecom AI use cases, as identified in a recent report, Artificial Intelligence for Telecommunications Applications, from research house Tractica (a sister company to Telecoms.com).
That report identified the seven main use cases as:
1) Network operations monitoring and management
2) Predictive maintenance
3) Fraud mitigation
5) Customer service and marketing virtual digital assistants (or ‘bots’)
6) Intelligent CRM systems
7) Customer Experience Management.
“The low hanging fruit seems to be chat bots to augment call center workers,” said Heavy Reading Senior Analyst James Crawshaw, who will be one of the expert moderators digging deeper into the use of AI tools by telcos during Light Reading’s upcoming ‘Software-Defined Operations & the Autonomous Network’ event.
“The more challenging stuff is making use of machine learning in network management. That’s still a science project for most operators — Verizon’s Matt Tegerdine was pretty frank about that in his recent interview with Light Reading. (See Verizon: Vendor AI Not Ready for Prime Time).
That analysis from the Verizon executive shows it’s still early days for the application of machine learning in production communications networks. And, as Crawshaw noted, AI is not a magic wand and can’t be applied to anything and everything. “It can be applied to the same things you would apply other branches of mathematics to, such as statistics. But it’s only worth using if it brings some advantage over simpler techniques. You need to have clean data and a clear question you are seeking to answer — you can’t just invoke machine learning to magically making everything good,” adds the analyst, bringing a Harry Potter element to the proceedings.
So what should network operators be ding to take advantage of AI capabilities? BT appears to have set a good example by hiring experts, investing in R&D, applying AI tools in a very focused way (on its cybersecurity processes) and combining the resulting processes with human intelligence and know-how. “You don’t need to recruit an army of data scientists to take advantage of machine learning,” said Crawshaw. “Nor should you remain totally reliant on third parties. Develop a core team of experts and then get business analysts to leverage their expertise into the wider organisation.”
Blockchain technology is set to be used by telcos in multiple applications across all areas of operations in coming years, according to an industry analyst who has delved into the potential use of the digital ledger technology (DLT) in the space.
James Crawshaw, a senior analyst at Heavy Reading, says communications service providers (CSPs) see significant potential for the use of the much-hyped technology, which is best known for underpinning cryptocurrencies such as Bitcoin.
“Today, CSPs use databases for thousands of applications. Blockchain might reach dozens of applications in the next few years. Examples include mobile number portability, SLA monitoring, or replacing CDRs for billing,” says the analyst, who describes blockchain, in essence, as a “decentralized, immutable electronic ledger; a write-once-read-many record of historical transactions, as opposed to a database that can be written over.”
Currently, CSPs are considering using blockchain in three key areas, according to Crawshaw:
- Fraud management: for roaming and subscription identity fraud.
- Identity management: storing identity transactions (network logins, purchases, etc.).
- IoT connectivity: a blockchain could enable secure and error-free peer-to-peer connectivity for thousands of IoT devices with cost-efficient self-managed networks.
Crawshaw examined those use cases in depth in a recent report, Blockchain Opportunities for CSPs: Separating Hype From Reality.
And while there is a certain level of marketing enthusiasm around blockchain currently, that shouldn’t get in the way of real-world tests and deployments, notes the analyst.
“Like all complex new technologies there is a degree of hype and bandwagon-jumping with blockchain. Its main purpose is as an alternative to centralized systems for recording information (primarily databases). By distributing the control, you eliminate the risk of a hack of the central controller and the information being altered fraudulently. By using clever computer science you can replace the central controller (and the fees they normally charge) with software and get a cheaper, more reliable solution. But in most cases where we use a database today we will continue to use them in the future,” notes Crawshaw.
So which CSPs are taking the lead with the exploration of blockchain as a useful tool? Colt is one network operator that has been taking a close look at multiple ways to exploit blockchain’s potential for some time.
The operator, in collaboration with Zeetta Networks, is also set to deliver a proof-of-concept demonstration of a blockchain-based offering that enables network carriers to buy and sell network services in a secure, distributed marketplace. That PoC will be unveiled at the upcoming MEF2018 show in Los Angeles.
And Colt is one of the operators participating in a panel discussion – What Opportunities Are There For Blockchain In Telecoms & How Can These Aid Automation? – on November 8 in London as part of Light Reading’s ‘Software-Defined Operations & the Autonomous Network’ event. PCCW Global and Telefónica will also be involved in that discussion.
There are also a number of industry initiatives involving multiple CSPs: The key ones related to blockchain are:
- The Carrier Blockchain Study Group, which counts Axiata, Etisalat, Far EasTone, KT, LG Uplus, PLDT, SoftBank, Sprint, Telin, Turkcell, Viettel and Zain among its participants
- The Mobile Authentication Taskforce, which includes AT&T, Sprint, T-Mobile and Verizon
- The International Telecoms Week Global Leaders’ Forum, in which BT, HGC Global Communications, Telefónica and Telstra are involved.
In time, blockchain might be joined in CSP back offices by other DLTs. “Blockchain is a particular type of DLT that uses cryptographically hashed blocks to record transactions in a time series or chain. If security is less of an issue you could use a simpler DLT. But then again, you might just use a regular database,” notes Crawshaw.