A look back at the biggest stories this week

Whether it’s important, depressing or just entertaining, the telecoms industry is always one which attracts attention.

Here are the stories we think are worth a second look at this week:


GSMA cosies up to O-RAN Alliance

The GSMA, the telco industry lobby group, has announced a new partnership with the O-RAN Alliance to accelerate the adoption of Open Radio Access Network (RAN) technologies.

Full story here


Europe backtracks on market consolidation opposition

The General Court of the European Court of Justice has annulled a decision made in 2016 to block the merger between O2 and Three in the UK, potentially opening the door for consolidation.

Full story here


Huawei CFO loses first legal battle in extradition case

Huawei CFO Wanzhou Meng, the daughter of Ren Zhengfei, has lost her first legal battle in Canada and will now have to face an extradition case.

Full story here


Data privacy is in the same position as cybersecurity five years ago

It has taken years for the technology and telecoms industry to take security seriously, and now we are at the beginning of the same story arc with privacy.

Full story here


Indian telco association pushes for ‘floor tariffs’ on data pricing

In an open letter to India’s telecoms regulator, the Cellular Operators Association of India (COAI) has pressed for quicker decision making on pricing restriction rules.

Full story here


UK’s National Cyber Security Centre launches another Huawei probe

The National Cyber Security Centre (NCSC) has confirmed it is attempting to understand what impact potential US sanction directed towards Huawei would have on UK networks.

Full story here


 

Making Sense of the Telco Cloud

In recent years the cloudification of communication networks, or “telco cloud” has become a byword for telecom modernisation. This Telecoms.com Intelligence Monthly Briefing aims to analyse what telcos’ transition to cloud means to the stakeholders in the telecom and cloud ecosystems. Before exploring the nooks and crannies of telco cloud, however, it is worthwhile first taking an elevated view of cloud native in general. On one hand, telco cloud is a subset of the overall cloud native landscape, on the other, telco cloud almost sounds an oxymoron. Telecom operator’s monolithic networks and cloud architecture are often seen as two different species, but such impressions are wrong.

(Here we are sharing the opening section of this Telecoms.com Intelligence special briefing to look into how telco cloud has changing both the industry landscape and operator strategies.

The full version of the report is available for free to download here.)

What cloud native is, and why we need it

“Cloud native” have been buzz words for a couple of years though often, like with many other buzz words, different people mean many different things when they use the same term. As the authors of a recently published Microsoft ebook quipped, ask ten colleagues to define cloud native, and there’s good chance you’ll get eight different answers. (Rob Vettor, Steve “ardalis” Smith: Architecting Cloud Native .NET Applications for Azure, preview edition, April 2020)

Here are a couple of “cloud native” definitions that more or less agree with each other, though with different stresses.

The Cloud Native Computing Foundation (CNCF), an industry organisation with over 500 member organisations from different sectors of the industry, defines cloud native as “computing (that) uses an open source software stack to deploy applications as microservices, packaging each part into its own container, and dynamically orchestrating those containers to optimize resource utilization.”

Gabriel Brown, an analyst from Heavy Reading, has a largely similar definition for cloud native, though he puts it more succinctly. For him, cloud native means “containerized micro-services deployed on bare metal and managed by Kubernetes”, the de facto standard of container management.

Although cloud native has a strong inclination towards containers, or containerised services, it is not just about containers. An important element of cloud native computing is in its deployment mode using DevOps. This is duly stressed by Omdia, a research firm, which prescribes cloud native as “the first foundation is to use agile methodologies in development, building on this with DevOps adoption across IT and, ideally, in the organization as well, and using microservices software architecture, with deployment on the cloud (wherever it is, on-premises or public).”

Some would argue the continuous nature of DevOps is as important to cloud native as the infrastructure and containerised services. Red Hat, an IBM subsidiary and one of the leading cloud native vendors and champions for DevOps practices, sees cloud native in a number of common themes including “heavily virtualized, software-defined, highly resilient infrastructure, allowing telcos to add services more quickly and centrally manage their resources.”

These themes are aligned with the understanding of cloud native by Telecoms.com Intelligence, and this report will discuss cloud native and telco cloud along this line. (A full Q&A with Azhar Sayeed, Chief Architect, Service Provider at Red Hat can be found at the end of this report).

The main benefits of cloud native computing are speed, agility, and scalability. As CNCF spells it out, “cloud native technologies empower organizations to build and run scalable applications in modern, dynamic environments such as public, private, and hybrid clouds. Containers, service meshes, microservices, immutable infrastructure, and declarative APIs exemplify this approach. These techniques enable loosely coupled systems that are resilient, manageable, and observable. Combined with robust automation, they allow engineers to make high-impact changes frequently and predictably with minimal toil.”

To adapt such thinking to the telecom industry, the gains from migrating to cloud native are primarily a reflection of, and driven by, the increasing convergence between network and IT domains. The first candidate domain that cloud technology can vastly improve on, and to a certain degree replace the heavy infrastructure, is the support for the telcos’ own IT systems, including the network facing Operational Support Systems and customer facing Business Support System (OSS and BSS).

But IT cloud alone is far from what telcos can benefit from the migration to cloud native. The rest of this report will discuss how telcos can and do embark on the journey to cloud native, as a means to deliver true business benefits through improved speed, agility, and scalability to their own networks and their customers.

The rest of the report include these sections:

  • The many stratifications of telco cloud
  • Clouds gathering on telcos
  • What we can expect to see on the telco cloud skyline
  • Telco cloud openness leads to agility and savings — Q&A with Azhar Sayeed, Chief Architect, Service Provider, Red Hat
  • Additional Resources

The full version of the report is available for free to download here.

US consumers don’t feel there are benefits to data-sharing economy

Only 7.6% of US consumers feel they get the benefits of user tracking behavioural data, as research demonstrates pessimism towards the digital economy.

The reason companies want to track existing or potential customers, while also collecting insight on these individuals, is simple; it is simpler to sell goods and services to someone you know more about. But, in order to do something for free, you have to offer a benefit. This equation does not seem to be balanced currently.

Research from AI firm Cujo suggest 64.2% of the surveyed consumers do not believe tracking is beneficial to the user, while only 28.2% said it could be. A meagre 7.6% believe they get the benefits of tracking.

If users do not see the benefits of tracking and personalisation, there will be resistance and push-back against these practices. Data and insight is being touted as a central cog of new business models, but these strategies will fail if the consumer is not brought forward on the same mission.

Sentiment is clearly moving against data collection, so much so that 61.9% of respondents to the survey would be happy to be tracked less even if personalization was affected.

The question is what service is being provided by tracking users and collecting data?

Google clearly tracks users though the benefits emerge in several different ways. For example, more accurate results are shown when using the search engine, or more favourable restaurants are show on the mapping services. This is a benefit for the user, while also making money.

Netflix is another example where the benefits are clear. The recommendation engine will help customers navigate through the extensive back catalogue, theoretically, while understanding consumer behaviour will also inform decisions on what content is created in the future.

These are logical applications of data insight, something which the user can see benefits from though they might not appreciate them. However, the larger issue is with the majority who collect data but there is no obvious reason as to why or where the benefits are.

For the most part, this might be viewed as a security risk, an unnecessary ‘transaction’ to make, and considering the security credentials of the majority, the consumer is right not to place trust in these organisations.

City of Austin jumps in bed with NTT Data for smart city project

While some of the buzz surrounding smart cities has quietened, NTT Data is boasting of a new initiative with the City of Austin to address traffic-related issues.

Initially focused on easing congestion through the Texan city, the initiative could be expanded to numerous other areas for traffic management. This is a very small trial for the moment, focused on intersections of Cesar Chavez Street and Trinity Street and Neches Street and 8th Street, though additional locations will be added in time.

“We are piloting NTT because these solutions have the potential to help Austin digitally transform how people move safely through the city,” said Jason JonMichael, Assistant Director of Smart Mobility at Austin Transportation. “By better understanding the data and causal effects of problems we see in challenging areas, we can develop effective solutions that meet the community’s needs.

“Evaluating data is key to reaching our Vision Zero goal of eliminating fatalities and serious injuries on Austin roadways. Smart technologies like this one will help us prioritize improvements to make our streets safer.”

Using NTT’s Accelerate Smart data platform, the project will collect traffic-related and mobility data through vehicle counting and classification, as well as wrong way vehicle occurrences. The project will allow for the delivery of real-time alerts and traffic statistics that improve predictions, traffic management and future infrastructure planning.

This initiative ties into the overarching Vision Zero project in the city, to end deaths and serious injuries on Austin roadways. The majority of these projects are civil engineering based, adding a second left-turn lane to Slaughter Lane at South 1st Street for example, but all are underpinned by data collection and analysis.

Arizona Attorney General sues Google for misleading data collection practices

Arizona Attorney General Mark Brnovich has filed a lawsuit against Google for what he describes as ‘deceptive and unfair’ methods to secure valuable personal data.

While it is hardly unusual for Google to find itself on the wrong side of right when it comes to data collection and privacy practices, registering the attention of a single Attorney General could be a worrying start. These lawyers have a tendency to swarm around an adversary, collecting support from counterparts in other states. Simply look at how easily New York Attorney General Letitia James rallied disciples in failed opposition to the T-Mobile US and Sprint mega-merger, as well as a previous antitrust case against Google.

“While Google users are led to believe they can opt-out of location tracking, the company exploits other avenues to invade personal privacy,” said Brnovich. “It’s nearly impossible to stop Google from tracking your movements without your knowledge or consent. This is contrary to the Arizona Consumer Fraud Act and even the most innovative companies must operate within the law.”

The basis of this lawsuit is whether Google is acting with the rules set forth in Arizona consumer law. Brnovich details that the majority of Google’s revenues are derived from the collection of valuable personal information, though he also claims it is often done without the users’ consent or knowledge.

In 2018, the Associated Press ran an article which claimed Google was continuing to collect data even when the user explicitly removed consent. This practice seemingly carried on until the mid-2018’s and forms the basis of the case for Arizona. However, this is only the tip of the spear.

Following a two-year investigation, the Arizona Attorney General office has filed a 50-page complaint against Google in the Maricopa County Superior Court. Featuring internal documents, under-oath testimony from Google employees, as well as external opinions from academia condemning the activities.

A significant proportion of the information has been redacted and will be examined in private, thanks to confidentiality claims from Google, but the State lawyers will be pushing for more to be made public. Over the course of the next few weeks this could be a very interesting case to keep an eye on as details of the internal workings of Google are potentially exposed. Few people genuinely understand how Google works, so this could be very illuminating.

This will be an interesting case, though Brnovich will have to rally some support very quickly. The privacy advocacy organisations are remaining quiet for the moment, as are other politicians and Attorney Generals. That might change by this afternoon as our transatlantic cousins wake up but fighting the powerful Google legal department solo is unlikely to end well for Arizona’s Attorney General.

China deliberates privacy law in the midst of increased state surveillance

China’s parliament has said it will legislate on privacy protection, while the state has vastly increased surveillance since the outbreak of COVID-19.

The National People’s Congress, China’s highest legislature, is back in session after being delayed by two months by COVID-19. In his work report, the Chairman of the People’s Congress’s Standing Committee singled out three pieces of legislation related to state security and society control as priority tasks in the immediate future. Privacy protection is one of them, the other two being laws on data security and biosecurity, according to the reporting by People’s Daily, one of China’s main propaganda outlets.

This does not come as a complete surprise. At the end of last year, the People’s Congress announced at a press conference that a comprehensive privacy law would go through the legislation process in 2020. So far China’s privacy protection legislation is dispersed in different criminal, civil, and commercial laws and it often replies on the interpretation of judges when it comes to specific litigations. This gives those organisations, businesses, and individuals that have almost unbridled access to personal and private data an almost free hand to determine how to use the data. A group of consumers in China actually lost their case against Amazon when their privacy data on the e-commerce giant’s China site was comprised, which led to their losing large amount of money to phishing schemes.

Tencent and Alibaba have deployed facial recognition solutions at retail outlets where users of their online payment systems can pay for their purchases by looking at the camera at the check-out point. It is true that such solutions are both convenient and adding fun to the shopping experience, and it may also be true that the attitudes towards privacy in China are different from that in Europe. “In China, and across Asia, data is not seen as something to be locked down, it’s something that can be used,” according to a Hong Kong-based lawyer.

More recently, while the country was combating COVID-19, various tracing applications have been developed and deployed using personal data including name, date of birth, physical address, ID number, geo location records, and the like. Some of these apps have been jointly developed by commercial entities and public authorities and law enforcement agencies. Some people have raised concern that when the emergency is over, who and for how long such sensitive data should still be kept.

Probably more important is the scope of application of the impending law. The discussion on China’s official media is all about how to protect private data from being misused or abused by businesses, in particular the internet companies that have both access to the data and the technologies to benefit from it. It cannot help but giving the impression that the law is designed to primarily keep big businesses in check, without tying the government’s hands.

While the state legislature announced the new law being codified, China has vastly increased surveillance over its people, especially during the COVID-19 pandemic. Reuters reported that the country has seen “hundreds of millions of cameras in public places” being set up in cities and villages, as well as “increasing use of techniques such as smartphone monitoring and facial recognition.” The authorities have successfully located people infected by COVID-19 with surveillance images and facial recognition technologies, state media reported.

However, despite all the talking about AI, big data, and facial recognition, surveillance in China is still largely done by human beings constantly watching surveillance camera footages on screen and smartphone, which doesn’t come cheap. 4,400 cameras were installed in a village in Hubei, the province where COVID-19 first started, costing $5.6 million, according to the Reuters report.


Telecoms.com Daily Poll:

Should privacy rules be re-evaluated in light of a new type of society?

Loading ... Loading ...

51% of IT pros disagree with Gov approach to COVID-19 app

With the UK’s COVID-19 tracing application being test on the Isle of Wight, only 24% of IT professionals believe the initiative will be successful.

Research from BCS, an association for IT professionals, suggests the Government is struggling to source support from the IT community. This is not to say the efforts will not be a success, but it is hardly a confidence boost.

“BCS is clear that if done ethically and competently a tracing app can make a huge contribution to stopping the spread of COVID-19; but a majority of our members don’t believe the current model will work and are worried about the reliance on a centralised database,” said Bill Mitchell, Director of Policy at BCS.

“Yet despite their doubts 42% would still install the app and 21% are undecided. It feels like there is a lot of goodwill out there to give a tracing app a chance – if it can be shown to work. That means if these concerns are fully addressed then maybe over 60% of the population will install a high-quality app. That’s the magic adoption figure we need for the app to have real impact on stopping COVID-19.”

According to the research, only 24% believe the application will succeed. 32% explicitly believe it will fail and the remainder are still undecided. Interestingly enough, 51% believe the Google/Apple approach, the decentralised model where data is stored on user devices, should have been taken forward by the Government.

This is an argument which will persist as long as the coronavirus does. Some countries have opted for the decentralised model, which is being championed by Silicon Valley, and others have gone for centralised. What is worth noting is there is a very valid argument for the centralised data approach.

“If you don’t have the data at the starting point of the tunnel, you are facing a challenge,” said Sebastien Ourselin, a professor from Kings College London, during an industry conference. “When you want to react quickly, access to the data is key.”

Ourselin’s argument is that a centralised data model means you have access to the data all the time and whenever you want. It means you can run different models and apply different conditions to forecasting models, which is a lot more difficult when you only have access to the insight not the raw data.

The issue with the Government decision for centralised data is one of credentials.

When asked what the IT pros were concerned about, and why they would not download the app, 69% said data security, 67% pointed to privacy, 59% worried it was a pointless exercise and 49% lacked trust in the Government.

The final concern is why some might suggest opting for the Google/Apple route would have been more successful. People don’t trust Governments, but the majority have already handed personal information over to Silicon Valley. There is a respect for the smarts and capabilities of these companies. The Government could have weaponised Silicon Valley’s credibility to drive user adoption of the application.

If the Government fails to convince the general public to adopt this app, it will not succeed as imagined. There will be a valid contribution, but for a material success 60% of the population will have to download the application. This is a tough ask, though the lessons learned from the Isle of Wight trials should provide some valuable insight.


Telecoms.com Daily Poll:

Who would you consider the King of Innovation in the telco industry currently?

Loading ... Loading ...

Finland joins the quest for quantum computing strengths

The Technical Research Centre of Finland is going to build the country’s first quantum computer, joining a growing European contingent to compete at the front of next generation computing technology.

VTT, Finland’s state-owned Technical Research Centre (Teknologian tutkimuskeskus VTT Oy) announced that it will design and build the country’s first quantum computer, in partnership with “progressive Finnish companies from a variety of sectors”, aiming to “bolster Finland’s and Europe’s competitiveness” in this cutting-edge technology.

“In the future, we’ll encounter challenges that cannot be met using current methods. Quantum computing will play an important role in solving these kinds of problems,” said Antti Vasara, CEO of VTT. Referring to the country’s challenge of post-COVID-19 recover, Vasara said “it’s now even more important than ever to make investments in innovation and future technologies that will create demand for Finnish companies’ products and services.”

The multi-year project, with a total cost estimated about €20-25 million, will run in phases. The first checkpoint will be about a year from now, when VTT targets to “get a minimum five-qubit quantum computer in working order”, it said in the press release. Qubit, or “quantum bit”, is the basic information unit in quantum computing, analogous to binary digit, or “bit”, in classical computing.

In all fairness, this is a modest target on a modest budget. To put the 5-qubit target into perspective, by late last year, Google claimed that its quantum computer had achieve 53-qubit computing power. It could perform a task in 200 seconds that would take Summit, one of IBM’s supercomputers, 2.5 days by IBM’s own admission. By the time of writing, VTT has not responded to Telecoms.com’s question on the project’s ultimate target.

When it comes to budget, the VTT amount is easily dwarfed by the more ambitious projects. Although the most advanced quantum computers in the world are developed and run by the leading American technology companies and academic institutions, for example the MIT, IBM, and Google. But other parts of the world are quickly building their own facilities, including businesses and universities in Japan, India, China, and Europe. One of the high-profile cases recently is IBM’s decision to build Europe’s first commercial quantum computer in German’s state-backed research institute in Fraunhofer, near Stuttgart.

In addition to getting closer to and better serving the European markets in the future, IBM’s decision to build a quantum computer in Europe is also to do with GDPR requirement. While European businesses can use IBM’s quantum computer located in the US, through the cloud, they may hesitate when sending user data outside of the EU. The Fraunhofer project has been personally endorsed by Angela Merkel, the German Chancellor. The federal government has pledged €650 million investment for quantum computing, though not in the Fraunhofer project alone.

When it comes to quantum computing applications in the communications industry, at least two areas it can have strong impact. The first is security. Quantum computing will enable new modes of cryptography. The second is new materials. Daimler, the carmaker, has already used IBM’s quantum computers to design new batteries for its electric cars by simulating the complex molecule level chemistry inside the battery cells. On top of batteries, another research topic in new materials in the communications industry is to find silicon replacement as semiconductor in extremely high radio spectrums.

Despite its modest scope, the VTT undertaking is significant. Not only does it give Finland the right to boast of being the first Nordic country to build its own quantum computer, the success of the project would “provide Finland with an exceptional level of capabilities in both research and technology”. Faced with the worst economic crisis since the collapse of the Soviet Union, the Nordic nation is looking to technology breakthroughs for sustainable revival and long-term competitiveness. Quantum computing capability of this project, if not pursuing supremacy, limited by its scope, may at least give Finland the table stake.

COVID-19 forces Alphabet to pull plug on Page’s pet project

Alphabet has decided to terminate a smart city project in Toronto’s waterfront, a 2.5-year old project undertaken by its subsidiary Sidewalk Labs and a favourite of Google’s co-founder Larry Page.

Daniel L. Doctoroff, CEO of Sidewalk Labs, announced the decision to abandon the Quayside project in an article on Medium, blaming primarily on economic difficulties caused by the on-going Covid-19 pandemic. “As unprecedented economic uncertainty has set in around the world and in the Toronto real estate market, it has become too difficult to make the 12-acre project financially viable without sacrificing core parts of the plan we had developed together with Waterfront Toronto to build a truly inclusive, sustainable community”, Doctoroff said.

The company has “invested time, people, and resources in Toronto, including opening a 30-person office on the waterfront”, according to Doctoroff, though so far it has not built anything yet. It does not even receive the final development approval by the time of its demise, despite that it has been endorsed by governments at the city, provincial, and national level, including being championed by Justin Trudeau, the Prime Minister. The Wall Street Journal cited people familiar with the Toronto situation saying that “Alphabet had poured hundreds of millions of dollars into Sidewalk, with most of that earmarked for the Toronto project, and yet had little to show for it.”

Launched by Larry Page, Sidewalk Labs, a graduate of Alphabet’s Moonshot incubation organisation and now a subsidiary of its Other Bets unit, is one of those wild and weird ideas that cost Alphabet over $26 billion last year but with little returns. In the good old days, the company could afford such luxury, but as the cost of a downward global economy biting in all directions, even Alphabet had to calculate the return on its investment. Since last month the company has practically imposed a hiring freeze on non-essential functions.

Cost aside, there has long been data security and privacy concerns from the community. Sidewalk Labs promised to revolutionise city life with “ubiquitous connectivity, social networks, sensing, machine learning and artificial intelligence, and new design and fabrication technologies”, but these are precisely what concern the very people the Quayside project was meant to appeal to.

Google’s reputation has proceeded Alphabet’s participation in the project, leading Adrian Aoun, Sidewalk’s founder to comment “sometimes it’s easier not to be Google when going after bold ideas.” Prominent opponents to the project include Jim Balsillie, the former CEO of BlackBerry and an Ontario resident. “This is a major victory for the responsible citizens who fought to protect Canada’s democracy, civil and digital rights,” Balsillie said on hearing Alphabet’s decision. “Sidewalk Toronto will go down in history as one of the more disturbing planned experiments in surveillance capitalism.”

In a way this could be seen as a missed opportunity to showcase what smart cities have to offer, which would be encouraging to the flagging flare of IoT. Alphabet would have had the muscle to pull together the end-to-end smart city solutions so many similar trials have failed.

5G enabling a new data-driven business model

Telecoms.com periodically invites expert third parties to share their views on the industry’s most pressing issues. In this piece Alex Gledhill, Global Account Director, Intel UK, looks at how 5G could transform the commercial use of data.

The coming of 5G will prove transformative for global enterprise. Through 5G network adoption, long-awaited solutions to a range of shortcomings in key communications technologies will emerge. And the limitations of technology to contribute to business development and performance will be turned on its head.  Reflecting this expectation, a recent telecoms report predicts that a third of mobile operators will deploy 5G standalone within two years. But significantly it also indicates that half of operators intend to migrate to a common data layer for their network functions as they roll out their 5G offering.

New data model

The adoption of a common data model by operators is indicative of where the fifth generation of wireless communications technologies will prove truly transformative. The unprecedented connectivity inherent in 5G will serve to generate, active and integrate business data to a previously impossible extent. This is apparent in the direction of travel for network architecture. The common data model will enable essential business data across areas including device engagement, network services, subscriptions and connectivity. It will also facilitate integration for data storage and access like never before. And this new data-driven model will represent an essential business enabler though access to new revenue streams across the telecoms space.

Next generation mobile

First generation mobile technology was all about connecting people but had little data-generating capability. This has transformed over the past decades with mobile technology evolving into a data conduit. The sector’s essential priorities have shifted to include the provision of a constant streams of diverse information and content to users. In turn, consumers themselves have become generators of unprecedented quantities and new forms of data. This dynamic is set to be supercharged across mobile with the rollout of 5G. As the promise of 5G takes hold, our customers are demanding the increased performance and flexibility they need to rapidly deliver services with lower latency where it is needed most. To help Intel has created a portfolio for 5G network infrastructure development, including critical components for early 5G network deployment, which are enabling businesses to future-proof their offering in the face of 5G-driven transformation. The resulting availability of enhanced mobile broadband (e-MBB) will be among the key results.

In this context, uptake use cases will include the harnessing of 5G’s ground-breaking connectivity to stream even higher quality video across expanding markets. And in terms of addressing the limitations of existing infrastructure technology, eMBB will expand service coverage across wide areas and address perennial problem points such as stadiums, housing complexes and shopping centres. And a direct implication of this infrastructure improvement will be a significant increase in the amounts of data used and generated by consumers. People will be empowered to generate and share whatever content they want, anywhere, and at any time.

The operational implications of this development will be seismic for mobile providers servicing the TikTok generation. Verizon, the US network provider and Intel partner, were early in recognising the transformative potential of 5G. The company is rapidly rolling out 5G Ultra-Wideband services in the US. It was first operator to offer Intel-enabled 5G home services and achieved a global industry-first with the 5G network edge computing. Directing the power of the cloud closer to mobile, Verizon is anticipating an array of new and previously unimagined use cases and connecting evermore devices at the edge of its Ultra-Wideband network.

Data-rich customers  

For mobile operators and entrepreneurs across the telecoms space, the coming of 5G will bring a diverse range of operational improvements, which will serve to enhance their offering to customers. Such advancements will include faster network and data speeds, greater energy efficiency, lower latency, and increased bandwidths. In the broader sense, the improvements to network infrastructure will mean fundamental consumer behavioural change, with traditional broadband practices increasingly happening across mobile networks. Fundamental to this shift will be the capacity of 5G to significantly improve the efficiency of data transmission. Commercially, this will represent a game-changing advantage for operators. A more efficient network means cheaper by the bit data. And the passing of this benefit to consumers represent a new era of data-generated business opportunities and trends across multiple sectors.

Data-fuelled business

Exploring the applications potential of 5G in entertainment, a report from Intel predicts a radical redefinition in business models and the emergence of multiple new immersive, interactive and data-generating customer experiences. For instance, 5G is predicted to generate more than $140 billion in revenue from augmented reality (AR) and virtual reality (VR) application between 2021 and 2028. And the data-generating potential from new use cases across this spectrum is multifaceted. In the case of AR, it will create a new way for consumers to connect with media through virtual tools, scenarios and characters. Users will also have unprecedented engagement with augmented contextual information. And AR well facilitate previously unimagined communications channels between content creators and their audiences. This amounts to a new 5G-powered data-generating enterprise paradigm.

Now commonly referred to as industry 4.0, this new business epoch will generate, and be fuelled by, previously unimaginable levels of smart data. In this context, 5G represents a virtual data network – enabling a fully connected and intelligent global society. Moreover, the essence of 5G is intelligent connectivity. And mobile networks in the coming decade will connect ever increasing number of smart devices – helping to make the much-mooted internet of things (IoT) a realised fact.

From 5G to the Edge

A business landscape redefined by 5G will present myriad opportunities for operators and enterprises across the telecoms space. And the integration accessible through supercharged connectivity will result in the most powerful unified communications platform seen to date. It will also supercharge the growing smart digital services space as digitised communications reach new sectors and markets.

Rakuten mobile, the Japanese operator and Intel partner, recognised the need for a fundamental redesign of its network platform in anticipation of the opportunities 5G will bring. This encompassed the development of fully virtualised end-to-end cloud network architecture. And with separate built in user and data planes, its network is now ready to embrace multiple new use cases – with further confidence drawn from the benefit of its future-proofed edge architecture. And through the adoption of Intel’s data centre processors for cross network functionality, Rakuten is guaranteed agility, efficiency, flexibility and capacity to pass cost benefits onto its customers in Japan.

As in the case of Rakuten, the capability of mobile operators and businesses to capitalise in this rapidly evolving world of 5G-powered smart data will ultimately depend on their ability to think ahead and adopt. The rates of adoption of a common data model by operators is encouraging in this regard. But in the wider sense, businesses will have to go further and faster to develop software-defined networks and cloud-based ecosystems for essential scalability and flexibility in the face of a 5G-driven data-defined world.