Consolidation in the B2B CSP market with Ribbon and ECI set to merge

Two companies involved in providing communications services to other companies are merging to make a single bigger one.

Ribbon Communications focuses on cloud communications as a platform while EDI provides network solutions to companies like operators and cloud providers, so there’s a distinctly cloudy theme to this merger. Rather confusingly Ribbon says it’s acquiring ECI through a merger. Ribbon is handing over 32.5 million shares, which equates to around $130 million, as well as $324 in cash to mergaquire ECI.

“The ECI acquisition will extend Ribbon’s reach into the networking market and propel us into the global 5G market,” said Daryl Raiford, CFO of Ribbon. “ECI brings world class networking technology and a proven track record of success in winning top customers in direct competition with major industry players.

“Ribbon has long-standing, deep customer relationships in North America and Japan, which will provide immediate access to ECI solutions into these substantial markets. We believe this combination will create new revenue opportunities to drive growth, provide our customers and partners with a broader solutions portfolio, and generate significant long-term value for our stockholders.”

“We are excited to join forces with Ribbon, bringing together Ribbon’s and ECI’s rich portfolios of communications solutions,” said Darryl Edwards, CEO of ECI. Both companies enjoy a distinguished operating history and are trusted suppliers to the world’s leading telecommunication service providers and enterprises. We aim to create a powerhouse company that offers world-class products for an enhanced customer experience, benefiting our combined global customer base.”

“With ECI’s solid position and long history in the packet-optical transport markets, this acquisition makes sense for Ribbon on multiple fronts, giving Ribbon an entry into the early and growing 5G xHaul transport market while providing its combined customers with a full stack of solutions,” said Don (couldn’t they have found a Darrell?) Frey, Analyst at Ovum. “In addition to cross-selling opportunities, this proposed acquisition will give Ribbon a broad product line and enhance scale as a communications solutions vendor to service providers and enterprises.”

The market doesn’t seem to have such a rosy view of this move, however, with Ribbon’s shares down a whopping 20% at time of writing. Maybe it has something to do with Ribbon CEO Franklin Hobbs bailing on the day of the announcement. Presumably he didn’t think this was such a great idea either, so you can see why investors might be feeling a bit twitchy. They could also have tire of Ribbon’s apparent addiction to M&A in general

Public cloud gathering momentum in India – Gartner

Few countries are speeding towards the digital economy as quickly as India, and it seems the bug is catching as enterprise organizations start to surge spending on the public cloud.

Today’s India is almost unrecognisable from bygone years. With a renewed focus on digital from Government and regulatory agencies, telcos finally spending on networks and consumers demonstrating an incredible appetite for data, India is quickly closing the divide. An increase in public cloud spending only adds further confidence in progress.

“Moving to the cloud and investing in public cloud services have become imperative to the success of digital business initiatives,” said Gartner Analyst Sid Nag.

“It’s no longer a question of ‘why’, but a matter of ‘when’ organizations can shift to the cloud. We have entered the cloud 2.0 era, where organizations are adopting a cloud-first or a cloud-only strategy.”

Those who are of a certain age will remember the excitement which was drummed up around the ‘BRIC’ nations. The acronym described the economic potential of slumbering giants (Brazil, Russia, India and China), four countries with large population that were supposed to be the growth engines for international businesses around the world after growth in domestic markets slowed.

China certainly offered fortunes for those who were strategically savvy enough, while there has been some promise in Russia and Brazil. India was always the nation which undermined the BRICs theory, though it is quickly entering its own digital era.

According to Gartner estimates, public cloud investment from enterprise organisation will increase by 25% over the next 12 months. Software-as-a-Service (SaaS) remains the largest segment, representing 42% of all investments, though this is the same journey many ‘developed’ nations took in bygone years. The team estimates SaaS cloud application services will total $1.4 billion over the next 12 months, an increase of 21%.

Segment 2018 2019 2020
Platform-as-a-Service (PaaS) 284 363 461
Software-as-a-Service (SaaS) 900 1,105 1,364
Business-Process-as-a-Service (BPaaS) 172 189 212
Cloud management and security 187 228 274
Infrastructure-as-a-Service (IaaS) 558 744 996

Figures in millions (US$)

As you can see from the figures above, spending has been steadily increasing year-on-year, though considering the size of India as a country, the potential is significant. However, there might be a challenge on the horizon unless all the cogs click into place.

CIOs across the market are suggesting there could be consolidation in the market as smaller players are replaced by the global power houses of the cloud economy, however with such potential money will have to be spent to ensure the digital infrastructure is in place.

This is where India has traditionally struggled. It was a ‘chicken and egg’ situation, with low ROI discouraging infrastructure investment, though inadequate infrastructure seemed to hobble potential profits. This conundrum does seem to be in the past, though there is still plenty of work to do to increase the data centre footprint, as well as ‘fibering up’ the nation to take advantage of future applications, both consumer and enterprise.

Amazon Web Services announced in May it would open a new Availability Zone in the AWS Asia Pacific (Mumbai) region due to customer demand, Microsoft Azure currently has three Availability Zones in the country and has partnered with Reliance Jio to boost its presence, Google is currently hiring very aggressively in the country, while IBM recently said it was focusing more acutely on SMEs to gain traction.

India still does not compete with the top nations around the world when it comes to digital readiness, but all the pieces do seem to be falling into place. Increased investments in public cloud services and infrastructure is more evidence this country is flying towards the digital economy.

Nokia and Microsoft bundle their cloud offerings

A strategic collaboration between Nokia and Microsoft is banking on companies wanting to buy their cloud hardware and software together.

Here’s the official pitch: “By bringing together Microsoft cloud solutions and Nokia’s expertise in mission-critical networking, the companies are uniquely positioned to help enterprises and communications service providers transform their businesses.” This seems to be mainly about things like private networks, SD-WAN and private cloud, but specific commercial use-cases are thin on the ground at this stage.

“We are thrilled to unite Nokia’s mission-critical networks with Microsoft’s cloud solutions,” said Kathrin Buvac, President of Nokia Enterprise and Chief Strategy Officer. “Together, we will accelerate the digital transformation journey towards Industry 4.0, driving economic growth and productivity for both enterprises and service providers.”

“Bringing together Microsoft’s expertise in intelligent cloud solutions and Nokia’s strength in building business and mission-critical networks will unlock new connectivity and automation scenarios,” said Jason Zander, EVP of Microsoft Azure. “We’re excited about the opportunities this will create for our joint customers across industries.”

This initiative is more than just good PowerPoint and canned quote, however, with BT announced as its first paying punter. Apparently it’s already offering a managed service that integrates Microsoft Azure cloud and Nokia SD-WAN stuff. Specifically this means Azure vWAN and Nuage SD-WAN 2.0.

Apart from that the joint announcement mainly just bangs on about how great both companies are at this sort of thing – in other words a thinly-veiled sales pitch. The market will decide if it needs this kind of complete virtual WAN package and whether or not Nokia and Microsoft are the best companies to provide it. But there’s no denying BT is a strong first customer win.

China switches on 5G

The world’s largest mobile market has gone live with its 5G networks, and is poised to become the largest 5G market in the world.

Very much in the same way as the South Korean operators did back in April, all three of China’s incumbent telecom operators switched their 5G networks on the same day. The simultaneous inauguration was held in Beijing on Thursday and was graced by the presence of officials of the Ministry of Industry and Information Technology (MIIT), the government body overseeing telecoms.

“The commercial launch of 5G is an ideal opportunity to quicken the steps to infrastructure development, including AI and IoT,” said Chen Zhaoxiong, the vice minister of MIIT, as was reported by Xinhua, one of China’s major official propaganda outlets. Chen also highlighted the importance of 5G working in collaboration with other vertical industries, including manufacturing, transport, energy, and agriculture. He also sees 5G playing a key role in promoting innovations in education, health care, government service, and smart cities.

Since four licences were awarded in June, over 80,000 5G base stations have been built, half of them belong to China Mobile. The world’s largest mobile operator by subscribers are going to offer 5G services in 50 cities, including all the major cities across the country, according to its press release. The company claimed it has already launched 44 5G devices since it received the licence, 13 of them already in the market (10 smartphones and 3 CPEs).

China Telecom, the world’s largest integrated operator by subscribers, will also start offering 5G services in 50 cities. The names were not spelled out in the press release, but it would be a surprise if they were the same 50 as China Mobile. In addition to talking about new experience offered by 5G in cloud-based gaming and HD video, the operator also stressed 5G+Cloud+AI offers to industry customers, including industrial internet, smart cities, smart medical care, smart education, transport and logistics, and smart energy.

China Unicom does not give a specific number of cities its 5G service will cover, though earlier the company announced that it will share RAN with China Telecom in 15 cities, as well as build its own 5G networks in 14 additional cities as well as extending to 8 other provinces. So, the total number of cities covered by Unicom 5G should be comparable with the other two.

Absent from the launch is China Broadcasting Network Corporation Ltd, the licenced greenfield operator.

South Korea is the largest 5G subscriber market so far, but thanks to the sheer size of the Chinese market, even with lower penetration China is expected to overtake South Korea and the US to become the world’s largest 5G market. GSMA, the industry lobby group, estimates China will have 600 million 5G subscribers by 2025, about 40% of the global 5G market.

Amdocs launches ‘future ready’ RevenueONE billing system

Telecoms software vendor Amdocs has unveiled its bid to bring billing into the 5G and cloud era, in the form of RevenueONE.

The Amdocs marketing team saw fit to describe it as ‘game-changing’ in the headline of the press release. What game that is, whether it needs changing and whether or not this launch does so, we’ll leave to others to establish. The top line is that this is a billing system designed to help operators exploit all the new revenue opportunities we’re constantly being told have been generated by 5G and the move to the cloud.

To flesh out the press release we spoke to Ron Porter, Product Marketing Manager at Amdocs. He explained that 5G, IoT, connected environments, etc create all sorts of new billing opportunities for operators, but legacy billing systems aren’t geared to exploit them. A lot of this comes down to the kind of speed and flexibility that comes with having virtualized functions in the cloud, especially the edge. He concluded that the ultimate aim was to offer a billing system that is ‘future ready’.

“Amdocs RevenueONE brings together proven scalability and cloud-native architecture to accelerate the launch of new 5G services, while supporting existing products and offers, said Anthony Goonetilleke, Amdocs President of Media, Network and Technology. “At its core, the RevenueONE blueprint was built to scale, and was proven to support 200 million subscribers on a single platform.  This robust architecture allows CSPs to handle the velocity of new service launches, and the variety of new business models, that will come with 5G, while cutting time to market from days to minutes.

“Our goal was to continue to significantly reduce our hardware footprint while scaling to support the influx of new connected devices and services. Utilizing edge-based architecture to reduce network traffic, we believe RevenueONE will grow with our customers as consumers embrace new business models and services.”

The BSS/billion/digital transformation space is pretty competitive at the moment, with various vendors queueing up to giver operators the tools to capitalize on their 5G investments. If products like RevenueONE enable even half of what they promise the onus, as ever, is on operators to adapt the way they do business. 5G is still at an early stage, but the winners of it will surely be those operators that use it as a platform for genuine innovation.

Amazon profits fall and its share price follows

Internet giant Amazon announced strong sales growth but that didn’t translate into profit after it invested heavily in one-day shipping.

The consequent significant year-on-year rise in operating expenses, combined with shrinking margin at AWS, where most of Amazon’s profit comes from, resulted in quarterly operating income declining for the first time in a while. While investors had been warned about the increased overheads, they were apparently even greater than expected, because Amazon’s share price declined 6% on the news.

“We are ramping up to make our 25th holiday season the best ever for Prime customers — with millions of products available for free one-day delivery,” said Jeff Bezos, Amazon founder and CEO. “Customers love the transition of Prime from two days to one day — they’ve already ordered billions of items with free one-day delivery this year.

“It’s a big investment, and it’s the right long-term decision for customers. And although it’s counterintuitive, the fastest delivery speeds generate the least carbon emissions because these products ship from fulfillment centers very close to the customer — it simply becomes impractical to use air or long ground routes. Huge thanks to all the teams helping deliver for customers this holiday.”

As you can see from the table below, Amazon’s total overheads were 14 billion bucks higher in the most recent quarter than they were a year ago. North America is still where most of its sales are and thus where most of the overheads come from too. Profits disproportionality come from its AWS cloud services division, but even there margins are significantly reduced year-on-year.

Amazon has spent its entire history sacrificing profit on the altar of investment, and that seems to have paid off. So it’s hard to read too much into the share price fall other than a realisation among investors that Amazon is serious about this one-day delivery stuff. That will probably pay off in the long term too, and we expect Bezos isn’t very bothered about the short term reaction to his grand plan.

Microsoft revenues surge once again thanks to the cloud

This will officially be the last time we talk about Microsoft’s recovery, as it is unfair to undermine the continued progress and domination of the firm on the digital economy.

Everyone in the TMT industry knows the trouble Microsoft faced in bygone years, and everyone understands why the firm found itself in that position. But there is no need to discuss this aspect of the business anymore. CEO Satya Nadella has redefined the organization, leaving the troubles in the past. This business is a new beast and the latest financials prove it is one of the dominant forces in the digital economy.

“We are off to a strong start in fiscal 2020, delivering $33 billion in revenue this quarter,” Nadella said. “Our Commercial Cloud business continues to grow at scale as we work alongside the world’s leading companies to help them build their own digital capability.”

Fundamentally, Microsoft is a different business. In the 90s and 00s, Microsoft was defined by its dominance of the PC operating software world. Although this presence still exists today, the focus is on enterprise customers, however, the prospects of the business are perhaps more acutely focused on Azure, the cloud computing unit. The Microsoft of today and the troubled Microsoft of yesteryear are chalk and cheese.

Looking at the financials for the first quarter which were announced following the close of the market yesterday [23 October], they are once again pretty impressive. Total revenues increased 14% year-on-year to $33 billion, while operating income stood at $12.7 billion, up 27% from the same three-month period in 2018.

Revenues at Microsoft Azure increased 59% year-on-year, while the team has stated there has been a ‘material increase’ in the number of $10 million + contracts. The cloud is driving Microsoft forward, while the excitement around edge computing opens-up new prospects for the business.

This quarter also saw the team open two new datacentre regions, in Germany and Switzerland, taking the total up to 54 worldwide. Microsoft Azure is now available in 140 countries around the world, with the geographical footprint focused in Europe and North America.

Asia is one market where the team could grow further, and this might be a development worth keeping an eye on as more countries travel through the digital transformation journey. Nadella paid homage to a partnership with Indian telco Reliance Jio as a green-shoot of growth in the market.

Alongside the progress which is being made to expand the datacentre footprint of the business, the team is also pointing towards strategic partnerships with the likes of VMWare, Oracle and SAP for the added momentum in the cloud business.

“You’d actually see it in a couple of places [impact of partnerships], not just in Azure, which may in fact be the most logical extension,” said CFO Amy Hood during the earnings call.

“But, at the heart of this is making it easier, faster and more reliable for us to help customers move their estate to the cloud and to migrate that with confidence.”

This is perhaps one of the most exciting aspect of the cloud segment and will not just be limited to the success at Microsoft. There is still a huge amount of growth left to realise.

Many companies around the world will claim to be cultivating a cloud-first mentality, and many of these companies are migrating workloads across to the cloud. However, what has been achieved to date is only a fraction of the total. The cloud has matured, availability is increasing, and prices are decreasing. The likes of Microsoft, Amazon and Google might be hoovering up the profits, but there is still huge potential for growth.

Value Growth
Total revenue $33.1 billion 14%
Operating income $12.7 billion 27%
Net income $10.7 billion 21%
Productivity and Business Processes unit $11.1 billion 13%
Intelligent Cloud unit $10.8 billion 27%
More Personal Computing unit $11.1 billion 4%

 

Google claims quantum computing breakthrough, IBM disagrees

Google says it has achieved ‘quantum supremacy’, as its Sycamore chip performed a calculation, which would have taken the world’s fastest supercomputer 10,000 years, in 200 seconds.

It seems quite a remarkable upgrade, but this is the potential of quantum computing. This is not a step-change in technology, but a revolution on the horizon.

Here, Google is claiming its 53-qubit computer performed a task in 200 seconds, which would have taken Summit, a supercomputer IBM built for the Department of Energy, 10,000 years. That said, IBM is disputing the claim suggesting Google is massively exaggerating how long it would take Summit to complete the same task. After some tweaks, IBM has said it would take Summit 2.5 days.

Despite the potential for exaggeration, this is still a breakthrough for Google.

For the moment, it seems to be nothing more than a humble brag. Like concept cars at the Tokyo Motor Show, the purpose is to inflate the ego of Google and create a perception of market leadership in the quantum computing world. Although this is an area which could be critically important for the digital economy in years to come, the technology is years away from being commercially viable.

Nonetheless, this is an impressive feat performed by the team. It demonstrates the value of persisting with quantum computing and will have forward-thinking, innovative data scientists around the world dreaming up possible applications of such power.

At the most basic level, quantum computing is a new model of how to build a computer. The original concept is generally attributed to David Deutsch of Oxford University, who at a conference in 1984, pondered the possibility of designing a computer that was based exclusively on quantum rules. After publishing a paper a few months later, which you can see here if you are brave enough, the race to create a quantum computer began.

Today’s ‘classical’ computers store information in binary, where each bit is either on or off. Quantum computation use qubits, which can either be on or off, as well as being both on and off. This might sound incredibly complicated, but the best way to explain is to imagine a sphere.

In classical computing, a bit can be represented by the poles of the sphere, with zero representing the south pole and one representing the north, but in Quantum computing, any point of the sphere can be used to represent any point. This is achieved through a concept called superposition, which means ‘Qbits’ can be represented by a one or a zero, or both at the same time. For example, two qubits in a single superposition could represent four different scenarios.

Irrelevant as to whether you understand the theoretical science behind quantum computing, the important takeaway is that it will allow computers to store, analyse and transfer information much more efficiently. As you can see from the claim Google has made, completing a calculation in 200 seconds as opposed to 10,000 years is a considerable upgrade.

This achieved can be described as ‘quantum supremacy’, in that the chip has enabled a calculation which is realistically impossible on classical computing platforms. From IBM’s perspective, this is a step forward, but not ‘quantum supremacy’ if its computer can complete the same task in 2.5 days.

If this still sounds baffling and overly complex, this is because quantum computing is a field of technology only the tiniest of fractions of the worlds’ population understand. This is cutting-edge science.

“In many ways, the exercise of building a quantum computer is one long lesson in everything we don’t yet understand about the world around us,” said Google CEO Sundar Pichai.

“While the universe operates fundamentally at a quantum level, human beings don’t experience it that way. In fact, many principles of quantum mechanics directly contradict our surface level observations about nature. Yet the properties of quantum mechanics hold enormous potential for computing.”

What is worth taking away here is that understanding the science is not at all important once it has been figured out by people far more intelligent. All normal people need to understand is that this is a technology that will enable significant breakthroughs in the future.

This might sound patronising, but it is not supposed to. Your correspondent does not understand the mechanics of the combustion engine but does understand the journey between London and South Wales is significantly faster by car than on horse.

But what could these breakthroughs actually be?

On the security side, although quantum computing could crack the end-to-end encryption software which is considered unbreakable today, it could theoretically enable the creation of hack-proof replacements.

In artificial intelligence, machine learning is perfect area for quantum computing to be applied. The idea of machine learning is to collect data, analyse said data and provide incremental improvements to the algorithms which are being integrated into software. Analysing the data and applying the lessons learned takes time, which could be dramatically decreased with the introduction of quantum computing.

Looking at the pharmaceutical industry, in order to create new drugs, chemists need to understand the interactions between various molecules, proteins and chemicals to see if medicines will improve cure diseases or introduce dangerous side-effects. Due to the eye-watering number of combinations, this takes an extraordinary amount of time. Quantum computing could significantly reduce the time it takes to understand the interaction but could also be combined with analysing an individual’s genetic make-up to create personalised medicines.

These are three examples of how quantum computing could be applied, but there are dozens more. Weather forecasting could be improved, climate change models could be more accurate, or traffic could be better managed in city centres. As soon as the tools are available, innovators will come up with the ideas of how to best use the technology, probably coming up with solutions to challenges that do not exist today.

Leading this revolutionary approach to computing is incredibly important for any company which wants to dominate the cloud industry in the futuristic digital economy, which is perhaps the reason IBM felt it was necessary to dampen Google’s celebrations.

“Building quantum systems is a feat of science and engineering and benchmarking them is a formidable challenge,” IBM said on its own blog.

“Google’s experiment is an excellent demonstration of the progress in superconducting-based quantum computing, showing state-of-the-art gate fidelities on a 53-qubit device, but it should not be viewed as proof that quantum computers are “supreme” over classical computers.”

Google measured the success of its own quantum computer against IBM’s Summit, a supercomputer which is believed to be the most powerful in the world. By altering the way Summit approaches the same calculation Google used, IBM suggests Summit could come to the same conclusion in 2.5 days rather than 10,000 years.

Google still has the fastest machine, but according to IBM the speed increase does not deserve the title of ‘quantum supremacy’. It might not be practical to ask a computer to process a calculation for 2.5 days, but it is not impossible, therefore the milestone has not been reached.

What is worth noting is that a pinch of salt should be taken with both the Google and IBM claims. These are companies who are attempting to gain the edge and undermine a direct rival. There is probably some truth and exaggeration to both statements made.

And despite this being a remarkable breakthrough for Google, it is of course way too early to get exciting about the applications.

Not only is quantum computing still completely unaffordable for almost every application data scientists are dreaming about today, the calculation was very simple. Drug synthesis or traffic management where every traffic signal is attempting to understand the route of every car in a major city are much more complicated problems.

Scaling these technologies so they are affordable and feasible for commercial applications is still likely to be years away, but as Bill Gates famously stated: “We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten.”

Microsoft might be toying with European data protection compliance

The European Data Protection Supervisor has raised ‘serious concerns’ over whether Microsoft is compliant with data protection regulations.

The contracts in question are between the software giant and various European Union institutions which are making use of said products. The central issue is whether contractual terms are compliant with data protection laws intended to protect individual rights across the region from foreign bodies which do not hold data protection to the same standards.

“Though the investigation is still ongoing, preliminary results reveal serious concerns over the compliance of the relevant contractual terms with data protection rules and the role of Microsoft as a processor for EU institutions using its products and services,” a statement reads.

“Similar risk assessments were carried out by the Dutch Ministry of Justice and Security confirmed that public authorities in the Member States face similar issues.”

The preliminary findings from the European Data Protection Supervisor follow on from investigations taking place in the Netherlands and also changes to the Microsoft privacy policies for its VoIP product Skype and AI assistant Cortana. The changes were seemingly a knee-jerk reaction to reports contractors were listening to audio clips to improve translations and the accuracy of inferences.

What is worth noting is that Microsoft is not the only company which has been bending the definition of privacy with regard to contractors and audio clips. Amazon and Google have also been dragged into the hazy definition of privacy and consent.

The issue which seems to be at the heart of this investigation is one of arm’s length. While government authorities and agencies might hand-over responsibility of data protection and privacy compliance to the cloud companies, the European Data Protection Supervisor is suggesting more scrutiny and oversight should be applied by said government parties.

Once again, the definition and extent of privacy principles are causing problems. Europe takes a much more stringent stance on the depth of privacy, as well as the rights which are affording to individuals, than other regions around the world. Ensuring the rights of European citizens are extended elsewhere was one of the primary objectives of the GDPR, though it seems there are still teething problems.

“When using the products and services of IT service providers, EU institutions outsource the processing of large amounts of personal data,” the statement continues.

“Nevertheless, they remain accountable for any processing activities carried out on their behalf. They must assess the risks and have appropriate contractual and technical safeguards in place to mitigate those risks. The same applies to all controllers operating within the EEA.”

One development which could result in additional scrutiny is The Hague Forum, an initiative to create standardised contracts for European member states which meet the baseline data protection and privacy conditions set forward. The European Data Protection Supervisor has encouraged all European institutions to join the Forum.

Although GDPR was seen as a headache for many companies around the world, such statements from the European Data Protection Supervisor proves this is not an area which can simply be addressed once and then forgotten. GDPR was supposed to set a baseline, and there will be more regulation to build further protections. Perhaps the fact that Microsoft is seemingly non-compliant with current regulations justifies the introduction of more rules and red-tape.

Nvidia takes 5G to the edge with help from Ericsson and Red Hat

Graphics chip maker Nvidia has unveiled its EGX Edge Supercomputing Platform that is designed to boost 5G, IoT and AI processing at the edge of the network

Nvidia has long been the market leader in GPUs (graphics processing units), which has enabled it to get a strong position in supercomputing, where the parallel processing qualities of GPUs come in especially handy. This EGX initiative seems to be Nvidia’s attempt to translate that position from datacentres to the edge computing.

“We’ve entered a new era, where billions of always-on IoT sensors will be connected by 5G and processed by AI,” said Jensen Huang, Nvidia CEO. “Its foundation requires a new class of highly secure, networked computers operated with ease from far away. We’ve created the Nvidia EGX Edge Supercomputing Platform for this world, where computing moves beyond personal and beyond the cloud to operate at planetary scale.”

There seems to be a fair bit of support for this new platform, with a bunch of companies and even a couple of US cities saying they’re already involved. “Samsung has been an early adopter of both GPU computing and AI from the beginning,” said Charlie Bae, EVP of foundry sales and marketing at Samsung Electronics. “NVIDIA’s EGX platform helps us to extend these manufacturing and design applications smoothly onto our factory floors.”

“At Walmart, we’re using AI to define the future of retail and re-think how technology can further enhance how we operate our stores,” said Mike Hanrahan, CEO of Walmart Intelligent Retail Lab. “With NVIDIA’s EGX edge computing platform, Walmart’s Intelligent Retail Lab is able to bring real-time AI compute to our store, automate processes and free up our associates to create a better and more convenient shopping experience for our customers.”

On the mobile side Ericsson is getting involved to build virtualized 5G RANs on EGX. As you would expect the reason is all about being able to introduce new functions and services more easily and flexibly. More specifically Ericsson hopes the platform will make virtualizing the complete RAN solution cheaper and easier.

“5G is set to turbocharge the intelligent edge revolution,” said Huang. “Fusing 5G, supercomputing, and AI has enabled us to create a revolutionary communications platform supporting, someday, trillions of always-on, AI-enabled smart devices. Combining our world-leading capabilities, Nvidia and Ericsson are helping to invent this exciting future.”

On the software side a key partner for all this virtualized 5G fun will be Red Hat, which is getting its OpenShift Kubernetes container platform involved. It will combine with Nvidia’s own Aerial software developer kit to help operators to make the kind of software-defined RAN tech that can run on EGX.

“The industry is ramping 5G and the ‘smart everything’ revolution is beginning,” said Huang. “Billions of sensors and devices will be sprinkled all over the world enabling new applications and services. We’re working with Red Hat to build a cloud-native, massively scalable, high-performance GPU computing infrastructure for this new 5G world. Powered by the Nvidia EGX Edge Supercomputing Platform, a new wave of applications will emerge, just as with the smartphone revolution.”

Things seemed to have gone a bit quiet on the virtualization front, with NFV, SDN, etc having apparently entered the trough of disillusionment. Nvidia is a substantial cloud player these days, however, and judging by the level of support this new initiative has, EGX could a key factor in moving the telecoms cloud onto the slope of enlightenment.