The reality of mobile SD-WAN – the missing link for enterprise 5G?

Telecoms.com periodically invites third parties to share their views on the industry’s most pressing issues. This is the second of a two piece series in which Simon Pamplin, EMEA Technical Sales Director for Silver Peak, looks at some of the enterprise benefits of the latest generations of wireless networking technology.

This is the second of a two-part article series that explores SD-WAN and the future of networking in the 5G era. The first looked at how established 4G LTE connectivity, partnered with software-defined WANs (SD-WANs), have contributed to changing the way users connect to applications, particularly for ‘on-the-go’ requirements and in hard to reach locations.

5G: forging the new ultra-high speed, hyperconnected world

Following the 5G spectrum auction in the UK back in April of 2018, telecommunications providers are now racing to roll out the fifth generation of mobile wireless technology to meet today’s explosive bandwidth and network connectivity demands. 5G is poised to revolutionise several industries by bringing significantly faster connections, shorter delays and increased connectivity for users. It will aid the expansion of IoT, creating a virtual network of ultra-high-speed connections across multiple devices.

According to Gartner, two thirds of large organisations have plans to deploy 5G by 2020. However, a full end-to-end deployment will take many more years. The firm also projects that the lack of readiness of communications service providers to meet enterprise demands in time will be a major issue.

Looking at the near term, service providers have to provide means to enable use cases – such us IoT communications, enhanced mobile broadband, fixed wireless access along with high-performance edge analytics – without having the benefit of an end-to-end 5G network. This is all while contending with sky high user expectations.

SD-WAN is one of those enabling technologies that will help service providers to deliver a higher quality of network experience that is tailored to the customer’s needs, while managing the transition to a complete end-to-end 5G infrastructure for delivery.

5G: on the edge

According to Gartner, the number of IoT devices is set to rise to more than 20 billion by 2020. IoT connectivity across more and more devices will drive the processing of high volumes of data at high speed – one of the core promises of 5G. This influx of data must be ingested and processed both in real time, and as close to the source as possible, ultimately driving the need for edge computing.

While 5G provides higher bandwidth, it is more limited in range. It is anticipated that 5G networks will be powered by hundreds of thousands of small cells. Denser networks of cells will make it more difficult for operators to operate, manage and maintain. As such, the optimisation of these networks will be key to deliver the best possible network performance and maintain the highest quality of experience to end users.

The emergence of 5G will not only change end users’ expectations when it comes to always-on connectivity and low latency, it will also transform the way enterprises manage their networks. Strong demands on real-time network monitoring across transport connectivity and traffic management optimisation will drive the need for automation.

SD-WAN can hold the 5G ends together but what features will be most valuable?

SD-WAN connects users to applications securely and directly using any combination of underlying transport, including MPLS, 4G LTE and internet. As companies roll out 5G in the UK, businesses with SD-WANs deployed on the network will have the ability to transition key parts or locations over to the latest high-speed, high-performance connectivity.

An SD-WAN platform that enables automation will help service providers to easily connect to and integrate across all the different compute edges required to optimise the traffic and management of 5G cells. This will enable a seamless transition towards a full 5G infrastructure by managing any transport available across the edge, leveraging 5G transport for those critical applications that require zero latency and higher speeds.

To guarantee the highest quality of experience for users, service providers need to evaluate SD-WAN vendors. The best vendors and solutions will be those that are able to offer advanced features. One of these is granular, intelligent application-driven routing. In Layman’s terms, this means the SD-WAN can automatically prioritise high-bandwidth, or business critical, traffic (like video streaming) to a 5G cell and manage failovers, while lower bandwidth traffic is routed to another transport available (such as LTE or broadband internet).

Moreover, centralised orchestration and management capabilities can facilitate easier operation, management and maintenance of edges and 5G cells by intelligently rerouting traffic during cell provisioning or upgrades. It also enables faster policy-based provisioning of WAN services to support any device – a must for IoT.

With business around the world ramping up cyber defences, an SD-WAN that also unifies security features with business intent networking is more favourable. These enable the centralised enforcement of granular, application-driven security policies by identifying, classifying and automatically steering traffic to the right security services without compromising either performance or cost.

Enterprises expect the best quality of service (QoS) that applications demand and SD-WAN solutions with virtual WAN overlays can allow for a more efficient and flexible allocation of network resources. Similarly, 5G networks rely on network slicing, where each slice receives a unique set of optimised resources and network topology. By using both technologies together, service providers can steer mission-critical traffic to the 5G network, where it can be isolated to a particular slice depending on the specific application requirements.

Lastly, as SD-WAN continues to evolve, emerging technologies will be incorporated to further enhance the user experience. Today, SD-WAN solutions that utilise machine learning separate themselves from the crowd. These can automatically adapt to varying network conditions in real time and provide optimal routing to the edges and the 5G small cells.

5G & SD-WAN: powering forward-thinking businesses

The adoption of 5G and edge computing will drive higher expectations from end users and enterprises for an always-on, high performing network and applications. The initial success of 5G deployments will demand an automated, self-driving wide area network foundation with underlay intelligence that delivers the highest quality of experience for users, such as the one offered by SD-WAN. Additionally, advanced business-driven SD-WAN platforms will empower service providers with ways to accelerate new revenue streams from 5G-enabled managed services rather than just as a transport connection.

 

100508_Simon Pamplin_v1Simon Pamplin is the EMEA Technical Sales Director for Silver Peak and a regular speaker at events on topics ranging from the latest storage technologies and server virtualisation to the current shift in data networking towards SD-WAN, as well as the latest developments in the technology. With over 20 years’ experience in enterprise IT, Simon has worked for IP, SAN and hyper-convergent companies and is driven by new technology and the business benefits it can bring.

The reality of mobile SD-WAN – what 4G LTE made possible

Telecoms.com periodically invites third parties to share their views on the industry’s most pressing issues. This is the first of a two piece series in which Simon Pamplin, EMEA Technical Sales Director for Silver Peak, looks at some of the enterprise benefits of the latest generations of wireless networking technology.

5G, also known as the fifth generation of mobile wireless technology, is one of the hottest topics in wireless circles today. Indeed, you can’t throw a stone without hitting a plethora of titles, potential use cases and detailed explanations about 5G. While telecommunications providers are in a heated competition to roll out 5G, it is important to reflect on current 4G LTE (long term evolution) business solutions as a preview of what has been learned and what’s possible.

At the same time, the enterprise has experienced its own networking revolution. As cloud computing has become the norm, and more applications and services have migrated for enterprise convenience and flexibility, IT departments have realised that traditional wide area network (WAN) architectures – utilising multiprotocol label switching (MPLS) circuits and conventional routers – cannot keep up. As such, to achieve the highest levels of application performance and security, many organisations have turned to software-defined WAN (SD-WAN), the networking technology that connects users directly and securely to applications using any underlying transport, including 4G and broadband internet.

SD-WAN enables enterprises to shift to a business-first networking model, where the network enables the business, rather than the business conforming to the constraints of existing WAN approaches. Instead of being a constraint, the WAN becomes a business accelerant that is fully automated and continuous, giving every application the resources it truly needs, while delivering 10 times the bandwidth for the same budget – ultimately achieving the highest quality of experience to users and IT alike.

This is part one of a two-part article series that will explore the effect of 4G and 5G on enterprise networking, as well as the SD-WAN journey through the evolution of these wireless technologies.

Mobile SD-WAN is a reality

4G LTE commercialisation is continuing to expand. According to the GSM (Groupe Spéciale Mobile) Association, 710 operators have rolled out 4G LTE in 217 countries, reaching 83 percent of the world’s population. The evolution of 4G is transforming the mobile industry and is setting the stage for the advent of 5G.

Mobile connectivity is increasingly integrated with SD-WAN, along with MPLS and broadband WAN services today. The reason being is that 4G LTE represents a very attractive transport alternative, as a backup or even an active member of the WAN transport mix to connect users to critical business applications. In some cases, 4G LTE might be the only choice in locations where fixed lines are not available or reachable. Furthermore, an SD-WAN can optimise 4G LTE connectivity, and bring new levels of performance and availability to mobile-based business use cases by bonding multiple 4G LTE connections to deliver the highest levels of network and application performance.

Increasing application performance and availability with 4G LTE

Best in class SD-WAN solutions enable customers to incorporate one or more low-cost 4G LTE services into the WAN transport mix. Indeed, all the capabilities of the SD-WAN platform – including packet-based link bonding, dynamic path control, and path conditioning – can be supported across multiple LTE links. This ensures always-consistent, always-available application performance even in the event of an outage or degraded service.

With an advanced business-driven SD-WAN edge platform, IT can also incorporate sophisticated network address translator (NAT) traversal technology to eliminate the requirement for provisioning the LTE service with extra-cost static IP addresses. Holistic solutions offer management software that enables the prioritisation of LTE bandwidth usage based on branch and application requirements – active-active or backup-only. This kind of solution is ideally suited toward retail point-of-sale and other deployment use cases where always-available WAN connectivity is critical for the business.

Mobile SD-WAN: innovative connectivity solutions to real world problems

An example of an innovative mobile SD-WAN service is swyMed’s DOT Telemedicine Backpack, powered by an SD-WAN hardware platform. This integrated telemedicine solution enables emergency services first responders to connect to doctors and communicate patient vital statistics using real-time video anywhere, anytime, thereby greatly improving and expediting care for emergency patients. Using a lifesaving backpack provisioned with two LTE services from different carriers, the SD-WAN can continuously monitor the underlying 4G LTE services for packet loss, latency and jitter. In the case of transport failure or brownout, the SD-WAN automatically initiates a sub-second failover so that voice, video and data connections continue without interruption over the remaining active 4G service. By bonding the two LTE links together with the SD-WAN, swyMed can achieve an aggregate signal quality in excess of 90 percent, bringing mobile telemedicine to areas that would have been impossible in the past due to poor signal strength.

Prepare for the 5G future

The adoption of 4G LTE is already a reality. As well as evangelising SD-WAN to end-users, service providers have a vital and value-added role in the design, installation, deployment, repair, and ongoing monitoring of managed SD-WAN services. Indeed, service providers are already taking advantage of the distinct benefits of SD-WAN to offer managed SD-WAN services that leverage flexible and mobile 4G LTE to their customers.

As the race for the 5G gains momentum in the UK – with it expected to be available in multiple cities this year – service providers will no doubt look for ways to drive new revenue streams to capitalise on their initial investments. The next article of this two-part series will discuss the rise of 5G, and how SD-WAN will help service providers to transition from 4G to 5G, as well as enable the monetisation of a new wave of managed 5G services.

 

100508_Simon Pamplin_v1Simon Pamplin is the EMEA Technical Sales Director for Silver Peak and a regular speaker at events on topics ranging from the latest storage technologies and server virtualisation to the current shift in data networking towards SD-WAN, as well as the latest developments in the technology. With over 20 years’ experience in enterprise IT, Simon has worked for IP, SAN and hyper-convergent companies and is driven by new technology and the business benefits it can bring.

Q&A with Sunil Lingayat, Chief of Cyber Strategy and Technology at T-Mobile

Sunil Lingayat leads the cybersecurity strategy architecture and cybersecurity technology functions for T-Mobile and is responsible for driving next generation cyber strategies and capabilities and positioning products and services into an effective cyber resilience posture. The Big 5G Event team interviewed Sunil ahead of the event to gain a sneak peek of what we can expect at our upcoming conference.

What are the unique security requirements for 5G networks?

At a high level there are two primary reasons that are driving unique security requirements for 5G networks.  First is the use of COTS technologies and open architectures, distributed architectures, disaggregation at various layers of the stack, open service-based architecture (HTTP2/JSON), etc.  Second the exponential growth in number of devices (e.g. IoT), higher business value use cases, need for privacy-by-design, need for Safety, Low latency, order of magnitude higher throughout, etc.  Both of these aspects lead to (a) increased attack surface, (2) susceptible to a broader and established attacks and exploits, and higher tier threat actors, and more importantly (3) traditional security architectures and controls will not work, etc ….all contributing to unique security requirements for 5G in comparison to earlier networks – such as requirement for use case driven security, layered security, security automation, and cyber resilience.

What will be the unique security considerations for specialized 5G use cases?

As per ITU, 5G is expected to support three different families of use cases with somewhat conflicting requirements on a couple of dimensions such as latency, integrity, etc.  These are driving the need for slicing or network of networks architecture.  Within each use case type, there is also the dimension of privacy.  Some use cases requiring very stringent privacy e.g., HIPPA.  Whereas some use cases integrity and latency are critical.  It is important that security controls are geared towards each use case.  One size fits all will not work as it will make services very expensive, fragile, and in effect non-operational.

How can a dynamic security architecture be ensured for each network slice?

Software-defined security (SDS) becomes very important for achieving dynamic security. Security orchestration integrated with service orchestration is essential. Security function virtualization is another approach aligned with the VNF and NFVi architectures. All of this need automation at scale from the very beginning in the architecture.  Machine Learning and AI have to be incorporated and fine-tuned for “whitelist” security model and behaviour monitoring.

How can service providers adequately support NFV/SDN security requirements

Virtualization is not new as a technology.  Much innovation and lessons learned in the cloud industry.  Cyber 2.0 cyber resilience design principles like Autonomic security, least privilege, privilege escalation, dynamic alignment, dynamic positioning, etc have to be designed in. Adoption of de-perimeterized security strategy and architecture is crucial so security is not tied to the perimeter or zone.  Security has to be dynamic.  In fact, SDN/NFV can be effectively used to enhance traditional static (host and network-based) security positively and make 5G services cyber-resilient.

 

You can come face to face with Sunil Lingayat, Chief of Cyber Strategy and Technology at T-Mobile this year at the Big 5G Event this May 6-8 2019 in Denver, CO.

Ericsson and Intel partner for 5G cloud platform

Ericsson and Intel have announced a new partnership which is aimed at aligning the Swedes efforts for software-defined infrastructure with Intel’s Rack Scale Design.

The resulting hardware management platform will be designed for telcos targeting 5G, NFV, and distributed cloud. In theory, the pair aims to create a common managed hardware pool for all workloads that dynamically scales. It’s the scalable and affordable dream telcos have been promised for years.

The duo has said the new tie-up will allow telcos to take advantage of multi-vendor hardware options, Ericsson’s end-to-end software solutions, and Intel’s latest architectural solutions.

“We have long history of successful collaboration with Intel. Lars Mårtensson, Head of Cloud & NFV infrastructure for Digital Services at Ericsson. “This new collaboration will focus on software in addition to hardware and we see it to be truly transformative for service providers’ ability to successfully deploy open cloud and NFV infrastructure, from centralized data-centres to the edge. Intel’s and Ericsson’s joint efforts significantly strengthens the competitiveness and roadmap of the Ericsson Software Defined Infrastructure offering.”

“5G will be transformative, accelerating today’s applications and triggering a wave of new usages and edge-based innovation,” said Sandra Rivera, SVP of the Network Platform Group at Intel. “Our infrastructure manageability collaboration with Ericsson will help communications service providers remove deployment barriers, reduce costs, and deliver new 5G and edge services with cloudlike speed on a flexible, programmable and intelligent network.”

As part of the tie up, the Ericsson SDI Manager software and Intel RSD reference software will be converged, though the pair reiterated full backward compatibility would be maintained for existing customers. Any new solutions developed moving forwards will be subsequent Ericsson hardware platforms, as well as Intel’s server products which are sold through third-parties and in other industry segments.

Are you ready to look at 6G?

We can hear the groans already, but we’re going to do it anyway. Let’s have a look at what 6G could possibly contribute to the connected economy.

Such is our desire for progress, we haven’t even launched 5G but the best and brightest around are already considering what 6G will bring to the world. It does kind of make sense though, to avoid the dreaded staggering of download speeds and the horrific appearance of buffering symbols, the industry has to look far beyond the horizon.

If you consider the uphill struggle it has been to get 5G to this point, and we haven’t even launched glorious ‘G’ properly, how long will it take before we get to 6G? Or perhaps a better question is how long before we actually need it?

“5G will not be able to handle the number of ‘things’ which are connected to the network in a couple of years’ time,” said Scott Petty, CTO of Vodafone UK. “We need to start thinking about 6G now and we have people who are participating in the standards groups already.”

This is perhaps the issue which we are facing in the future; the sheer volume of ‘things’ which will be connected to the internet. As Petty points out, 5G is about being bigger, badder and leaner. Download speeds will be faster, reliability will be better, and latency will be almost none existent, but the weight of ‘things’ will almost certainly have an impact. Today’s networks haven’t been built with this in mind.

Trying to find consensus on the growth of IOT is somewhat of a difficult task, such is the variety of predictions. Everyone predicts the same thing, the number of devices will grow in an extra-ordinary fashion, but the figures vary by billions.

Using Ericsson’s latest mobility report, the team is estimating cellular IoT connections will reach 4.1 billion in 2024, of which 2.7 billion will be in North East Asia. This is a huge number and growth will only accelerate year-on-year. But here is thing, we’re basing these judgments on what we know today; the number of IOT devices will be more dependent on new products, services and business models which will appear when the right people have the 5G tools to play around with. Who knows what the growth could actually be?

IOT Growth

Another aspect to consider is the emergence of new devices. As it stands, current IOT devices deliver such a minor slice of the total cellular traffic around the world its not much of a consideration, however with new usecases and products for areas such as traffic safety, automated vehicles, drones and industrial automation, the status quo will change. As IOT becomes more commonplace and complicated, data demands might well increase, adding to network strain.

Petty suggests this will be the massive gamechanger for the communications industry over the next few years and will define the case for 6G. But, who knows what the killer usecase will be for 5G, or what needs will actually push the case for the next evolution of networks. That said, more efficient use of the spectrum is almost certainly going to be one of the parameters. According to Petty, this will help with the tsunami of things but there is a lot of new science which will have to be considered.

Then again, 6G might not be measured under the same requirements as today…

Sooner or later the industry will have to stop selling itself under the ‘bigger, badder, faster’ mantra, as speeds will become irrelevant. If you have a strong and stable 4G connection today, there isn’t much you can’t do. Few applications or videos that are available to the consumer require 5G to function properly, something which telco marketers will have to adapt to in the coming years as they try to convince customers to upgrade to 5G contracts.

4G and arguably todays vision of 5G has always been about making the pipe bigger and faster, because those were the demands of the telcos trying to meet the demands of the consumer. 6G might be measured under different KPIs, for example, energy efficiency.

According to Alan Carlton, Managing Director of InterDigital’s European business, the drive towards more speed and more data is mainly self-imposed. The next ‘G’ can be defined as what the industry wants it to be. The telcos would have to think of other ways to sell connectivity services to the consumer, but they will have to do that sooner or later.

The great thing about 5G is that we are barely scratching the surface of what is capable. “We’re not even at 5.0G yet,” said Carlton. “And this is part of the confusion.”

What 5G is nowadays is essentially LTE-A Pro. We’re talking about 256-QAM and Massive MIMO but that is not really a different conversation. With Release 16 on the horizon and future standards groups working on topics such virtualisation, MMwave and total cost of ownership, future phases of 5G will promise so much more.

The next step for Carlton is not necessarily making everything faster, or more reliable or lower latency, but the next ‘G’ could be all about ditching the wires. Fibre is an inflexible commodity, and while it might be fantastic, why do we need it? Why shouldn’t the next vision of connectivity be one where we don’t have any wires at all?

Carlton’s approach to the future of connectivity is somewhat different to the norm. This is an industry which is fascinated by the pipes themselves and delivering services faster, but these working groups and standards bodies are driving change for the benefit of the industry. It doesn’t necessarily have to be about making something faster, so you can charge more, just a change to the status quo which benefits the industry.

Coming back to the energy efficiency idea, this is certainly something which has been suggested elsewhere. IEEE has been running a series of conferences in California addressing this very issue, as delivering 1000X more data is naturally going to consume more energy to start with. It probably won’t be 1000X more expensive, but it is incredibly difficult to predict what future energy consumption needs will be. Small cells do not consume as much energy as traditional sites, but there will need to be a lot more of them to meet demand. There are a lot of different elements to consider here (for example environment or spectrum frequency), but again, this is a bit of an unknown.

Perhaps this is an area where governments will start to wade in? Especially in the European and North American markets which are more sensitive to environmental impacts (excluding the seemingly blind Trump).

Echoing Petty’s point from earlier, we don’t necessarily know the specifics of how the telco industry is going to be stressed and strained in six- or seven-years’ time. These changes will form the catalyst for change, evolving from 5G to 6G, and it might well be a desire for more energy efficient solutions or it might well be a world free of wires.

Moving across the North Sea, 6G has already captured the attention of those in the Nordics.

Back in April 2018, the Academy of Finland announced the launch of ‘6Genesis’, an eight-year research programme to drive the industry towards 6G. Here, the study groups will start to explore technologies and services which are impossible to deliver in today’s world, and much of this will revolve around artificial intelligence.

Just across the border in Sweden, these new technologies are capturing the attention of Ericsson. According to Magnus Frodigh, Head of Ericsson Research, areas like Quantum computing, artificial intelligence and edge computing are all making huge leaps forward, something which will only be increased with improved connectivity. These are the areas which will define the next generation, and what can be achieved in the long-run.

“One of the new things to think about is the combination of unlimited connectivity as a resource, combined with low latency, more powerful computing,” said Frodigh. “No-one really knows how this is going to play out, but this might help define the next generation of mobile.”

Of course, predicting 6G might be pretty simple. In a couple of years’ time, perhaps we will all be walking around with augmented reality glasses on while holographic pods replace our TVs. If such usecases exist, perhaps the old ‘bigger, badder, faster’ mantra of the telco industry will be called upon once again. One group which is counting on this is EU-funded Terranova, which is currently working on solutions to allow network connection in the terahertz range, providing speeds of up to 400 Gbps.

Another area to consider is the idea of edge computing and the pervasiveness of artificial intelligence. According to Carlton (InterDigital), AI will be every in the future with intelligence embedded in almost every device. This is the vision of the intelligent economy, but for AI to work as promised, latency will have to be so much lower than we can even consider delivering today. This is another demand of future connectivity, but without it the intelligent economy will be nothing more than a shade of what has been promised.

And of course, the more intelligence you put on or in devices, the greater the strain on the components. Eventually more processing power will be moved off the devices and into the cloud, building the case for distributed computing and self-learning algorithms hosted on the edge. It is another aspect which will have to be considered, and arguably 5G could satisfy some of these demands, but who knows how quickly and broadly this field will accelerate.

Artificial intelligence and the intelligent economy have the potential to become a catalyst for change, forcing us to completely rethink how networks are designed, built and upgraded. We don’t know for sure yet, but most would assume the AI demands of the next couple of years will strain the network in the same way video has stressed 4G.

Who knows what 6G has in store for us, but here’s to hoping 5G isn’t an over-hyped dud.

Why open source is the backbone enabling 5G for telcos

Telecoms.com periodically invites third parties to share their views on the industry’s most pressing issues. In this piece Alla Goldner looks at ONAP and its contribution to virtualization and preparing the way for 5G for telcos.

5G is a technology revolution – paving the way for new revenue streams, partnerships and innovative business models. More than a single technology, 5G is about the integration of an entire ecosystem of technologies. Indeed, a recent Amdocs survey found that nearly 80% of European communications service providers (CSPs) expect the introduction of 5G to expand revenue opportunities with enterprise customers. It also found that 34% of operators plan to offer 5G services commercially to this sector by the end of 2019, a figure that will more than double to 84% by the end of 2020.

As with every revolution, to extract its full potential value, it will require a set of enablers or tools to connect the new technology with the telco network. For CSPs in particular, the need for new and enhanced network management system is an established fact, with more than half of European operators saying they would need to enhance their service orchestration capabilities. But, they want to do this in a flexible, agile and open manner, and not be burdened with constraints and limitations of traditional tools approaches.

ONAP: the de-facto automation platform

This is where ONAP (Open Network Automation Platform) enters the picture. Developed by a community of open source network evangelists from across the industry, it has become the de-facto automation platform for carrier grade service provider networks. Since its inception in February 2017, the community has expanded beyond the pure technical realm to include collaboration with other open source projects such as OPNFV, CNCF, and PNDA, as well as standards communities such as ETSI, MEF, 3GPP and TM Forum. We also anticipate collaboration with the Acumos Project to feed ONAP analytics with AI/ML data and parameters. Such collaboration is essential when it comes to delivering revolutionary use cases, such as 5G and edge automation, as its implementation requires alignment with evolving industry standards.

ONAP and 5G

CSPs consider 5G to be more than just a radio and core network overhaul, but rather a significant architecture and network transformation. And ONAP has a key role to play in this change. As an orchestration platform, ONAP enables the instantiation, lifecycle management and assurance of 5G network services. As part of the roadmap, ONAP will eventually have the ability to implement resource management and orchestration of 5G physical network functions (PNFs) and virtual network functions (VNFs). It will also have the ability to provide definition and implementation of closed-loop automation for live deployments.

The 5G blueprint is a multi-release effort, with Casablanca, ONAP’s latest release, introducing some key capabilities around PNF integration and network optimization. Given that the operators involved with ONAP represent more than 60% of mobile subscribers and the fact that they are directly able to influence the roadmap, this paves the way for ONAP, over time, to become a compelling management and orchestration platform for 5G use cases, including hybrid VNF/PNF support.

Another capability in high-demand is support for 5G network slicing, which is aggregated from access network (RAN), transport and 5G core network slice subnet services. These, in turn are composed of a combination of other services, virtual network functions (VNFs) and physical network functions (PNFs). To support this, ONAP is working on supporting the ability to model complex network services, as part of the upcoming Dublin release.

To summarize the above, 5G and ONAP are together two critical pieces of the same puzzle:

  • ONAP is the defacto standard for end-to-end network management systems, a crucial enabler of 5G
  • ONAP enables support of existing and future networking use cases, and provides a comprehensive solution to enable network slicing as a key embedded capability of 5G
  • By leveraging a distributed and virtualized architecture, ONAP is active in the development of network management enhancements and distributed analytics capabilities, which are required for edge automation – a 5G technology enabler

The importance of vendor involvement: Amdocs case study

Amdocs has been involved in ONAP since its genesis as ECOMP (Enhanced Control, Orchestration, Management and Policy), the orchestration and network management platform developed at AT&T. Today, Amdocs is one of the top vendors participating in ONAP developments, and has supported proven deployments with leading service providers.

Amdocs supports both platform enhancements and use case development activities including:

  • SDC (Service Design and Creation)
  • A&AI (Active and Available Inventory)
  • Logging and OOM (ONAP Operations Manager) projects
  • Modeling and orchestration of complex 5G services, such as network slicing

Amdocs’ and other vendors participation in ONAP enables the ecosystem to benefit through a best-in-class NFV orchestration platform, supporting the full lifecycle of support of 5G services in an open, multi-vendor environment – from service ideation, modeling, through its instantiation, commission, modification, automatic closed-loop operations, analytics and finally, decommissioning.

The result is a win-win for CSPs, Amdocs, other vendors, as well as the ONAP community as a whole.

The benefits of collaboration for CSPs are that it provides them comprehensive monetization capabilities that enable them to capture every 5G revenue opportunity. The benefit for vendors such as Amdocs is to further their knowledge of best practices, which then flow back to the ONAP community.

 

About the author: Since ONAP’s inception, Alla Goldner has been a member of the ONAP Technical Steering Committee (TSC) and Use Case subcommittee chair. She also leads all ONAP activities at Amdocs.

Alla Golder is on the advisory board of Network Virtualization & SDN Europe. Find out what’s on the agenda and why you should be in Berlin this May

Vodafone bags Big Blue as $550 million partner

Vodafone Business and IBM have signed-off on a new joint venture which will aim to develop systems to help data and applications flow freely around an organization.

The joint-venture, which will be operational in the first half of 2019, will aim to bring together the expertise of both the parties to solve one of the industry’s biggest challenges; multi-cloud interoperability and the removal of organizational siloes. On one side of the coin you have IBM’s cloud know-how while Vodafone will bring the IoT, 5G and edge computing smarts. A match made in digital transformational heaven.

“IBM has built industry-leading hybrid cloud, AI and security capabilities underpinned by deep industry expertise,” said IBM CEO Ginni Rometty. “Together, IBM and Vodafone will use the power of the hybrid cloud to securely integrate critical business applications, driving business innovation – from agriculture to next-generation retail.”

“Vodafone has successfully established its cloud business to help our customers succeed in a digital world,” said Vodafone CEO Nick Read. “This strategic venture with IBM allows us to focus on our strengths in fixed and mobile technologies, whilst leveraging IBM’s expertise in multi-cloud, AI and services. Through this new venture we’ll accelerate our growth and deepen engagement with our customers while driving radical simplification and efficiency in our business.”

The issue which many organizations are facing today, according to Vodafone, is the complexity of the digital business model. On average, 70% of organizations are operating in as many as 15 different cloud environments, leaning on the individuals USPs of each, but marrying these environments is a complex, but not new, issue.

Back in September, we had the chance to speak to Sachin Sony of Equinix about the emerging Data Transfer Project, an initiative to create interoperability and commonalities between the different cloud environments. The project is currently working to build a common framework with open-source code that can connect any two online service providers, enabling a seamless, direct, user-initiated portability of data between the two platforms This seems to be the same idea which the new IBM/Vodafone partnership is looking to tackle.

With this new joint-venture it’ll be interesting to figure out whether the team can build a proposition which will be any good. Vodafone has promised the new business will operate with a ‘start-up’ mentality, whatever that means when you take away the PR stench, under one roof. Hopefully the walk will be far enough away from each of the parent companies’ offices to ensure the neutral ground can foster genuine innovation.

This is a partnership which has potential. The pair have identified a genuine issue in the industry and are not attempting to solve it alone. Many people will bemoan the number of partnerships in the segment which seem to be nothing more than a feeble attempt to score PR points, but this is an example where expertise is being married to split the spoils.

Huawei launches Kunpeng 920 chip to bag big data and edge computing

Huawei has unveiled a new ARM-based CPU called Kunpeng 920, designed to capitalise on the growing euphoria building around big data, artificial intelligence and edge-computing.

The CPU was independently designed by Huawei based on ARMv8 architecture license, with the team claiming it improves processor performance by optimizing branch prediction algorithms, increasing the number of OP units, and improving the memory subsystem architecture. Another bold claim is the CPU scores over 930 in the SPECint Benchmarks test, 25% higher than the industry benchmark.

“Huawei has continuously innovated in the computing domain in order to create customer value,” said William Xu, Chief Strategy Marketing Officer of Huawei.

“We believe that, with the advent of the intelligent society, the computing market will see continuous growth in the future. Currently, the diversity of applications and data is driving heterogeneous computing requirements. Huawei has long partnered with Intel to make great achievements. Together we have contributed to the development of the ICT industry. Huawei and Intel will continue our long-term strategic partnerships and continue to innovate together.”

The launch itself is firmly focused on the developing intelligence economy. With 5G on the horizon and a host of new connected services promised, the tsunami of data and focus on edge-computing technologies is certain to increase. These are segments which are increasingly featuring on the industry’s radar and Huawei might have stolen a couple of yards on the buzzword chasers ahead of the annual get-together in Barcelona.

“With Kirin 980, Huawei has taken smartphones to a new level of intelligence,” said Xu. “With products and services (e.g. Huawei Cloud) designed based on Ascend 310, Huawei enables inclusive AI for industries. Today, with Kunpeng 920, we are entering an era of diversified computing embodied by multiple cores and heterogeneity. Huawei has invested patiently and intensively in computing innovation to continuously make breakthroughs.”

Another interesting angle to this launch is the slight shuffle further away from the US. With every new product which Huawei launches, more of its own technology will feature. In years gone, should Huawei have wanted to launch any new servers or edge computing products it would have had to look externally for CPUs. Considering Intel and AMD have a strong position in these segments, supply may have come from the US.

For any other company, this would not be a problem. However, considering the escalating trade war between the US and China, and the fact Huawei’s CFO is currently awaiting trial for violating US trade sanctions with Iran, this is a precarious position to be in.

Cast you mind back to April. ZTE had just been caught red-handed violating US trade sanctions with Iran and was subsequently banned from using any US components or IP within its supply chain. Should the courts find Huawei guilty of the same offence, it is perfectly logical to assume it would also face the same punishment.

This is the suspect position Huawei finds itself in and is currently trying to correct. Just before Christmas, Huawei’s Rotating CEO Ken Hu promised it’s supply chain was in a better position than ZTE’s and the firm wouldn’t go down the same route, while in the company’s New Year’s message, Rotating Chairman Guo Ping said the focus of 2019 would be creating a more resilient business. These messages are back up by efforts in the R&D team, such as building an alternative to the Android operating system which would power its smartphones should it be banned from using US products.

Perhaps the Kunpeng 920 could be seen as another sign Huawei is distancing itself from the US, while also capitalising on a growing which is about to blossom.

The cloud is booming but no-one seems to have told Oracle

Revenues in the cloud computing world are growing fast with no end in sight just yet, but Oracle can’t seem to cash in on the bonanza.

This week brought joint-CEOs Safra Catz and Mark Hurd in front of analysts and investors to tell everyone nothing has really changed. Every cloud business seems to be hoovering up the fortunes brought with the digital era, demonstrating strong year-on-year growth, but Oracle only managed to bag a 2% increase, 1% for the cloud business units.

It doesn’t matter how you phrase it, what creative accounting processes you use, when you fix the currency exchange, Oracle is missing out on the cash grab.

Total Revenues were unchanged at $9.6 billion and up 2% in constant currency compared to the same three months of 2017, Cloud Services and License Support plus Cloud License and On-Premise License revenues were up 1% to $7.9 billion. Cloud Services and License Support revenues were $6.6 billion, while Cloud License and On-Premise License revenues were $1.2 billion. Cloud now accounts for nearly 70% of the total company revenues and most of it is recurring revenues.

Some might point to the evident growth. More money than last year is of course better, but you have to compare the fortunes of Oracle to those who are also trying to capture the cash.

First, let’s look at the cloud market on the whole. Microsoft commercial cloud services have an annual run rate of $21.2 billion, AWS stands at $20.4 billion, IBM $10.3 billion, Google cloud platform at $4 billion and Alibaba at $2.2 billion. Oracle’s annual run rate is larger than Google and Alibaba, those these two businesses are growing very quickly.

Using the Right Scale State of the Cloud report, enterprises running Google public cloud applications are now 19%, IBM’s applications are 15%, Microsoft at 58% and AWS at 68%. Alibaba is very low, though considering the scale potential it has in China, there is great opportunity for a catapult into the international markets. Oracle’s applications are only running in 10% of enterprise organizations who responded to the research.

Looking at the market share gains for last quarter, AWS is unsurprisingly sitting at the top of the pile collecting 34% over the last three months, Microsoft was in second with around 15%, while Google, IBM and Alibaba exceeded the rest of the market as well. Oracle sits in the group of ten providers which collectively accounted for 15% of cloud spending in the last quarter. These numbers shouldn’t be viewed as the most attractive.

Oracle is not a company which is going to disappear from the technology landscape, it is too important a service provider to numerous businesses around the world. However, a once dominant and influential brand is losing its position. Oracle didn’t react quick enough to the cloud euphoria and it’s looking like its being punished for it now.