Open source and collaboration stacks up in telco virtualization on the cloud

OpenStack Ecosystem Technical Lead Ildikó Vancsa drives NFV related feature development activities in projects like OpenStack’s Nova and Cinder, as well as onboarding and training. The Network Virtualization event team caught up with her ahead of Network Virtualization & SDN Europe in May.

“I’m an open source advocate both personally and professionally and when the movement towards virtualisation and the cloud started, telcos started looking into OpenStack and we began to explore how they could use us,” she explained.

“It has been a very interesting journey, even if it hasn’t always been easy. Virtualisation really is a transformation both from mind set perspective as well as technology perspective,” she said.

The concept sounds simple enough – lift functions from a physical hardware stack and put it into a virtual machine in the cloud. The realty is quite different. In an ideal world, Vancsa suggested you would just rewrite everything from scratch, but this approach is not possible.

“It’s a short sentence to summarize it but it’s a really hard thing to do; especially because those functions are often tightly coupled with specialised hardware,” she said.

This hardware traditionally represented the whole functions stack, all the way out to software. As Vancsa put it: “We needed to support this journey while at the same time looking for where network functions can go. We need to be able to support the legacy network functions as well as providing an environment for new applications, written with the with this new mind set”.

“We do not have to reinvent the wheel and we [OpenStack] didn’t try that. We worked with companies and vendors in the in the NFV and networking space to be able to plug the components that they prefer to use into OpenStack and provide the functionality that they need,” she said.

OpenStack has now moved away from being one huge code stack to become more modular, offering standalone components such as load balancing as a service.

One crucial aspect of getting OpenStack right is collaboration across telco as open source becomes more and more widespread. In Vancsa’s words: “Back in the old days you had one vendor and they supplied your hardware and software and it was all tightly integrated and, no questions asked, it was supposed to work.”

“As open source has become more popular in telecom environments, we see more operators picking up commodity hardware and have a couple of vendors supplying software. So they might have one vendor supplying the MANO components and another for the orchestration layer,” she explained. “This means it is critical to keep an eye on interfaces and interoperability.”

For Vancsa, collaboration, open infrastructure and open source software are vital for virtualization to succeed, especially as telcos move more into the cloud, and events such as Network Virtualization & SDN Europe are vital for this.

“You get the chance to talk to people and basically make the first connection after which is just so much easier to collaborate on other forums,” she enthused.

 

Ildikó Vancsa was also a panellist on a recent webinar from Network Virtualization & SDN Europe on Virtualzation and the Cloud. You can listen to the webinar on-demand here. Network Virtualization & SDN Europe 2019 takes place 21-23 May at the Palace Hotel in Berlin. Find out more here.

F5 makes agile move with $670 million NGNIX acquisition

App security outfit F5 is buying open-source application platform specialist NGINX to augment its multi-cloud offering.

F5 is hardly the first to notice the importance of the cloud in the evolution of the entire tech industry, nor is it unique in realising that open-source is a great way of making a multi-cloud environment work. But for a company of its size (revenues of $563 million in 2018) this certainly qualifies as putting your money where your mouth is.

“F5’s acquisition of NGINX strengthens our growth trajectory by accelerating our software and multi-cloud transformation,” said François Locoh-Donou, CEO of F5. “By bringing F5’s world-class application security and rich application services portfolio for improving performance, availability, and management together with NGINX’s leading software application delivery and API management solutions, unparalleled credibility and brand recognition in the DevOps community, and massive open source user base, we bridge the divide between NetOps and DevOps with consistent application services across an enterprise’s multi-cloud environment.”

“NGINX and F5 share the same mission and vision,” said Gus Robertson, CEO of NGINX. “We both believe applications are at the heart of driving digital transformation. And we both believe that an end-to-end application infrastructure – one that spans from code to customer – is needed to deliver apps across a multi-cloud environment. “I’m excited to continue this journey by adding the power of NGINX’s open source innovation to F5’s ADC leadership and enterprise reach. F5 gains depth with solutions designed for DevOps, while NGINX gains breadth with access to tens of thousands of customers and partners.”

Open source and DevOps are often referred to in the same breath as part of a broader narrative around ‘agility’. One of the main benefits of the move to the cloud is the far greater choice, efficiency and flexibility it promises, but without a culture geared towards exploiting those opportunities they’re likely to be wasted. With this acquisition F5 is positioning itself as a partner for telcos heading in an agile direction.

Here’s a diagram outlining the rationale of the move.

F5+NGINX

Red Hat gives thanks for Turkcell virtualization win

Turkish operator Turkcell has launched a virtualization platform called Unified Telco Cloud that’s based on Red Hat’s OpenStack Platform.

As the name implies this new platform is all about centralising all its services onto a single virtualized infrastructure. This NFVi then allows east selection and implementation of virtual network functions, or so the story goes. Examples of operators going all-in on this stuff are still sufficiently rare for this to be noteworthy.

As a consequence this deal win is also a big deal for Red Hat, which has invested heavily in attacking the telco virtualization market from an open source direction, as is its wont. Red Hat OpenStack Platform is its carrier-grade distribution of the open source hybrid cloud platform. Turkcell is also using Red Hat Ceph Storage, a software-defined storage technology designed for this sort of thing.

“Our goal is to remake Turkcell as a digital services provider, and our business ambitions are global,” said Gediz Sezgin, CTO of Turkcell. “While planning for upcoming 5G and edge computing evolution in our network, we need to increase vendor independence and horizontal scalability to help maximise the efficiency and effectiveness of our CAPEX investment.

“With the Unified Telco Cloud, we want to lower the barrier to entry of our own network to make it a breeding ground for innovation and competition. In parallel, we want to unify infrastructure and decrease operational costs. Red Hat seemed a natural choice of partner given its leadership in the OpenStack community, its interoperability and collaboration with the vendor ecosystem and its business transformation work with other operators.”

Another key partner for Turkcell in this was Affirmed Networks, which specialises in virtualized mobile networks. “We initially selected Affirmed Networks based on their innovation in the area of network transformation and virtualization and their work with some of the world’s largest operators,” said Sezgin.

It’s good to see some of the endlessly hyped promise of NFV actually being put into effect and it will be interesting to see what kind of ROI Turkcell claims to have got from its Unified Telco Cloud. With organisations such as the O-RAN Alliance apparently gathering momentum, open source could be a major theme of this year’s MWC too.

Seven success factors for partnering in the age of open source

Telecoms.com periodically invites third parties to share their views on the industry’s most pressing issues. In this piece Susan James, Senior Director of Telecommunications Strategy at Red Hat, explores the ideal balance of in-house and outsourced talent in order to make the most of open source opportunities.

The breadth of technology knowledge that service providers now require has increased exponentially, as the number of employees is stable or in decline in most service providers, and the pool of needed specialist digital talent hasn’t been able to keep pace.

With a more diverse and collaborative ecosystem than ever before, choices are available – but what’s the right balance of making the most of your own talent versus looking externally? Here are several considerations when deciding how, when, and with whom to partner.

  1. Focus on your core business

For most companies, success comes from the core business. You have to closely examine your business and ask what your core business can provide that no one else can.

It’s easy to look at other companies and be tempted to duplicate the way they do things as a template for your own business. However, you need to recognise what’s good about your business, and focus on maximizing those aspects with innovation, rather than trying to replicate what others are doing.

When you know your core business, you can make informed decisions on what to build yourself, where to use external people to solve a problem, and what areas need third party software to do the job.

  1. Diversify to STEAM ahead

With a growing shortfall of specialized digital skills, we need to encourage people to enter emerging technology from outside of the expected STEM fields (science, technology, engineering and maths). Hence the A for ‘arts’ in the emerging term STEAM. We must be open to diverse types of skills and perspectives coming into the industry; they can help us understand the diverse customer base we all have.

In a recent internal meeting, I did a quick straw poll of how many people were engineers – less than half had an engineering background. I myself have an economics degree and ended up as a software product manager. You don’t need an engineering degree to use software, and you don’t need one to understand how to build software. I encourage you to look at your business as a whole, and what different perspectives can bring, when developing software. This applies to hiring as well as team building and partnering.

  1. Choose thought leaders

You have your core strategy and your internal teams. What’s your criteria for choosing who to team up with inside and outside your organization? Thought leadership is a priority, surely. But what makes a thought leader?

I would argue that true thought leadership today is not just someone who can solve today’s technology problems; nor is it painting a technical vision of the future. It’s recognizing the need to shape your organization with the capabilities that enable you to seize unknown opportunities that lie ahead. Red Hat’s CEO Jim Whitehurst calls this ‘organizing for innovation’. He argues that in today’s dynamic environment, planning as we know it is dead. Instead, you must build the mindset and the mechanisms that enable you to move faster and change as needed.

  1. Seek shared philosophy

Ask yourself what kind of partners you want to be associated with, and how you want to be perceived in the market. Move forward with partners whose core values align with yours. For example, as the world moves increasingly towards open source, do you want to be perceived as a leader in upstream* innovation? Do you want to be an active contributor to open source communities and help influence new features that are developed for real world needs? Or, do you desire to be a leader in technology adoption for bringing new capabilities to market? Look for a partner that displays or complements what you want to see in your business.

*Upstream communities are the projects and people that participate in open source software development and are also known as innovation engines.

  1. Prioritize honesty

Some say the measure of a true friend is someone brave enough to tell you the truth, and this can apply in business as well. Some companies aren’t used to openness and honesty that brings – they are accustomed to being told what they want to hear by partners. So when they do hear the truth, it can come as a shock. They may see it as confrontational or they may look for a second meaning.

But being honest and upfront enables companies to grow together. It provides the opportunity to quickly identify threats and know what you’re dealing with, both good and bad. Working in open source communities makes it easier to be honest about deficiencies because everyone can see them. All code is fully exposed and everyone works on the same information and code base, which allows everyone to more easily unite around common causes and problems.

When evaluating potential partners, determine their participation in open source communities, and be clear about the importance of open, honest communication in all business dealings.

  1. Establish clear partner engagement models

A relationship with a partner can be multifaceted. Collaborating in open source development is a separate engagement from working with that same partner on the business side of things, and likely looks quite different.

Upstream communities are all about rapid iteration, creativity, and innovation. Going to market with a product must be about reliability, security, and making sure the product works in practice.

It’s possible for these two areas to overlap: business needs can influence communities in a certain direction, and upstream collaboration between partners can solidify a business relationship. Or, you might work harmoniously with a company upstream, yet go out and compete with each other fiercely on the sales side.

Therefore, be clear from the outset about your engagement models and what constitutes success in each area.  If you are not able to see a “win” for both parties, then long term success of the partnership is questionable.

  1. Understand open source

The proliferation of open source across industries is allowing new players to enter markets, and existing players to work more closely together. There are different ways to leverage open source technologies, and it’s important to understand how different uses impact your business.

Downloading open source software for free doesn’t mean that it doesn’t cost you anything. If you make customizations to that open source software, you need to be aware of what you’re taking on internally. Ensuring that the software is secure ongoing and understanding how to manage the lifecycle takes resources and competence.  Making changes that are not delivered upstream requires you to manage monitoring, maintenance, support, updates (including upstream changes), and the full software lifecycle yourself.

Choosing the supported software route (enterprise version of open source community software) requires you to pay a subscription fee to a software vendor, but it’s the vendor’s job to stabilize the software, certify it works with an ecosystem of other hardware and software, ensure it is safe to use over its lifetime, and provide guidance on the best way to integrate it with the existing environment for the desired results.

Different vendors differ in their level of open source community participation. Vendors that do not contribute all changes back to upstream projects become out of sync with the community version and can no longer take advantage of community innovation.

It’s also important to note that as software development in an area becomes less cutting-edge, people can be less likely to stick around. And unless you’re recognized as a leader in that particular area, it can be hard to attract people. Such might be the challenge for a communications service provider trying to recruit talent for working on containers and competing with companies recognized for container innovation.

This relates back to your core business – what would you rather have your people working on? Where can your internal innovation and differentiation bring you the most value?

Final word

This isn’t a ‘one and done’ process – it’s cyclical. You should continually evaluate your business objectives and results in the context of what’s happening around you, and adjust your approach as you go. Remember too that it’s ok to make mistakes – it’s the mistakes that help you learn for the future. A company that is strategic in the projects it gets involved in, and one that is not afraid to change and drop the projects that aren’t successful is going to be more agile and adaptable to change.

Be open to diverse skill sets, and be honest within your organization and with your partners about what’s working and what’s not. These are all long term strategies that will best position you for success.

 

SusanJamesSusan joined Red Hat in May, 2018, after 27 years at Ericsson, where she was head of Product Line NFV infrastructure. While at Ericsson, she worked in Enterprise, Wireline, Network and Cloud organizations. She worked extensively with the IP Multimedia Subsystem (IMS), and was responsible for a number of the network functions in the Ericsson portfolio. A product management veteran, her career has focused on developing products to address technology transitions, and the establishment of new business areas.

Culture is holding back operator adoption of open source

If open source is the holy grail for telcos, more than a few of them are getting lost trying to uncover the treasure; but why?

At a panel session featuring STC and Vodafone at Light Reading’s Software Defined Operations and the Autonomous Network event, the operational culture was suggested a significant roadblock, as well as the threat of ROI due to shortened lifecycles and disappearing support.

Starting with the culture side, this is a simple one to explain. The current workforce has not been configured to work with an open source mentality. This is a different way of working, a notable shift away from the status quo of proprietary technologies. Sometimes the process of incorporating open source is an arduous task, where it can be difficult to see the benefits.

When a vendor puts a working product in front of you, as well as a framework for long-term support, it can be tempting to remain in the clutches of the vendor and the dreading lock-in situation. You can almost guarantee the code has been hardened and is scalable. It makes the concept of change seem unappealing Human nature will largely maintain the status quo, even is the alternative might be healthier in the long-run.

The second scary aspect of open source is the idea of ROI. The sheer breadth and depth of open source groups can be overwhelming at times, though open source is only as strong as the on-going support. If code is written, supported for a couple of months and then discarded in favour of something a bit more trendy, telcos will be fearful of investment due to the ROI being difficult to realise.

Open source is a trend which is being embraced on the surface, but we suspect there are still some stubborn employees who are more charmed by the status quo than the advantage of change.

Broadband Forum unveils first Open Broadband release

The Broadband Forum has announced the release of code and supporting documentation for Broadband Access Abstraction (OB-BAA), the first code release for the Open Broadband project.

The code and documentation offer an alternative approach for telcos looking to upgrade networks ahead of the anticipated stress caused by the introduction of more accessible and faster connectivity. The aim is to facilitate coexistence, seamless migration and the agility to adapt to an increasingly wide variety of software defined access models.

“OB-BAA enables operators to optimize their decision-making process for introducing new infrastructure based on user demand and acceptance instead of being forced into a total replacement strategy,” said Robin Mersh, Broadband Forum CEO. “By reducing planning, risks and execution time, investment in new systems and services can be incremental.”

The Forum’s Open Broadband initiative has been designed to provide an open community for the integration and testing of new open source, standards-based and vendor provided implementations. The group already counts support from the likes of BT, China Telecom, CenturyLink and Telecom Italia, as well as companies such as Broadcom and Nokia on the vendor side.

OB-BAA specifies northbound interfaces, core components and southbound interfaces for functions associated with access devices that have been virtualized. The standardized approach, specifically designed for SDN automation, is what Broadband Forum claims differentiates the launch from other approaches with the benefit of removing the hardware/software decoupling process.

“The first release of OB-BAA marks a major milestone for the industry,” said Tim Carey, Lead Technology Strategist at Nokia. “It delivers an open reference implementation based on standards-compliant interfaces, that operators and vendors worldwide can use to develop and deploy interoperable cloud-based access networks more easily and quickly.”

While this is the first release of the group, the Broadband Forum has promised several planned releases, consisting of code and supporting documentation.

Airship launched by AT&T and SK Telecom

AT&T and SK Telecom have jointly announced the launch of a new open infrastructure project called Airship, intended to simplify the process of deploying cloud infrastructure.

Airship uses the OpenStack-Helm project as a foundation, building a collection of open source tools to allow operators, IT service providers or enterprise organizations to more easily deploy and manage OpenStack, focusing more specifically on container technologies like Kubernetes and Helm. The mission statement is a simple one; make it easier to more predictably build and manage cloud infrastructure.

“Airship gives cloud operators a capability to manage sites at every stage from creation through all the updates, including baremetal installation, OpenStack creation, configuration changes and OpenStack upgrades,” SK Telecom said in a statement. “It does all this through a unified, declarative, fully containerized, and cloud-native platform.”

The initial focus of this project is the implementation of a declarative platform to introduce OpenStack on Kubernetes (OOK) and the lifecycle management of the resulting cloud, with the scale, speed, resiliency, flexibility, and operational predictability demanded of network clouds. The idea of a declarative platform is every aspect of the cloud is defined in standardized documents, where the user manages the documents themselves, submits them and lets the platform takes care of the rest.

The Airship initiative will initially consist of eight sub-projects:

  • Armada – An orchestrator for deploying and upgrading a collection of Helm charts
  • Berth – A mechanism for managing VMs on top of Kubernetes via Helm
  • Deckhand – A configuration management service with features to support managing large cluster configurations
  • Diving Bell – A lightweight solution for bare metal configuration management
  • Drydock – A declarative host provisioning system built initially to leverage MaaS for baremetal host deployment
  • Pegleg – A tool to organize configuration of multiple Airship deployments
  • Promenade – A deployment system for resilient, self-hosted Kubernetes
  • Shipyard – A cluster lifecycle orchestrator for Airship

“Airship is going to allow AT&T and other operators to deliver cloud infrastructure predictably that is 100% declarative, where Day Zero is managed the same as future updates via a single unified workflow, and where absolutely everything is a container from the bare metal up,” said Ryan van Wyk, Assistant VP of Cloud Platform Development at AT&T Labs.

While the emergence of another open source project is nothing too revolutionary, AT&T has stated it will act as the foundation of its network cloud that will power the 5G core supporting the 2018 launch of 5G service in 12 cities. Airship will also be used by Akraino Edge Stack, another project which intends to create an open source software stack supporting high-availability cloud services optimized for edge computing systems and applications. Two early use-cases certainly add an element of credibility.

Linux Foundation looks to open source growth for AI

The Linux Foundation has announced the launch of LF Deep Learning Foundation, an umbrella organization which will support open source projects in artificial intelligence, machine learning, and deep learning.

Amdocs, AT&T, B.Yond, Baidu, Huawei, Nokia, Tech Mahindra, Tencent, Univa, and ZTE will form the foundations of the group, which was announced at the Open Networking Summit in Los Angeles. As with all projects at the Linux Foundation the aim here is to harmonise developments across the industry to make the technology more accessible to developers and data scientists.

“We are excited to offer a deep learning foundation that can drive long-term strategy and support for a host of projects in the AI, machine learning, and deep learning ecosystems,” said Jim Zemlin, Executive Director of The Linux Foundation.

“With LF Deep Learning, we are launching the Acumos AI Project, a comprehensive platform for AI model discovery, development and sharing. In addition, we are pleased to announce that Baidu and Tencent each intend to contribute projects to LF Deep Learning. LF Deep Learning enables the open source community to support entire ecosystems of projects in these spaces, and we invite the open source community to join us in this effort.”

The Acumos AI Project is one of these initiatives to improve the accessibility of AI by attempting to standardise the infrastructure stack and components required to run an out-of-the-box general AI environment. The project will package tool kits such as TensorFlow and SciKit Learn and models with a common API that allows them to seamlessly connect, allowing for simpler onboarding and training of models and tools.

“Acumos will benefit developers and data scientists across numerous industries and fields, from network and video analytics to content curation, threat prediction, and more,” said Zemlin.

Elsewhere at the Open Networking Summit, AT&T announced plans to install 60,000 white box hardware in cell towers over the next several years to power the 5G revolution. The white boxes will run on AT&T’s ‘Disaggregated Network Operating System’ which was build using the acquired expertise from Vyatta, and powered by ONAP. The team hope to see the system adopted as open source software across the industry.

Finally, Pensa, a provider of automation software, has announced the open integration of the Pensa Maestro NFV platform with the ONAP project. The integration will allow interoperability with ONAP using Pensa Maestro NFV, which it claims is the first full stack automation platform for intelligent planning, design, validation, and delivery of carrier-grade Network Functions Virtualization (NFV) services.