Culture is holding back operator adoption of open source

If open source is the holy grail for telcos, more than a few of them are getting lost trying to uncover the treasure; but why?

At a panel session featuring STC and Vodafone at Light Reading’s Software Defined Operations and the Autonomous Network event, the operational culture was suggested a significant roadblock, as well as the threat of ROI due to shortened lifecycles and disappearing support.

Starting with the culture side, this is a simple one to explain. The current workforce has not been configured to work with an open source mentality. This is a different way of working, a notable shift away from the status quo of proprietary technologies. Sometimes the process of incorporating open source is an arduous task, where it can be difficult to see the benefits.

When a vendor puts a working product in front of you, as well as a framework for long-term support, it can be tempting to remain in the clutches of the vendor and the dreading lock-in situation. You can almost guarantee the code has been hardened and is scalable. It makes the concept of change seem unappealing Human nature will largely maintain the status quo, even is the alternative might be healthier in the long-run.

The second scary aspect of open source is the idea of ROI. The sheer breadth and depth of open source groups can be overwhelming at times, though open source is only as strong as the on-going support. If code is written, supported for a couple of months and then discarded in favour of something a bit more trendy, telcos will be fearful of investment due to the ROI being difficult to realise.

Open source is a trend which is being embraced on the surface, but we suspect there are still some stubborn employees who are more charmed by the status quo than the advantage of change.

Broadband Forum unveils first Open Broadband release

The Broadband Forum has announced the release of code and supporting documentation for Broadband Access Abstraction (OB-BAA), the first code release for the Open Broadband project.

The code and documentation offer an alternative approach for telcos looking to upgrade networks ahead of the anticipated stress caused by the introduction of more accessible and faster connectivity. The aim is to facilitate coexistence, seamless migration and the agility to adapt to an increasingly wide variety of software defined access models.

“OB-BAA enables operators to optimize their decision-making process for introducing new infrastructure based on user demand and acceptance instead of being forced into a total replacement strategy,” said Robin Mersh, Broadband Forum CEO. “By reducing planning, risks and execution time, investment in new systems and services can be incremental.”

The Forum’s Open Broadband initiative has been designed to provide an open community for the integration and testing of new open source, standards-based and vendor provided implementations. The group already counts support from the likes of BT, China Telecom, CenturyLink and Telecom Italia, as well as companies such as Broadcom and Nokia on the vendor side.

OB-BAA specifies northbound interfaces, core components and southbound interfaces for functions associated with access devices that have been virtualized. The standardized approach, specifically designed for SDN automation, is what Broadband Forum claims differentiates the launch from other approaches with the benefit of removing the hardware/software decoupling process.

“The first release of OB-BAA marks a major milestone for the industry,” said Tim Carey, Lead Technology Strategist at Nokia. “It delivers an open reference implementation based on standards-compliant interfaces, that operators and vendors worldwide can use to develop and deploy interoperable cloud-based access networks more easily and quickly.”

While this is the first release of the group, the Broadband Forum has promised several planned releases, consisting of code and supporting documentation.

Airship launched by AT&T and SK Telecom

AT&T and SK Telecom have jointly announced the launch of a new open infrastructure project called Airship, intended to simplify the process of deploying cloud infrastructure.

Airship uses the OpenStack-Helm project as a foundation, building a collection of open source tools to allow operators, IT service providers or enterprise organizations to more easily deploy and manage OpenStack, focusing more specifically on container technologies like Kubernetes and Helm. The mission statement is a simple one; make it easier to more predictably build and manage cloud infrastructure.

“Airship gives cloud operators a capability to manage sites at every stage from creation through all the updates, including baremetal installation, OpenStack creation, configuration changes and OpenStack upgrades,” SK Telecom said in a statement. “It does all this through a unified, declarative, fully containerized, and cloud-native platform.”

The initial focus of this project is the implementation of a declarative platform to introduce OpenStack on Kubernetes (OOK) and the lifecycle management of the resulting cloud, with the scale, speed, resiliency, flexibility, and operational predictability demanded of network clouds. The idea of a declarative platform is every aspect of the cloud is defined in standardized documents, where the user manages the documents themselves, submits them and lets the platform takes care of the rest.

The Airship initiative will initially consist of eight sub-projects:

  • Armada – An orchestrator for deploying and upgrading a collection of Helm charts
  • Berth – A mechanism for managing VMs on top of Kubernetes via Helm
  • Deckhand – A configuration management service with features to support managing large cluster configurations
  • Diving Bell – A lightweight solution for bare metal configuration management
  • Drydock – A declarative host provisioning system built initially to leverage MaaS for baremetal host deployment
  • Pegleg – A tool to organize configuration of multiple Airship deployments
  • Promenade – A deployment system for resilient, self-hosted Kubernetes
  • Shipyard – A cluster lifecycle orchestrator for Airship

“Airship is going to allow AT&T and other operators to deliver cloud infrastructure predictably that is 100% declarative, where Day Zero is managed the same as future updates via a single unified workflow, and where absolutely everything is a container from the bare metal up,” said Ryan van Wyk, Assistant VP of Cloud Platform Development at AT&T Labs.

While the emergence of another open source project is nothing too revolutionary, AT&T has stated it will act as the foundation of its network cloud that will power the 5G core supporting the 2018 launch of 5G service in 12 cities. Airship will also be used by Akraino Edge Stack, another project which intends to create an open source software stack supporting high-availability cloud services optimized for edge computing systems and applications. Two early use-cases certainly add an element of credibility.

Linux Foundation looks to open source growth for AI

The Linux Foundation has announced the launch of LF Deep Learning Foundation, an umbrella organization which will support open source projects in artificial intelligence, machine learning, and deep learning.

Amdocs, AT&T, B.Yond, Baidu, Huawei, Nokia, Tech Mahindra, Tencent, Univa, and ZTE will form the foundations of the group, which was announced at the Open Networking Summit in Los Angeles. As with all projects at the Linux Foundation the aim here is to harmonise developments across the industry to make the technology more accessible to developers and data scientists.

“We are excited to offer a deep learning foundation that can drive long-term strategy and support for a host of projects in the AI, machine learning, and deep learning ecosystems,” said Jim Zemlin, Executive Director of The Linux Foundation.

“With LF Deep Learning, we are launching the Acumos AI Project, a comprehensive platform for AI model discovery, development and sharing. In addition, we are pleased to announce that Baidu and Tencent each intend to contribute projects to LF Deep Learning. LF Deep Learning enables the open source community to support entire ecosystems of projects in these spaces, and we invite the open source community to join us in this effort.”

The Acumos AI Project is one of these initiatives to improve the accessibility of AI by attempting to standardise the infrastructure stack and components required to run an out-of-the-box general AI environment. The project will package tool kits such as TensorFlow and SciKit Learn and models with a common API that allows them to seamlessly connect, allowing for simpler onboarding and training of models and tools.

“Acumos will benefit developers and data scientists across numerous industries and fields, from network and video analytics to content curation, threat prediction, and more,” said Zemlin.

Elsewhere at the Open Networking Summit, AT&T announced plans to install 60,000 white box hardware in cell towers over the next several years to power the 5G revolution. The white boxes will run on AT&T’s ‘Disaggregated Network Operating System’ which was build using the acquired expertise from Vyatta, and powered by ONAP. The team hope to see the system adopted as open source software across the industry.

Finally, Pensa, a provider of automation software, has announced the open integration of the Pensa Maestro NFV platform with the ONAP project. The integration will allow interoperability with ONAP using Pensa Maestro NFV, which it claims is the first full stack automation platform for intelligent planning, design, validation, and delivery of carrier-grade Network Functions Virtualization (NFV) services.

TM Forum brings Open APIs to the Linux party

Some might bring chips and dips, others potato salad, while the rogue ones bring Tequila, but the TM Forum has pitched up to the Linux Foundation’s LA party with a bag full of Open APIs.

The pair were already in partnership prior to this announcement, but the Open Networking Summit was deemed the perfect shindig to announce things are starting to get serious. As part of the new partnership, the TM Forum will allow its APIs to be shared across the entire Linux Foundation community for use in any of its open source projects.

“Together with TM Forum, we can shift the global industry one step closer to harmonization of open source and open standards,” said Arpit Joshipura, GM of Networking at The Linux Foundation. “Our joint efforts will help accelerate deployment and adoption for end users.  We look forward to this continued and intensified collaboration and how it will advance future networks.”

“Open Source and open standards – including TM Forum’s Open APIs and Open Digital Architecture – have a pivotal role to play in transforming the agility of our industry, ensuring it is fit for the next decade,” said Nik Willetts, CEO of TM Forum. “We’re delighted to be working with The Linux Foundation to bring together our joint expertise, and look forward to partnering with a range of open source projects over the coming months.”

As it stands, the TM Forum has a suite of 50 REST-based Open APIs which are used by 4,000 software developers in 700 companies worldwide. Open source projects are becoming increasingly popular with the telco space and this partnership introduces platform-agnostic industry standards to the mix. This could mean different things to different organizations, but standardization is generally a bit quicker, cheaper and removes the threat customization.

For a more in-depth look at the partnership Carol Wilson from our sister-site Light Reading is at the event. While there are certainly technological benefits to the partnership, it would also be worth noting the administrative benefits of the tie up.

Firstly, a closer relationship means the two can communicate directly as opposed to using members as the middlemen. Secondly, moving to Apache 2.0 license terms removes the potential threat of contributors claiming IP rights on any breakthroughs.

Mycom OSI makes telco cloud move with Red Hat collaboration

Network assurance vendor Mycom OSI has moved to improve its cloud credentials through a partnership with Red Hat.

The specific point of the partnership is to offer automated assurance across hybrid NFV networks. In the virtualized telco utopia most of the functions will exist in a massive, fluid cloud but, as we are continually reminded, the road to the promised land is a convoluted one. One of the many complexities to be contended with is how you monitor, maintain and optimize all this.

Red Hat has been heavily invested in the telco cloud from an open source perspective for some time, so it makes sense for Mycom to collaborate with it in order to stay relevant in the cloud era. Mycom’s Experience Assurance and Analytics solution will be deployed on the Red Hat OpenStack Platform and the Red Hat OpenShift Container Platform to make the telco cloud magic happen.

“Telco clouds are a key enabler to unlocking on-demand, real time consumer and enterprise digital services such as SD-WAN and IoT opportunities for CSPs, and our recent innovations in telco cloud assurance have resulted in rapid growth in customer projects,” said Mounir Ladki, President and CTO at Mycom. “Red Hat not only helps us to deliver agility, speed and cost benefits to our customers, but also a rich stream of essential telco cloud data that feeds our analytics engine. We are delighted to have such a strong collaboration with Red Hat.”

“As the telco industry moves towards cloudification of networks to increase innovation, agility and scalability, service quality and performance are top of mind for telco leaders,” said Darrell Jordan-Smith, VP of Global Telecommunications and ICT at Red Hat. “We are pleased to underpin MYCOM OSI’s assurance and analytics solution with Red Hat’s highly scalable hybrid cloud and container-based technologies. Together, we are setting out to help operators better understand and act on the performance of their networks as they deliver on their network virtualization strategies.”

Red Hat has made its name in the telco sphere by tailoring open source software to make it ‘telco grade’ i.e. commercially useful and robust. Network assurance is all about making sure the network is commercially useful and robust so this partnership seems to make sense. Furthermore it might set a precedent for further such collaborations as the telco cloud matures.

AT&T comes up with sensible idea to cash in on AI craze

It isn’t often we actually say this, but a telco has come up with a good idea as opposed to ripping someone else off, reinventing the wheel or using Kevin Bacon to make the brand seem cool (in someone’s head this made sense).

AT&T has paired up with Tech Mahindra to build AI marketplace called Acumos, which will be hosted by The Linux Foundation. It is a relatively simple idea as well. But all the best ones are.

With the craze of AI building every day, there are countless applications which do very specific things, whatever that might be. Incorporating these applications into a business function or creating a new product with AI as the foundation is complicated, as there are so many different moving parts. Acumos creates a single marketplace where you can stitch together various different applications to make a complete solution, without having to hunt around extensively.

Assuming there is traction with the developers, it could turn into a very useful little marketplace. All the AI ideas in one place. The complicated bit, which might decide whether it is a success or failure, will be making it accessible. AT&T and Tech Mahindra have said this isn’t just for data scientists, but for any developer who has a focus on creating apps and microservices. Translating this into an effective user-centric proposition for those who have limited (or no) AI experience will not be a simple task.

Just to demonstrate how it works, we’ve copied AT&T’s example below.

 

AI example

 

“Our goal with open sourcing the Acumos platform is to make building and deploying AI applications as easy as creating a website,” said Mazin Gilbert, VP of Advanced Technology at AT&T Labs.

“We’re collaborating with Tech Mahindra to establish an industry standard for AI in the networking space. We invite others to join us to create a global harmonization in AI and set the stage for all future AI network applications and services”

This is not the first venture for AT&T into such open source initiatives. The same model was used to launch the Open Networking Automation Platform (ONAP), but in this example the team spent a while developing an internal platform, called ECOMP, before releasing it into open source. AT&T and Tech Mahindra are still working on the governance side of things now, so a 2018 release does seen more realistic.

ONF caterpillar metamorphoses into open source butterfly

The Open Network Foundation has proudly announced the completion of its restructuring into an open source-centric organization following the integration of ON.Lab.

This process took almost a year but if it has been done properly then fair enough as business is littered with the debris of botched M&A. Having pupated for nine months the ONF gave its beautiful new wings their first flutter this summer with the announcement that DT is joining the party as a partner.

Despite the name, the resulting butterfly is largely driven by what was ON.Lab, with its focus on SDN and specifically projects such as ON.Lab creation CORD (Central Office Re-architected as a Datacenter). The chances are that a lot of the work done by ONF was already of the open source variety, so this announcement seems to be a statement of intent as much as anything else.

“We are at the forefront of a massive transformation of the networking ecosystem,” said Guru Parulkar, executive director of ONF. “Recognizing that open source is transforming our industry at an unprecedented pace, we set out to optimize the ONF to embrace and further drive this revolution. We’re very pleased to have now completed our metamorphosis. We are also humbled by the success and impact our projects are having worldwide, and excited to embrace the work ahead.”

Roz Roseboro, of peerless analyst house Heavy Reading, reckons CORD is the future. “CORD, unlike many other projects that only address parts of the puzzle, offers a holistically-designed, fully-integrated, end-to-end solution ready for deployment. It’s a way for service providers to prove that this whole new architecture and process can work.

“Heavy Reading forecasts that the majority of CSPs will use CORD by 2020 to at least some degree, and nearly 40% of all end-customers (residential, wireless and enterprise, collectively) will have service provided by COs or their equivalents using CORD by mid-2021. We expect CORD will enable service providers to more effectively compete with web-scale operators while speeding network automation and innovation.”

The ONF has also taken this opportunity to make a few other organizational announcements. Andre Fuetsch, president of AT&T LABS and CTO, has been elected as chair of the ONF Board, AT&T having been a founding member of ON.Lab. In addition, Turk Telekom is also joining the partner as a partner member and lastly there have been more membership layers added to reflect the ONF’s renewed focus on embracing the open source community.