Italians clearly aren’t that suspicious of Huawei

Despite governments around the world turning against Chinese vendors, Telecom Italia has agreed a new partnership with Huawei based on Software Defined Wide Area Network (SD-WAN) technology.

As part of a strategy aimed at evolving TIM’s network solutions for business customers, Huawei’s SD-WAN technology will be incorporated to create a new TIM service model which will allow customers companies to manage their networks through a single console.

“Today, more than ever, companies need networks that can adapt to different business needs over time, in particular to enable Cloud and VoIP services,” said Luigi Zabatta, Head of Fixed Offer for TIM Chief Business & Top Clients Office. “Thanks to the most advanced technologies available, these networks can be managed both jointly and by customers themselves through simple tools.

“The partnership with Huawei allows us to expand our value proposition for companies and to enrich our offer through the adoption of a technological model that is increasingly and rapidly emerging in the ICT industry.”

The partnership is a major win for Huawei considering the pressure the firm must be feeling over suspicions being peaked around the world. Just as more countries are clamping down on the ability for Huawei to do business, TIM has offered a windfall.

Aside from the on-going Chinese witch hunt over in the US, the Australians have banned Huawei from participating in the 5G bonanza and Korean telcos have left the vendor off preferred supplier lists. Just to add more misery, the UK is seemingly joining in on the trends.

In recent weeks, a letter was sent out from the Department of Digital, Culture, Media and Sport, and the National Cyber Security Centre, warning telcos of potential impacts to the 5G supply chain from the Future Telecom Infrastructure Review. China was not mentioned specifically, and neither was Huawei, but sceptical individuals might suggest China would be most squeezed by a security and resilience review.

The rest of the world might be tip-toeing around the big question of China, but this partnership suggests TIM doesn’t have the same reservations.

Nokia gets a bunch more cash from Chinese operators

Nokia is so keen for everyone to know how well it’s doing in China that is it makes an announcement every time it wins some business.

Earlier this year we heard all about a ‘framework agreement’ signed with China mobile that was worth around €1 billion. Today Nokia has announced some more ‘frame agreements’, which are presumably the same thing and refer to a kind of pre-contract that amounts to a formal commitment to do a bunch of business in future.

This time we’re talking €2 billion, but split between all three Chinese MNOs – China Mobile, China Telecom and China Unicom. Presumably the China Mobile bit is fresh cash, not just a recycling of the previous bil. The agreements cover delivery for the next year or so of radio, fixed access, IP routing and optical transport equipment, as well as some SDN and NFV goodness. Nokia is excited by all this transitioning and leveraging.

“We are excited to continue our close collaboration with these important customers in China, to drive new levels of network performance as they transition toward 5G,” said Mike Wang, president of Nokia Shanghai Bell. “Leveraging the breadth of our end-to-end network and services capabilities, we will work closely with China Mobile, China Telecom and China Unicom to deploy technologies that meet their specific business needs.”

It wouldn’t be surprising to see some kind of equivalent announcement by Ericsson before long as the two Nordic kit vendors clearly like to compete over this sort of thing. Not long after its first China Mobile announcement Nokia said it was getting £3.5 billion from T-Mobile US to help out with 5G. Within a few weeks Ericsson had countered with an almost identical announcement of its own.

What does cloud-native really mean for operators?

Telecoms.com periodically invites third parties to share their views on the industry’s most pressing issues. In this piece Dominik Pacewicz, Chief Product Manager for BSS at Comarch examines the term ‘cloud-native’ and asks what it signifies.

Cloud-native services are disrupting many industries. The telecoms industry however has long been outstripped by other sectors in the adoption of new technology. At the same time, service providers see a great opportunity to catapult themselves into the digital age through a spirited combination of cloud-nativeness, 5G networks and virtualization.

The term “cloud-native” is two-faceted. It entails both the technology used as well as the more strategic design aspect, signifying the direction many enterprises want to take with their applications. This strategy would require a broader look at the meaning of cloud-nativeness, going beyond the usual cloud-native “triad” or microservices, containers, and PaaS (Platforms as a Service) to include 5G and network virtualization.

Focus on microservices for consistent quality

Microservices are a set of autonomous and loosely-coupled services. It is often contrasted with rigid siloed architecture, but microservices are self-contained. They have their own data models, repository and functions, which can be accessed only through their own API. Microservices essentially break down applications into their core functions. In the case of a hypothetical cloud-based streaming platform, these microservices could fulfil separate functions such as search, customer rating, recommendations and product catalogue.

The practice of using microservices comes from the realization that today’s users expect flexible yet consistent experience across all devices, which entails high demand for modular and scalable cloud-based architecture.

Use containers for service peaks and troughs

Containers are the frameworks used to run individual microservices. They can hold different types of software code, allowing it to run simultaneously over different runtime environments such as production, testing, and integration. Containers make microservice-based applications portable, since they can be created or deleted dynamically. Performance can be scaled up or down with precision to treat bottlenecks – for instance, during Black Friday, a CSP can predict the increased demand for its online and offline sales, which can affect the domain but will have a negligible impact on all others.

Containers are an essential part of cloud-native architecture because the same container, managed with exactly the same Open Source tools, can be deployed on any cloud. It will not impact the operator’s virtual servers or computing systems.

Utilize PaaS for different capabilities

PaaS provides the foundation for software to be developed or deployed – somewhat similar to the operating system for a server or an entire network. All of this happens online and PaaS provides an abstraction layer, for networking, storing and computing, for the network infrastructure to grow and scale. PaaS creates an environment in which the software, the operating system and the underlying hardware and network infrastructure are all taken care of.  The user only has to focus on application development and deployment.

Using PaaS enables the harmonization of all elements of the cloud environment by integrating various cloud services. This in turn leads to virtualized processes of web application development, while developers still retain access to the same tools and standards.

5G is the cloud on steroids

The traditional “triad” of cloud-nativeness is not enough for the perfect, uninterrupted cloud application experience. There’s one asset missing – the 5G network. One reason why 5G is important for cloud-native environments, particularly for mobile cloud app development, is that striking the right balance between efficiency and the number of functionalities is a tough nut to crack. This is due to the high latency and the unreliable connectivity of some mobile devices.

Apart from LAN-like speeds for mobile devices, 5G can deliver lower battery consumption, broader bandwidth, greater capacity (1000 times that of 4G), and a substantial reduction in latency (even 50-fold). This is the main limiting factor when working with client-server architectures. What could follow is improved wireless range, increased capacity per cell tower and greater consistency.

The ‘cloud experience’ for mobile devices will be completely reshaped as a result of the adoption of 5G technology and mobile cloud applications will rival – or even surpass – their versions relying on corporate LAN connectivity to the desktop in terms of the number of offered functionalities.

Bridging the Gap with Network Virtualization

A key innovative element of NFV is the concept of VNF (Virtual Network Functions) forwarding graphs which enable the creation of service topologies that are not dependent on the physical topology. Network virtualization allows operators to allocate resources according to traffic demand. Operators can exert control over the network while managing “slices” of the network, without having to spend on infrastructure upkeep.

For this reason, NFV is leading the evolution of the 5G network ecosystem. Virtualizing the Evolved Packet Core (EPC) has emerged as a leading use case and one of the most tangible examples of the advantages of virtualization. The vEPC abstracts and decomposes the EPC functions, allowing them to run in combinations as COTS software instances. This approach allows CSPs to design networks in new ways that drastically reduce costs and simplify operations. Perfect conditions for 5G.

On the access side, the Cloud Radio Access Network (C-RAN) is a highly complementary technology to vEPC. C-RAN deployment, virtualizing many of the RAN functionalities on standard CPUs is seen as an important technology enabler for reducing the total cost of ownership (TCO) associated with RAN. The amount of investment and the operational costs are expected to decrease fast thanks to maturing cloud technologies and deployment experience. The C-RAN approach facilitates faster radio deployment, drastically reducing the time needed for conventional deployments.

In the race to 5G, telcos are steadily introducing function virtualization to gain software control over their networks. C-RAN and vEPC both help to create bespoke data pathways that meet highly specified network requirements of applications – staying true to 5G‘s vision.

The power of now

So, what does ‘cloud-native’ mean for operators? All the interdependencies between the cloud and the enabling technologies make it necessary for the true cloud-native experience to involve not only just traditional “triad” of microservices, containers, and PaaS. Network virtualization and 5G are key elements in the search for efficient, uninterrupted and feature-rich cloud-based services and applications. This will make previously impossible cloud-native use cases easier feasible.

Thanks to operators who experimented with virtualization and conducted early 5G trials, telcos will be the first to have all the necessary technology in place to succeed in the cloud. Will operators take full advantage of this head start – or will they will once again be beaten to the finish line – and fail to capitalize on the technology they championed?

 

Dominik PacewiczDominik Pacewicz is the head of BSS product management at Comarch. He has been with Comarch for over 6 years and works with a number of mobile operators helping them to simply and automate their networks.

Nokia launches some actual applications for SDN

All the hype surrounding software-defined networking is finally starting to yield some tangible results in the form of three apps from Nokia.

Deciding to kill two buzzwords with one stone, Nokia is claiming its new WaveSuite open applications will jump-start optical network digital transformation. It consists of three apps: Service Enablement, Node Automation and Network Insight. The point of these apps is apparently to offer businesses a new degree of access to networks that is expected to yield novel commercial opportunities.

To help us get our heads around this new piece of networking arcana we spoke to Kyle Hollasch, Director of Marketing for Optical Networking at Nokia. He was most keen to focus on the service enablement app, which he said is “the first software that tackles the issue of resell hierarchy.”

Specifically we’re talking about the reselling of fixed line capacity. This app is designed to massively speed up the capacity reselling process, with the aim of turning it into a billable service. The slide below visualises the concept, in which we have the actual network owner at the base and then several levels of capacity reselling, allowing greater degrees of specialisation and use-case specific solutions.

Nokia WaveSuite slide 1

The node automation app allows network nodes to be controlled via an app on a smartphone, thanks to the magic of SDN. In fact this would appear to be the epitome of SDN as it’s only made possible by that technology. The slide below shows how is it is, at least in theory, possible to interact with a network element via a smartphone, which also opens up the ability to use other smartphone tools such as the GPS and camera.

Nokia WaveSuite slide 2

The network insight app seems to do what is says on the tin, so there doesn’t seem to be the need for further explanation at this stage. “These innovations are the result of years of working closely with our customers to address all aspects of optical networking with open applications enhancing not just operations, but opening up new services and business models,” said Sam Bucci, Head of Optical Networks for Nokia.

As a milestone in the process of virtualizing networks and all the great stuff that’s supposed to come with that, the launch of actual SDN apps seems significant. Whether or not the market agrees and makes tangible business use of these is another matter, however, and only time will tell if good PowerPoint translates into business reality.

BBWF 2018: Consumers don’t care about tech, just connectivity – BT

Today’s consumer is demanding but disinterested. They don’t care about mobile or broadband or wifi, just top-line connectivity. To meet these demands, BT has pointed to network convergence.

Speaking at Broadband World Forum, Howard Watson, BT’s CTIO, outlined the bigger picture. It’s all about convergence where the dividing lines between wireless and fixed or hardware and software are blurred, with connectivity is viewed as a single concept, bringing together network design, technology convergence and customer insight to create a single software-orientated network for device neutral connectivity.

“For the consumer, it’s not about their wifi, or their mobile connection, or their fixed broadband, or even their landline,” said Watson. “It’s about connectivity as a whole. And I’m pleased to say we’re already making strong progress here.”

Of course, it wouldn’t be a telco conference without mentioning 5G, and this is a critical component of the BT story. Trials have already begun in East London, though over the next couple of days 10 additional nodes will be added to expand the test. Plans are already underway to launch a converged hardware portfolio, introduce IP voice for customers and create a seamless wifi experience. All of this will be built on a single core network.

But what does this mean for the consumer? Simplicity in the simplest of terms.

The overall objective is to create a seamless connectivity experience which underpins the consumer disinterest in anything but being connected. Soon enough, devices will be able to automatically detect and select the best connectivity option, whether it is wifi or cellular for example, essentially meaning consumers will not have to check anything on their devices. Gone will be the days where you have to worry about your device clinging onto weak wifi signal or being disrupted by a network reaching out to your device, according to Watson. Signing in will become a distant memory as the consumer seamlessly shift from wifi to mobile.

This is of course a grand idea, and there is still a considerable amount of work to be done. Public wifi is pretty woeful as a general rule, and mobile connectivity is patchy in some of the busiest and remotest regions in the UK, but in fairness to BT, it does look like a sensible and well thought out plan.

With telcos becoming increasingly utilitised, these organizations need to start adding value to the lives of the consumer. Connectivity is not enough anymore, as it has become a basic expectation not a luxury in today’s digitally-defined society; providing the seamless experience might just be one way BT can prove its value. Fortunately, with its broadband footprint, EE’s mobile network and 5000 public wifi spots throughout the UK, BT is in a strong position to make the converged network dream a reality.

US Senators hit out at India’s pro-privacy and localisation laws

Two US Senators have signed a letter addressed to India’s Prime Minister Narendra Modi suggesting new rules to tighten up data practices in the country could lead to a weakened trade relationship with the US.

The US Government has already shown the damage which can be done when it starts throwing around economic sanction and hurdles, almost sending ZTE to join the Dodo on the extinction list, and it appears to be using the same tactics here. However, instead of punishing an organization which broke trade laws, it is attempting to bully a country into its own line of thinking and away from a pro-privacy stance.

“We see this (data localization) as a fundamental issue to the further development of digital trade and one that is crucial to our economic partnership,” the letter signed by Senators John Cornyn and Mark Warner states. The Senators serve as co-chairs of the Senate’s India caucus.

The letter, seen by Reuters, relates to new data protection, privacy and localisation rules which are set to come into play this week (October 15). The rules have been in the making for some time, and while there are some very suspect clauses, this is an attempt to tame the wild-west internet in the country, applying regulations which should be deemed more acceptable for the digital economy.

Back in July, the Indian Government unveiled a report which detailed its new approach to data regulations in the country. Included in the rules are restrictions on how data can be collected and utilised, setting out a similar stance to GDPR in Europe, while also including new approaches such as the right to be forgotten, explicit opt-in consent for certain categories of data (that which is deemed sensitive), and also data localisation. It is a much more stringent approach to the data economy, taking India closer to the European stance on privacy than the US’ views.

Aside from the data protection and privacy benefits of localisation, and not to mention greater influence for the Indian Government, such a strategy also stimulates the economy. Local jobs will have to be created and new data centres will have to be constructed to meet the rising demands of the increasingly digital Indian economy and society. These are clearly benefits for the country, though the threat of an impact on US trade will certainly be a worry for India.

Over the course of 2017, India exported $34.83 billion worth of goods and services to the US. This figure accounted for 16% of the total exports for the country, making the US the largest trading partner. The US Government certainly does have leverage to coerce India into its own way of thinking.

The letter from the Senators also happens to coincide with some pretty heavy lobbying from the likes of Visa, Mastercard and American Express. All would certainly find life simpler if there was no such thing as localisation, though it seems lobbying Senators to fight the cause has been more effective than efforts to persuade Indian officials to head a different direction. The new rules seem to have been influenced by Europe’s GDPR, though the US, both the Government and companies, have a different approach to data than the pro-privacy Europeans.

With India’s economy fast evolving from analogue to digital, there certainly will be profits to be made. Many US companies, most notably those in Silicon Valley, will be looking greedily at the country though such rules would make life more difficult. Not impossible, but not as simple. Perhaps the economic weight of the US Government can bully India into believing the ‘American Dream’.

Virtualization and the deployment and operation of 5G networks

Telecoms.com periodically invites expert third parties to share their views on the industry’s most pressing issues. In this piece John English, Director of Marketing, Service Providers Solutions at Netscout offers a quick overview of the need for virtualization with 5G.

While 5G undoubtedly holds enormous potential, meeting its demands for increased speed, performance, scalability, and flexible service deployment is likely to result in untenable complexity and OpEx for service operators.

The only truly affordable and practical way to operate a 5G network is to virtualise and automate, change network design, and increasingly manage the network and services from the edge. The application of virtualisation technologies such as NFV and SDN to 5G networks is therefore essential if 5G is to be deployed at a reasonable cost.

Many service providers are already adopting NFV and SDN as a means of boosting efficiencies, launching services faster, and supporting a wider range of applications. Indeed, McKinsey has estimated that the newest technologies in NFV and SDN would let operators lower their capital expenditures by up to 40%, and their network operating expenses by a similar amount.

Are any deployments under way?

Yes. Major service providers in the US, Europe, and South Korea, for example, are well advanced in their testing and initial network deployments. Verizon, AT&T, and Korea Telecom are all utilising the 1Gbps capability of 5G to provide services such as fixed mobile broadband. In addition to enabling valuable enterprise applications, this supports their SD-WAN deployments which, due to their ability to flexibly instantiate new services, are increasingly being seen by service providers as a way of monetising 5G, even at this early stage.

Is that the full extent of it?

Right now, both virtualisation and 5G are being deployed on a crawl, walk, and run basis. We’re still very much in the crawl phase, with most service providers tentatively deploying both technologies in contained parts of their networks and businesses so that they can understand how they work, learn how to manage them and understand how to deploy them most effectively.

With a myriad of new use cases and technologies, fragmented standards around how VNFs are introduced and orchestrated in the network, 5G is filled with unknowns. The relative immaturity of both 5G and virtualisation is therefore currently serving as a barrier to full-scale adoption.

How can we overcome this barrier?

5G and virtualisation each rely on a series of other technologies for successful deployment. Both require tools that enable intelligent visibility into the network in order to generate smart data with clear, actionable insights that will feed automated systems, manage performance, and control automation. Power is nothing without control, after all, and accurate control is fundamental to the management of a virtualised network.

Fortunately, these tools exist in the form of virtualised software designed to gather data, analyse it, and present it in a way that it can be actioned by a service provider’s other systems. Critically, the days of using expensive hardware probes to reactively report on network performance are over. They may still be applicable in trial phases during which a new network is established, but only software-based network assurance will be capable of scaling up to handle the volume of data involved as the scale and scope of a 5G deployment heads towards the running phase.

What will happen when 5G is up and running?

The exact nature of the services of the future is still unclear. Despite all the talk about instantaneously downloading 4K video, or enabling connected cars and supporting a plethora of new devices and services with the Internet of Things, the killer apps for 5G are yet to emerge.

What is apparent, though, is that there are universal requirements that will remain the same regardless of what the future holds. The effective running of services and applications on 5G will rely on visibility into the network, and gaining insights that will enable proactive, as opposed to reactive, responses to any network issues.

 

JGE headshot (002)John is Director of Marketing, Service Providers Solutions at Netscout, focusing on NFV/SDN, IoT/DX, Big Data Analytics and 5G. Before joining Netscout John held product management and marketing positions at Empirix and Tekelec. John has over 25 years of telecom experience covering 4G/3G/2G mobile, cable and fixed line technologies.

Nokia plugs openness ahead of Broadband World Forum

Open is one of 2018’s buzzwords and Nokia is cashing in on the bonanza ahead of Broadband World Forum in a couple of weeks.

This is only the first of several announcements from the Finns, but it builds on the fibre connectivity and virtualisation foundations set last year. The first installment is focused on fixed access network slicing and multi-vendor optical network units (ONU).

Starting with the network slicing piece, the team plan to launch a fully open and programmable network slicing solution for fixed access networks. While the buzz for network slicing has been primarily focused on the mobile side of telecommunications, Nokia’s Head of Fixed Networks Marketing Stefaan Vanhastel told Telecoms.com the solution can be just as effective in fixed access.

“Yes network slicing is a hot topic for 5G, but we are now starting to see the benefit for slicing in a fixed network,” said Vanhastel. “Operators can use residential network for 5G transport – why not, you already have a network and can save up to 50% of deployment costs. Why not use the same infrastructure for residential broadband, enterprise customers and 5G transport.”

In the same way network slicing can be used to create several virtual networks in the wireless business, why not do it in fixed access? Not only does it allow telcos to more efficiently plan for the world of 5G transport, while simultaneously serving a variety of customers, it opens up a host of new deployment models.

Vanhastel highlighted there are several non-traditional players building their own networks, individual cities or national governments for example, though these are not the people you would want running telco services. Local authorities have plenty of experience from a civil engineering perspective, digging the trenches and deploying the networks, but with network slicing capabilities several virtual networks can be created to bring-in the right expertise to deliver the services.

This is one idea which will aid the deployment of future proof networks, though network slicing could also help co-operative efforts and co-investment from competitors. The physical deployment of the network can be shared between any number of telcos, with each then claiming their own ‘slice’ which can be managed and configured independently. Openness and collaboration seems like a nice idea, though few competitors can play nice unfailingly, but with network slicing they only have to for a set period of time (in theory) before turning their attention to their own business.

Secondly, Nokia has launched Multivendor ONU connect, which it claims is the first fully open, virtualised solution that allows telcos to connect any optical network unit (ONU). The solution takes a ‘driver’ approach to how telcos deploy and manage ONUs, allowing for ‘plug and play’ functionality. As part of Nokia’s Altiplano open programmable framework, software is decoupled to allow the ONU management to be virtualised. An open-API framework allows third-party stacks to be on-boarded in a more time-efficient manner.

The approach will offer telcos the opportunity to realise the benefits of interoperability, connecting any modem to an access platform and potentially removing the painstaking task of integration. Vanhastel said that once the whole management infrastructure is virtualized, it would be possible to connect any fibre modem to access networks without the hassle, while updates or new ONUs can be quickly introduced through software upgrades.

Broadband World Forum might still be a couple of weeks away, but the Nokia marketing message is clear; simplicity and openness.

Microsoft looks to take Xbox experience onto mobile

Microsoft has announced the launch of Project xCloud to take the world of Xbox gaming onto mobile.

The idea is a relatively simple one. Gaming is traditionally a better experience on consoles which are specifically designed for gaming, but Microsoft wants to take this experience into the mobile world of tablets and smartphones. Trials will start next year and will allow gamers to take the same content from games built for the Xbox console and PC onto their smartphones, using a Bluetooth enabled handset or an under-development touch overlay.

“The future of gaming is a world where you are empowered to play the games you want, with the people you want, whenever you want, wherever you are, and on any device of your choosing,” said Kareem Choudhry, Corporate VP of Gaming Cloud at Microsoft. “Our vision for the evolution of gaming is similar to music and movies – entertainment should be available on demand and accessible from any screen. Today, I’m excited to share with you one of our key projects that will take us on an accelerated journey to that future world: Project xCloud.”

Compatibility with existing and future Xbox games has been enabled by building out custom hardware in Microsoft data centres. The team have architected a new customizable blade that can host the component parts of multiple Xbox One consoles, as well as the associated infrastructure supporting it. The custom blades will be scaled out through the Azure cloud regions over time.

Currently, the test experience is running at 10 Mbps, though the team are keen to bring this down while still maintaining the same experience for gamers through advances in networking topology, and video encoding and decoding. The idea is to ensure these games can be played on 4G networks, though getting the bitrate down might be a tough ask considering the depth and interactivity of the content on consoles such as Xbox.

One thing is very clear; gaming is just another aspect of the mobile world which is pressing the case for 5G.

Microsoft has an ambition to ensure this content will be able to meet consumer experience demands on 4G networks, though this is a selfish view on networking. These games are incredibly immersive and will place additional strain on the network. For the telcos, the issue is not the singular demands of browsing, video or gaming, but the sum of all the parts. Gaming is just another item which has been thrown on top of the teetering pile of network strain. The efficiency gains of 5G will soon become a necessity, not the buffering-free cat video gains of today.

Looking at the gaming industry, growth is gaining momentum fast. Research from Newzoo suggests mobile gaming will generate $70.3 billion across 2018, accounting for roughly 51% of the industry total. This equates to 25% year-on-year growth, compared to 4.1% growth on consoles, such as Xbox, which is expected to account for $34.6 billion. Mobile’s share of gaming is expected to increase to 59% by 2021, taking $106.4 billion. Asia will account for the majority of this spend, though the gains will be experienced in every region.

An excellent example of the surge of mobile of gaming is Fortnite. While this might be a title most play through consoles or on PC, the most recent update for the game saw 60% surge in data traffic over normal peak traffic levels on Verizon’s broadband network, as well as a 5-8% jump on mobile.

The tsunami of mobile gaming titles over the last 4-5 years has improved the accessibility of gaming for the general public, though the complexity of these games in also growing. While this segment of mobile content might have been simplistic to start with, think of Candy Crush, more in-depth games are becoming increasingly popular with the general public. The proportion of games which require constant connectivity is also increasing. Should the Microsoft project prove to be successful, both in terms of operation and adoption, these trends will only be accelerated.

Gaming is no longer a niche, and pretty soon it will start to weigh heavily on the network.