Italians clearly aren’t that suspicious of Huawei

Despite governments around the world turning against Chinese vendors, Telecom Italia has agreed a new partnership with Huawei based on Software Defined Wide Area Network (SD-WAN) technology.

As part of a strategy aimed at evolving TIM’s network solutions for business customers, Huawei’s SD-WAN technology will be incorporated to create a new TIM service model which will allow customers companies to manage their networks through a single console.

“Today, more than ever, companies need networks that can adapt to different business needs over time, in particular to enable Cloud and VoIP services,” said Luigi Zabatta, Head of Fixed Offer for TIM Chief Business & Top Clients Office. “Thanks to the most advanced technologies available, these networks can be managed both jointly and by customers themselves through simple tools.

“The partnership with Huawei allows us to expand our value proposition for companies and to enrich our offer through the adoption of a technological model that is increasingly and rapidly emerging in the ICT industry.”

The partnership is a major win for Huawei considering the pressure the firm must be feeling over suspicions being peaked around the world. Just as more countries are clamping down on the ability for Huawei to do business, TIM has offered a windfall.

Aside from the on-going Chinese witch hunt over in the US, the Australians have banned Huawei from participating in the 5G bonanza and Korean telcos have left the vendor off preferred supplier lists. Just to add more misery, the UK is seemingly joining in on the trends.

In recent weeks, a letter was sent out from the Department of Digital, Culture, Media and Sport, and the National Cyber Security Centre, warning telcos of potential impacts to the 5G supply chain from the Future Telecom Infrastructure Review. China was not mentioned specifically, and neither was Huawei, but sceptical individuals might suggest China would be most squeezed by a security and resilience review.

The rest of the world might be tip-toeing around the big question of China, but this partnership suggests TIM doesn’t have the same reservations.

What to bin and what to keep; the big data conundrum

Figuring out what is valuable data and binning the rest has been a challenge for the telco industry, but here’s an interesting dilemma; how do you know the unknown value of data for the usecases of tomorrow?

This was broadly one of the topics of conversation at Light Reading’s Software Defined Operations & the Autonomous Network event in London. Everyone in the industry knows that data is going to be a big thing, but the influx of questions are almost overwhelming as the number of data sets available.

“90% of the data we collect is useless 90% of the time,” said Tom Griffin of Sevone.

This opens the floodgates of questions. Why do you want to collect certain data sets? How frequently are you going to collect the data? Where is it going to be stored? What are the regulatory requirements? How in-depth does the data need to be for the desired use? What do you do with the redundant data? Will it still be redundant in the future? What is the consequence of binning data which might become valuable in the future? How long do you keep information for with the hope it will one day become useful?

For all the promise of data analytics and artificial intelligence in the industry, the telcos have barely stepped off the starting block. For Griffin and John Clowers of Cisco, identifying the specific usecase is key. While this might sound very obvious, it’s amazing how many people are still floundering, but once this has been identified machine learning and artificial intelligence become critically important.

As Clowers pointed out, with ML and AI data can be analysed in near real-time as it is collected, assigned to the right storage environment (public, private or traditional dependent on regulatory requirements) and then onto the right data lakes or ponds (dependent on the purpose for collecting the data in the first place). With the right algorithms in place, the process of classifying and storing information can be automated, freeing up the time of the engineers to add value, though it also keeps an eye on costs. With the sheer volume of information being collected increasing very quickly, storage costs could rise rapidly.

And this is below the 5G and IoT trends have really kicked in. If telcos are struggling with the data demands of today, how are they going to cope with the tsunami of information which is almost guaranteed in tomorrow’s digital economy.

Which brings us back to the original point. If you have to be selective with the information which you keep, how do you know what information will be valuable for the usecases of tomorrow? And what will be the cost of not having this data?

What does cloud-native really mean for operators?

Telecoms.com periodically invites third parties to share their views on the industry’s most pressing issues. In this piece Dominik Pacewicz, Chief Product Manager for BSS at Comarch examines the term ‘cloud-native’ and asks what it signifies.

Cloud-native services are disrupting many industries. The telecoms industry however has long been outstripped by other sectors in the adoption of new technology. At the same time, service providers see a great opportunity to catapult themselves into the digital age through a spirited combination of cloud-nativeness, 5G networks and virtualization.

The term “cloud-native” is two-faceted. It entails both the technology used as well as the more strategic design aspect, signifying the direction many enterprises want to take with their applications. This strategy would require a broader look at the meaning of cloud-nativeness, going beyond the usual cloud-native “triad” or microservices, containers, and PaaS (Platforms as a Service) to include 5G and network virtualization.

Focus on microservices for consistent quality

Microservices are a set of autonomous and loosely-coupled services. It is often contrasted with rigid siloed architecture, but microservices are self-contained. They have their own data models, repository and functions, which can be accessed only through their own API. Microservices essentially break down applications into their core functions. In the case of a hypothetical cloud-based streaming platform, these microservices could fulfil separate functions such as search, customer rating, recommendations and product catalogue.

The practice of using microservices comes from the realization that today’s users expect flexible yet consistent experience across all devices, which entails high demand for modular and scalable cloud-based architecture.

Use containers for service peaks and troughs

Containers are the frameworks used to run individual microservices. They can hold different types of software code, allowing it to run simultaneously over different runtime environments such as production, testing, and integration. Containers make microservice-based applications portable, since they can be created or deleted dynamically. Performance can be scaled up or down with precision to treat bottlenecks – for instance, during Black Friday, a CSP can predict the increased demand for its online and offline sales, which can affect the domain but will have a negligible impact on all others.

Containers are an essential part of cloud-native architecture because the same container, managed with exactly the same Open Source tools, can be deployed on any cloud. It will not impact the operator’s virtual servers or computing systems.

Utilize PaaS for different capabilities

PaaS provides the foundation for software to be developed or deployed – somewhat similar to the operating system for a server or an entire network. All of this happens online and PaaS provides an abstraction layer, for networking, storing and computing, for the network infrastructure to grow and scale. PaaS creates an environment in which the software, the operating system and the underlying hardware and network infrastructure are all taken care of.  The user only has to focus on application development and deployment.

Using PaaS enables the harmonization of all elements of the cloud environment by integrating various cloud services. This in turn leads to virtualized processes of web application development, while developers still retain access to the same tools and standards.

5G is the cloud on steroids

The traditional “triad” of cloud-nativeness is not enough for the perfect, uninterrupted cloud application experience. There’s one asset missing – the 5G network. One reason why 5G is important for cloud-native environments, particularly for mobile cloud app development, is that striking the right balance between efficiency and the number of functionalities is a tough nut to crack. This is due to the high latency and the unreliable connectivity of some mobile devices.

Apart from LAN-like speeds for mobile devices, 5G can deliver lower battery consumption, broader bandwidth, greater capacity (1000 times that of 4G), and a substantial reduction in latency (even 50-fold). This is the main limiting factor when working with client-server architectures. What could follow is improved wireless range, increased capacity per cell tower and greater consistency.

The ‘cloud experience’ for mobile devices will be completely reshaped as a result of the adoption of 5G technology and mobile cloud applications will rival – or even surpass – their versions relying on corporate LAN connectivity to the desktop in terms of the number of offered functionalities.

Bridging the Gap with Network Virtualization

A key innovative element of NFV is the concept of VNF (Virtual Network Functions) forwarding graphs which enable the creation of service topologies that are not dependent on the physical topology. Network virtualization allows operators to allocate resources according to traffic demand. Operators can exert control over the network while managing “slices” of the network, without having to spend on infrastructure upkeep.

For this reason, NFV is leading the evolution of the 5G network ecosystem. Virtualizing the Evolved Packet Core (EPC) has emerged as a leading use case and one of the most tangible examples of the advantages of virtualization. The vEPC abstracts and decomposes the EPC functions, allowing them to run in combinations as COTS software instances. This approach allows CSPs to design networks in new ways that drastically reduce costs and simplify operations. Perfect conditions for 5G.

On the access side, the Cloud Radio Access Network (C-RAN) is a highly complementary technology to vEPC. C-RAN deployment, virtualizing many of the RAN functionalities on standard CPUs is seen as an important technology enabler for reducing the total cost of ownership (TCO) associated with RAN. The amount of investment and the operational costs are expected to decrease fast thanks to maturing cloud technologies and deployment experience. The C-RAN approach facilitates faster radio deployment, drastically reducing the time needed for conventional deployments.

In the race to 5G, telcos are steadily introducing function virtualization to gain software control over their networks. C-RAN and vEPC both help to create bespoke data pathways that meet highly specified network requirements of applications – staying true to 5G‘s vision.

The power of now

So, what does ‘cloud-native’ mean for operators? All the interdependencies between the cloud and the enabling technologies make it necessary for the true cloud-native experience to involve not only just traditional “triad” of microservices, containers, and PaaS. Network virtualization and 5G are key elements in the search for efficient, uninterrupted and feature-rich cloud-based services and applications. This will make previously impossible cloud-native use cases easier feasible.

Thanks to operators who experimented with virtualization and conducted early 5G trials, telcos will be the first to have all the necessary technology in place to succeed in the cloud. Will operators take full advantage of this head start – or will they will once again be beaten to the finish line – and fail to capitalize on the technology they championed?

 

Dominik PacewiczDominik Pacewicz is the head of BSS product management at Comarch. He has been with Comarch for over 6 years and works with a number of mobile operators helping them to simply and automate their networks.

Chip division continues to carry Samsung

Samsung has released its quarterly numbers, and while it is an improvement on the last quarter, the business is seemingly being propped up by a surging semiconductor unit.

Total revenues for the three months stood at roughly $57 billion, a 5.5% increase from the same period in 2017, while operating profit came in at roughly 15.5 billion, a year-on-year increase of 20.9%. The earnings were largely in line with the expectations the management team floated a few weeks back.

“In the third quarter, operating profit reached a new quarterly high for the company driven mainly by the continued strength of the Memory Business,” the team said in a statement. “Total revenue increased YoY and QoQ on the back of strong sales of memory products and OLED panels.

“The Korean won remained weak against the US dollar, resulting in a positive QoQ effect of approximately KRW 800 billion, experienced mainly in the components businesses. However the Korean won rose against major emerging currencies, which weighed slightly on the set businesses.”

Looking at the individual business units, the chip team rose to the top of the rankings once again. Revenues came in at roughly $22 billion for the quarter, with profit standing at $12 billion. Although demand is set to be weaker for the next quarter, the team anticipate slight increases over the next twelve months as demand for public cloud market, and mobile storage expands.

With fingers pointing to increased competition, revenues fell in the IT & Mobile Communications with over smartphone shipments remaining flat due to a decrease in sales of mid- to low-end products. High promotional costs and fluctuating currencies have been blamed for a dip in profitability, with the division only contributing $1.9 billion, despite it claiming pretty much the same revenues as the chip boys.

Another unit worth keeping an eye on will be the Networks unit. While revenues were down year-on-year, owing to decreased investments in 4G and the 5G euphoria yet to kick in, Samsung does seem to be benefiting from the increased scrutiny placed on Huawei in recent months. With many telcos snubbing Huawei, or at least decreasing dependence on the vendor, Samsung could certainly take advantage.

With Huawei and Xiaomi offering a more sustained threat in markets where Samsung traditionally dominates, this might not be the end of the woes for the start-studded division of Samsung.

How easy will it be for IBM to digest Red Hat?

Imagine our surprise the day before a lunch scheduled with the CTO of Red Hat when IBM announced it was buying his company.

Sometimes the journalism gods, those that are left, smile on even the humblest of hacks and today it was our turn. Lunch with Chris Wright (pictured below, with said hack) had already been arranged with the promise of delivering the kind of light Linux chit-chat over a glass of red that we all secretly crave. But then, out of the blue (pun intended) IBM announced it’s going to buy Red Hat for $34 billion and things suddenly got a bit more spicy.

Now, you don’t get to be the CTO of a major company by speaking injudiciously to the press, so we didn’t expect Wright to have much to say on the relative merits of the acquisition itself. Instead we wanted to know more about what Red Hat brings to the table, such that a venerable tech giant would want to drop such a serious chunk of change on it.

The core of Red Hat’s product strategy for the past few years has been the hybrid cloud. In its simplest terms this refers to the use of both private, on-premise server capacity and the public cloud as found in colossal data centers provided by the likes of AWS, Microsoft and Google. Increasingly this applies to pretty much all larger enterprises so it’s a pretty important place to be if you’re serious about the B2B tech space.

Sharing this writer’s love of a pun, Wright conceded that the cloud is a nebulous term, but that’s why you need companies that have made it their business to get their heads around it, such as Red Hat. IBM is, and always has been, a B2B tech company, so it’s easy to see why it would want to buy a company that specialises in one of the most important and arcane manifestations of that.

Everyone in tech has probably had to puzzle over one of those baffling software architecture slides that attempt to explain how everything fits together via the use of endless rectangles piled on top of each other like some geeky game of Jenga. Throw hundreds of those into a virtualised environment spanning any number of actual physical locations and you get somewhere close to the kind of challenge faced by today’s CTO.

Between the cloud and the cloud user lies an extended value chain of technologies and services dedicated to making that relationship as useful and intuitive as possible. One good example of this is the banking app, through which anyone can now whizz thousands of pounds around the world in an instant. For this to be made possible a hell of a lot of robust technologies have to exist between the bank’s servers and the client device.

According to Wright, Red Hat plays across that whole value chain, so for that reason alone it’s easy to see its appeal to IBM. But Red Hat is also deeply rooted in the Linux, open-source culture, which isn’t necessarily an obvious fit with IBM’s notoriously rigid corporate philosophy. As with so much M&A, how effectively the cultures of the two organisations are reconciled will be the single most important factor in determining whether this deal goes down smoothly or results in corporate indigestion.

Chris Wright Red Hat Telecoms

The edge will make us cash, but we need to be quick – DT

With telcos searching for elusive return on investment in a 5G world, edge computing could offer some relief.

Speaking at Total Telecom Congress in London, Arash Ashouriha, SVP of Technology Architecture & Innovation at Deutsche Telekom pointed towards the edge as a way to recapture the lost fortunes of yesteryear, but better move quickly or those crafty OTTs will swoop in again.

“With 5G, of course the consumer will benefit, but the challenges are mainly with enterprises; how you build specific solutions on one network, using network slicing,” said Ashouriha.

“What is the real opportunity for operators in avoiding becoming a dumb pipe? It’s going to be the edge. Whatever happens, only the operator can provide the next POP on the mobile phone, never forget that. Are we able to monetize this? If you want to achieve low latency, you have to process the traffic where it is being generated, not 100km away. From an operator perspective, this gives a huge opportunity to leverage the network.”

The theory here is simple. For those usecases which require near real-time transactions, autonomous driving or robotic surgery for example, low latency is critical. Unfortunately, the speed at which data can be moved has a limit, that is just physics, in order to guarantee low latency processing power has to be moved closer to the event. This is an excellent opportunity for the telcos to make money.

However, there is only a small window of opportunity, Ashouriha thinks it might only be two or three years. If the telcos do not take advantage and create a business to capitalise on the edge, the OTTs will swoop in and reap the rewards. All the likes of Google or AWS need to build such a business model is a partnership with one MNO in a market, then the cloud players can leverage the power of their cloud assets to build the case for low latency.

This is the challenge for the telcos; the opportunity is there and very apparent, but are they swift enough to capitalise on it? It certainly wasn’t the case for value added services and it seems the battle for control of the smart home has been lost, with Google and Amazon successfully positioning the smart speaker (not the router) as the centre of the ecosystem. The telcos need to react quickly, as you can guarantee the cloud players are eyeing up the opportunity.

One of the challenges, as Ashouriha points out, is industry collaboration. It doesn’t matter if you are the biggest, baddest telco around, no-one has 100% geographical coverage. To make this edge orientated service work, the telcos will have to develop some sort of framework where holes in connectivity can be plugged by competitors.

To tackle this challenge, DT has spun-off a business in Silicon Valley called MobiledgeX to create a platform where telcos plug in to create an always-connected experience for an ecosystem to build products and services on top of. It’s an interesting idea, and certainly a step in the right direction to capitalise on the edge opportunity.

With the billions being spent to develop 5G networks there is no single silver bullet to realise the ROI, but building a portfolio of services and business models will certainly get the telcos across the line. They just have to get better at capitalising on the opportunity when it presents itself.

Nokia launches some actual applications for SDN

All the hype surrounding software-defined networking is finally starting to yield some tangible results in the form of three apps from Nokia.

Deciding to kill two buzzwords with one stone, Nokia is claiming its new WaveSuite open applications will jump-start optical network digital transformation. It consists of three apps: Service Enablement, Node Automation and Network Insight. The point of these apps is apparently to offer businesses a new degree of access to networks that is expected to yield novel commercial opportunities.

To help us get our heads around this new piece of networking arcana we spoke to Kyle Hollasch, Director of Marketing for Optical Networking at Nokia. He was most keen to focus on the service enablement app, which he said is “the first software that tackles the issue of resell hierarchy.”

Specifically we’re talking about the reselling of fixed line capacity. This app is designed to massively speed up the capacity reselling process, with the aim of turning it into a billable service. The slide below visualises the concept, in which we have the actual network owner at the base and then several levels of capacity reselling, allowing greater degrees of specialisation and use-case specific solutions.

Nokia WaveSuite slide 1

The node automation app allows network nodes to be controlled via an app on a smartphone, thanks to the magic of SDN. In fact this would appear to be the epitome of SDN as it’s only made possible by that technology. The slide below shows how is it is, at least in theory, possible to interact with a network element via a smartphone, which also opens up the ability to use other smartphone tools such as the GPS and camera.

Nokia WaveSuite slide 2

The network insight app seems to do what is says on the tin, so there doesn’t seem to be the need for further explanation at this stage. “These innovations are the result of years of working closely with our customers to address all aspects of optical networking with open applications enhancing not just operations, but opening up new services and business models,” said Sam Bucci, Head of Optical Networks for Nokia.

As a milestone in the process of virtualizing networks and all the great stuff that’s supposed to come with that, the launch of actual SDN apps seems significant. Whether or not the market agrees and makes tangible business use of these is another matter, however, and only time will tell if good PowerPoint translates into business reality.

IBM aims to boost its strategic imperatives with $34 billion acquisition of Red Hat

IBM has announced by far the largest acquisition in its history with the acquisition of cloud and open source software vendor Red Hat.

$34 billion is several times more than IBM has previously spent on an acquisition, which indicates just how important it thinks this is to its future prosperity. Red Hat has expanded from a developer of Linux-based business software to being involved in most places you might find B2B open source software, including the cloud and telecoms.

While most venerable tech companies seem to be in a constant state of so-called transformation, this has especially been the case with IBM as it seeks to replace its declining legacy businesses with shiny new ones. As a consequence it has four clear strategic imperatives in the form of cloud, security, analytics and mobile, revenue from which recently overtook legacy stuff for the first time.

But IBM has apparently decided this organic transformation isn’t happening quickly enough and has decided a nice, juicy bit of M&A is required to hasten the process. Most reports are focusing on how Red Hat will contribute to IBM’s hybrid cloud efforts, and thus give it a boost in competing with the likes of Amazon, but Red Hat’s activities in the telco cloud specifically shouldn’t be underplayed.

“The acquisition of Red Hat is a game-changer,” hyperbolised IBM Dictator (Chairman, President and CEO) Ginni Rometty. “It changes everything about the cloud market. IBM will become the world’s number one hybrid cloud provider, offering companies the only open cloud solution that will unlock the full value of the cloud for their businesses.

“Most companies today are only 20 percent along their cloud journey, renting compute power to cut costs,” she said. “The next 80 percent is about unlocking real business value and driving growth. This is the next chapter of the cloud. It requires shifting business applications to hybrid cloud, extracting more data and optimizing every part of the business, from supply chains to sales.”

IBM Red Hat Rometty Whitehurst cropped

“Open source is the default choice for modern IT solutions, and I’m incredibly proud of the role Red Hat has played in making that a reality in the enterprise,” said Jim Whitehurst, President and CEO, Red Hat (pictured, with Rometty). “Joining forces with IBM will provide us with a greater level of scale, resources and capabilities to accelerate the impact of open source as the basis for digital transformation and bring Red Hat to an even wider audience –  all while preserving our unique culture and unwavering commitment to open source innovation.”

Cloud and open source have been major themes in the tech M&A scene recently. Microsoft continued its transition from closed software box-shifter with the recent $7.5 billion acquisition of code sharing platform GitHub. Meanwhile open source big data vendors Cloudera and Hortonworks have decided to merge and earlier this year Salsforce dropped $6.5 billion on MuleSoft to power its Integration Cloud.

In M&A, the party line from the company being acquired is usually something along the lines of it enabling them to take the next step in its evolution thanks to the greater resources of its new parent, and this was no exception. “Powered by IBM, we can dramatically scale and accelerate what we are doing today,” said Whitehurst in his email to staff announcing the deal. “Imagine Red Hat with greater resources to grow into the opportunity ahead of us. Imagine Red Hat with the ability to invest even more and faster to accelerate open source innovation in emerging areas.” And so on.

He went on to explain that, while he will report directly to Rometty, Red Hat will continue to operate as a ‘distinct unit’, whatever that means. Usually this sort of talk is designed to sell the concept that it will remain the same company it was before the acquisition, but with loads more cash to play with. Let’s see.

IBM would be mad to mess around with Red Hat too much as it seems to be doing just fine and reported 14% revenue growth in its last quarterlies. Then again you don’t pay a 60% premium for a company just to accrue its revenue and how IBM integrates Red Hat into the rest of its offerings will be what determines the success of this bold move. There are, sadly, no signs the company plans to change its name to Big Blue Hat, which is a worrying early a missed opportunity.

Share price drops for both Amazon and Google after quarterlies

Despite reporting quarterly numbers most companies would kill for Amazon and Alphabet share prices dropped by 8.6% and 5% respectively due to investor disappointment.

More than anything else it shows the high demands of investors but also the confidence which is being placed in the internet giants. With Amazon reporting a revenue increase of 29% to $56.6 billion for the quarter, while Google parent company Alphabet reported $33.7 billion, up 21%, the expectations are certainly high.

Starting with Amazon, the revenue increase of 29% paled in comparison to the more than 10X lift in net income to $2.9 billion. While this would be a regular cash bonanza for most companies around the world, sales guidance between $66.5 billion and $72.5 billion for final quarter were lower than what the market wanted to hear. The more coy guidance for Amazon’s busiest quarter resulted in the 8.6% drop, after confidence during the day sent stock up 7%.

In Google’s HQ the story was slightly different. Revenues of $33.7 billion, up 21%, and net income of $9.1 billion, compared to $6.7 billion in 2017. Shares were down 5%, following a 4.4% rise across the day, after sales figures did not hit the expected heights. The last three months have been a tough period for investors to swallow with various scandals dropping share price by 8.8% over the last three months.

Of course, it wasn’t all bad news. The cloud unit for both businesses is continuing to rack up revenues with AWS up 45% to $6.7 billion across the quarter and Google’s other revenues segment, which features cloud up 29% to $4.6 billion. Encouragingly for both, Gartner estimates the worldwide public cloud services market is projected to grow 17% percent in 2019 to total $206.2 billion, up from $175.8 billion in 2018. IaaS is set to get the largest boost, forecast to grow 27.6% in 2019 to reach $39.5 billion. With so many businesses around the world citing a cloud-first approach, it’s amazing to think only 10% of workloads have been moved into the cloud.

The relatively new venture into the world of smart speakers and virtual assistants is proving to be a continued success story as well. For Amazon, the number of Alexa-compatible smart home devices has quintupled to more than 20,000 devices from 3,500, while the team have also started to launch new products such as a smart home security solution (Alexa Guard), and Alexa is expanding what it can give updates on as well, such sports with predictions, live streams, cooking instructions and maths homework. For Google. the Assistant has expanded to 20 languages and 76 countries, while the devices with screens will help YouTube business, which is attempting to blend in more direct response adverts as well as branding to its proposition.

There will of course be short-term wins for the pair in this space, but this is a long-term bet. Once the idea has been adopted by the mass market, the opportunities to make money through third-party relationships will be quite remarkable. Search revenues can be moved into the voice domain (effectively anywhere) and look how profitable search has been for Google. This is only one way to make money, but both Amazon and Google are putting themselves in a remarkably strong position for the future.

Both businesses might have suffered in the last 24 hours but they are still in incredibly dominant positions. The cloud units still have incredible growth potential, while the smart speaker ecosystem is starting to become a reality. For Google, the is delivering amazing profitability but sales growth does seem to be slowing slightly. Amazon is delivering on the North American market but the business is not as effective on the international scene, posting a loss of $385 million.

There are issues, but these are nothing compared to the billions being raked in and the growth potential in new, lucrative markets.