Google claims quantum computing breakthrough, IBM disagrees

Google says it has achieved ‘quantum supremacy’, as its Sycamore chip performed a calculation, which would have taken the world’s fastest supercomputer 10,000 years, in 200 seconds.

It seems quite a remarkable upgrade, but this is the potential of quantum computing. This is not a step-change in technology, but a revolution on the horizon.

Here, Google is claiming its 53-qubit computer performed a task in 200 seconds, which would have taken Summit, a supercomputer IBM built for the Department of Energy, 10,000 years. That said, IBM is disputing the claim suggesting Google is massively exaggerating how long it would take Summit to complete the same task. After some tweaks, IBM has said it would take Summit 2.5 days.

Despite the potential for exaggeration, this is still a breakthrough for Google.

For the moment, it seems to be nothing more than a humble brag. Like concept cars at the Tokyo Motor Show, the purpose is to inflate the ego of Google and create a perception of market leadership in the quantum computing world. Although this is an area which could be critically important for the digital economy in years to come, the technology is years away from being commercially viable.

Nonetheless, this is an impressive feat performed by the team. It demonstrates the value of persisting with quantum computing and will have forward-thinking, innovative data scientists around the world dreaming up possible applications of such power.

At the most basic level, quantum computing is a new model of how to build a computer. The original concept is generally attributed to David Deutsch of Oxford University, who at a conference in 1984, pondered the possibility of designing a computer that was based exclusively on quantum rules. After publishing a paper a few months later, which you can see here if you are brave enough, the race to create a quantum computer began.

Today’s ‘classical’ computers store information in binary, where each bit is either on or off. Quantum computation use qubits, which can either be on or off, as well as being both on and off. This might sound incredibly complicated, but the best way to explain is to imagine a sphere.

In classical computing, a bit can be represented by the poles of the sphere, with zero representing the south pole and one representing the north, but in Quantum computing, any point of the sphere can be used to represent any point. This is achieved through a concept called superposition, which means ‘Qbits’ can be represented by a one or a zero, or both at the same time. For example, two qubits in a single superposition could represent four different scenarios.

Irrelevant as to whether you understand the theoretical science behind quantum computing, the important takeaway is that it will allow computers to store, analyse and transfer information much more efficiently. As you can see from the claim Google has made, completing a calculation in 200 seconds as opposed to 10,000 years is a considerable upgrade.

This achieved can be described as ‘quantum supremacy’, in that the chip has enabled a calculation which is realistically impossible on classical computing platforms. From IBM’s perspective, this is a step forward, but not ‘quantum supremacy’ if its computer can complete the same task in 2.5 days.

If this still sounds baffling and overly complex, this is because quantum computing is a field of technology only the tiniest of fractions of the worlds’ population understand. This is cutting-edge science.

“In many ways, the exercise of building a quantum computer is one long lesson in everything we don’t yet understand about the world around us,” said Google CEO Sundar Pichai.

“While the universe operates fundamentally at a quantum level, human beings don’t experience it that way. In fact, many principles of quantum mechanics directly contradict our surface level observations about nature. Yet the properties of quantum mechanics hold enormous potential for computing.”

What is worth taking away here is that understanding the science is not at all important once it has been figured out by people far more intelligent. All normal people need to understand is that this is a technology that will enable significant breakthroughs in the future.

This might sound patronising, but it is not supposed to. Your correspondent does not understand the mechanics of the combustion engine but does understand the journey between London and South Wales is significantly faster by car than on horse.

But what could these breakthroughs actually be?

On the security side, although quantum computing could crack the end-to-end encryption software which is considered unbreakable today, it could theoretically enable the creation of hack-proof replacements.

In artificial intelligence, machine learning is perfect area for quantum computing to be applied. The idea of machine learning is to collect data, analyse said data and provide incremental improvements to the algorithms which are being integrated into software. Analysing the data and applying the lessons learned takes time, which could be dramatically decreased with the introduction of quantum computing.

Looking at the pharmaceutical industry, in order to create new drugs, chemists need to understand the interactions between various molecules, proteins and chemicals to see if medicines will improve cure diseases or introduce dangerous side-effects. Due to the eye-watering number of combinations, this takes an extraordinary amount of time. Quantum computing could significantly reduce the time it takes to understand the interaction but could also be combined with analysing an individual’s genetic make-up to create personalised medicines.

These are three examples of how quantum computing could be applied, but there are dozens more. Weather forecasting could be improved, climate change models could be more accurate, or traffic could be better managed in city centres. As soon as the tools are available, innovators will come up with the ideas of how to best use the technology, probably coming up with solutions to challenges that do not exist today.

Leading this revolutionary approach to computing is incredibly important for any company which wants to dominate the cloud industry in the futuristic digital economy, which is perhaps the reason IBM felt it was necessary to dampen Google’s celebrations.

“Building quantum systems is a feat of science and engineering and benchmarking them is a formidable challenge,” IBM said on its own blog.

“Google’s experiment is an excellent demonstration of the progress in superconducting-based quantum computing, showing state-of-the-art gate fidelities on a 53-qubit device, but it should not be viewed as proof that quantum computers are “supreme” over classical computers.”

Google measured the success of its own quantum computer against IBM’s Summit, a supercomputer which is believed to be the most powerful in the world. By altering the way Summit approaches the same calculation Google used, IBM suggests Summit could come to the same conclusion in 2.5 days rather than 10,000 years.

Google still has the fastest machine, but according to IBM the speed increase does not deserve the title of ‘quantum supremacy’. It might not be practical to ask a computer to process a calculation for 2.5 days, but it is not impossible, therefore the milestone has not been reached.

What is worth noting is that a pinch of salt should be taken with both the Google and IBM claims. These are companies who are attempting to gain the edge and undermine a direct rival. There is probably some truth and exaggeration to both statements made.

And despite this being a remarkable breakthrough for Google, it is of course way too early to get exciting about the applications.

Not only is quantum computing still completely unaffordable for almost every application data scientists are dreaming about today, the calculation was very simple. Drug synthesis or traffic management where every traffic signal is attempting to understand the route of every car in a major city are much more complicated problems.

Scaling these technologies so they are affordable and feasible for commercial applications is still likely to be years away, but as Bill Gates famously stated: “We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten.”

Age-discrimination lawsuit rumbles on as IBM accused of ‘trendy’ objectives

In an on-going age-discrimination lawsuit, a former Big Blue executive has suggested the firm fired old-timers in pursuit of a ‘trendy’ and ‘cool’ image, so it could compete with the likes of Google and Amazon.

This saga has been quietly building in the background for quite some time. The first complaints were presented to the courts in 2018 by Shannon Liss-Riordan, with the ex-IBM employees suggesting they were culled for no reason aside from the fact they were too old. This is of course a big no-no.

The latest revelation from the courts is from Alan Wild, the former-VP for Human Resources, Employee Relations and Engagement. According to Bloomberg, Wild is suggesting Big Blue has fired between 50,000 and 100,000 employees over the last five years in the pursuit of creating an image which will attract the younger generations.

This is where IBM has suffered over the last couple of years. It might have been one of the most successful technology companies of the 20th century, but it certainly doesn’t maintain this perception in the 21st. When graduates leave MIT or Stanford, first port-of-call is likely to be firms like Google, Facebook, Tesla, Twitter, Amazon, or a host of other forward-looking technology giants who have dominated headlines.

Big Blue might be on the rise currently, but this was only after 23 consecutive quarters of year-on-year revenue decline. This company went through a very painful process of realigning attention from the unpopular, unprofitable and declining legacy operations, through to its future-proof ‘Strategic Imperatives’ unit.

In Wild’s evidence presented to the courts, which are currently sealed documents, IBM began to correct its ‘seniority mix’ in 2014, firing older employees and hiring millennials. Perhaps part of this was to ensure its employees were suitably tooled to tackle the new challenges presented by the digital economy, but if you believe Wild’s alleged comments, the purpose was to have a compounding effect on attracting more fresh graduates into the midst.

According to an extensive investigation launched by Pro Publica last March, IBM had culled 20,000 US employees aged at least 40. This number represents 60% of the total number of total cuts across the period. Although the claims are unconfirmed to date, IBM allegedly told some staff their skills were no-longer needed, before hiring them back as contractors for less cash, and encouraging the older employees to apply for new roles internally, while secretly telling hiring managers not to hire them.

These are very serios accusations from both Pro Publica and Wild, though there are plenty of other testimonies which back up the claims.

In January, Catherine Rodgers presented her own evidence to a court in New York. Rodgers was previously a VP in the Global Engagement Office, while also serving as the most senior executive in Nevada. Rodgers was dismissed from IBM, after almost 30 years of service, in 2017 aged 62.

In Rodgers affidavit, she claimed she raised concerns to Steve Welsh, IBMs GM of Global Technology Services, that the firm was ‘engaging in systematic age discrimination by utilizing several methods to eliminate thousands of its employees’. As part of her role, Rodgers had access to lists of individuals who were being cut as part of ‘Resource Action’ initiatives in her business group, noting that all were over the age of 50 while the younger employees were not impacted whatsoever.

Having spoken to managers in other groups, many of whom had workforces made up of employees from younger generations, the layoffs were not as serious. Considering Rodgers’ division was exceeding targets and running under budget, the request from more senior executives caught her by surprise.

“IBM’s upper management encouraged us to inform the employees who were being laid off that they should use IBM’s internal hiring platform to apply for other jobs within the company, but at the same time, IBM implemented barriers to those employees actually being hired for openings,” Rodgers said in her affidavit.

“For example, if a manager wanted to hire someone who had been laid off, there was a multi-layer approval process. (This multi-layer approval process would not apply if a manager wanted to hire someone who had applied from outside IBM.) Even as Vice President, I could not make such a hire without approval by the Global Resource Board and IBM’s upper management.”

IBM has consistently denied reports it targeted older employees for lay-offs, pointing towards the number of people it hires each year. This statement does not prove innocence by any stretch of the imagination, it simply confirms the firm has been hiring new employees to replace the culled ones. In fairness to IBM, it is not proven the firm has targeted older employees and embarked on a campaign of age-discrimination.

What is worth noting is that the number of IBM employees globally has been decreasing steadily over the last few years. Statista estimates IBM had 350,600 employees at the end of 2018, down from 366,600 in 2017, 380,300 in 2016 and 377,700 in 2015. In 2012, IBM employed 434,250 people across the world. These numbers do not prove or disprove the allegations but are good to bear in mind.

IBM has managed a significant turnaround over the last few years. From irrelevance to a cloud business which is almost dining at the top table. It is threatening to compete with the likes of Amazon, Microsoft, Google and Alibaba, though it still has a lot of work to do. That said, it is in a much more comfortable position than previous years. The question is whether it has managed the process legitimately.

206 days: IBM’s estimate on how long it takes to find a security breach

A new study from IBM suggests it takes 206 days on average for companies to discover a breach and another 73 to fix it.

With cybercriminals becoming savvier and assaults becoming much more complex, it seems many companies will have been exposed for months without even realising it. The average cost to the business could be as much as $3.92 million, with the firm feeling the impact over three-year periods.

“Cybercrime represents big money for cybercriminals, and unfortunately that equates to significant losses for businesses,” said Wendi Whitmore, Global Lead for IBM X-Force Incident Response and Intelligence Services.

“With organizations facing the loss or theft of over 11.7 billion records in the past 3 years alone, companies need to be aware of the full financial impact that a data breach can have on their bottom line – and focus on how they can reduce these costs.”

On average, 67% of the financial impact of security breaches are felt within the first 12 months, 22% is seen in the second year and 11% in the third year after the incident. The long-tail costs are felt more painfully in highly-regulated industries such as healthcare, financial services, energy and pharmaceuticals. Telecoms was not mentioned specifically, but we suspect it will also be among the more impacted industries.

What you have to bear in mind is that this is a security vendor stoking the fire. The dangers of inadequate security in the digital era are very well-known, but you have to take the estimates with a pinch of salt here; it is in the IBM interest for companies to be in heightened states of fear and paranoia.

Looking at the time in which it takes to detect a breach, this is quite a remarkable number and perhaps demonstrates the retrospective approach many firms have taken to security over the last few years. These attitudes are slowly changing, security is moving up the agenda, though this does not compensate for the years of inadequacy.

The IBM report suggests the lifecycle of a breach is 279 days, not accounting for all the regulatory headaches which would follow. That said, those who are able to detect and contain a breach with 200 days are $1.2 million better off when it comes to the financial impact.

Here are a few of the more interesting stats from the report:

  • Data breaches cost companies around $150 per record that was lost or stolen
  • Security automation technologies could potentially half the financial impact of a breach
  • Extensive use of encryption can reduce total cost of a breach by $360,000
  • Breaches originating from a third-party cost companies $370,000 more than average
  • Average cost of a breach in the US is $8.19 million, double the worldwide average
  • Breaches in the healthcare industry are the most expensive
  • Companies with less than 500 employees suffered losses of more than $2.5 million on average

AT&T gets Microsoft and IBM to help with its cloud homework

US telco AT&T has decided it’s time to raise its cloud game and so has entered into strategic partnerships with Microsoft and IBM.

The Microsoft deal focuses on non-network applications and enables AT&T’s broader strategy of migrating most non-network workloads to the public cloud by 2024. The rationale for this is fairly standard: by moving a bunch of stuff to the public cloud AT&T will be able to better focus on its core competences, but let’s see how that plays out.

IBM will be helping AT&T Business Solutions to better provide solutions to businesses. The consulting side will modernize its software and bring it into the IBM cloud, where they will use Red Hat’s platform to manage it all. In return IBM will make AT&T Business its main SDN partner and general networking best mate.

“AT&T and Microsoft are among the most committed companies to fostering technology that serves people,” said John Donovan, CEO of AT&T. “By working together on common efforts around 5G, the cloud, and AI, we will accelerate the speed of innovation and impact for our customers and our communities.”

“AT&T is at the forefront of defining how advances in technology, including 5G and edge computing, will transform every aspect of work and life,” said Satya Nadella, CEO of Microsoft. “The world’s leading companies run on our cloud, and we are delighted that AT&T chose Microsoft to accelerate its innovation. Together, we will apply the power of Azure and Microsoft 365 to transform the way AT&T’s workforce collaborates and to shape the future of media and communications for people everywhere.”

“In AT&T Business, we’re constantly evolving to better serve business customers around the globe by securely connecting them to the digital capabilities they need,” said Thaddeus Arroyo, CEO of AT&T Business. “This includes optimizing our core operations and modernizing our internal business applications to accelerate innovation. Through our collaboration with IBM, we’re adopting open, flexible, cloud technologies, that will ultimately help accelerate our business leadership.”

“Building on IBM’s 20-year relationship with AT&T, today’s agreement is another major step forward in delivering flexibility to AT&T Business so it can provide IBM and its customers with innovative services at a faster pace than ever before,” said Arvind Krishna, SVP, Cloud and Cognitive Software at IBM. “We are proud to collaborate with AT&T Business, provide the scale and performance of our global footprint of cloud data centers, and deliver a common environment on which they can build once and deploy in any one of the appropriate footprints to be faster and more agile.”

Talking of the US cloud scene, the Department of Defense is reportedly looking for someone to provide some kind of Skynet-style ‘war cloud’ in return for chucking them $10 billion of public cash. Formally known as the Joint Enterprise Defense Infrastructure (yes, JEDI), this is designed to secure military and classified information in the event of some kind of catastrophic attach, contribute to cyber warfare efforts and enable the dissemination of military intelligence to the field.

It looks like the gig will be awarded to just one provider, which had led to much jostling for position among the US cloud players. The latest word on the street is that either AWS or Microsoft will get the work, which has prompted considerable moaning from IBM and Oracle and reported concern from President Trump, prompted by politicians apparently repaying their lobbying cash. Here’s a good summary of all that from Subverse.

IBM and Google reportedly swap morals for cash in Chinese surveillance JV

IBM and Google executives should be bracing for impact as the comet of controversy heads directly towards their offices.

Reports have emerged, via the Intercept, suggesting two of the US’ most influential and powerful technology giants have indirectly been assisting the Chinese Government with its campaign of mass-surveillance and censorship. Both will try to distance themselves from the controversy, but this could have a significant impact on both firms.

The drama here is focused around a joint-venture, the OpenPower Foundation, founded in 2013 by Google and IBM, but features members such as Red Hat, Broadcom, Mellanox, Xilinx and Rackspace. The aim of the open-ecosystem organization is to facilitate and share advances in networking, server, data storage, and processing technology.

To date, the group has been little more than another relatively uninteresting NPO, serving a niche in the industry, though one initiative is causing the stir. The OpenPower Foundation has been working with Xilinx and Chinese firm Semptian to create a new breed of chips capable of enabling computers to process incredible amounts of data. This might not seem extraordinary, though the application is where the issue has been found.

On the surface, Semptian is a relatively ordinary Chinese semiconductor business, but when you look at its most profitable division, iNext, the story becomes a lot more sinister. iNext specialises in selling equipment to the Chinese Government to enable the mass-surveillance and censorship projects which have become so infamous.

It will come as little surprise a Chinese firm is aiding the Government with its nefarious objectives, but a link to IBM and Google, as well as a host of other US firms, will have some twitching with discomfort. We can imagine the only people who are pleased at this news are the politicians who are looking to get their faces on TV by theatrically condemning the whole saga.

Let’s start with what iNext actually does before moving onto the US firms involved in the controversy. iNext works with Chinese Government agencies by providing a product called Aegis. Aegis is an interception and analysis system which has been embedded into various phone and internet networks throughout the country. This is one of the products which enables the Chinese Government to have such a close eye on the activities of its citizens.

Documentation acquired by The Intercept outlines the proposition in more detail.

“Aegis is not only the standard interception system but also the powerful analysis system with early warning and timely action capabilities. Aegis can work with all kinds of networks and 3rd party systems, from recovering, analysing, exploring, warning, early warning, locating to capturing. Aegis provides LEA with an end to end solution described as Deep Insight, Early Warning and Timely Action.”

Although the majority of this statement is corporate fluff, it does provide some insight into the way in which the technology actually works. This is an incredibly powerful surveillance system, which is capable of locating individuals through application usernames, IP addresses or phone numbers, as well as accurately tracking the location of said individuals on a real-time basis.

Perhaps one of the most worrying aspect of this system is the ‘pre-crime’ element. Although the idea of predictive analytics in some societies has been met with controversy and considerable resistance, we suspect the Chinese Government does not have the same reservations.

iNext promises this feature can help prevent crime through the introduction of an early warning system. This raises all sorts of ethical questions, as while the data estimates might be accurate to five nines, can you arrest someone when they haven’t actually committed a crime. This is the sticky position Google and IBM might have found itself in.

OpenPower has said that it was not aware of the commercial applications of the projects it manages, while its charter prevents it from getting involved. The objective of the foundation is to facilitate the progress of technology, not to act as judge and jury for its application. It’s a nice little way to keep controversy at arm’s length; inaction and negligence is seen as an appropriate defence plea.

For IBM and Google, who are noted as founding members of the OpenPower Foundation, a stance of ignorance might be enough to satisfy institutions of innocence, but the court of public opinion could swing heavily the other direction. An indirect tie to such nefarious activities is enough for many to pass judgment.

When it comes to IBM, the pursuit of innocence becomes a little bit trickier. IBM is directly mentioned on the Semptian website, suggesting Big Blue has been working closely with the Chinese firm for some time, though the details of this relationship are unknown for the moment.

For any of the US firms which have been mentioned here, it is not a comfortable situation to be in. Although they might be able to plead ignorance, it is quite difficult to believe. These are monstrous multi-national billion-dollar corporations, with hordes of lawyers, some of whom will be tasked with making sure the technology is not being utilised in situations which would get the firm in trouble.

Of course, this is not the first time US technology firms have found themselves on the wrong side of right. There have been numerous protests from employees of the technology giants as to how the technology is being applied in the real-world. Google is a prime example.

In April 2018, Google employees revolted over an initiative the firm was participating in with the US Government. Known as Project Maven, Google’s AI technology was used to improve the accuracy of drone strikes. As you can imagine, the Googlers were not happy at the thought of helping the US Government blow people up. Project Dragonfly was another which brought internal uproar, this time the Googlers were helping to create a version of the Google news app for China which would filter out certain stories which the Government deemed undesirable.

Most of the internet giants will plead their case, suggesting their intentions are only to advance society, but there are numerous examples of contracts and initiatives which contradict this position.

Most developers or engineers, especially the ones who work for a Silicon Valley giant, work for the highest bidder, but there is a moral line few will cross. As we’ve seen before, employees are not happy to aide governments in the business of death, surveillance or censorship, and we suspect the same storyline will play out here.

Google and IBM should be preparing themselves for significant internal and external backlash.

IBM and Red Hat seal the deal

The $34 billion acquisition of opensource enterprise software vendor Red Hat by venerable tech giant IBM has finally been completed.

The mega M&A was first announced last October and, given the size of it, seems to have gone through relatively quickly. Now begins the significant undertaking of integrating two such massive organisations that may well have quite distinct cultures.

IBM was founded in 1911 and has undergone several transformations to become the enterprise software and services company it is today. Red Hat only came into existence in 1993 and has always focused on the decidedly un-corporate open-source software community. IBM will be hoping some of its youthful vigour and flexibility will rub off, but that remains to be seen.

The official line is that the acquisition makes IBM one of the leading hybrid cloud providers as well as augmenting its software offering. There’s much talk Red Hat’s independence being preserved but, of course, it will now be taking orders from IBM.

“Businesses are starting the next chapter of their digital reinventions, modernizing infrastructure and moving mission-critical workloads across private clouds and multiple clouds from multiple vendors,” said Ginni Rometty, IBM chairman, president and CEO. “They need open, flexible technology to manage these hybrid multicloud environments. And they need partners they can trust to manage and secure these systems.”

“When we talk to customers, their challenges are clear: They need to move faster and differentiate through technology,” said Jim Whitehurst, president and CEO of Red Hat (what’s the difference?). “They want to build more collaborative cultures, and they need solutions that give them the flexibility to build and deploy any app or workload, anywhere.

“We think open source has become the de facto standard in technology because it enables these solutions. Joining forces with IBM gives Red Hat the opportunity to bring more open source innovation to an even broader range of organizations and will enable us to scale to meet the need for hybrid cloud solutions that deliver true choice and agility.”

That’s it really. There’s lots aspirational talk and general banging on in the press release, but you get the gist of it. Whitehurst will join the senior management team and report into Rometty, who seems to possess every senior management position worth having. IBM has been steadily increasing cloud as a proportion of total revenues and the pressure is now on to take that growth to the next level.

It’s Red Hat, but not as we know it

Software vendor Red Hat is celebrating the launch of Enterprise Linux 8 and the approval of its acquisition by IBM with a change of wardrobe.

As arguably the best known company to be named after an item of clothing, the hat itself is central to Red Hat’s brand and image, so any decision to muck about with it, therefore, is not to be taken lightly. But when incoming CMO Tim Yeaton chatted to people about the logo he was distressed to hear they found the dude wearing the hat to be sinister and even evil.

Showing some of the qualities that presumably lead to his promotion Yeaton quickly concluded that having an ‘evil’ logo was a potential marketing liability and dedicated the next year and a half to resolving the matter in an appropriately open source way. This exhaustive process apparently came to a simple conclusion: ditch the dude, resulting in the dude-less logo you see above.

The evolution of the Red Hat logo coincides with a couple of other pretty significant milestones for the company. Tech giant IBM was recently advised that the US Department of Justice has concluded its review of the Red Hat acquisition and said it’s got no problem with it and as far as the US is concerned this is an unconditional green light. IBM apparently reckons the whole thing will be wrapped up later this year.

Lastly Red Hat recently announced the first major new version of its Enterprise Linux platform – RHEL 8. As a platform designed with datacenters in mind, RHEL is of increasing relevance to telcos as they move ever more of their stuff into the cloud and the edge. Red Hat is positioning RHEL 8 as the platform for the hybrid cloud era and name-dropped lots of other associated buzzwords like containers and devops. We wouldn’t even know which end of the box to open with this stuff, so hopefully this vid as well as some canned quotes will help you understand what the big deal is.

 

Stefanie Chiras, vice president and general manager, Red Hat Enterprise Linux, Red Hat

“Red Hat Enterprise Linux 8 embraces the role of Linux as IT’s innovation engine, crystallizing it into an accessible, trusted and more secure platform. Spanning the entirety of the hybrid cloud, the world’s leading enterprise Linux platform provides a catalyst for IT organizations to do more than simply meet today’s challenges; it gives them the foundation and tools to launch their own future, wherever they want it to be.”

Tibor Incze, technical lead, Red Hat Enterprise Linux, Datacom Systems

“The capacity for Red Hat Enterprise Linux 8 to not only run multiple versions of the same application or database on a specific operating system but to also have a clear and efficient way to manage them is a significant benefit to Datacom and our customers. As we continue to execute on our internal DevOps strategy, we’re also pleased to see improved container capabilities in the operating system and extensive automation, all factors that will help us bring differentiated services to our end users.”

John Gossman, distinguished engineer, Microsoft Azure

“We have seen growth in applications being deployed using Red Hat Enterprise Linux on Azure, including Microsoft SQL Server, for cloud-native, hybrid, and cloud migration scenarios. We’re excited to see what customers will create with Red Hat Enterprise Linux 8 on Azure with continued integrated support from Microsoft and Red Hat, as well as the operating system’s new capabilities to build applications for workloads like AI.”

Arlen Shenkman, executive vice president, Global Business Development and Ecosystems, SAP

“Red Hat Enterprise Linux 8 for SAP Solutions offers high availability capabilities, which are important for SAP workloads, and downtime is unacceptable for business critical applications such as S/4HANA. For more than two decades, we’ve worked with Red Hat on maintaining a stable, open foundation for SAP applications, helping our customers make smarter decisions, faster, across the hybrid cloud.”

IBM Vodafone partnership wins its first clients

IBM and Vodafone announced during Mobile World Congress 2019 that their $550 million cloud and AI partnership has signed its first heavy-weight clients.

SEAT, a Spanish sub-brand of the Volkswagen group, and KONE, a world leading lift and escalator supplier from Finland, have become the first customers of the open cloud and AI technologies offered by the IBM and Vodafone Business partnership.

SEAT is going to use the cloud, AI, and 5G technologies to facilitate its transformation into a “mobility services provider”. KONE’s main interest is in the IoT domain. With the new technologies it aims to move its customer service from reactive to proactive then predictive mode as well as to improve the efficiency of the monitoring and fix operations.

The partnership between IBM and Vodafone Business was announced last month. Although billed as a “joint venture”, Michael Valocchi, IBM’s General Manager of the new venture, clarified to Telecoms.com that it is not a formal joint venture or a separate organization but an 8-year strategic commercial partnership and $550M managed services agreement. IBM and Vodafone Business are going to put in equal amount of investment.

“IBM’s partnerships with global telco companies like Vodafone will help speed up the deployment of 5G and provide easier access to new technologies such as AI, blockchain, edge computing and IoT,” said Valocchi in a statement. “This is because the promise of 5G doesn’t just depend on fiber, spectrum and gadgets, but on advanced levels of integration, automation, optimization and security across the ever more complex IT systems that companies are building in a bid to transform.”

“By providing the open cloud, connectivity and portable AI technologies that companies need to manage data, workloads and processes across the breadth of their IT systems, Vodafone and IBM are helping to drive innovation and transform user experiences across multiple industries – from retail to agriculture,” added Greg Hyttenrauch, Co-leader of the new venture for Vodafone Business.

The partnership will become operational in Q2 this year. IBM told Telecoms.com that by that time Vodafone Business customers will immediately have access to IBM’s entire hybrid cloud portfolio to optimise and enhance their current solutions. These solutions and services are not dependent on 5G. In the future, clients will benefit from new solutions and services that the new venture will develop, combining IBM’s multi-cloud, AI, analytics and blockchain with IoT, 5G, and edge computing from Vodafone.

Considering that Vodafone is going to start with a non-standalone approach to 5G, the use cases for verticals that demand extreme low latency are hard to realise in the near future. The engineers at IBM’s stand also conceded that although Watson can be deployed and trained to support many scenarios, the implementation of mission critical cases will have to wait till end-to-end 5G network is in place.

Vodafone bags Big Blue as $550 million partner

Vodafone Business and IBM have signed-off on a new joint venture which will aim to develop systems to help data and applications flow freely around an organization.

The joint-venture, which will be operational in the first half of 2019, will aim to bring together the expertise of both the parties to solve one of the industry’s biggest challenges; multi-cloud interoperability and the removal of organizational siloes. On one side of the coin you have IBM’s cloud know-how while Vodafone will bring the IoT, 5G and edge computing smarts. A match made in digital transformational heaven.

“IBM has built industry-leading hybrid cloud, AI and security capabilities underpinned by deep industry expertise,” said IBM CEO Ginni Rometty. “Together, IBM and Vodafone will use the power of the hybrid cloud to securely integrate critical business applications, driving business innovation – from agriculture to next-generation retail.”

“Vodafone has successfully established its cloud business to help our customers succeed in a digital world,” said Vodafone CEO Nick Read. “This strategic venture with IBM allows us to focus on our strengths in fixed and mobile technologies, whilst leveraging IBM’s expertise in multi-cloud, AI and services. Through this new venture we’ll accelerate our growth and deepen engagement with our customers while driving radical simplification and efficiency in our business.”

The issue which many organizations are facing today, according to Vodafone, is the complexity of the digital business model. On average, 70% of organizations are operating in as many as 15 different cloud environments, leaning on the individuals USPs of each, but marrying these environments is a complex, but not new, issue.

Back in September, we had the chance to speak to Sachin Sony of Equinix about the emerging Data Transfer Project, an initiative to create interoperability and commonalities between the different cloud environments. The project is currently working to build a common framework with open-source code that can connect any two online service providers, enabling a seamless, direct, user-initiated portability of data between the two platforms This seems to be the same idea which the new IBM/Vodafone partnership is looking to tackle.

With this new joint-venture it’ll be interesting to figure out whether the team can build a proposition which will be any good. Vodafone has promised the new business will operate with a ‘start-up’ mentality, whatever that means when you take away the PR stench, under one roof. Hopefully the walk will be far enough away from each of the parent companies’ offices to ensure the neutral ground can foster genuine innovation.

This is a partnership which has potential. The pair have identified a genuine issue in the industry and are not attempting to solve it alone. Many people will bemoan the number of partnerships in the segment which seem to be nothing more than a feeble attempt to score PR points, but this is an example where expertise is being married to split the spoils.

German regulator effectively confirms IBM/T-Systems talks

As it does from time-to-time, German regulator Bundeskartellamt has published a list of mergers and acquisitions which is evaluating. IBM and T-Systems are lucky enough to make the list.

Reports of the discussions emerged over the weekend, with IBM rumoured to be considering taking the mainframe service business unit off the hands of the struggling T-Systems. Although the specifics of the deal are not completely clear right now, it would hardly be a surprise to learn T-Systems is attempting to slim the business down.

On the Bundeskartellamt website, there is a page which lists some of the main transactions which the regulator is considering in its role as merger overseer. These are mainly deals which are in the ‘first phase’ and usually passed unless there are any competition concerns. Although the description is not detailed, it lists IBM will be acquiring certain assets from T-Systems.

The news was initially broken by German-language newspaper Handelsblatt, quoting an internal email which suggested 400 employees would be transferred to the IBM business in May. Subsequently IT-Zoom has suggested IBM will be paying €860 million for the business unit.

The origins of such a deal can only lead back to one place; the office of T-Systems CEO Adel Al-Saleh. Al-Saleh was initially brought to the firm, having previously worked at IBM for almost two decades, to trim costs and salvage a business unit which, recently, has been nothing but bad news for parent company Deutsche Telekom. Aside from this saga, job cuts of roughly 10,000 have been announced since Al-Saleh’s appointment.

Confirmed back in June, the 10,000 job cuts were a result of a long-time losing battle to the more agile and innovative players such as AWS and Microsoft. Al-Saleh’s objective was to trim the fat, focusing on the more lucrative contracts, as well as more profitable, emerging segments of the IT and telco world.

While T-Systems and IBM do already have an established relationship, it seems options are running thin to make this business work effectively. With headcount going down from 37,000 to 27,000, its footprint dropping from 100 cities to 10 and this deal working through the cogs as we speak, Deutsche Telekom employees will hope this is the last of the bad news. Whether Al-Saleh feels this is enough restructuring to make the business work remains to be seen.