US Senators hit out at India’s pro-privacy and localisation laws

Two US Senators have signed a letter addressed to India’s Prime Minister Narendra Modi suggesting new rules to tighten up data practices in the country could lead to a weakened trade relationship with the US.

The US Government has already shown the damage which can be done when it starts throwing around economic sanction and hurdles, almost sending ZTE to join the Dodo on the extinction list, and it appears to be using the same tactics here. However, instead of punishing an organization which broke trade laws, it is attempting to bully a country into its own line of thinking and away from a pro-privacy stance.

“We see this (data localization) as a fundamental issue to the further development of digital trade and one that is crucial to our economic partnership,” the letter signed by Senators John Cornyn and Mark Warner states. The Senators serve as co-chairs of the Senate’s India caucus.

The letter, seen by Reuters, relates to new data protection, privacy and localisation rules which are set to come into play this week (October 15). The rules have been in the making for some time, and while there are some very suspect clauses, this is an attempt to tame the wild-west internet in the country, applying regulations which should be deemed more acceptable for the digital economy.

Back in July, the Indian Government unveiled a report which detailed its new approach to data regulations in the country. Included in the rules are restrictions on how data can be collected and utilised, setting out a similar stance to GDPR in Europe, while also including new approaches such as the right to be forgotten, explicit opt-in consent for certain categories of data (that which is deemed sensitive), and also data localisation. It is a much more stringent approach to the data economy, taking India closer to the European stance on privacy than the US’ views.

Aside from the data protection and privacy benefits of localisation, and not to mention greater influence for the Indian Government, such a strategy also stimulates the economy. Local jobs will have to be created and new data centres will have to be constructed to meet the rising demands of the increasingly digital Indian economy and society. These are clearly benefits for the country, though the threat of an impact on US trade will certainly be a worry for India.

Over the course of 2017, India exported $34.83 billion worth of goods and services to the US. This figure accounted for 16% of the total exports for the country, making the US the largest trading partner. The US Government certainly does have leverage to coerce India into its own way of thinking.

The letter from the Senators also happens to coincide with some pretty heavy lobbying from the likes of Visa, Mastercard and American Express. All would certainly find life simpler if there was no such thing as localisation, though it seems lobbying Senators to fight the cause has been more effective than efforts to persuade Indian officials to head a different direction. The new rules seem to have been influenced by Europe’s GDPR, though the US, both the Government and companies, have a different approach to data than the pro-privacy Europeans.

With India’s economy fast evolving from analogue to digital, there certainly will be profits to be made. Many US companies, most notably those in Silicon Valley, will be looking greedily at the country though such rules would make life more difficult. Not impossible, but not as simple. Perhaps the economic weight of the US Government can bully India into believing the ‘American Dream’.

Samsung looks to reverse mid-year dip with Q3 numbers

Q2 wasn’t exactly a party for Samsung, though it seems ready to correct the dip at the first possible opportunity.

Samsung Electronics has announced its earnings guidance for the third quarter of 2018, with consolidated sales coming in at 65 trillion Korean won (roughly $57.3 billion), up 5% year-on-year, and profits of 17.5 trillion won ($15.8 billion). The profits would represent a 20% uplift compared to the same period of 2017.

This is a much more positive return for the business compared to the second quarter of 2018. Three months ago the business reported sales of $52.1 billion, a decline of 4% year-on-year, though this was in-line with guidance offered in the weeks leading up to the earnings call.

During the earnings call, the downward dog resembling graph was blamed on a sluggish S9, though this might be partly due to the 5G euphoria. With the promise of 5G compatible phones in 2019, some users might be wary of purchasing a device which will become old news in a matter of months.

That said, the semiconductor business has been a very strong performer for Samsung through the last couple of quarters, and some might suspect this is the case once again. Full results are due to be released on October 31.

Brexit might not be all its bigged up to be – Equinix

With Brexit chat flying back and forth across the English Channel, London might be a dirty word, but that hasn’t stopped it from charging ahead of interconnection trends.

Interconnection bandwidth, the movement of data without traversing the public internet, is an aspect of the connectivity industry which has been growing steadily over the last few years without attracting headlines. As more organizations progress in their digital transformation journeys, it could be used a good measure of economic digital activity in a region. According to Equinix, there is little evidence Brexit is having any material impact on business just yet.

Quoting its second annual Global Interconnection Index (GXI), interconnection bandwidth is forecasted to grow to 8,200 Terabits per second (Tbps) of capacity by 2021, the equivalent of 33 Zettabytes (ZB) of data exchange per year, which is projected to be ten times the capacity of internet traffic. This represents a five-year compound annual growth rate (CAGR) of 48%, compared to 26% CAGR of global IP Traffic. Despite many fearing Brexit would isolate the UK from Europe, if the flow of information is taken as one of the measures of economic productivity, things don’t look too bad right now.

Across Europe, London, Amsterdam, Frankfurt and Paris are the fastest growing markets for interconnection bandwidth, with only Amsterdam (57% CAGR) and Frankfurt (58% CAGR) outperforming London (52% CAGR) for projected growth. And while London sits in third for projected growth, it is still the most important market for Equinix on the continent accounting for more than 35% of Europe’s private data exchange growth. It is simply that much bigger than any other region.

“Despite Brexit and political uncertainty in the UK, the GXI reveals that London is projected to show strong growth, accounting for more than 35% of Europe’s Interconnection Bandwidth growth,” said Russell Poole, MD of Equinix’s UK business. “London’s digital acceleration shows that post-Brexit, Interconnection Bandwidth continues to be driven by the secular growth of global data traffic and the massive shift in IT to support this data explosion.”

The growth of the interconnection bandwidth is driven by digital transformation. As more business critical processes become digitized, the need to ensure real-time interactions between people, things, locations, clouds and data, in a secure fashion becomes more important. Using private data traffic exchanges to bypass the public internet and mitigate against digital threats, reducing the number of vulnerability points, is becoming increasingly popular. Despite Brexit looming on the horizon, the GXI claims 64% of decision makers believe the UK is still the best place in Europe to Interconnect with partners, customers, supply chain and cloud service providers, due to the growing data centre industry.

While these statistics are encouraging, at least for those who live in the UK, it ultimately means very little. No-one really knows what the impact of Brexit will actually be until we’ve actually left the union. That said, it is nice to see not everyone is jumping on the fear-mongering band wagon.

AI is a $200bn opportunity, and Intel has already grabbed $1bn of it

The newly emerging artificial intelligence market will be a money-maker, that is no secret, though Intel appears to have taken an early march on the prospering technology.

The challenge which the technology industry, and all the technology teams in the industrial verticals, are facing is the astronomical amount of data which is flowing around the digital society. To realise revenues associated with the data economy, companies have to make use of this information. That means collecting, storing and processing data with a completely different mindset.

“I find it astounding that 90% of the world’s data was generated in the past two years,” said Navin Shenoy, GM of Intel’s data centre business unit. “And analysts forecast that by 2025 data will exponentially grow by 10 times and reach 163 zettabytes. But we have a long way to go in harnessing the power of this data. A safe guess is that only about 1% of it is utilized, processed and acted upon. Imagine what could happen if we were able to effectively leverage more of this data at scale.”

As with every challenge which is faced by the industry, there is an opportunity to make money. Shenoy believes this challenge is worth more than $200 billion to the data centre industry, the largest prize Intel has ever set its eyes on. Having already claimed $1 billion of this fortune, the team has targeted $10 billion 2022. And as you can imagine, Intel has developed a new product roadmap to capture the fortunes.

This is of course part of the overall shift in the Intel business as it shifts more towards the data-centric dream. It wasn’t too long ago the data centre business counted for less than a third of total revenues, now it is almost half and continuing to gather momentum. The AI portfolio is receiving a lot of attention now, and this will only fuel the shift across to the data-centric dreamland. At Intel’s Data-Centric Innovation Summit, Shenoy provided some much needed depth to the Intel product roadmap.

First and foremost comes the need to move faster in the data centre. Silicon photonics are product a product breakthrough which will help, an innovation which has been incorporated into Xeon processor-based systems, with new products such as SmartNIC set to launch onto the market next year. Speed is of course one aspect, but storage is another. Be prepared for an assault of new products from Intel in this space as the NAND-based products have already been performing very well, while elsewhere, the first Optane DC persistent memory products have been shipped to Google.

The final aspect of the new data-defined era is the need to process everything. This is immensely complicated and requires a huge amount of processing power, but this is the point of the data economy; process seemingly redundant information into useful insight. Last year, Intel unveiled its Xeon Scalable platform, shipping more than 2 million units in 2018’s second quarter, and now it has revealed the next stages of the product roadmap.

Cascade Lake is a future Intel Xeon Scalable processor based on 14nm technology that will introduce Intel Optane DC persistent memory and a set of new AI features called Intel DL Boost. The next, Cooper Lake, will introduce a new generation platform with significant performance improvements, new I/O features, new DL Boost capabilities that improve AI/deep learning training performance, and additional Optane DC persistent memory innovations. Ice Lake is another Xeon Scalable processor based on 10nm technology that shares a common platform with Cooper Lake and is planned as a fast follow-on targeted for 2020 shipments

While all of this might seem incredibly obvious, it is always worth restating the fact. Intel is right in the sense it will require a completely different approach to how data centres and the intelligence business are approached. Before too long, today’s data processing techniques and depth will look very simplistic, though it is not going to be cheap. $200 billion might sound like a wondrous prize for Intel, but customers need to be tempted to sign the cheques. This might be the difference between tomorrow’s winners and losers, those who are prepared to spend big today to get a jump start on the intelligence era.

Intel is a company which has had its fair share of negative press, scandals and share price plummets in recent memory, but arguably there are few organizations who could consider themselves in the same league. In claiming $1 billion of the data-centre prize already, Intel is establishing itself at the forefront of tomorrow’s booming industry. Companies like AMD and Nvidia are racing to catch-up with the market leader, but Intel seems to be sitting very comfortably at the moment.

The price of data storage, transportation and processing is cheaper than ever and still going down. Use cases such as autonomous vehicles and personalised digital experiences are becoming clearer. The opportunity is right in front of use, now it just needs to be made into reality.

Google looks for help scaling the Great Firewall amid political posturing

Google’s cloud business unit is reportedly in talks with Tencent, Inspur and other domestic companies in an effort to launch services in the lucrative Chinese market.

Many Silicon Valley companies have found difficulty in establishing a presence in the world’s second largest economy, and Google is no exception. The absence of its core search advertising business has spurred on the establishment of Baidu, though it seems the Googlers are keen to avoid missing out on the digital transformation journey in enterprise which has been so lucrative to Google as well as other giants such as Amazon Web Services and Microsoft Azure.

According to Bloomberg, Google has been on the search for local partners since early 2018, though the list has been narrowed. As one of the most powerful internet companies in the country, Tencent naturally sits at the top, though there are more. Making moves on such a lucrative market might not surprise anyone in the tech world, all of the major players have been trying to smooth the edges of Chinese government demands for years, but Google seems to be attracting some attention.

The goal here would be to run Google services via local data centres, at least partly owned by domestic companies. This is not unusual, every Western business entering the country has to be grounded by some sort of local partnership, irrelevant of industry, vertical or service. Other rules in the country dictate data has to reside within the country’s physical borders, subjecting it to local laws.

What is unknown is whether the global brand can gain traction locally. As is the case with many of aspects of the internet, in the absence of Silicon Valley players, local companies have taken the opportunity to create domestic versions. Baidu is the search engine, though Alibaba and the telcos have stepped up to fill the gap in cloud services. Google might have a very reputable business worldwide, but it is the challenger in China; shifting the status quo is a very difficult task.

Aside from this venture, it was revealed last week Google is in the process of developing at app which would bow to censorship demands, allowing results to be filtered and some removed, to penetrate the Great Firewall. The business has also opened an AI research centre in Beijing and plans to open a Hong Kong cloud region in the future. It should also come as little surprise US politicians are finding issue with the efforts.

In a letter to Google CEO Sundar Pichai, a collection of Senators including Mark Warner of Virginia, Mark Rubio of Florida and Robert Menendez of New Jersey, signed a letter condemning Google’s submission to the government.

“If true, this reported plan is deeply troubling and risks making Google complicit in human rights abuses related to China’s rigorous censorship regime,” the letter reads. “After a cyberattack that compromised the Gmail accounts of dozens of Chinese human rights activists, Google’s March 2010 decision to stop censoring results on was widely praised.

“It is a coup for the Chinese government and Communist Party to force Google – the biggest search engine in the world – to comply with their onerous censorship requirements, and sets a worrying precedent for other companies seeking to do business in China without compromising their core values.”

With tensions continuing to rise between the two nations, Google’s plans to modify its values and principles in an effort to gain entry to the country could be viewed as somewhat of a PR win for the Chinese. Google is one of the companies which could be viewed as a poster boy for the US way of life and ideals, a mix of ruthless capitalism and excellent brand management. It’s a scalp in the war of words and tariffs.

Of course, US political posturing could have much more of an impact than the Googlers would want; intervention from PR chasing politicians is a very realistic outcome.

Microsoft sinks casually drops data centre into the North Sea

Microsoft’s Project Natick is entering into its second phase of development, and actually sinking a data centre 117 feet into the North Sea, a couple of miles off the coast of Scotland’s Orkney Islands.

It certainly is a novel idea, and while it does sound like a bit of a PR gambit, there is some sound logic. Its small enough to manufacture and deploy pretty quickly, low temperatures at the seabed reduce cooling bills, around half the world’s population live within 120 miles of the coast and when connected to renewable energy sources on the surface, it becomes notably cheaper to run.

The data centre itself has been designed by Naval Group, a French business with expertise in engineering, manufacturing and maintaining military-grade ships and submarines, is contained-sized,  loaded with 12 racks containing a total of 864 servers and associated cooling system infrastructure. The systems have been designed as simply as possible to remove the need for human intervention, as when the data centre has been lowered to the seafloor, it is impossible for humans to gain entry.

This is another aspect of Project Natick which is both exciting and worrying for the data centre workforce; the asset is designed to be a ‘Light Out’ operation, essentially meaning automation plays a big role here, though remote control and maintenance is possible. If something breaks, it breaks, but the PoC vessel operated in the same Lights Out manner for 105 days. This is a good start, but for the idea to be commercially viable, Microsoft will have to prove the data centre can remain operational for five years.

“The most joyful moment of the day was when the data centre finally slipped beneath the surface on its slow, carefully scripted journey,” said Project Leader Ben Cutler.

Microsoft 2

The idea is a good one, but now the team has to prove the theory is one which will be economically, environmentally and operationally sound. First and foremost is the sturdiness of the data centre. The test site, the European Marine Energy Centre, is one of the rougher locations in the North Sea, with tidal currents travelling up to nine miles per hour at peak intensity and the sea surface regularly roils with 10-foot waves that whip up to more than 60 feet in stormy conditions.

“We know if we can put something in here and it survives, we are good for just about any place we want to go,” said Cutler.

Another test will be making sure the theory of energy self-sufficient data centres is a sound one. Colocation with marine renewable energy is an important step here, and could allow Microsoft to bring services to regions of the world with unreliable electricity, and eliminate the need for costly backup generators in case of power grid failures. In this test, a cable from the Orkney Island grid sends electricity to the data centre, which requires just under a quarter of a megawatt of power when operating at full capacity. The Orkney Islands are actually 100% powered by renewable energy, a mix of solar, wind and marine.

Microsoft 3

The cooling system is another way in which Microsoft can potentially make the data centre more sustainable and efficient to run, as the water surrounding the submerged asset will be pumped through the system to cool the servers. Add in the consistently low temperature of the water at the seabed and the cooling problem of data centres becomes a lot more feasible.

Aside from proving it can be powered and operated, the team also have to demonstrate the logistical side. Building the data centre in the shape of a container means there are a variety of sizes available and it can be easily transported from the factory. This is a positive, as is the successful deployment, using 10 winches, a crane and a gantry barge, as well as a remotely operated vehicle that accompanied the data centre on its journey, but at the end of the experiment, the team need to get the asset back out. Recycling the asset is key for the business model here, and despite it being powered by renewable energy, we suspect Microsoft would have few green fans if it just left it at the bottom of the ocean.

The promise of the project is very attractive. With the rapid growth of internet usage and the increasing data-intensity of applications, infrastructure needs to scale quickly. The idea offers speedy and flexible deployment, low latency, environmentally friendly and cost-effective, but there are still a lot of unknowns. Whether Microsoft can operate a data centre with zero human intervention for five years, in a harsh environment, remains to be seen, but it could be onto a winner.

Cloud fuels Amazon, Microsoft and Intel, but best is yet to come

The quarterly results from Amazon, Microsoft and Intel show profits are certainly in the cloud, but with 5G just around the corner this might only be the start of a cloud renaissance.

Amazon’s AWS business unit grew 49% year-on-year, Microsoft’s Azure revenues shot up 93%, while Intel’s data centre group saw 24% jump compared to the same period 12 months ago. These are all very promising numbers, but when you considered the promises of 5G and the prospect of a more influential distributed cloud, it might be chump change.

5G promises increased and faster connectivity, as well as more cost efficient delivery, however a conversation which doesn’t seem to get much attention at the moment is the cloud. The more content users are downloading or accessing on the move, the greater the need for a more efficient and influential distributed cloud model. It might not be the most riveting aspect of the telecommunications world, but anyone associated with the cloud segment will clean up, if they aren’t making enough already.

That said, focusing on the here and now, Amazon is maintaining its position as one of the most influential companies on the planet. Net sales increased 43% to $51 billion in the first quarter, with operating income at $1.927 billion. Looking specifically at the cloud business, net sales stood at $5.442 billion, an increase of 49% year-on-year. This figure now accounts for roughly 11% of total revenues at Amazon, while operating income stood at $1.4 billion. Amazingly, 73% of all operating income at the technology mammoth came from AWS.

“AWS had the unusual advantage of a seven-year head start before facing like-minded competition, and the team has never slowed down,” said Jeff Bezos, Amazon CEO. “As a result, the AWS services are by far the most evolved and most functionality-rich.”

AWS might have had a running start at the industry, but it is not alone in fighting for profits in the cloud. Sitting in the number two spot is Microsoft, with its Azure platform making some very promising headway in the space.

Total revenues across the Microsoft business was $26.8 billion, a 16% year-on-year increase, with the Intelligent Cloud unit contributing $7.9 billion, up 17%. In the cloud business, Azure revenue growth was 93%. AWS is still the leader in the cloud world, but Microsoft must be chipping away at that leader with consistent quarterly growth in the 90s.

“Our results this quarter reflect the trust people and organizations are placing in the Microsoft Cloud,” said Satya Nadella, CEO of Microsoft. “We are innovating across key growth categories of infrastructure, AI, productivity, and business applications to deliver differentiated value to customers.”

Over the course of the quarter, Microsoft invested around $3.5 billion in capital expenditure, a notable chunk of which would certainly have been on enhancing the cloud business. While the cloud can be a very profitable game, it isn’t cheap to set up. Google have also confirmed this after spending $7.67 billion, both companies demonstrating substantial increases from the year before, with companies like Intel claiming the rewards.

The cloud is incredibly influential now, and will only become more so. Intel’s dominant position when it comes to chips in the data centre makes it looks like a good bet for any market speculators. Total revenues across the company were up 13% year-on-year to $16.1 billion, while the Data Centre Group brought in $5.2 billion, a 24% boost. Share price was up 7% since the announcement, with the future looking very rosy.

These are all companies making quiet waves in the technology world, but with 5G, bigger and better things could be on the horizon. Our insatiable appetite for data and connectivity will continue to fuel momentum for the cloud players. They might not be the sexiest aspect of the telecommunications segment, but certainly the ones with some of the most interesting prospects moving forward.

Apple flexes its green muscles

Apple has claimed it is now globally powered by 100% clean energy, as it brings another nine of its suppliers onto the green mission.

The firm now claims all retail stores, offices, data centres and co-located facilities across 43 countries are now powered 100% by clean energy, with the number of its suppliers committing to the same mission now up to 23. While this is a commendable achievement, it should be worth noting that ‘clean’ does not mean renewable.

“We’re committed to leaving the world better than we found it,” said Apple CEO Tim Cook. “After years of hard work we’re proud to have reached this significant milestone. We’re going to keep pushing the boundaries of what is possible with the materials in our products, the way we recycle them, our facilities and our work with suppliers to establish new creative and forward-looking sources of renewable energy because we know the future depends on it.”

The definition of clean energy is a source which does not pollute the atmosphere when used. This does not necessarily mean renewable sources such as wind or solar, but it does exclude oil and coal. There are also some grey areas when it comes to retail stores, which are powered by the public grid and therefore out of the control of the iLeader, though it does purchase Renewable Energy Certificates (RECs) as a means to compensate.

Once electricity is produced it is simply dumped onto the grid where it ‘mixes’ with all the other sources, whether they be coal, wind etc. Because of this, it is impossible to tell what ‘electricity’ is clean or not. When a renewable energy generator produces a megawatt-hour (MWh) of power it receives one REC, a certificate saying that it generated one MWh of electricity from clean sources, which it can sell. The electricity which Apple consumes might not be clean, but through the RECs, Apple can say that it is funding, and therefore sustaining, the production of clean energy.

While it is a bight of PR haziness, it should not take away from the Apple achievement which is of course impressive. Of course, these are examples which are out of the control of Apple, when it does have control the situation is a lot clearer.

Apple currently has 25 operational renewable energy projects around the world, totalling 626 megawatts of generation capacity. It also has 15 more projects in construction which once built, over 1.4 gigawatts of clean renewable energy generation will be spread across 11 countries. Some of the new projects include Apple’s new headquarters in Cupertino (which you can see at the top of the article) which will be powered 100% by renewable energy, its new data centre in Waukee, Iowa, that will run entirely on renewable energy and as will the two new data centres in Denmark. This is on top of all the partnerships with local authorities and utilities.

Alongside its own commitments, Apple has been on a mission to get its suppliers to commit to a greener lifestyle. Some of the new committees include DSM Engineering Plastics, Luxshare-ICT and Quanta Computer. This will only be a fraction of the Apple supply-chain, but it is still a good achievement.

Committing to clean energy is of course a good PR exercise for the iChief and may also help it secure lucrative contracts with public sector organizations which have commitments to greener operations. But it is also a good news story, which are starting to get rarer.

Intel’s Krzanich does this best duck impression

Intel might be fighting back the flames of the Spectre and Meltdown security flaws, but CEO Brian Krzanich has done his best to put on a calm front for investors during the latest earnings call.

We can only imagine Krzanich, much like a duck, is calm on the surface, but frantic below. The CEO managed to avoid any backlash when offloading any many Intel shares as possible prior to the Spectre and Meltdown announcements, even finding time to organization some patches to the vulnerabilities. These patches did not live up to the promise, but Krzanich is confident the flaws will not weigh heavy on Intel’s financial performance.

“From a cost standpoint, we’ve baked in and we’ve talked about that we don’t expect any material impact of this security exploit on our spending or product cost or any of that,” said Krzanich.

“From a fourth half standpoint, we actually made our forecast and we’ve checked it as we go through this the first two weeks here of the year against our prior forecast to make sure that the forecasting incorporated any changes or any signs we’re seeing up or down.”

Intel has absorbed the majority of the backlash from the industry, though this quarterly call seemed to be focused on calming investors. Intel might be under pressure right now, Krzanich seems to be saying, but don’t worry your pretty head about it. Intel will carry on and doesn’t expect there to be any ‘material’ impact on the business.

What ‘material’ means in this case is anyone’s guess. Krzanich is unlikely to reveal any details about the reallocation of resource or the financial cost of the vulnerability, using vague and dismissive language instead. Of course, Krzanich was never going to do anything other than present a positive front, that is what a good CEO does after all, but we are struggling to believe everything is groovy at Intel HQ.

Looking at the financials, Intel has good reason to be happy. Fourth-quarter revenue was $17.1 billion, while full-year revenue stood at $62.8 billion. These figures are 8% and 9% up year-on-year respectively, with data-centric revenue up 21% compared to Q4 in 2016.

This is the challenge Intel faces right now; can it transform itself from a PC-centric business to a data-centric one? IBM has seemingly successfully navigated the choppy digital transformation waters with its first quarter of growth in five years, and Microsoft make the transition to a cloud-business years ago, so it can be done. Intel is certainly growing the relevant business units, data centric revenues now account for 47% of the total, so it is certainly heading in the right direction.

Looking specifically at the data centre business, the core business grew 18% while adjacencies took a 35% uplift. Breaking down the market segments, cloud was up 35%, enterprise and government up 11% and CSPs 16%. Elsewhere in the world of Intel, the IoT Group was up 21% with operating income up 43%, Non-volatile Memory Solution Group up 9% and Programmable Solutions Group growing 35%.

The one group which has had little or no attention in the earnings is security, or McAfee as the team has now reverted to its previous name. Intel has ushered the majority of this brand to the exit, divesting 51% of the business to TPG Capital last year, though it hasn’t managed to get clear of this headache just yet.

Hats off to Intel for trying to cash in on the security craze before it hit euphoric levels back in 2011, though let’s look at this acquisition as what it is; a failure. Intel paid $7.68 billion to acquire McAfee back in 2011, and while it has recouped a healthy proportion of this, when you take into account inflation, it is unlikely to be as favourable reading. The evidence of this failure to diversify is evident in the sale of a majority holding in the business to TPG Capital.

Sweeping this car crash to the curb, Intel is looking like a business which is starting to get ready for the connected economy. The team expect the positive momentum to continue throughout 2018, forecasting full-year revenues of $65 billion, which would be 4% growth, as well as 5% year-on-year growth for the first quarter to $15 billion. Data-centric revenues are expected to be up mid-teens, led by strong memory growth, though the PC-centric business will continue to decline in the low single-digits.

The fact that data-centric is growing faster than PC-centric is declining is a good sign. Krzanich has shown us he knows when to offload shares, perhaps this is evidence is knows how to transform a company into a mean digital-ready machine.

Soon there will be another cable populating the subsea superhighway

China Telecom, China Unicom, Facebook, Tata Communications, and Telstra have all teamed up to sign a turnkey contract for the deployment of the Hong Kong-Americas (HKA) submarine cable network.

The Hong Kong-Americas (HKA) consortium, as they are officially known, has signed the agreement with Alcatel Submarine Networks to deliver a submarine cable network which will span more than 13,000km. The new asset will increase connectivity between Hong Kong and the US.

“We are committed to continually investing in our capabilities to meet our customers and partners’ increasing data demands,” said Tata Communications’ CTO, Genius Wong. “Joining the HKA consortium and connecting the new next-generation subsea cable system to our global network means that we are able to offer our customers and partners enhanced speed, diversity and reliability of connectivity between the business hubs of Asia and the US.

“With our growing network – and the cloud, mobility, security and collaboration services which it underpins – as the foundation, our customers and partners are better placed than ever to transform how they operate through new disruptive digital services and expand to new markets with agility.”

“The trust placed upon us by the HKA consortium validates our position as a key player for submarine network infrastructures in the Asia-Pacific region and the reinforcement of our local presence,” said Philippe Piron, President of Alcatel Submarine Networks.

“It also provides a strong platform to further demonstrate our commitment in project management and in the development of local relationships to support operators and content providers for their network and capacity expansion strategies.”

The new cable has promised to deliver greater diversity of connections, enhanced reliability and network efficiency, as well as improving connectivity between data centres in Asia and the US. In terms of the kit being used, Alcatel Submarine Networks has promised it will be top of the line, delivering 80 Tbps transmission capacity.

While it might not be the most glamorous part of the telco space, subsea cables are a crucial one. Google is another company which is boasting about its subsea party, as it announced investment into three new cables recently, one of which will be privately owned by the internet search giant.