It’s been a year in the making, but Microsoft is going through the final preparations to launch its game-streaming service, Project xCloud.
The project itself will allow Xbox gamers to play their favourite games by streaming the content onto their mobile devices. Although the technology giant has had to fit out its data centres with specialist servers to run the games, the extensive geographical footprint of its data centre network could make Microsoft a force to be reckoned with in the emerging cloud gaming segment.
“Our vision for Project xCloud is to empower the gamers of the world to play the games they want, with the people they want, anywhere they want,” said Kareem Choudhry, Corporate VP for Project xCloud at Microsoft.
“We’re building this technology so gamers can decide when and how they play. Customers around the world love the immersive content from Xbox in their homes and we want to bring that experience to all of your mobile devices.”
Next month, the public trial will be launched. The US, UK and Korea have been selected as the initial testing grounds, with consumers able to sign-up here. All you’ll need is a wireless controller with Bluetooth and a stable mobile internet connection of 10 Mbps.
Intel has reported its revenues for 2018, and while executives will be preaching their success the market has reacting very differently.
With fourth quarter revenue of $18.7 billion, a 9% year-on-year increase, and full-year revenues of $70.8 billion, up 13% compared to 2017, it looks pretty good. Executives will feel they have done a fair job, though a 6.1% decline in after-hours trading tells a different story.
“2018 was a truly remarkable year for Intel with record revenue in every business segment and record profits as we transform the company to pursue our biggest market opportunity ever,” said Bob Swan, Intel CFO and Interim CEO.
“In the fourth quarter, we grew revenue, expanded earnings and previewed new 10nm-based products that position Intel to compete and win going forward. Looking ahead, we are forecasting another record year and raising the dividend based on our view that the explosive growth of data will drive continued demand for Intel products”
Despite a 9% increase in sales during the final period, Intel missed expectations while the markets are reacting to what is perceived as a weak forecast. The team has forecast revenues of $16 billion over the first quarter and a 27% operating margin, while full-year revenues are expected to grow to $71.5 billion with an operating margin of 32%.
Looking at the specific segments, the PC-centric business brought in $9.8 billion, a 10% year-on-year boost, in the final quarter, and $37 billion across the year, up 9%. Intel suggested the higher performance products and strength in commercial and gaming segments brought about the growth.
Collectively, Intel’s data-centric businesses grew 20% year-on-year to $32.9 billion for the full year, with the Data-centre unit collecting the lions share, growing 21% year-on-year. Interestingly enough, mobile revenues have not been specifically listed here though this could be a very successful venture in the future.
It increasingly looks like Intel will be supplying modems for all Apple devices moving forward, as the iLeader looks to ditch Qualcomm. Intel has chequered past when it comes to mobile, but this could certainly be a more profitable venture as Apple bleeds its customers dry.
As it does from time-to-time, German regulator Bundeskartellamt has published a list of mergers and acquisitions which is evaluating. IBM and T-Systems are lucky enough to make the list.
Reports of the discussions emerged over the weekend, with IBM rumoured to be considering taking the mainframe service business unit off the hands of the struggling T-Systems. Although the specifics of the deal are not completely clear right now, it would hardly be a surprise to learn T-Systems is attempting to slim the business down.
On the Bundeskartellamt website, there is a page which lists some of the main transactions which the regulator is considering in its role as merger overseer. These are mainly deals which are in the ‘first phase’ and usually passed unless there are any competition concerns. Although the description is not detailed, it lists IBM will be acquiring certain assets from T-Systems.
The origins of such a deal can only lead back to one place; the office of T-Systems CEO Adel Al-Saleh. Al-Saleh was initially brought to the firm, having previously worked at IBM for almost two decades, to trim costs and salvage a business unit which, recently, has been nothing but bad news for parent company Deutsche Telekom. Aside from this saga, job cuts of roughly 10,000 have been announced since Al-Saleh’s appointment.
Confirmed back in June, the 10,000 job cuts were a result of a long-time losing battle to the more agile and innovative players such as AWS and Microsoft. Al-Saleh’s objective was to trim the fat, focusing on the more lucrative contracts, as well as more profitable, emerging segments of the IT and telco world.
While T-Systems and IBM do already have an established relationship, it seems options are running thin to make this business work effectively. With headcount going down from 37,000 to 27,000, its footprint dropping from 100 cities to 10 and this deal working through the cogs as we speak, Deutsche Telekom employees will hope this is the last of the bad news. Whether Al-Saleh feels this is enough restructuring to make the business work remains to be seen.
Huawei has unveiled a new ARM-based CPU called Kunpeng 920, designed to capitalise on the growing euphoria building around big data, artificial intelligence and edge-computing.
The CPU was independently designed by Huawei based on ARMv8 architecture license, with the team claiming it improves processor performance by optimizing branch prediction algorithms, increasing the number of OP units, and improving the memory subsystem architecture. Another bold claim is the CPU scores over 930 in the SPECint Benchmarks test, 25% higher than the industry benchmark.
“Huawei has continuously innovated in the computing domain in order to create customer value,” said William Xu, Chief Strategy Marketing Officer of Huawei.
“We believe that, with the advent of the intelligent society, the computing market will see continuous growth in the future. Currently, the diversity of applications and data is driving heterogeneous computing requirements. Huawei has long partnered with Intel to make great achievements. Together we have contributed to the development of the ICT industry. Huawei and Intel will continue our long-term strategic partnerships and continue to innovate together.”
The launch itself is firmly focused on the developing intelligence economy. With 5G on the horizon and a host of new connected services promised, the tsunami of data and focus on edge-computing technologies is certain to increase. These are segments which are increasingly featuring on the industry’s radar and Huawei might have stolen a couple of yards on the buzzword chasers ahead of the annual get-together in Barcelona.
“With Kirin 980, Huawei has taken smartphones to a new level of intelligence,” said Xu. “With products and services (e.g. Huawei Cloud) designed based on Ascend 310, Huawei enables inclusive AI for industries. Today, with Kunpeng 920, we are entering an era of diversified computing embodied by multiple cores and heterogeneity. Huawei has invested patiently and intensively in computing innovation to continuously make breakthroughs.”
Another interesting angle to this launch is the slight shuffle further away from the US. With every new product which Huawei launches, more of its own technology will feature. In years gone, should Huawei have wanted to launch any new servers or edge computing products it would have had to look externally for CPUs. Considering Intel and AMD have a strong position in these segments, supply may have come from the US.
For any other company, this would not be a problem. However, considering the escalating trade war between the US and China, and the fact Huawei’s CFO is currently awaiting trial for violating US trade sanctions with Iran, this is a precarious position to be in.
Two US Senators have signed a letter addressed to India’s Prime Minister Narendra Modi suggesting new rules to tighten up data practices in the country could lead to a weakened trade relationship with the US.
The US Government has already shown the damage which can be done when it starts throwing around economic sanction and hurdles, almost sending ZTE to join the Dodo on the extinction list, and it appears to be using the same tactics here. However, instead of punishing an organization which broke trade laws, it is attempting to bully a country into its own line of thinking and away from a pro-privacy stance.
“We see this (data localization) as a fundamental issue to the further development of digital trade and one that is crucial to our economic partnership,” the letter signed by Senators John Cornyn and Mark Warner states. The Senators serve as co-chairs of the Senate’s India caucus.
The letter, seen by Reuters, relates to new data protection, privacy and localisation rules which are set to come into play this week (October 15). The rules have been in the making for some time, and while there are some very suspect clauses, this is an attempt to tame the wild-west internet in the country, applying regulations which should be deemed more acceptable for the digital economy.
Back in July, the Indian Government unveiled a report which detailed its new approach to data regulations in the country. Included in the rules are restrictions on how data can be collected and utilised, setting out a similar stance to GDPR in Europe, while also including new approaches such as the right to be forgotten, explicit opt-in consent for certain categories of data (that which is deemed sensitive), and also data localisation. It is a much more stringent approach to the data economy, taking India closer to the European stance on privacy than the US’ views.
Aside from the data protection and privacy benefits of localisation, and not to mention greater influence for the Indian Government, such a strategy also stimulates the economy. Local jobs will have to be created and new data centres will have to be constructed to meet the rising demands of the increasingly digital Indian economy and society. These are clearly benefits for the country, though the threat of an impact on US trade will certainly be a worry for India.
Over the course of 2017, India exported $34.83 billion worth of goods and services to the US. This figure accounted for 16% of the total exports for the country, making the US the largest trading partner. The US Government certainly does have leverage to coerce India into its own way of thinking.
The letter from the Senators also happens to coincide with some pretty heavy lobbying from the likes of Visa, Mastercard and American Express. All would certainly find life simpler if there was no such thing as localisation, though it seems lobbying Senators to fight the cause has been more effective than efforts to persuade Indian officials to head a different direction. The new rules seem to have been influenced by Europe’s GDPR, though the US, both the Government and companies, have a different approach to data than the pro-privacy Europeans.
With India’s economy fast evolving from analogue to digital, there certainly will be profits to be made. Many US companies, most notably those in Silicon Valley, will be looking greedily at the country though such rules would make life more difficult. Not impossible, but not as simple. Perhaps the economic weight of the US Government can bully India into believing the ‘American Dream’.
Q2 wasn’t exactly a party for Samsung, though it seems ready to correct the dip at the first possible opportunity.
Samsung Electronics has announced its earnings guidance for the third quarter of 2018, with consolidated sales coming in at 65 trillion Korean won (roughly $57.3 billion), up 5% year-on-year, and profits of 17.5 trillion won ($15.8 billion). The profits would represent a 20% uplift compared to the same period of 2017.
This is a much more positive return for the business compared to the second quarter of 2018. Three months ago the business reported sales of $52.1 billion, a decline of 4% year-on-year, though this was in-line with guidance offered in the weeks leading up to the earnings call.
During the earnings call, the downward dog resembling graph was blamed on a sluggish S9, though this might be partly due to the 5G euphoria. With the promise of 5G compatible phones in 2019, some users might be wary of purchasing a device which will become old news in a matter of months.
That said, the semiconductor business has been a very strong performer for Samsung through the last couple of quarters, and some might suspect this is the case once again. Full results are due to be released on October 31.
With Brexit chat flying back and forth across the English Channel, London might be a dirty word, but that hasn’t stopped it from charging ahead of interconnection trends.
Interconnection bandwidth, the movement of data without traversing the public internet, is an aspect of the connectivity industry which has been growing steadily over the last few years without attracting headlines. As more organizations progress in their digital transformation journeys, it could be used a good measure of economic digital activity in a region. According to Equinix, there is little evidence Brexit is having any material impact on business just yet.
Quoting its second annual Global Interconnection Index (GXI), interconnection bandwidth is forecasted to grow to 8,200 Terabits per second (Tbps) of capacity by 2021, the equivalent of 33 Zettabytes (ZB) of data exchange per year, which is projected to be ten times the capacity of internet traffic. This represents a five-year compound annual growth rate (CAGR) of 48%, compared to 26% CAGR of global IP Traffic. Despite many fearing Brexit would isolate the UK from Europe, if the flow of information is taken as one of the measures of economic productivity, things don’t look too bad right now.
Across Europe, London, Amsterdam, Frankfurt and Paris are the fastest growing markets for interconnection bandwidth, with only Amsterdam (57% CAGR) and Frankfurt (58% CAGR) outperforming London (52% CAGR) for projected growth. And while London sits in third for projected growth, it is still the most important market for Equinix on the continent accounting for more than 35% of Europe’s private data exchange growth. It is simply that much bigger than any other region.
“Despite Brexit and political uncertainty in the UK, the GXI reveals that London is projected to show strong growth, accounting for more than 35% of Europe’s Interconnection Bandwidth growth,” said Russell Poole, MD of Equinix’s UK business. “London’s digital acceleration shows that post-Brexit, Interconnection Bandwidth continues to be driven by the secular growth of global data traffic and the massive shift in IT to support this data explosion.”
The growth of the interconnection bandwidth is driven by digital transformation. As more business critical processes become digitized, the need to ensure real-time interactions between people, things, locations, clouds and data, in a secure fashion becomes more important. Using private data traffic exchanges to bypass the public internet and mitigate against digital threats, reducing the number of vulnerability points, is becoming increasingly popular. Despite Brexit looming on the horizon, the GXI claims 64% of decision makers believe the UK is still the best place in Europe to Interconnect with partners, customers, supply chain and cloud service providers, due to the growing data centre industry.
While these statistics are encouraging, at least for those who live in the UK, it ultimately means very little. No-one really knows what the impact of Brexit will actually be until we’ve actually left the union. That said, it is nice to see not everyone is jumping on the fear-mongering band wagon.
The newly emerging artificial intelligence market will be a money-maker, that is no secret, though Intel appears to have taken an early march on the prospering technology.
The challenge which the technology industry, and all the technology teams in the industrial verticals, are facing is the astronomical amount of data which is flowing around the digital society. To realise revenues associated with the data economy, companies have to make use of this information. That means collecting, storing and processing data with a completely different mindset.
“I find it astounding that 90% of the world’s data was generated in the past two years,” said Navin Shenoy, GM of Intel’s data centre business unit. “And analysts forecast that by 2025 data will exponentially grow by 10 times and reach 163 zettabytes. But we have a long way to go in harnessing the power of this data. A safe guess is that only about 1% of it is utilized, processed and acted upon. Imagine what could happen if we were able to effectively leverage more of this data at scale.”
As with every challenge which is faced by the industry, there is an opportunity to make money. Shenoy believes this challenge is worth more than $200 billion to the data centre industry, the largest prize Intel has ever set its eyes on. Having already claimed $1 billion of this fortune, the team has targeted $10 billion 2022. And as you can imagine, Intel has developed a new product roadmap to capture the fortunes.
This is of course part of the overall shift in the Intel business as it shifts more towards the data-centric dream. It wasn’t too long ago the data centre business counted for less than a third of total revenues, now it is almost half and continuing to gather momentum. The AI portfolio is receiving a lot of attention now, and this will only fuel the shift across to the data-centric dreamland. At Intel’s Data-Centric Innovation Summit, Shenoy provided some much needed depth to the Intel product roadmap.
First and foremost comes the need to move faster in the data centre. Silicon photonics are product a product breakthrough which will help, an innovation which has been incorporated into Xeon processor-based systems, with new products such as SmartNIC set to launch onto the market next year. Speed is of course one aspect, but storage is another. Be prepared for an assault of new products from Intel in this space as the NAND-based products have already been performing very well, while elsewhere, the first Optane DC persistent memory products have been shipped to Google.
The final aspect of the new data-defined era is the need to process everything. This is immensely complicated and requires a huge amount of processing power, but this is the point of the data economy; process seemingly redundant information into useful insight. Last year, Intel unveiled its Xeon Scalable platform, shipping more than 2 million units in 2018’s second quarter, and now it has revealed the next stages of the product roadmap.
Cascade Lake is a future Intel Xeon Scalable processor based on 14nm technology that will introduce Intel Optane DC persistent memory and a set of new AI features called Intel DL Boost. The next, Cooper Lake, will introduce a new generation platform with significant performance improvements, new I/O features, new DL Boost capabilities that improve AI/deep learning training performance, and additional Optane DC persistent memory innovations. Ice Lake is another Xeon Scalable processor based on 10nm technology that shares a common platform with Cooper Lake and is planned as a fast follow-on targeted for 2020 shipments
While all of this might seem incredibly obvious, it is always worth restating the fact. Intel is right in the sense it will require a completely different approach to how data centres and the intelligence business are approached. Before too long, today’s data processing techniques and depth will look very simplistic, though it is not going to be cheap. $200 billion might sound like a wondrous prize for Intel, but customers need to be tempted to sign the cheques. This might be the difference between tomorrow’s winners and losers, those who are prepared to spend big today to get a jump start on the intelligence era.
Intel is a company which has had its fair share of negative press, scandals and share price plummets in recent memory, but arguably there are few organizations who could consider themselves in the same league. In claiming $1 billion of the data-centre prize already, Intel is establishing itself at the forefront of tomorrow’s booming industry. Companies like AMD and Nvidia are racing to catch-up with the market leader, but Intel seems to be sitting very comfortably at the moment.
The price of data storage, transportation and processing is cheaper than ever and still going down. Use cases such as autonomous vehicles and personalised digital experiences are becoming clearer. The opportunity is right in front of use, now it just needs to be made into reality.
Google’s cloud business unit is reportedly in talks with Tencent, Inspur and other domestic companies in an effort to launch services in the lucrative Chinese market.
Many Silicon Valley companies have found difficulty in establishing a presence in the world’s second largest economy, and Google is no exception. The absence of its core search advertising business has spurred on the establishment of Baidu, though it seems the Googlers are keen to avoid missing out on the digital transformation journey in enterprise which has been so lucrative to Google as well as other giants such as Amazon Web Services and Microsoft Azure.
According to Bloomberg, Google has been on the search for local partners since early 2018, though the list has been narrowed. As one of the most powerful internet companies in the country, Tencent naturally sits at the top, though there are more. Making moves on such a lucrative market might not surprise anyone in the tech world, all of the major players have been trying to smooth the edges of Chinese government demands for years, but Google seems to be attracting some attention.
The goal here would be to run Google services via local data centres, at least partly owned by domestic companies. This is not unusual, every Western business entering the country has to be grounded by some sort of local partnership, irrelevant of industry, vertical or service. Other rules in the country dictate data has to reside within the country’s physical borders, subjecting it to local laws.
What is unknown is whether the global brand can gain traction locally. As is the case with many of aspects of the internet, in the absence of Silicon Valley players, local companies have taken the opportunity to create domestic versions. Baidu is the search engine, though Alibaba and the telcos have stepped up to fill the gap in cloud services. Google might have a very reputable business worldwide, but it is the challenger in China; shifting the status quo is a very difficult task.
Aside from this venture, it was revealed last week Google is in the process of developing at app which would bow to censorship demands, allowing results to be filtered and some removed, to penetrate the Great Firewall. The business has also opened an AI research centre in Beijing and plans to open a Hong Kong cloud region in the future. It should also come as little surprise US politicians are finding issue with the efforts.
In a letter to Google CEO Sundar Pichai, a collection of Senators including Mark Warner of Virginia, Mark Rubio of Florida and Robert Menendez of New Jersey, signed a letter condemning Google’s submission to the government.
“If true, this reported plan is deeply troubling and risks making Google complicit in human rights abuses related to China’s rigorous censorship regime,” the letter reads. “After a cyberattack that compromised the Gmail accounts of dozens of Chinese human rights activists, Google’s March 2010 decision to stop censoring results on Google.cn was widely praised.
“It is a coup for the Chinese government and Communist Party to force Google – the biggest search engine in the world – to comply with their onerous censorship requirements, and sets a worrying precedent for other companies seeking to do business in China without compromising their core values.”
With tensions continuing to rise between the two nations, Google’s plans to modify its values and principles in an effort to gain entry to the country could be viewed as somewhat of a PR win for the Chinese. Google is one of the companies which could be viewed as a poster boy for the US way of life and ideals, a mix of ruthless capitalism and excellent brand management. It’s a scalp in the war of words and tariffs.
Of course, US political posturing could have much more of an impact than the Googlers would want; intervention from PR chasing politicians is a very realistic outcome.
Microsoft’s Project Natick is entering into its second phase of development, and actually sinking a data centre 117 feet into the North Sea, a couple of miles off the coast of Scotland’s Orkney Islands.
It certainly is a novel idea, and while it does sound like a bit of a PR gambit, there is some sound logic. Its small enough to manufacture and deploy pretty quickly, low temperatures at the seabed reduce cooling bills, around half the world’s population live within 120 miles of the coast and when connected to renewable energy sources on the surface, it becomes notably cheaper to run.
The data centre itself has been designed by Naval Group, a French business with expertise in engineering, manufacturing and maintaining military-grade ships and submarines, is contained-sized, loaded with 12 racks containing a total of 864 servers and associated cooling system infrastructure. The systems have been designed as simply as possible to remove the need for human intervention, as when the data centre has been lowered to the seafloor, it is impossible for humans to gain entry.
This is another aspect of Project Natick which is both exciting and worrying for the data centre workforce; the asset is designed to be a ‘Light Out’ operation, essentially meaning automation plays a big role here, though remote control and maintenance is possible. If something breaks, it breaks, but the PoC vessel operated in the same Lights Out manner for 105 days. This is a good start, but for the idea to be commercially viable, Microsoft will have to prove the data centre can remain operational for five years.
“The most joyful moment of the day was when the data centre finally slipped beneath the surface on its slow, carefully scripted journey,” said Project Leader Ben Cutler.
The idea is a good one, but now the team has to prove the theory is one which will be economically, environmentally and operationally sound. First and foremost is the sturdiness of the data centre. The test site, the European Marine Energy Centre, is one of the rougher locations in the North Sea, with tidal currents travelling up to nine miles per hour at peak intensity and the sea surface regularly roils with 10-foot waves that whip up to more than 60 feet in stormy conditions.
“We know if we can put something in here and it survives, we are good for just about any place we want to go,” said Cutler.
Another test will be making sure the theory of energy self-sufficient data centres is a sound one. Colocation with marine renewable energy is an important step here, and could allow Microsoft to bring services to regions of the world with unreliable electricity, and eliminate the need for costly backup generators in case of power grid failures. In this test, a cable from the Orkney Island grid sends electricity to the data centre, which requires just under a quarter of a megawatt of power when operating at full capacity. The Orkney Islands are actually 100% powered by renewable energy, a mix of solar, wind and marine.
The cooling system is another way in which Microsoft can potentially make the data centre more sustainable and efficient to run, as the water surrounding the submerged asset will be pumped through the system to cool the servers. Add in the consistently low temperature of the water at the seabed and the cooling problem of data centres becomes a lot more feasible.
Aside from proving it can be powered and operated, the team also have to demonstrate the logistical side. Building the data centre in the shape of a container means there are a variety of sizes available and it can be easily transported from the factory. This is a positive, as is the successful deployment, using 10 winches, a crane and a gantry barge, as well as a remotely operated vehicle that accompanied the data centre on its journey, but at the end of the experiment, the team need to get the asset back out. Recycling the asset is key for the business model here, and despite it being powered by renewable energy, we suspect Microsoft would have few green fans if it just left it at the bottom of the ocean.
The promise of the project is very attractive. With the rapid growth of internet usage and the increasing data-intensity of applications, infrastructure needs to scale quickly. The idea offers speedy and flexible deployment, low latency, environmentally friendly and cost-effective, but there are still a lot of unknowns. Whether Microsoft can operate a data centre with zero human intervention for five years, in a harsh environment, remains to be seen, but it could be onto a winner.