Orange goes submarine with Google cable partnership

Orange has been announced as the latest partner to join Google on its monstrous mission to bulk out its connectivity infrastructure maze.

The telco will act as the French ‘landing partner’ for Google’s Dunant transatlantic submarine cable, which is set to come into operation in 2020. As part of the partnership, Orange will provide backhaul services to Paris, while also benefiting from fibre-pairs with a capacity of more than 30 Tbps per pair.

“I am extremely proud to announce this collaboration with Google to build a new, cutting-edge cable between the USA and France,” said Stéphane Richard, CEO of Orange. “The role of submarine cables is often overlooked, despite their central role at the heart of our digital world. I am proud that Orange continues to be a global leader in investing, deploying, maintaining and managing such key infrastructure. Google is a major partner for Orange and this project reflects the spirit of our relationship.”

Announced back in July, Dunant (named after Nobel Peace Prize winner Henri Dunant) is Google’s first private investment across the Atlantic and supplements one of the busiest routes on the internet. The cable will be 6,600km long, connecting the west coast of France to North Virginia in the US. The cable is set to be the first to connect the two countries in 15 years.

While many organizations are investing in infrastructure through consortiums, Orange has invested in more than 40 submarine cables, few have taken Google’s approach in being the sole investor. It might be a more expensive approach, though Google will have more control over capacity and the route of the cable, perhaps giving it a bit of an edge over competitors. The weight of such investments have been putting some dents in the spreadsheets, the CAPEX column doubled during the latest quarterly earnings call, though it does put Google in a solid position.

From Orange’s perspective, the partnership will strengthen the telcos position to support the development of new uses for its consumer and enterprise customers in Europe and America. It will also be in a stronger position to provide services to wholesale market such as content-providers and third-party operators.

Google continues infrastructure investment with trans-Atlantic cable

Google has continued its cloud infrastructure quest with its first private trans-Atlantic subsea cable connecting Virginia Beach in the US to the French Atlantic coast.

The cable, named Durant, is expected to become available in late 2020, adding network capacity across the Atlantic, supplementing one of the busiest routes on the internet. This will be the 13th cable Google has invested in, though it is the first Atlantic asset which will be privately owned in its entirety.

“Today, we’re announcing our newest private subsea cable project: Dunant,” said Jayne Stowell, Strategic Negotiator at Google. “This cable crosses the Atlantic Ocean from Virginia Beach in the U.S. to the French Atlantic coast, and will expand our network–already the world’s largest–to help us better serve our users and customers. The Dunant cable is expected to become available in late 2020.”

The cable itself has been named after Henri Dunant, the first Nobel Peace Prize winner and founder of the Red Cross, and unveils another little quirk of the Google business. The Durant cable follows Curie, a cable connecting Chile to Los Angeles, which was announced in January, with Google confirming the trend will continue. Like the Android updates which follow the alphabet (funny that) and are named after sweets, the cable investments will also be named alphabetically, but after famous scientists.

There are a few to pick from when it comes to E. ‘America’s greatest inventor’ Thomas Edison could be an option, as could astronomer Arthur Eddington or Greek scientist Eratosthenes, the first person to produce a reliable, logical method to discover prime numbers. With AI as 2018’s buzzword, 1963 Nobel Prize winning neurophysiologist John Eccles could also be a good outside bet, but the safe money would have to go on Albert Einstein.

Northern Virginia has been a region receiving some attention in recent months, as aside from the cables landing on its beaches, Google has also been focusing some data centre investments in the region as well. In the escalating cloud battle, Google has some catching up to do, though AWS is largely believed to dominate this region of the states. Whether Google has the financial clout to compete against the market leader remains to be seen, but the search giant has not been scared to sign cheques in recent months.

Durant in the Atlantic and Curie in the Pacific are two serious investments being made by Google. Competitors are also involved in consortiums to build and maintain other cables around the world, though few head down the route of privately funding their own. Here, Google will have more control over capacity and the route of the cable, perhaps giving it a bit of an edge over competitors such as Microsoft and AWS who will have to compromise with partners when crossing the Atlantic.

The major cloud players are not short of cash to invest, though whether Google veering towards private investment in subsea cables kicks off a new trend remains to be seen.

Google Atlantic Cable

GSMA points finger at greedy governments for poor connectivity in developing nations

Spectrum pricing is one of the biggest bugbears of the GSMA, and its latest report is another opportunity for the lobby-group to wag the finger of disapproval at governments trying to replenish bank accounts at the expense of connectivity.

The ‘Spectrum Pricing in Developing Countries’ report, released by the GSMA Intelligence unit, puts forward familiar arguments from the association, which of course represents the interests of the telcos. Whenever the GSMA has a bee in its bonnet you have to take any criticism with a pinch of salt, but it is difficult to argue with some of the evidence and conclusions which have been put forward here.

“Connecting everyone becomes impossible without better policy decisions on spectrum,” said Brett Tarnutzer, Head of Spectrum at the GSMA.

“For far too long, the success of spectrum auctions has been judged on how much revenue can be raised rather than the economic and social benefits of connecting people. Spectrum policies that inflate prices and focus on short-term gains are incompatible with our shared goals of delivering better and more affordable mobile broadband services. These pricing policies will only limit the growth of the digital economy and make it harder to eradicate poverty, deliver better healthcare and education, and achieve financial inclusion and gender equality.”

The view must be exceptional atop that incredibly high-horse… but there is a point to the self-righteous proclamation.

Spectrum

As you can see from the graph above, the trends are quite clear. When the GDP of the country is taken into account, the adjusted cost of spectrum licenses are three times higher in developing nations than in the developed ones. There will of course be several reasons for this, not just greedy governments trying to bolster spreadsheets as the GSMA seems to be implying, but it is not a correlation which makes sense.

These are countries where mobile infrastructure is under-developed, therefore significant investments need to be made to kickstart the digital evolution. Spending more of spectrum licenses reduces the amount (in theory) an operator can spend on the infrastructure, while it would also be a fair assumption to assume lower national GDP means lower ARPU for the operators. In a perfect world, every government should be able to reclaim the same amount on spectrum sales in each of their auctions, but generally life is not fair. Reserve prices and annual license fees have to be adapted to the market to make an attractive business case for the operators.

What is worth noting is it is not every developing nation which is getting the policy decision making wrong, and in some cases the lessons have been learned. But there are consequences.

Jamaica is an example of where high reserve prices caused chaos. With a reserve price of $40 – 45 million set in 2013 for 700 MHz, the auction was ignored by the operators. A year later, the government corrected its mistake, though the launch of 4G networks was delayed until 2016 as a result with Jamaica’s 4G penetration today less than half that of the average throughout the Caribbean. Mozambique is another, which offered a total of 50 MHz in the 800 MHz band for the reserve price of $150 million. Operators would have had to invest a third of their annual mobile service revenues to meet the starting bid, which was concluded to be completely unfeasible. This is a country where the situation has not been rectified, 4G is yet to be launched, though the National Communications Institute of Mozambique plans to launch another auction during 2018.

The case for cheaper spectrum licenses has been made many times by the GSMA. The argument from operators and the GSMA that more would be spent on infrastructure with lower license expenses is theoretically sound, though sceptical individuals might suggest the penny pinching telcos are just looking for another way to spend less. However, there are examples where the boy who cried wolf might have spotted some teeth; it’s hard to argue with some of the figures presented by the GSMA here.

Regionalised monopolies or all-out competition? That is the question

It is always better to be a leader than a chaser, but sometimes you play the hand dealt. The UK is playing catch-up, but what is the right way forward; regionalised monopolies or bitter battle for the consumer?

This is a question which was addressed by Minister for Digital and Creative Services Margot James at the Connected Britain event this morning, and is currently being investigated by those running the Future Telecoms Infrastructure Review. The idea is to strike a balance between the two, but that is much easier said than done. Sitting on the fence, appeasing all contributors, is never a healthy situation.

As you can imagine, there are pros and cons for both sides. On the monopolistic side of things, the telco would be confident of capturing customers and therefore generating ROI. The rollout would be notably faster due to the guarantee of customers, either through direct-to-consumer services or wholesale, but then there is a risk of another Openreach saga being reached.

Looking at all-out-competition, the consumer wins as there would be a heated battle on price and performance to capture revenues, but deployment will be much slower. Telcos would have to spread investment further over different regions, while also not having the same level of confidence in securing revenues.

There are of course other options, collaborative investments in networks is one which springs to mind, though we are not entirely confident in the telcos ability to place nice with each other. The Future Telecoms Infrastructure Review will provide more insight, but the government is staying suitably tight lipped there.

“We are a matter of weeks away from delivering the review, I have the unenviable task of telling you the government’s view without telling you the government’s view,” said Raj Kalia, CEO of BDUK, the government department responsible for overseeing the deployment of future-proofed infrastructure.

While there was very little of substance said at the conference, pertaining to the review that is, we got the impression the government will be leaning more towards the network sharing model. This will be the telecoms version of herding cats, but perhaps is one of the most sensible. Investment risk for the telcos is reduced, while there is still the opportunity to monetize. Depending on how the government intends to ‘stimulate’ deployment, there might still be an element of regionalised monopolies, but with multiple telcos contributing to the rollout, risk is reduced for the consumer on the other side of the coin.

One comment was directed towards the UK’s fantastic ability to conduct investigations and reviews, as a substitution for genuine action, but all eyes will be directed towards the Department of Digital, Culture, Media and Sport, as well as the much lauded Future Telecoms Infrastructure Review over the next couple of weeks.

Will the UK ever agree on the internet?

This week Telecoms.com has 16 year-old Shannon O’Connor joining the team for work experience, and today’s ‘thrilling’ task is to join Jamie at the Connected Britain event in London. Here are her thoughts. 

With the Connected Britain event bringing together executives from TalkTalk, CityFibre and Openreach, as well as government representatives, the question still remains as to whether they will be able to work collaboratively to progress?

As the speakers continue to roll out their plans for an accelerated investment in high capacity networking across the UK, there still seems to be a lot of busywork.

“If you could rollout out connectivity through reports and investigations, Britain would have faster broadband than Japan and Korea,” said Matthew Howett of Assembly, the chair at this year’s event.

But is there any action?

Minister for Digital and the Creative Industries, Margot James highlighted the inequality of connectivity not being reached within the rural areas of the UK. As major towns and cities continue to prosper and develop, those living in the outskirts face difficulties in sustaining accessible, basic broadband. Something which interested attendees intently as plans begin to emerge for infrastructure collaboration.

However, in the following panel it was clear that collaboration would not only create conflicting ideas between competitors but also allow those to question whether proper competition could ever come while working hand in hand.

Emerging from what the speakers said at the conference was quite simply uncertainty. There had been too much discussion and not enough action in developing fibre broadband within the public sector and beyond in the UK. There doesn’t seem to be any consistency or coherence; it seems asking adults to be mature and agree on a logical path is too much (and that’s coming from a teenager – Ed.).

As our Europe counterpart continues to prosper both economically and industrially, the UK continues to fall further behind because of an inability to agree.

Microsoft sinks casually drops data centre into the North Sea

Microsoft’s Project Natick is entering into its second phase of development, and actually sinking a data centre 117 feet into the North Sea, a couple of miles off the coast of Scotland’s Orkney Islands.

It certainly is a novel idea, and while it does sound like a bit of a PR gambit, there is some sound logic. Its small enough to manufacture and deploy pretty quickly, low temperatures at the seabed reduce cooling bills, around half the world’s population live within 120 miles of the coast and when connected to renewable energy sources on the surface, it becomes notably cheaper to run.

The data centre itself has been designed by Naval Group, a French business with expertise in engineering, manufacturing and maintaining military-grade ships and submarines, is contained-sized,  loaded with 12 racks containing a total of 864 servers and associated cooling system infrastructure. The systems have been designed as simply as possible to remove the need for human intervention, as when the data centre has been lowered to the seafloor, it is impossible for humans to gain entry.

This is another aspect of Project Natick which is both exciting and worrying for the data centre workforce; the asset is designed to be a ‘Light Out’ operation, essentially meaning automation plays a big role here, though remote control and maintenance is possible. If something breaks, it breaks, but the PoC vessel operated in the same Lights Out manner for 105 days. This is a good start, but for the idea to be commercially viable, Microsoft will have to prove the data centre can remain operational for five years.

“The most joyful moment of the day was when the data centre finally slipped beneath the surface on its slow, carefully scripted journey,” said Project Leader Ben Cutler.

Microsoft 2

The idea is a good one, but now the team has to prove the theory is one which will be economically, environmentally and operationally sound. First and foremost is the sturdiness of the data centre. The test site, the European Marine Energy Centre, is one of the rougher locations in the North Sea, with tidal currents travelling up to nine miles per hour at peak intensity and the sea surface regularly roils with 10-foot waves that whip up to more than 60 feet in stormy conditions.

“We know if we can put something in here and it survives, we are good for just about any place we want to go,” said Cutler.

Another test will be making sure the theory of energy self-sufficient data centres is a sound one. Colocation with marine renewable energy is an important step here, and could allow Microsoft to bring services to regions of the world with unreliable electricity, and eliminate the need for costly backup generators in case of power grid failures. In this test, a cable from the Orkney Island grid sends electricity to the data centre, which requires just under a quarter of a megawatt of power when operating at full capacity. The Orkney Islands are actually 100% powered by renewable energy, a mix of solar, wind and marine.

Microsoft 3

The cooling system is another way in which Microsoft can potentially make the data centre more sustainable and efficient to run, as the water surrounding the submerged asset will be pumped through the system to cool the servers. Add in the consistently low temperature of the water at the seabed and the cooling problem of data centres becomes a lot more feasible.

Aside from proving it can be powered and operated, the team also have to demonstrate the logistical side. Building the data centre in the shape of a container means there are a variety of sizes available and it can be easily transported from the factory. This is a positive, as is the successful deployment, using 10 winches, a crane and a gantry barge, as well as a remotely operated vehicle that accompanied the data centre on its journey, but at the end of the experiment, the team need to get the asset back out. Recycling the asset is key for the business model here, and despite it being powered by renewable energy, we suspect Microsoft would have few green fans if it just left it at the bottom of the ocean.

The promise of the project is very attractive. With the rapid growth of internet usage and the increasing data-intensity of applications, infrastructure needs to scale quickly. The idea offers speedy and flexible deployment, low latency, environmentally friendly and cost-effective, but there are still a lot of unknowns. Whether Microsoft can operate a data centre with zero human intervention for five years, in a harsh environment, remains to be seen, but it could be onto a winner.

Smart signalling, intelligent kerbs and dynamic sat-nav; the roads of tomorrow

The National Infrastructure Commission has announced the shortlist of finalists for its competition to design roads fit for autonomous vehicles and the digital economy.

The aim of the Roads for the Future competition, launched in partnership with Highways England and Innovate UK, was to encourage the development of ideas to prepare physical infrastructure for autonomous vehicles. Smart traffic lights, flexible use of kerbsides, segregated driverless zones, and sat-navs were among the entries, which did bring forward some pretty interesting ideas.

“We can see for ourselves the progress in developing cars for the future, with trials of driverless cars taking place across the country – we now need to make sure the technology on our roads keeps up,” said John Armitt, Chairman of the National Infrastructure Commission. “These five entries clearly stood out and I look forward to seeing how their ideas develop further over the coming months.”

The finalists will not have three months to develop their ideas, before the overall winner will be announced in the autumn. So who are the five finalists?

AECOM An American engineering firm, which has come up with the idea of moderating the speeds of cars around junctions to ensure the vehicle approaches traffic lights just as they are turning green. It’s an interesting idea which will manage a consistent flow of traffic, reducing congestion and also pollution. The concept will be tested using a simulation model of the A59 in York.

Arup Using a high street in London, Arup will test out its FlexKerbs idea which can alter the use of kerbsides dependent on the time of day and requirements. Everyday features such as double yellow lines, parking bays or cycle lanes would no longer have to be permanent, and could be adapted dynamically according to the specific demands in real-time.

City Science A small firm based in Exeter which will test out the idea of sectioning off existing roads for the exclusive use of driverless vehicles. The process will mitigate risk, as well as creating distinctions between autonomous and manually-driven vehicles, perhaps aiding the normalisation process.

Immense Solutions A spin out from the Transport Systems Catapult which will aim to aid the development of satellite navigation systems. The aim here is a simple one; use data from sensors on street furniture and other vehicles to optimise travel routes in real-time. Google Maps sort of does this at the moment, but it is an incredibly rudimentary approach which is based on most-likely conditions, not real-time data. The team will be working with Oxfordshire County Council, using simulations of four busy local roads.

Leeds City Council Here the local authority will examine how data generated from digitally connected cars could be used to improve traffic light systems, allowing highway authorities to better manage traffic on their roads and reduce tailbacks.

The focus on autonomous vehicles so far has been very focused on a small aspect of the technology, but like the deployment of fibre, the physical elements will be one of the biggest challenges. As it stands physical infrastructure, such as roads or street furniture, is not up to scratch when it comes to managing autonomous vehicles.

“With 81 entries received, our Roads for the Future competition has demonstrated the keen interest there is across industry to be at the forefront of the technologies supporting the introduction of driverless cars,” said Bridget Rosewell, Chair of the Judging Panel for the Roads for the Future competition.

“We wanted to see how the rules of the road, road design and traffic management could all be adapted to accommodate these new vehicles – and these five entries particularly demonstrated the exciting potential there is to make the best use of those we already have.”

American Tower expands in Africa with acquisition of 723 towers

Telkom Kenya has announced it has reached a definitive agreement to sell up to 723 towers to American Tower, expanding the latter’s footprint to a fifth country in Africa.

The transaction, which will be completed in the second-half of 2018, will give the telco a bit of breathing room and cash to invest in its 4G network. Kenya Telkom has been performing adequately to date, though has struggled to get anywhere near market leader Safricom. It is hoped the funds will give the telco a boost to be more competitive.

“We are excited to announce the launch of operations in Kenya through our agreement to acquire TKL’s towers,” said William Hess, American Tower’s President of EMEA and Latin America. “This represents American Tower’s 17th market globally, and our fifth in Africa, and we look forward to helping expand the reach of mobile broadband throughout the country. Kenya is a very attractive market, and we have high expectations for its long-term growth potential.”

“Telkom will now focus on its core function – the provision of quality telecommunications services to our customers,” said Aldo Mareuse, CEO of Telkom Kenya. “In addition, the sale will release capital for further investment in our 4G network and a number of state of the art IT platforms, all of which will further enhance services for our customers as they demand higher quality and speed from our mobile data networks as well as a richer range of services.”

American Tower is one company which has been on the acquisition trail recently, seemingly capitalizing on strong financial performance. Aside from this deal, American Tower has also acquired the tower business units of both Idea and Vodafone in India, as the pair gear up to tackle the disruption caused by Reliance Jio in the market. Praying on struggling telcos’ assets seems to be a successful strategy here, as it spent $673 million to acquire nearly 10,600 sites over the first three months of 2018.

Looking at the financial side of the business, American Tower recently reported its figures with the first quarter demonstrating a 7.8% year-on-year increase to $1.742 billion, though it has lowered forecasts for the year, citing the troublesome Indian market and other factors.

The team will also be keeping an eye on developments with Sprint and T-Mobile, with the pair accounting for 4% and 3% (respectively) of American Tower’s consolidated property revenues. While there is still three to four years left on non-cancellable lease agreements, and the team anticipate aggressive spending to catch up to the top tier, consolidation of two major customers generally doesn’t bode well for the supplier.

Whether there is any acquisition left in American Tower remains to be seen, though the team has stated it is continuing to review and act on expansion opportunities in international markets. It does not seem to be shy about living by the ‘speculate to accumulate’ mantra.

Community Fibre gets £18 million from public/private UK infrastructure fund

A London-based full-fibre ISP has secured £25 million in new financing led by the National Digital Infrastructure Fund (NDIF).

The NDIF is a fund initiated by the UK government as part of its almost identically-named Digital Infrastructure Investment Fund, which was announced last year as a £400 million pot designed to kick-start expected to be more than doubled by private sector investment. So this is actual investment, as opposed to a state handout, which is good.

It would seem to be a reasonably sound investment too, since Community Fibre intends to use the fresh cash to connect 100,000 homes in private and social London housing estates. The company’s longer-term objective is to connect 500,000 London homes with full fibre by 2022.

As the headline says, the NDIF is fronting up £18 million of the £25 million total in this round, with the rest coming from current Community Fibre investor Railpen. The NDIF is run by infrastructure specialist Amber, which was identified as the main private sector partner when the fund was announced.

“This investment shows full-fibre is the only way forward for ultra-fast connectivity,” said Jeremy Chelot, Chief Exec of Community Fibre. “Community Fibre is the partner of choice to deliver high-quality, fast pure fibre connection, to deliver sustainable investment and to support Government policy across the UK. This funding takes us a step closer to have our full-fibre network available to social housing or private landlords in every borough in London.”

“We want to see full fibre broadband rolled out across the UK as quickly as possible and to support a competitive private sector in delivering that objective,” said Robert Jenrick, Exchequer Secretary to the Treasury. “That’s why we created the Digital Infrastructure Investment Fund, which is boosting the rollout of full fibre to enable people to live and work flexibly and productively, without connections failing.”

“As long-term, responsible partners to the public sector, our aim is to mobilise capital effectively to scale up the digital infrastructure that is so critical to the UK’s economic future,” said Khalid Naqib, Senior Investment Director at Amber. “NDIF’s investment will help deliver gigabit connectivity to London’s local communities and secure appropriate returns for our investors, one of which is the UK taxpayer.”

£25 million isn’t that much in the great scheme of things, but 100,000 more fibre connections are still a lot better than nothing. It looks like this is how fibre rollout will progress, at least in the UK: small, piecemeal investments over time eventually resulting in most of the country being connected to full fibre. The most intriguing part is how much of this is being done without the involvement of BT or even Virgin Media.

ZTE fears for its very survival following US export ban

Following the decision from the US government to activate the Suspended Denial Order, ZTE has hit back with the threat of a lawsuit, claiming the order not only threatens its own survival but that of its suppliers.

While the order might have had the objective of knee-capping a specific Chinese company, the fallout has also sent shockwaves through the US technology scene. Companies like Acacia Communications, Oclaro and Lumentum Holdings, all of whom are US companies reliant on ZTE as a major customer, have seen share prices plunge. The US government might have hit bullseye when it comes to tackling ZTE, but the friendly-fire has been spraying all over the country.

“The Denial Order will not only severely impact the survival and development of ZTE, but will also cause damages to all partners of ZTE including a large number of US companies,” ZTE said in a statement. “In any case, ZTE will not give up its efforts to resolve the issue through communication, and we are also determined, if necessary, to take judicial measures to protect the legal rights and interests of our Company, our employees and our shareholders, and to fulfil obligations and take responsibilities to our global customers, end-users, partners and suppliers.”

It is difficult to get a handle of the damage which has been done at ZTE primarily because there are so many moving parts and a huge number of possible scenarios. The loss of customers in the US is only the tip of the iceberg, ZTE is huge reliant on US technology and intellectual property. Some estimates say 80-90% of ZTE technology is reliant on some form of US input, while Qualcomm supplies around 70% of the chips used in its smartphones.

This is devastating for ZTE. Who knows how long recrafting the supply chain to make sure there are no US components involved would take. It is a task which has probably never been undertaken before.

That said, the lobbyists in Washington must be hammering the front door of the White House. Anti-China sentiment has been a long-standing tradition of US governments, but this order takes the stakes up who-knows how many levels. In trying to cripple a Chinese beast, the White House has possible resigned hundreds, if not thousands, of employees to the dole queue within its own borders. Acacia Communications, Oclaro and Lumentum Holdings are the three companies who have been hit hardest, but there will of course be dozens of firms who are less reliant on the firm, though the order will still have a material impact on the business. ZTE is after all one of the world’s largest telecommunications vendors.

Legal action will potentially follow, though ZTE might be able to negotiate its way out. Judicial action does not necessarily mean lawsuit, there will be steps to take before this point is reached, including an appeal. Before too long, assume the Chinese government will wade into the mediation mess, while ZTE will be calling on its US suppliers for backup as well. That said, don’t expect the US to have a sympathetic ear. The US government is going to try and make an example of ZTE as it flexes its muscles over China