How smart city investments in will reach $135 billion in 2021

Around the globe, metropolitan and suburban municipalities continue to explore digital transformation projects. Emerging technology spending for smart city applications is forecast to reach $80 billion in 2018, according to the latest worldwide market study by International Data Corporation (IDC).

IDC has provided a detailed look at the technology investments associated with a range of smart city priorities and use cases. As these initiatives gain traction, IDC expects spending to accelerate over the 2016-2021 forecast period, growing to $135 billion in 2021.

Smart city market development

"Smart cities have recently evolved from a collection of discrete flagship projects to a sizeable market opportunity that will drive significant technology investments in 2018 and beyond," said Serena Da Rold, program manager at IDC.

IDC believes that strategic priorities  will drive digital transformation across cities of all sizes, but the analyst's research demonstrates there can be significant differences in the focus of investments across regions.

Smart cities attain digital transformation in an urban ecosystem to meet environmental, financial, and social outcomes. A smart city is developed when multiple initiatives are coordinated to leverage investments, use common IT platforms to decrease service time or costs and share data across systems.

Smart city programs enabled by emerging technologies are accelerated in the city ecosystem to deliver innovative solutions. The strategic priorities that will see the most investment in 2018 are intelligent transportation, data-driven public safety, and resilient energy.

Intelligent traffic and transit and fixed visual surveillance are the two significant use cases, in terms of worldwide spending, followed by smart outdoor lighting and environmental monitoring.

While these use cases attract considerable investments in most geographies, the focus shifts across different regions. Intelligent traffic and transit will be the top priority in investment terms in the United States, Japan, and Western Europe.

Fixed visual surveillance will be the leading use case in China and the second largest in the United States, while environmental monitoring will be relatively more important in Japan.

Outlook for smart city regional growth

On a geographic basis, the United States will be the largest market for smart city technologies with spending forecast to reach $22 billion in 2018. China will be a close second with 2018 spending expected to be nearly $21 billion.

The two countries will share a similar growth trajectory with five-year compound annual growth rates (CAGRs) of 19 percent and 19.3 percent, respectively. The regions that will see the fastest spending growth are Latin America (28.7 percent CAGR) and Canada (22.5 percent CAGR).

Is Silicon Valley’s culture of creativity slowly dying?

Silicon Valley is currently known as the centre of the world for innovation, but perhaps the philosophy of acquisition is starting to kill off the creative ambitions of this generation’s dreamers.

This idea is based on the endless quest of the technology industry’s giants to find for the next big thing. Companies like Google, Amazon, IBM or Intel, are constantly on the search to purchase new businesses, people and IP to fuel growth and capture new revenues, but is this acquisition trail limiting the big ideas of the blue-sky thinkers?

It should be important to note that there is no practical way to prove this theory, but it might prove to be an interesting talking point. The basic premise of this idea dates back to 1971 when AT&T was considering the purchase of  ARPANet, the predecessor to the internet. What seems incredible to believe now is that AT&T turned down the opportunity to buy ARPANet off the US Government and essentially develop a monopoly on the internet.

Who knows what would have happened if AT&T had completed the transaction, but it would be fair to assume the internet would not have grown into the open ecosystem it is today. There wouldn’t have been an open revolution which drove the likes of Amazon, Netflix or Google to be masters of the world, as acquired technologies are usually put to task serving the corporate need of the parent company.

The beauty of the internet is that there are no walls and no limitations, but if its development had been controlled by a private organization, with two eyes on profitability, would the internet have developed into the beast it is today? Impossible to answer this question, but we don’t think so.

Perhaps this quirky little story can be likened to some of the situations which we are seeing today. Let’s start with Google and its acquisition of Deepmind.

Deepmind has been and, for some, continues to be the centre of excellence for artificial intelligence. There have been significant breakthroughs since Google acquired Deepmind, winning a game of Go which requires intuition is a one, but how much could be achieved without the greater corporate ambition as an objective? Google is after all a corporate machine, everything it does is geared towards making money.

There is now a purpose to what Deepmind is doing. The breakthroughs, which are of course very impressive, feed into other Google products. Google made a bet on AI and it is paying off by making its products more relevant, sophisticated and intelligent. Deepmind is feeding into the Google machine.

But, if these incredibly intelligent innovators did not have a corporate goal, what would they do? Maybe their financial ambitions would take them down the same route or maybe the academic curiosity would take over and they would start playing around with new ideas that have no immediate financial gain or corporate objective. Perhaps they would investigate an idea just because they want to and it could lead to the greatest breakthrough of this generation.

Corporate aims laser focus the attention of brilliant people, who can get distracted easily, to speed up progress. Sometimes this focus is needed but sometimes these distractions are. That’s the perfect thing about the unknown, you don’t know what’s out there and it could be exactly what the world needs.

Another interesting example to think about is Netflix. This is a company which has faced numerous acquisition rumours and stayed the steady course of independence. Blockbuster started the rumour trail, Verizon walked it for a while, Disney is a long time traveller and Apple is a newcomer. If the organization had been acquired at some point, would it have found itself in the same position as it is today?

We are not financial experts at and would have no idea how much debt Netflix is harbouring, but it is certainly not shy about spending on content in pursuit of creating arguably the world’s finest entertainment platform. Should the company have been purchased, would executives have been given the freedom to signs checks in the way which they have done and continue to do? We very much doubt it.

So this is the conundrum we are facing today. The giants of the technology world are acquiring companies in their own pursuit of profitability and relevance, but it might just be holding back innovation. Acquisition should, in theory, offer a greater R&D budget for these thinkers to play around with, but this financial contribution comes with caveats and corporate responsibility. They might be able to achieve certain things in a quicker timeframe, but that might come at the expense of new ideas.

This acquisition trail is never going to stop. No-one wants to be remembered as the next John Antioco, the CEO of Blockbuster who passed on the opportunity to buy Netflix. Corporate funding might save some ideas, but is it a good substitute for exploring the unknown at a slower pace and finding the next big thing? We’re not too sure.

It’s an interesting thought to consider, and while it does sound very doom and gloom, there is at least one saving grace. You can’t miss what you’ve never had.

Ericsson continues to bang the IoT drum

Ericsson looked to capture the headlines over MWC with a big focus on IoT, and the trends have continued with two new announcements in Saudi Arabia and Greece.

Starting in the Middle East, Ericsson has bagged a new customer in STC as part of a nationwide expansion of its 4G network in Saudi Arabia, including the deployment of LTE Advanced and Narrow Band-Internet of Things (NB-IoT) in Radio Access Network.

“At STC, our main goal is to ensure that the country’s Saudi Vision 2030 ambitions are met,” said Nasser Al Nasser, Group CEO at STC. “We do that by always making sure we offer our subscribers the latest innovative technologies. LTE Advanced and NB-IoT are exactly what we need to pave the way to 5G.”

On the LTE-A side of things, the pair will deploy new 5G-ready hardware into the network to make use of STC’s newly acquired spectrum in the 700 MHz band. The claim here is that the project will increase STC network throughput in 4G by up to 50% for smartphones, though that crafty ‘up to’ metric is still there. Seems like advertisers aren’t the only one making use of the grey areas. Over for IoT, STC will be deploying NB-IoT tech in RAN across its expanded network to support smart city ambitions.

NB-IoT is also the focus of the announcement with Greek telco Cosmote. Here the pair have completed the deployment of the first cellular NB-IoT clusters in Cosmote’s network across eleven cities in Greece. The aim here is to support and develop massive Machine Type Communications (mMTC) usecases.

“Cosmote, fully recognizing the potential of massive IoT technology, is the first in Greece and one of the firsts in Europe, to trial NB-IoT and evolve its network,” said George Tsonis, OTE Group Executive Director of Network Planning & Development. “We’re poised to evolve beyond merely providing mobile broadband connectivity, to play a leading role in the rapidly developing IoT market and create through technology and innovation a better world for all.”

The IoT stance does seem to be working for Ericsson, so why change it. We noticed that the Ericsson stand was certainly very busy, perhaps even busier than its main competitors, across MWC perhaps owing to this IoT messaging. That said, it might be down to stringent access to the Huawei Village and the free lunch the Swedes were offering. Anything to avoid the dreaded and dreary beigeness of the MWC cafes.

Mobile network experience in Scotland – in order to get it right we need to understand what is wrong periodically invites third parties to share their views on the industry’s most pressing issues. In this piece Brendan Gill, CEO of OpenSignal, looks at the complexities of accurately measuring mobile signals.

Scotland has been struggling to keep up with the rest of the UK in terms of digital connectivity for many years. Official figures from Ofcom show that a mere 17% of Scotland has 4G mobile coverage, compared to 60% in England. And although there have been several initiatives launched by the Scottish Government to address this – such as The Mobile Action Plan, or the R100 programme pledging to deliver superfast broadband access to 100% of premises in Scotland by 2021 – the gap remains visible in key connectivity metrics.

In most cases the debate has centred around how Scotland’s consumers are being left behind. But erratic connectivity, especially in mobile services is having an impact and hurting businesses as well.

It’s being measured wrong

“Fixing” Scotland’s connectivity issue can only start once there is a clear understanding of what the problem is. Taking it a step further, if problems are not identified and quantified correctly, they serve merely as a distraction, delaying the process of finding a solution. In the wireless world, the distractors are the measurements used to quantify 4G connectivity.

The way that coverage is currently measured has a lot of limitations. Talking about 99% population coverage paints a very rosy picture of a rather grim tale.  It’s vital that the right metrics are in place to truly assess – and address – the problem. Only then can the discussions begin on how it can be fixed.

So how does the industry get it right?  One approach, from the likes of OpenSignal, is to look at time. Specifically, the percentage of time users are able to connect to an LTE signal. This draws a much more realistic picture of the everyday mobile network experience by collecting and analysing data from smartphones wherever their owners are: indoors or out, in the city or the countryside, day or night. The metric, called 4G availability, shows  Scotland (as well as the UK in general) still has a long way to go before reaching that coveted 99%.

By relying on real-world data, the availability approach eliminates the pitfalls that other coverage metrics so easily fall into: such as disregarding population density (i.e. geographic coverage) or failing to measure coverage indoors or at any location other than your home (i.e. population coverage).

Indoor coverage needs to be part of the equation

In fact, the majority of service fluctuations and black spots occur inside homes and offices. And many operators currently lack the means to track, let alone address the issue.

Population coverage is often talked about, and typically looks at: is there a signal at your doorstep? That would be great if all mobile devices were only used on doorsteps. But the reality is mobile users go inside, and there is a huge differential between the signal received outside and inside homes.

Searching for a signal at home, in the supermarket or in the Palace of Westminster itself should not be a source of frustration.  But as we look at the coverage map below (built based on real-world user data), it’s clear that Parliament’s indoor coverage is close to non-existent, while the surrounding areas all indicate strong signals.

westminster coverage

2 days less connectivity

The reality is that national 4G availability statistics are troubling – UK-wide we are talking about between 58% and 78% – and that is between the different providers. There is a big difference between this and the 99% population coverage figure that so many in the industry like to use.

As for Scotland, we were seeing around 7% less time connected to 4G compared with the UK nationwide average. It might not seem like a lot, yet 7% of the time is potentially two days per month when a mobile user in Scotland is not connected to 4G, but the UK-wide average is.

There is clearly a long list of unique challenges hindering 4G connectivity in Scotland, ranging from population spread to difficult terrain; but before the industry can start working on solving the issue, they first need to make sure that the metrics are right.


Brendan Gill_OpenSignal CEOBrendan Gill is the CEO of OpenSignal, a company he co-founded in 2010. He has spent over 10 years providing solutions to help people understand and improve mobile service and experience. Prior to OpenSignal, Brendan was part of the team that launched RepeaterStore in 2007, which provides signal boosting solutions to improve wireless cell and data reception in buildings, homes and vehicles. Brendan is listed in the Global Telecoms Business Power100 2017 as one of the most powerful names behind the telecoms sector. He is an accomplished speaker and has presented at leading industry events including: CTIA, Mobile World Congress, TechCrunch Mobile and the Qualcomm CEO Summit.

Passionate about empowering and supporting entrepreneurship, Brendan founded BetaFoundry, an accelerator programme offering mentors and advisors for students to encourage them to choose an alternative to the standard career path. In addition, he is one of the TechStars London mentors, and has taken the FoundersPledge to encourage tech entrepreneurs to donate to charitable causes. Brendan holds a degree in Physics and Philosophy from the University of Oxford.

Pressure mounts on European spectrum allocation

Now that 5G is within reach the urgency to make enough spectrum available is increasing significantly, especially in Europe.

Everyone seems to be doing a pretty decent job of R&D, collaboration, testing, etc, but you can have all the base stations, virtualized cores and whizzy devices you want and they won’t be much use without the spectrum to carry the signals. For 5G to work we need a lot more spectrum than we currently do and that requires a lot of effort across the board to pull off.

Here in the UK Ofcom has baked in the announcement it made last week concerning the six companies that will be participating in the imminent auction of 40 MHz of 2.3 GHz spectrum and 150 MHz of 3.4 GHz spectrum. The latter will be the first tranche of spectrum specifically set aside for 5G.

The usual suspects are all there: EE, Vodafone, Telefónica and Hutchison. The two left-field ones are Connexin, which has an interest in fixed wireless access, and Airspan Spectrum Holdings, which is a subsidiary of small cell specialist Airspan. EE won’t be allowed to big for the 2.3 GHz and will only be able to win a maximum of 85 GHz of the 3.4 GHz, but it’s unlikely to get close to that anyway.

Meanwhile ETNO, one of those organisations that represents European operators, is continuing to rattle the cage of the many pan-European bureaucracies to urge them to free up more spectrum more quickly.

“5G is too important for Europe to accept a compromise falling short of the original ambition,” said Lise Fuhr, ETNO Director General. “Future licences needs to deliver increased certainty with respect to the status quo and a truly effective peer review system is essential to ensure the credibility of spectrum policy.”

The specific gripe seems to be the Electronic Communications Code negotiations, which ETNO has been moaning about for a while, and are apparently still lacking urgency. ‘We ask legislators to ensure that the final text delivers far more ambition, more certainty, less complexity and a credible governance system,’ pleads the ETNO announcement.

Lack of transparency could cause problems for digital economy – Here Technologies

New research from mapping and location platform provider Here Technologies claims there is a lack of trust between consumers and technology companies when it comes to sharing and handling of personal information.

The focus of the research was location data, as you might expect from Here, but the results were perhaps more pessimistic than some would have expected. Only 20% feel they have full control over their personal location data, while 76% of the respondents are left feeling stressed or vulnerable about sharing their location data.

“People share location data with app providers because of the many benefits, whether it’s food delivery, hailing a ride, or getting the most out of social media,” said Peter Kürpick, Chief Platform Officer at Here Technologies.

“But, for many, it can be a trade with which they’re uneasy. While the lack of trust is problematic today, we believe that there could be greater challenges down the road if privacy practices continue to be dominated by a click-to-consent approach.”

Transparency is a key term here as there is very little in the technology industry nowadays. A couple of weeks back we commented that a lack of education regarding the use of AI will eventually be a massive concern, and this research supports our concern that the technology companies are ignoring their responsibilities to educate the consumer as the digital economy matures. Education is the key word here, as there hasn’t been enough.

Many app developers or internet companies seemingly feel that they need to hide what data they are collecting off the consumer for fear there would be retaliation or rejection. This is solely down the fact that the consumer has not been taken on the digital journey with the industry. Most consumers don’t understand the trade-off between free services and the release of personal information and now it seems the point of no return has been passed. You can’t pull back the curtain because the digital machines powering the internet revolution have gotten so big and scary.

A good example of this is opening up one of the gaming apps which you have downloaded on your phone; you might be quite surprised how much information you are actually giving away. No one reads the terms of service nowadays and this is a massive problem.

New regulation in Europe, GDPR, will give some control back to the consumer but the consumer will have to be more proactive in managing his/her digital footprint. The big problem here is knowing where your data actually is. How many apps have you downloaded over the years and then deleted? Do you remember the names of the developers? It is likely you have given personal information to these people, but having the right to request deletion is irrelevant if you don’t know where or who to ask.

The research from Here claims consumers would be more open to giving away personal information if there was more transparency and control over how location data is collected and used. People don’t like giving things away blindly, but are usually quite reasonable when the why is explain. Everyone, or at least we hope the vast majority, know there is no such thing as a free lunch; there is always a trade-off but the technology industry has to trust that the consumer is mature enough to accept it.

There are of course examples where the consumer would be happy to share location information, with autonomous driving platforms for example or drones which search for missing people, but these are only a small part of the digital economy. The rest of the industry needs to be more honest with the consumer or the whole idea will come crashing down.

Right now the relationship between the technology industry and the consumer is broken. If the information age is going to flourish more education is needed and the industry needs to trust the consumer to make decisions, not enforce itself upon individuals with a consent or leave policy.

How alternative online payment technology continues to gain momentum

Across the globe, eRetail is now the mainstream -- with well over half of all adults making purchases for products and services online, increasingly from their mobile devices.

Meanwhile, traditional bricks and mortar retailers have had to incorporate digital channels into their offerings. Unfortunately for the retailer, there are numerous steps during the online shopping experience where the shopper may abandon their journey to payment.

Online payments market development

The value of spending on remote payments for digital and physical goods will surpass $3.3 trillion this year -- that's up 10 percent on the 2017 total of $3 trillion, according to the latest worldwide market study by Juniper Research.

The new research findings uncovered that alternative payment mechanisms would comprise an ever increasing proportion of online spending.

PayPal already accounts for 20 percent of mobile and online physical goods transactions made outside China, while the success of Alipay and Weixin Pay within China means that these two players combined now account for 45 percent of global payment volumes.

It also claimed that there was a significant opportunity for more nascent options, such as the various OEM-Pay solutions and carrier billing, recently adopted by Amazon in Japan for physical goods purchases.

The market study findings also highlighted the major pain points for merchants and their customers. It argued that European merchants needed to be aware of the implications of PSD2 on card-on-file, meaning they would need to be ‘white-listed’ by consumers for payment details to be stored.

Indeed, it claimed Secure Customer Authentication (SCA) obligations could potentially adversely impact on conversion rates by increasing friction at checkout.

Meanwhile, the research noted that retailers were struggling to resolve issues around customer identification within the broader commerce framework.

Outlook for online payment applications

"Payment processors and other key stakeholders need to work closely with merchants to ensure they can recognize individual consumers, regardless of device and whether they are purchasing online or offline, to deliver the optimal experience across the retail lifecycle," said Dr Windsor Holden, head of forecasting & consultancy at Juniper Research.

Furthermore, the research stressed that not only should merchants localize supported payment mechanisms, but also the entire payment flow. Therefore Juniper analysts recommended they work with payment processors to test optimal flows for target markets.

CityFibre hits out at UK fibre advertising rules

CityFibre has rubbished research from the Advertising Standards Authority (ASA) that inferred consumers aren’t that bothered by fibre, and claimed other providers are directly misleading potential customers.

While such posturing should always be taken with a pinch of salt, as any research which doesn’t say CityFibre is the most important company on the planet will always be rubbished by CityFibre, the loud Northerner does have a point. Just because the consumers aren’t that bothered about whether a service is fibre-based or not doesn’t mean the ASA shouldn’t create stringent rules around it.

CityFibre has now filed for a judicial review of the ASA’s continued use of the term ‘fibre’ to describe some services which are only part-fibre. The firm argues the research and logic leading to the decision was fundamentally flawed, and that only full-fibre offerings should be able to use the term.

“You could hardly expect an automotive manufacturer to get away with advertising an ‘electric car’ when the most electric part of the car was its windows. The time has come to do away with ‘fake fibre’,” said CityFibre CEO Greg Mesch.

“The ASA’s short-sighted decision to allow yesterday’s copper-based infrastructure to masquerade as the future-proof full fibre networks of tomorrow is a clear failure in its duty. It has failed to ensure honest and truthful broadband advertising, it has failed to enable consumers to make informed choices and it has failed to support a national infrastructure project critical to our success in a digital age.”

CityFibre has made itself a reputation of moany very vocally and being incredible combative when it comes to press announcements or any rules from the ASA and Ofcom which doesn’t give it an advantage as a challenger brand, so you have to be careful here.

The firm is referring to rule changes made by the ASA back in November 2017 which looked at advertised broadband speed claims. In the same consultation, the ASA said consumers weren’t that bothered about whether the service was fibre or copper, just as long as it was quick enough, therefore there wouldn’t be any wholesale changes or challenges made to the language used in advertising. This is what CityFibre has taken issue with.

Just to be clear, broadband providers cannot claim their service is fibre if it is not. They also cannot claim it is full-fibre unless it is. Part-copper services cannot be claimed to be best in the market. And advertisers have to claim appropriate speeds for the technology being used in delivering the service. Taking CityFibre’s comments alone you would assume it was the wild west, but despite being blown out of proportion, it does have a point.

Rules should be tightened up around this area of advertising, as just because consumers do not care about fibre now they will in the future, when fibre connections will be critical to meet the demands of the digital economy. By this point, misconceptions and false-truths might be ingrained in the mind of the consumer as the rules are lax now. This is not to say that the rules are not accurate as they are, but there are too many grey areas for the ‘creative’ marketers to exploit. We generally don’t trust those in advertising and they have done little to prove this mistrust is not appropriate, especially in the telco space.

Perhaps advertisers should be held accountable to explicitly say what type of network is available. Maybe they should have to describe the service as one of two offerings; part-fibre or full-fibre. No other options, just to the point and informative. This would be fair to the consumer, but since when has that been a concern for the advertisers.

The BariMatera5G project seeks to exemplify the potential of 5G

A collaboration between TIM, Huawei and Fastweb to create one of the first 5G antennas is designed to show why it’s worth the effort.

The BariMatera5G project is a high-profile piece of 5G virtue-signaling by these three tech players. It has left the lab and officially hit the airwaves today with some kind of symbolic switch having been flicked, no doubt. As a result the Italian cities of Bari and Matera will be among the first in Europe to live the 5G dream, or at least be ready for it once devices turn up.

The precise aim of the project is to use the 3.7-3.8 GHz band to achieve 75% coverage of the two cities’ testing area by 2018. The testing has already hit 3 Gbps in the field, we’re told, but it’s about a lot more than just enhanced mobile broadband, which is just as well as merely a step up in speeds-and-feeds is unlikely to be enough to make 5G a success.

The slide below from the latest presentation about the project is a good summary of the various moving parts that full-fat 5G will consist of. The afore-mentioned tests also achieved 2 ms latency, which corresponds with the low-latency network slice (uRLLC) and the third cardinal slice is massive machine-type communication (mMTC), which is geek-talk for IoT and  will be represented at a technological level primarily by NB-IoT.

This project seems to have established itself sufficiently that is reasonable to expect it to be an exemplar for early 5G and what it promises for a while yet. It’s also good to see at least some parts of Europe having a good go at keeping up with China and the US in the 5G race.

BariMatera5G network slide

France leads Europe’s tax charge against Silicon Valley

The European Commission is on the verge of kicking off a tax raid on Silicon Valley, unveiling a directive within weeks which would set the tax rate on tech companies between 2% and 6% of revenues.

French Economy Minister Bruno Le Maire told Journal du Dimanche the rumoured tax reforms are just around the corner, with the directive focusing on revenues derived from specific countries as opposed to profits. Taxing the likes of Facebook or Amazon has always been a complicated job but a draft document released a few weeks back looked to set the tone.

“A European directive will be unveiled in the coming weeks,” said Le Maire. “And it will mark a considerable step forward. The range is 2% and 6%, we (France) will be closer to two than six.”

In the initial 12-page draft document (initially uncovered by Politico) the Commission (hereafter known as the Gaggle of Red-tapers) sought to create interim and long-term rules which will stem the flow of money leaving the bloc. Of course this is all about tax and trying to figure out how the technology giants can contribute a bit more to the economies from which they are so handsomely profiting from.

“The starting point is the internationally accepted premise that taxation should take place where value is created,” the document reads. “Currently there is a mismatch between where taxation of the profit takes place and where value is created for certain digital activities.”

In short, the European citizens are providing the value for the internet giant’s spreadsheets, but these companies are taking advantage of tax havens around the world. There is of course nothing illegal about what the technology firms are actually doing, but you have to question whether it is ethically sound to bleed these economies of its cash while contributing very little back to the public service.

With Le Maire’s comments, there is seemingly confidence Europe will be able to reverse the on-going trends and hold the technology firms accountable.

This is of course not the first time the Gaggle has taken aim at Silicon Valley, and it is unlikely to go unnoticed by a political administration which is far more combative in its narrative than many before it. Europe has already punished many of the Silicon favourites with penalties for competition violations, it has waged war with Apple over its tax haven in Ireland and its pro-privacy stance is proving to be a thorn for both the internet giants and US intelligence agencies. Such a move could drive a wedge further between Europe and the US.

This is not a new story, but such a draft document is perhaps the most significant step forward thus far. The attack has a distinctly French-feel to it, and this is not the first time France has led the charge against the Silicon profit trail. French President Emmanuel Macron made strides forward last year, but felt resistance from some countries, Ireland being one of them, who profit considerably from alternative tax rules.

The Irish government might not be rolling in riches, but a favourable corporate tax environment has seemingly done wonders to reinvigorate the tech scene. Apple reportedly employs 5,000 people in the country, Dropbox has its European HQ in Dublin, Intel has a significant presence, Google uses the country to re-route taxes, as did Facebook until recently, while about a quarter of Synopsys’ employees worldwide are employed in Ireland. Why would Ireland want any change in the status quo? It certainly won’t be alone as Luxembourg is another who benefits from an alternative tax set-up, though Le Maire believes resistance is weakening in these countries.

While this has been a long-standing narrative for the Gaggle of Red-tapers this is not going to be an easy change to push through. Changes on this scale would have to voted-in unanimously and you can expect some pretty aggressive lobbying from the technology firms. These are some pretty big companies who have extensive legal teams.