Google challenges France’s first swing of the GDPR stick

Google has stated it will appeal the French regulator’s decision to dish out a €50 million fine for not being forthright enough with how it collects, stores and processes user’s personal data.

For Google, this is not about the money. €50 million for Google is nothing. This is a company which generated $33.7 billion over the final quarter of 2018. It would take a matter of minutes for the team to pay off this fine. However, should this ruling be allowed to stand Google would have to alter its business model, as would the rest of the data-sharing economy, causing a very unwelcomed, and potentially costly, disruption.

“The 50 million euro fine issued by the CNIL on 21 January 2019 significantly impacts Google as it directly challenges its business model based on the processing of personal data,” said Sonia Cissé, Head of TMT Practice of law firm Linklaters in Paris.

“Considering the seriousness of the CNIL’s findings and the broad publicity of this case, a potential appeal by Google is no surprise and makes perfect sense from a legal-strategy perspective.”

On Monday, France’s National Data Protection Commission (CNIL) dished out the fine for two violations of Europe’s General Data Protection Regulation (GDPR). Firstly, the search giant was not specific enough when requesting consent from users. Secondly, for users who wanted to dig deeper into the Google data practices, the company made it unnecessarily difficult to see the entire picture. Google was being too vague and not accessible enough.

“Users are not able to fully understand the extent of the processing operations carried out by Google,” the CNIL said in a statement.

This is the first time a regulator has used GDPR to hold one of the internet giants accountable, but there are plenty of other cases in the pipeline. Google is of course not the only target, as various different privacy advocates across the bloc lodge their complaints against the likes of Spotify, Amazon and Apple, just to name a few others.

In appealing this case, Google is making itself the tip of the spear for the entire internet ecosystem. There will be multiple appeals against the various rulings over the coming months because of how important precedent in this saga. If Google was to just let this ruling stand, it is effectively validating its opinion potentially undermining its own business model. If similar ruling start to appear across the continent the disruption to the data-sharing economy would be massive.

“In all likelihood, Google will challenge the CNIL’s decision on two main grounds: (i) procedural aspects (i.e., the competence of the CNIL); and (ii) the content of the case (i.e., challenging the facts),” said Cissé.

“Should Google be able to demonstrate that Google Ireland Limited was its main establishment in the European Union (EU) at the time of the CNIL’s investigations, then the competence of the CNIL could be validly challenged.

“Second, the content of the decision is another ground for action, and it will be up to the French administrative judges to determine, in light of the circumstances at stake, whether the transparency requirements under GDPR were met or not.”

GDPR is an incredibly complicated set of rules mainly because there are so many different definitions and clauses, but also certain exemptions. In most cases, companies would have to obtain consent from users to use data for explicit purposes, retaining the data only until these purposes have been satisfied. However, companies do not have to obtain consent when it is necessary to comply with another law, or there are ‘legitimate interests’. It paints a complicated picture.

Of course, for those who are more privacy sensitive, such rules and grey areas are a bounty of riches. The rules have created amble opportunity to challenge the internet giants’ business models, as well as the influence they have over the world. One of those is privacy campaigner Max Schrems.

“We are very pleased that for the first time a European data protection authority is using the possibilities of GDPR to punish clear violations of the law,” Schrems said following the CNIL ruling.

“Following the introduction of GDPR, we have found that large corporations such as Google simply ‘interpret the law differently’ and have often only superficially adapted their products. It is important that the authorities make it clear that simply claiming to be compliant is not enough.”

Schrems’ firm, None of Your Business (NYOB), has filed several complaints against other internet businesses on the grounds of accessibility. Those who will come under the scrutiny of Austrian courts include Apple, DAZN, Filmmit, Netflix and Amazon. More specifically, these complaints suggest the companies violated GDPR’s ‘right to access’, enshrined in Article 15 GDPR and Article 8(2) of the Chart of Fundamental Rights.

All of these cases will dictate how the internet economy will function over the coming years, but this battle between the CNIL and Google could prove to be a critical one, such is the power of precedent in the legal world.

“In a nutshell, it is highly difficult to identify certainties regarding the outcome of Google’s appeal,” said Cissé.

“Since data protection is a field of law particularly subject to interpretation and grey areas, one cannot exclude the possibility that Google could be successful in appealing the CNIL’s decision before the French Administrative Supreme Court. In any event, the ruling of the French administrative judges will be closely monitored by all the tech companies.”

France fines Google for being vague

The French regulator has swung the GDPR stick for the first time and landed it firmly on Google’s rump, costing the firm €50 million for transparency and consent violations.

The National Data Protection Commission (CNIL) has been investigating the search engine giant since May when None Of Your Business (NOYB) and La Quadrature du Net (LQDN) filed complaints suggesting GDPR violations. The claims specifically suggested Google was not providing adequate information to the user on how data would be used or retained for, while also suggesting Google made the process to find more information unnecessarily complex.

“Users are not able to fully understand the extent of the processing operations carried out by Google,” the CNIL said in a statement.

“But the processing operations are particularly massive and intrusive because of the number of services offered (about twenty), the amount and the nature of the data processed and combined. The restricted committee observes in particular that the purposes of processing are described in a too generic and vague manner, and so are the categories of data processed for these various purposes.”

This seems to be the most prominent issue raised by the CNIL. Google was being too vague when obtaining consent in the first instance, but when digging deeper the rabbit hole become too complicated.

Information on data processing purposes, the data storage periods or the categories of personal data used for the ad personalization were spread across several pages or documents. It has been deemed too complicated for any reasonable member of the general public to make sense of and therefore a violation of GDPR.

When first obtaining consent, Google did not offer enough clarity on how data would be used, therefore was without legal grounding to offer personalised ads. Secondly, the firm then wove too vexing a maze of red-tape for those who wanted to understand the implications further.

It’ll now be interesting to see how many other firms are brought to the chopping block. Terms of Service have been over-complicated documents for a long-time now, with the excessive jargon almost becoming best practise in the industry. Perhaps this ruling will ensure internet companies make the legal necessities more accessible, otherwise they might be facing the same swinging GDPR stick as Google has done here.

For those who are finding the NOYB acronym slightly familiar it might be because the non-profit recently filed complaints against eight of the internet giants, including Google subsidiary YouTube. These complaints focus on ‘right to access’ clauses in GDPR, with none of the parties responding to requests with enough information on how data is sourced, how long it would be retained for or how it has been used.

As GDPR is still a relatively new set of regulations for the courts to ponder, the complaints from NOYB and LQDN were filed almost simultaneously as the new rules came into force, this case gives some insight into how sharp the CNIL’s teeth are. €50 million might not be a monstrous amount for Google, but this is only a single ruling. There are more complaints in the pipeline meaning the next couple of months could prove to be very expensive for the Silicon Valley slicker.

EU Advisor tells France to forget about global ‘right to be forgotten’

The Advocate General of the European Court of Justice has given his opinion on the ‘right to be forgotten’ conflict between France and Google, and its good news for the ‘do no evilers’.

Advocate General, Maciej Szpunar, has been pondering the implications of the ‘right to be forgotten’ saga for some months now, and the opinion is relatively simple; France does not have the right to impose its own considerations on a company which operates outside its jurisdiction.

The French regulator can force Google to de-list search results on the grounds of privacy in France, and generally across the EU, though it does not have the authority to impose itself on the companies worldwide footprint. As the Advocate General notes, the repercussions of such a ruling would have too much potential to cause damage in various other scenarios.

The case is somewhat of a tricky one, as it does have implications in the contentious world of privacy/free speech/accountability. And while the European Court of Justice does not have to follow the opinion of the Advocate General, it generally does.

“This is a really important case pitting fundamental rights to privacy against freedom of expression,” said Richard Cumbley, Partner and Global Head of Technology at law firm Linklaters. “The case highlights the continuing conflict between national laws and the Internet which does not respect national boundaries.

“The opinion contains a clear recommendation that the right to remove search results from Google should not have global effect. There are a number of good reasons for this, including the risk other states would also try and supress search results on a global basis. This would seriously affect people’s right to access information.”

The case dates back to the early months of 2018, with the CNIL, France’s data protection watchdog, suggesting the search giant should have to enforce any ‘right to be forgotten’ rulings to all of its domains instead of just that of the home nation of the challenging regulator. Google, and various other free speech advocacy groups, have been suggesting France and the European Union are attempting to impose their own data privacy position on the rest of the world.

Looking at the ramifications, those of us who have more long-term considerations would certainly be thankful of Szpunar’s opinion. As Cumbley points out above, this case could be used as evidence by other nations to supress free speech or opinions which are not in-line with the political climate. Precedent is everything in the legal community, and while it hopefully does not intend to, France may be aiding more authoritarian governments in trying to impose its privacy demands on Google.

What is worth noting is that this opinion is not an official ruling from the European Court of Justice, though it does generally head in the same direction as the Advocate General.

US contemplates its own version of GDPR

The U.S. National Telecommunications and Information Administration has started a 30-day public hearing process to gather comments on its policy options towards consumer privacy protection.

Shortly after Europe’s General Data Protection Regulation (GDPR) came into force in late May, “a global tidal wave of new and updated privacy regulations” have followed hot on the heels of GDPR as it was called at the recent Digital Futures conference (see the picture). Regulations and laws passed in jurisdictions from India to California with other markets in between have largely modelled after the European legislation.

In the latest move, on Tuesday September 25, the US federal government, through the National Telecommunications and Information Administration (NTIA), kick-started a month-long process to hear from the public on the approach towards privacy protection.

“The United States has a long history of protecting individual privacy, but our challenges are growing as technology becomes more complex, interconnected, and integrated into our daily lives,” said David Redl, NTIA Administrator and Assistant Secretary of Commerce for Communications and Information. “The Trump Administration is beginning this conversation to solicit ideas on a path for adapting privacy to today’s data-driven world.”

The feedback requested is two-fold. The first part is on the outcome of any future privacy legislation. This includes:

  • Organizations should be transparent about how they collect, use, share, and store users’ personal information.
  • Users should be able to exercise control over the personal information they provide to organizations.
  • The collection, use, storage and sharing of personal data should be reasonably minimized in a manner proportional to the scope of privacy risks.
  • Organizations should employ security safeguards to protect the data that they collect, store, use, or share.
  • Users should be able to reasonably access and correct personal data they have provided.
  • Organizations should take steps to manage the risk of disclosure or harmful uses of personal data.
  • Organizations should be accountable for the use of personal data that has been collected, maintained or used by its systems.

All these are rather similar to what GDPR and the up-coming e-Privacy regulation are designed to achieve.

Meanwhile the NTIA is also requesting comments on the overall “High-Level Goals for Federal Action”, the key points including:

  • “Harmonize the regulatory landscape” between existing and future legislations;
  • “Legal clarity while maintaining the flexibility to innovate” to enable new business models and technologies while privacy is protected;
  • “Comprehensive application” to “all private sector organizations that collect, store, use, or share personal data in activities that are not covered by sectoral laws”;
  • “Incentivize privacy research” in technologies and services that improve privacy protections.
  • FTC should be the enforcement agency

However a few other points stand out that deserve a closer look. One probably deserves a full quote:

Employ a risk and outcome-based approach.  Instead of creating a compliance model that creates cumbersome red tape—without necessarily achieving measurable privacy protections—the approach to privacy regulations should be based on risk modeling and focused on creating user-centric outcomes.  Risk-based approaches allow organizations the flexibility to balance business needs, consumer expectations, legal obligations, and potential privacy harms, among other inputs, when making decisions about how to adopt various privacy practices.  Outcome-based approaches also enable innovation in the methods used to achieve privacy goals.  Risk and outcome-based approaches have been successfully used in cybersecurity, and can be enforced in a way that balances the needs of organizations to be agile in developing new products, services, and business models with the need to provide privacy protections to their customers, while also ensuring clarity in legal compliance.

NTIA’s focus is clearly to avoid heavy-handed measures to regulate what can be done, but rather giving flexibility to businesses to make their own judgement what measures to take. This is also in the same spirit as the first part of the consultation which is “focuses on the desired outcomes of organizational practices, rather than dictating what those practices should be.”

Another point that draws our attention is related to “Scalability”, which stresses that small companies operating in good faith, and 3rd party processing data on behalf of other organisations should be treated differently from big companies that own and control personal data.

The two points above combined make a balanced message for the internet giants, which are not necessarily the biggest fans of privacy regulations. While they are afforded more flexibility, they are also going to be treated more strictly if they contravene. However as we wrote earlier, because of their size, the Googles and Facebooks of the world are much quicker in ticking the compliance boxes.

One more point that worth highlighting, probably for entertainment purposes than anything else, relates to “Interoperability” with other major global legislations. Here, for whatever reason it pointedly does not refer to GDPR but uses the example of “APEC Cross-Border Privacy Rules System.”

In general, the NTIA’s approach is balanced and measured, which is largely in line with our attitude towards privacy protection. On one hand we deplore the blatant abuse of privacy by companies like Facebook and Cambridge Analytics. On the other hand, we also sympathise with the small and medium-sized businesses operating in Europe, most of which had to scramble some policies at the eleventh hour, but may still fall foul of consumers. France’s private data protection agency CNIL (Commission nationale de l’informatique et des libertés) registered a 64% increase in consumer complaints after GDPR came to force over the same four months last year.

As Mary Meeker highlighted, draconian laws could limit the exploratory nature of tech innovators. That many countries model their privacy legislation after GDPR confirmed that Europe’s policymakers are “world-class in setting standards”, as a recent article in The Economist put it. But in the same article the newspaper also highlighted the gap between Europe and the AI leaders, China and US, neither of which is role model in guarding individual privacy, though for entirely different purposes.

In a recent Telecoms.com online poll, a third of the respondents agreed with the statement that there should be “flexible rules to allow users to trade privacy for benefits”. An optimal regulatory environment should give this minority group the freedom to do so while providing the other two third consumers with strict privacy protection.

Google fights back against EU plans to impose its regulations on rest of world

Today the European Court of Justice will make a decision which will impact the global digital economy. Does the European Union have the right to impose its own data protection and privacy standards on everyone else?

The one-day hearing has been brought about because of French data protection watchdog, CNIL, pressing for Google to extend the ‘right to be forgotten’ ruling to all of its domains. When such a request is made and accepted, Google will remove content from search results in the relevant domain (e.g. .fr in France for example), but also when users from that country are searching through other domains (e.g. .com or .co.uk). CNIL argues the content should be removed from all domains, irrelevant where the user is based.

“This case could see the right to be forgotten threatening global free speech,” said Thomas Hughes, Executive Director of free speech advocacy group Article 19. “European data regulators should not be allowed to decide what Internet users around the world find when they use a search engine. The CJEU (European Court of Justice) must limit the scope of the right to be forgotten in order to protect the right of Internet users around the world to access information online.”

While it might not seem like the most damning of cases, the ripples from this ruling could quickly become turbulent waves. Google and numerous other free speech advocacy groups argue this is simply France, and the European Union, pursuing their own form of censorship, imposing their own standards on other nations around the world. Should the judges rule in favour of CNIL precedent would be set and precedent can be very dangerous.

If the European Union can force other countries into complying with its regulations, why shouldn’t others?

“If European regulators can tell Google to remove all references to a website, then it will be only a matter of time before countries like China, Russia and Saudi Arabia start to do the same,” said Hughes. “The CJEU should protect freedom of expression not set a global precedent for censorship.”

The question these judges have to answer is a relatively simple one on the surface; should governments and regulators have influence over those who live in their jurisdiction or should they be afforded power over everyone else as well? For us, the answer is incredibly simple as well; no it shouldn’t.

The whole concept of the CNIL argument is contradictory and patronising; it’s a form of digital colonialism, with France assuming it is the moral, ethical and political authority on such matters. If China or Russia were pressing for their rules to be imposed on the international stage, there would be uproar. Of course, the rules in these countries are backwards, though the principle remains the same. France should not be allowed to dictate to other countries around the world.

This is another example of globalisation trends working against the consumer. Companies like Google make use of the grey areas and cracks between the legislative and regulatory regimes of different countries. They take advantage of lighter-touch regulation in some countries, remaining out of reach of those who are more involved. The absence of an international code or ruling authority simply offers the internet players a blank rule book and encourages lawyers to look for loop-holes to ignore regulations in more privacy-sensitive countries. That said, the will of one nation, or a dozen or 28, should not be imposed on the rest of the world.

For Telecoms.com, the decision is a simple one; France should be told to govern its own country and not get involved in jurisdictions which does not concern it. The precedent set would be far too dangerous.