Researchers point to 1,300 apps which circumnavigate Android’s opt-in

Research from a coalition of professors has suggested Android location permissions mean little, as more than 1,300 apps have developed ways and means around the Google protections.

A team of researchers from the International Computer Science Institute (ICSI) has been working to identify short-comings of the data privacy protections offered users through Android permissions and the outcome might worry a few. Through the use of side and covert channels, 1,300 popular applications around the world extracted sensitive information on the user, including location, irrelevant of the permissions sought or given to the app.

The team has informed Google of the oversight, which will be addressed in the up-coming Android Q release, receiving a ‘bug bounty’ for their efforts.

“In the US, privacy practices are governed by the ’notice and consent’ framework: companies can give notice to consumers about their privacy practices (often in the form of a privacy policy), and consumers can consent to those practices by using the company’s services,” the research paper states.

This framework is a relatively simple one to understand. Firstly, app providers provide ‘notice’ to inform the user and provide transparency, while ‘consent’ is provided to ensure both parties have entered into the digital contract with open eyes.

“That apps can and do circumvent the notice and consent framework is further evidence of the framework’s failure. In practical terms, though, these app behaviours may directly lead to privacy violations because they are likely to defy consumers’ expectations.”

What is worth noting is while this sounds incredibly nefarious, it is no-where near the majority. Most applications and app providers act in accordance with the rules and consumer expectations, assuming they have read the detailed terms and conditions. This is a small percentage of the apps which are installed en-mass, but it is certainly an oversight worth drawing attention to.

Looking at the depth and breadth of the study, it is pretty comprehensive. Using a Google Play Store scraper, the team downloaded the most popular apps for each category; in total, more than 88,000 apps were downloaded due to the long-tail of popularity. To cover all bases however, the scraper also kept an eye on app updates, meaning 252,864 different versions of 88,113 Android apps were analysed during the study.

The behaviour of each of these apps were measured at the kernel, Android-framework, and network traffic levels, reaching scale using a tool called Android Automator Monkey. All of the OS-execution logs and network traffic was stored in a database for offline analysis.

Now onto how these apps developers can circumnavigate the protections put in place by Google. For ‘side channels’, the developer has discovered a path to a resource which is outside the security perimeters, perhaps due to a mistake during design stages or a flaw in applying the design. With ‘covert channels’ these are more nefarious.

“A covert channel is a more deliberate and intentional effort between two cooperating entities so that one with access to some data provides it to the other entity without access to

the data in violation of the security mechanism,” the paper states. “As an example, someone could execute an algorithm that alternates between high and low CPU load to pass a binary message to another party observing the CPU load.”

Ultimately this is further evidence the light-touch regulatory environment which has governed the technology industry over the last few years can no-longer be allowed to persist. The technology industry has protested and quietly lobbied against any material regulatory or legislative changes, though the bad apples are spoiling the harvest for everyone else.

As it stands, under Section 5 of the Federal Trade Commission (FTC) Act, such activities would be deemed as non-compliant, and we suspect the European Commission would have something to say with its GDPR stick as well. There are protections in place, though it seems there are elements of the technology industry who consider these more guidelines than rules.

Wholesale changes should be expected in the regulatory environment and it seems there is little which can be done to prevent them. These politicians might be chasing PR points as various elections loom on the horizon, but the evolution of rules in this segment should be considered a necessity nowadays.

There have simply been too many scandals, too much abuse of grey areas and too numerous examples of oversight (or negligence, whichever you choose) to continue on this path. Of course, there are negative consequences to increased regulation, but the right to privacy is too important a principle for rule-makers to ignore; the technology industry has consistently shown it does not respect these values therefore will have to be forced to do so.

This will be an incredibly difficult equation to balance however. The technology industry is leading the growth statistics for many economies around the world, but changes are needed to protect consumer rights.

EE feels the sharp-end of the opt-in stick

EE is the latest firm to feel the rising wrath of the Information Commissioner’s Office as it is forced to cough up £100,000 for opt-in violations during 2018.

The messages, which were sent back in early 2018, encouraged customers to use a new feature but also to suggest device upgrades. EE claimed the communications were sent as ‘service messages’, but due to the presence of directing marketing, fell afoul of the guidance on electronic marketing put forward by the ICO.

“These were marketing messages which promoted the company’s products and services,” said Andy White, ICO Director of Investigations. “The direct marketing guidance is clear: if a message that contains customer service information also includes promotional material to buy extra products for services, it is no longer a service message and electronic marketing rules apply.

“EE Limited were aware of the law and should have known that they needed customers’ consent to send them in line with the direct marketing rules. Companies should be aware that texts and emails providing service information which also include a marketing or promotional element must comply with the relevant legislation or could face a fine up to £500,000.”

EE might feel a little bit hard-done by here, though it is a pretty clear violation of the rules.

As these messages contained prompts to earn EE a few extra quid each month, they clearly fall into the marketing category. EE would have to secured opt-in from these customers in the past, or in the case of ‘soft opt-in’, existing customers would have had to buy relevant products and given the opportunity to opt-out.

In this instance, the ICO accepted EE had not knowingly broken the rules, though as it did intentionally send out the emails it did not escape a fine. A second-batch of messages were sent out to those who didn’t engage with the first, which probably didn’t help the EE case.

Although this is a relatively minor fine, we expect to see a lot more of these investigations over the coming months. Rules around privacy and data protection are being toughened up, and the regulators need to be seen enforcing them. This fine might not be significant when you compare it to total revenues at the BT Group, but it is symbolic; we expect a few more of these ‘gestures’ sooner rather than later.

Maine gets tough on telcos over data economy

Maine Governor Janet Mills has signed new privacy rules into law, demanding more proactive engagement from broadband providers in the data-sharing economy.

While the rules are tightening up an area of the digital world which is under-appreciated at the moment, it will have its critics. The law itself is targeting those companies who delivering connectivity solutions to customers, the telcos, not the biggest culprits of data protection and privacy rights, the OTTs and app developers.

The rules are applicable to broadband providers in the state, both mobile and fixed, and force a more proactive approach in seeking consent. Telcos will now be compelled to seek affirmative consent from customers before being allowed to use, disclose, sell or permit access to customer personal information, except in a few circumstances.

As is on-trend with privacy rules, the ‘opt-out’ route, used by many to ensure the lazy and negligent are caught into the data net, has been ruled out.

There are also two clauses included in the legislation which block off any potential coercing behaviour from the telcos also:

  • Providers will not be allowed to refuse service to a customer who does not provide consent
  • Customers cannot be penalised or offered a discount based on that customer’s decision to provide or not provide consent

This is quite an interesting inclusion in the legislation. Other states, California for example, are building rules which will offer freedoms to those participating in the data-sharing economy if the spoils are shared with those providing the data (i.e. the customer), though the second clause removes the opportunity to offer financial incentives or penalties based on consent.

This is not to say rewards will not be offered however. There is wiggle room here, zero-rating offers on in-house services or third-party products for example, which does undermine the rules somewhat.

It is also worth noting that these rules only pertain to what the State deems as personal data. Telcos can continue to monetize data which is not considered personal without seeking affirmative consent, unless the customer has written to the telco to deny it this luxury. Personal data is deemed as the following categories:

  • Web browsing history
  • Application usage history
  • Geolocation
  • Financial
  • Health
  • Device identifiers
  • IP Address
  • Origin and destination of internet access service
  • Content of customer’s communications

What is worth noting is this is a solution to a problem, but perhaps not the problem which many were hoping would be addressed.

Firstly, the telcos are already heavily regulated, with some suggesting already too much so. There are areas which need to be tightened up, but this is not necessarily the problem child of the digital era. The second point is the issue which we are finding hard to look past; what about the OTTs, social media giants and app community?

The communications providers do need to be addressed, though the biggest gulf in regulation is concerning the OTTs and app developers. These are companies which are operating in a relative light-touch regulatory environment and benefiting considerably from it. There are also numerous examples of incidents which indicate they are not able to operate in such a regulatory landscape.

Although it is certainly a lot more challenging to put more constraints on these slippery digital gurus, these companies are perhaps the biggest problem with the data-sharing economy. Maine might grab the headlines here with new privacy rules, which are suitably strict in fairness, but the rule-makers seem to have completely overlooked the biggest problem.

These rules do not add any legislative or regulator restraints on the OTTs or app developers, therefore anyone who believes Maine is taking a notable step in addressing the challenges of the data-sharing economy is fooling themselves. This is a solution, but not to the question which many are asking.

If 52% don’t understand data-sharing economy, is opt-in redundant?

Nieman Lab has unveiled the results of research suggesting more than half of adults do not realise Google is collecting and storing personal data through usage of its platforms.

The research itself is quite shocking and outlines a serious issue as we stride deeper into the digital economy. If the general population does not understand the basic principles behind the data-sharing economy, how are they possibly going to protect themselves against the nefarious intentions from the darker corners of the virtual world?

You also have to question whether there is any point in the internet players seeking consent if the user does not understand what he/she is signing up for.

According to the research, 52% of the survey respondents do not expect Google to collect data about a person’s activities when using its platforms, such as search engines or YouTube, while 57% do not believe Google is tracking their web activity in order to create more tailored advertisements.

While most working in the TMT industry would assume the business models of the Google and the other internet are common knowledge, the data here suggests otherwise.

66% also do not realise Google will have access to personal data when using non-Google apps, while 64% are unaware third-party information will be used to enhance the accuracy of adverts served on the Google platforms. Surprisingly, only 57% of the survey respondents realise Google will merge the data collected on each of its own platforms to create profiles of users.

Although this survey has been focused on Google, it would be fair to assume the same respondents do not appreciate this is how many newly emerging companies are fuelling their spreadsheets. The data-sharing economy is the very reason many of the services we enjoy today are free, though if users are not aware of how this segment functions, you have to question whether Google and the other internet giants are doing their jobs.

The ideas of opt-in and consent are critically important nowadays. New rules in the European Union, GDPR, set about significant changes to dictate how companies collect, store and use personal information collected by the service providers. These rules were supposed to enforce transparency and encourage the user to be in control of their personal information, though this research does not offer much encouragement.

If the research suggests more than half of adults do not understand how Google collects personal information or uses it to enhance its own advertising capabilities, what is the point of the opt-in process in the first place?

Reports like this suggest the opt-in process is largely meaningless as users do not understand what they are giving the likes of Google permission to do. The blame for this lack of education is split between the internet giants, who have become experts at muddying the waters, and the users themselves.

Those who use the services for free but do not question the continued existence of ‘free’ platforms should forgo the right to be annoyed when scandals emerge. Not taking the time to understand, or at least attempt to, the intricacies of the data-sharing economy is the reason many of these scandals emerge in the first place; users have been blindly handing power to the internet giants.

The internet players need to do more to educate the world on their business models, however the user does have to take some of the responsibility. We’re not suggesting everyone becomes an internet economy expert, but gaining a basic understanding is not incredibly difficult. However, it does seem ignorance is bliss.

Google just about manages to avoid another massive fine

The High Court in the UK has quashed an attempted class-action lawsuit against Google for the illegal collection of iPhone user’s data during 2011-2012.

The case, which was first heard in May, was brought forward by a group called ‘Google You Owe Us’, headed up by Richard Lloyd, a former Executive Director at Which. Lloyd believes in acting illegally, Google should financially compensate the iPhone users who were affected. The number of affected users has been estimated at 4.4 million, meaning Google would have been liable for damages between £1 and £3 billion. As it stands, the class-action suit can no longer proceed as it would be impossible to accurately calculate the number of iPhone users who have been impacted sufficiently.

Google, which has already admitted to wrong-doing, used a practise now known as the ‘Safari Workaround’ to obtain sensitive information without obtaining permission from the user. As a data controller in this instance, Google has breached its responsibilities under the Data Protection Act, though this case is not to punish illegal activity, but to seek compensation for users as a result of the illegal activity.

For those who are familiar with legal jargon, Justice Mark Warby concluded:

“In my judgment the facts alleged in the Particulars of Claim do not support the contention that the Representative Claimant or any of those whom he represents have suffered ‘damage’ within the meaning of DPA s 13. If that was wrong, the Court would inevitably refuse to allow the  claim to continue as a representative action because members of the Class do not have the ‘same interest’ within the meaning of CPR 19.6(1) and/or it is impossible reliably to ascertain the members of the represented Class.”

Section 13 of the Data Protection Act states individuals who suffer ‘damage’ by reason of any contravention by a data controller (in this case, Google) are entitled to compensation. Civil Procedure Rules (CPR) section 19.6(1) states each individual in the class would have to have the same ‘damage’, though as there has been only one case brought forward in the last six years, the damage cannot be logically concluded or attributed.

The case was brought to the courts in 2017. Lloyd claims Google profited from illegally collecting and processing information on iPhone users, which was then used in the ‘DoubleClick’ advertising business to create a hyper-targeted advertising service.

In the simplest of terms, Google managed to find a way of collecting information about users without going through the accepted opt-in route. Google wrote code which bypassed the opt-in on the Safari browser, and placed a third-party cookie onto the iPhones. This practise, known now as the ‘Safari Workaround’, essentially allowed Google to track iPhone users, collecting information without seeking the appropriate opt-in.

Through the collection of sensitive information including race, social class, location data and interests, Google was able to build a detailed profile of individuals and organize the users into categories such as ‘football fans’. This categorization is critical to advertisers, who want to make sure ROI is as high as possible for every pound spent. At the time of the incident, 2011-12, such hyper-targeted would have been a relatively new concept, with Google pushing the boundaries of what would be considered acceptable.

Google has already admitted to wrong-doing, and has been punished by the relevant authorities in the US, paying out multi-million sums. In the UK, such investigations would fall into the jurisdiction of the Information Commissioner’s Office, though it has not taken any action to date. What is worth noting is that this ruling against Lloyd should not restrict any action from the ICO, as it is related to class-action suits against Google compensating victims of the wrong-doing, not the wrong-doing itself. It’s a nuance, but worth noting, as the ICO could in theory take action.

For Google, this ruling will certainly come as a relief. Not only does it save the accountants from having to sign another multi-billion pound cheque, but it sets precedent. In stating it would not be possible for Lloyd to understand and identify the ‘damage’ done to each of the individual users, Justice Warby has made it more difficult for consumer groups to organize class action suits against major organizations.

The ripples of this ruling go further than the technology world. Consumer groups throughout the UK would have been watching this saga with interest; the ruling would have set precedent as to whether class-action suits are a realistic possibility in the UK. Justice Warby has not ruled out class-action suits, though he has simply stated Lloyd was not able to attribute an appropriate amount to the ‘damage’ column. More work on the foundations will be needed on the future for such class-actions suits to progress in the future.

Feel like you’re being watched? Probably Google violating privacy rights

More often than not we’re writing positively about Google, but the ‘do no evil’ company has been caught out tracking smartphone locations even if the user has opted out.

An investigation from by Associated Press, and ratified by researchers at Princeton University, found several Google services on Android and iOS devices have been storing location data of users, even if the individual has set privacy settings to remain invisible. As privacy and the right to access personal data increasingly become hot-topics, Google might have stepped on a bit of a PR and legal landmine.

Generally Google is quite upfront about discussing privacy and location enablement. It has faced various fines over the years for data-dodginess and is even facing an European Commission investigation over its alleged suspect coercion of users into opting-in to various services, though this is potentially either an example of extreme negligence, or illegally misleading the consumer. Neither explanation is something Google execs would want to be associated with.

One of the issues here is the complexity of getting off the grid. Although turning location tracking off stops Google from adding location data to your accounts timeline, leaving ‘Web & App Activity’ on allows Google to collect other location markers.

We mentioned before this is either negligence or illegal activity, but perhaps this is just another example of an internet giant taking advantage of the fact not everyone is a lawyer. The small print is often the best friend of Silicon Valley. Few would know about this little trick from the Googlers which allows them to appear like the data privacy hero, while simply sneaking in through the slight ajar window in your kitchen.

“When Google builds a control into Android and then does not honour it, there is a strong potential for abuse,” said Jesse Victors, Software Security Consultant at Synopsys.

“It is sometimes extremely important to keep one’s location history private; such as visiting a domestic violence shelter, for example. Other times you may simply wish to opt out of data collection. It’s disingenuous and misleading to have a toggle switch that does not completely work. This, and other examples before it, are one of the reasons why my phone runs LineageOS, a Google-free fork of Android.”

On the company support page, Google states users can switch off location services for any of its services at any time, though this would obviously impact the performance of some. The Maps application for example cannot function without it, and does track user movements by the minute once switched on. With such opportunity for abuse, Google introduced pause features for some of its apps, allowing the user to become invisible for a undefined period of time.

The relationship with the user and the concept of trust is critical to Google. Revenues are generated off creating free services and implementing advertising platforms into the services, though to remain relevant Google needs the consumer data to improve applications. Without constant upgrades and fine-tuning, Google could not maintain the dominant position is enjoys today.

Collecting this data requires trust. The user must trust Google is not mishandling the data it acquires, but also respects the users right to privacy. Without this element of trust between the user and Google, it would not be able to acquire the critically important insight. With this revelation, Google has put a dent in its own credibility and damaged the relationship with the user.

The impact on Google overall will of course be limited. There are too many good stories to drown out the negative and ultimately the user needs Google. Such is the importance of Google’s services to the digital economy, or perhaps it should be worded as a lack of effective enough alternatives, we suspect few users will allow this invasion of privacy to impact their daily routines.

This is not supposed to be any form of validation for the contradictory ‘do no evil’ business, but more a sad truth of today.

Should privacy be treated as a right to protect stringently, or a commodity for users to trade for benefits?

Loading ... Loading ...