MediaTek defends itself after benchmark cheating accusations

After reports emerged suggesting MediaTek has been cheating the benchmarking system, the chipset manufacturer has vehemently defending its position.

It has been alleged in AnandTech that MediaTek has been cheating the mobile enthusiasts with some clever code. In the firmware files, references were found tying benchmark apps to a so-called ‘sports mode’. When triggered (if a benchmark app has been initiated), features on the phone were ramped up to give the impression of better performance.

AnandTech claims the cheating was brought to light thanks to testing two different OPPO Reno 3 devices. The Reno 3 Pro (the European version) beat the Reno 3 (the Chinese version) in the PCMark benchmark utility, despite its Helio P95’s Cortex-A75 CPU cores being two generations older than the Dimensity 1000L’s Cortex-A77 CPU cores. And not only did the Reno 3 Pro has older MediaSet chipsets than the Reno 3 devices, it had half as many.

The difference in the test results were slightly unusual, though when a ‘stealth’ benchmark apps were used, the lower results were confirmed.

Why those in the industry feel it is necessary to cheat benchmarking tests is anybody’s guess. The negatives of being caught far outweigh the gains of impressing a few hyper-geeks, and the cheaters eventually get caught. It is embarrassing and some might ask whether they are a reliable partner. The chipsets in questions have been used in OPPO, Vivo, Xiaomi and Sony devices.

Following the original statement, which you can see at the foot of the article, an expanded blog post was offered to the industry.

“We do find it interesting that AnandTech has called into question the benchmarking optimizations on MediaTek powered devices, when these types of configurations are widely practiced across the industry,” MediaTek said. “If they were to review other devices, they would see, as we have, that our key competitor has chipsets that operate in the exact same way – what AnandTech has deemed cheating on device benchmarking tests.”

Although this is a very reasonable explanation, it is still a bit fishy. It is perfectly understandable for performance to be ramped up for some applications, but the fact the ‘sports mode’ has been linked to the initiation of a benchmarking app as well as other functions (gaming for instance) suggests the aim is to fool the tests. Most reasonable individuals would assume these tests are performed in ‘normal’ mode.

Whether this is an adequate explanation, we’ll let the court of public opinion decide, but it is somewhat of a flimsy excuse.

Original MediaTek statement:

MediaTek follows accepted industry standards and is confident that benchmarking tests accurately represent the capabilities of our chipsets. We work closely with global device makers when it comes to testing and benchmarking devices powered by our chipsets, but ultimately brands have the flexibility to configure their own devices as they see fit. Many companies design devices to run on the highest possible performance levels when benchmarking tests are running in order to show the full capabilities of the chipset. This reveals what the upper end of performance capabilities are on any given chipset.

Of course, in real world scenarios there are a multitude of factors that will determine how chipsets perform. MediaTek’s chipsets are designed to optimize power and performance to provide the best user experience possible while maximizing battery life. If someone is running a compute-intensive program like a demanding game, the chipset will intelligently adapt to computing patterns to deliver sustained performance. This means that a user will see different levels of performance from different apps as the chipset dynamically manages the CPU, GPU and memory resources according to the power and performance that is required for a great user experience. Additionally, some brands have different types of modes turned on in different regions so device performance can vary based on regional market requirements.

We believe that showcasing the full capabilities of a chipset in benchmarking tests is in line with the practices of other companies and gives consumers an accurate picture of device performance.

TomTom to fill the Google mapping void for Huawei

Dutch navigation specialist TomTom has been announced as Huawei’s replacement for Google’s mapping expertise, following the firm’s entry on the US Entity List.

While there was no doomsday sirens sounding when the US banned suppliers from working with Huawei, the trickle-down effects are starting to become much more prominent, especially in the consumer business unit which has fuelled so much growth over the last few years.

“We can confirm that developers can now use TomTom Maps APIs, Map content and traffic services via Huawei’s developer portal,” a TomTom spokesperson said.

Details are thin on the ground for the moment, though TomTom has confirmed it has entered into a multi-year agreement to act as the powerhouse behind navigation, mapping and traffic applications which will feature on Huawei devices.

Huawei’s friction with the White House has been well-documented over the last 12-18 months, though the impact seems to be more of a slow-burner than apocalyptic. When similar sanctions were placed on ZTE in 2017, the disruption to the vendors supply chain was almost an extinction level event. Some US politicians might have hoped the same would be the same for Huawei, though the damage is much more nuanced.

Thanks to the ‘Made in China 2025’ and perhaps more foresight from the management team, Huawei has a much more diverse supply chain and less of a reliance on the US than ZTE. When President Trump signed the executive order banning US suppliers from working with Huawei, it was certainly notable, but the impact was muted, evidence by the fact Huawei’s revenues have continued to grow through the period.

But the consumer division, and Huawei’s smartphones in particular present some difficult questions. And almost all of them focus around Google.

No new Huawei devices will feature any of the Google applications. The immediate challenge is replacing the operating software, Android, but this is only the tip of the iceberg. For Huawei’s OS to be competitive, it needs to have a developer ecosystem, and for many of the applications to work properly, mapping data needs to be plugged into the applications.

While it might not have the reputation of Google, TomTom is certainly no stranger to the mapping and navigation game. Those who are a bit longer in the tooth might remember TomTom being a mapping innovator in the noughties, though it seemingly lost the battle for supremacy with Google. Few get the better of the Googlers, so there is little shame, though this could act as a spring board into a brighter future for TomTom.

TomTom claims to travel more than three million kilometres a year to collect mapping data, as well as augmenting this information with satellite imagery, as well as drawing from data from government and private sources, aerial imagery, and field analysts. The business already has numerous partnerships in place with the likes of Subaru, Alfa Romeo and Stelvio for driving navigation, as well as 5G initiatives with Verizon.

This is a critical step in validating the Huawei OS and developer ecosystem as location-based data is very important nowadays for the performance of many apps and security features. TomTom fills a noticeable hole.

What is worth noting is that while TomTom will offer mapping data to Huawei and the developer community, this is should not be seen as a direct replacement for the Google Maps application. This is a feature which offers basic navigation, which will be simple enough to replicate, though the embedded features will take time. Through Google Maps you can book tables at restaurants, see how busy trains are, access reviews on local business, amongst other benefits. This will take a significant amount of time to replace.

US bolsters AI ambitions with Open Government Data Act

President Trump has signed the Open Government Data Act into law, potentially unleashing a tsunami of data for AI applications to be trained with.

The bill itself has been bouncing around Washington for some time now, though it has officially been signed into law. Within one year, all government agencies will have to ensure data sets are open and accessible to the general public and businesses, as well as being presented in a format that can be easily processed by a computer without human intervention. The act also hopes to make the data more accessible through smartphones.

“The government-wide law will transform the way the government collects, publishes, and uses non-sensitive public information,” said Sarah Joy Hays, Acting Executive Director of the Data Coalition, a public interest group which promotes transparency in government and business.

“Title II, the Open Government Data Act, which our organization has been working on for over three and a half years, sets a presumption that all government information should be open data by default: machine-readable and freely-reusable.”

For the digital ecosystem, such a bill should be welcomed with open arms. For any AI application to work effectively it needs to be trained. For years, many have claimed data is the new oil, although we suspect they did not mean in this manner. If the US is to create a leadership position in the developing AI ecosystem, its applications will need to be the best around and therefore will have to have the appropriate data sets to improve performance and accuracy.

Open data is of course not a new idea however. Back in September during Broadband World Forum in Berlin, we sat through several entertaining presentations from individual cities laying out their smart city ambitions. There was one common theme throughout the session; open data. These local governments realise the potential of empowering local digital ecosystems through open data, and the initiatives are proving to be successful.

This new law will force all federal agencies to make all non-sensitive data public in a machine-readable format and catalogue it online. New individuals must be appointed as Chief Data Officers to oversee the process, and new procedures will be introduced. While it seems incredibly obvious, when proposing new laws or regulations agencies will now have to justify the changes with supporting data. As it stands, only a handful of agencies are required to do this, the FCC is one of them, though this law ensures the validation and justification of new rules through data is rolled out across the board.

As with everything to do with data, there are of course privacy concerns. The text of the bill does seem to take this into account, one clause states any data released to the public will have to adhere to the Privacy Act of 1974, though there is bound to be a few blunders. Such a tangent should compound the importance of hiring a Chief Data Officer and a team of individuals who are appropriately trained. We suspect there will be few current employees in the agencies who could ensure compliance here.

Of course, this is not a law which will make an immediate impact. With any fundamental changes, such as this, procedures and systems will have to be updated. The procurement process is most likely, or at least we hope, underway and there will certainly be growing pains.

That said, if the US wants to make a meaningful dent on the AI world, the right tools and data need to be put in the hands of the right people. This is a step in the right direction.

Feel like you’re being watched? Probably Google violating privacy rights

More often than not we’re writing positively about Google, but the ‘do no evil’ company has been caught out tracking smartphone locations even if the user has opted out.

An investigation from by Associated Press, and ratified by researchers at Princeton University, found several Google services on Android and iOS devices have been storing location data of users, even if the individual has set privacy settings to remain invisible. As privacy and the right to access personal data increasingly become hot-topics, Google might have stepped on a bit of a PR and legal landmine.

Generally Google is quite upfront about discussing privacy and location enablement. It has faced various fines over the years for data-dodginess and is even facing an European Commission investigation over its alleged suspect coercion of users into opting-in to various services, though this is potentially either an example of extreme negligence, or illegally misleading the consumer. Neither explanation is something Google execs would want to be associated with.

One of the issues here is the complexity of getting off the grid. Although turning location tracking off stops Google from adding location data to your accounts timeline, leaving ‘Web & App Activity’ on allows Google to collect other location markers.

We mentioned before this is either negligence or illegal activity, but perhaps this is just another example of an internet giant taking advantage of the fact not everyone is a lawyer. The small print is often the best friend of Silicon Valley. Few would know about this little trick from the Googlers which allows them to appear like the data privacy hero, while simply sneaking in through the slight ajar window in your kitchen.

“When Google builds a control into Android and then does not honour it, there is a strong potential for abuse,” said Jesse Victors, Software Security Consultant at Synopsys.

“It is sometimes extremely important to keep one’s location history private; such as visiting a domestic violence shelter, for example. Other times you may simply wish to opt out of data collection. It’s disingenuous and misleading to have a toggle switch that does not completely work. This, and other examples before it, are one of the reasons why my phone runs LineageOS, a Google-free fork of Android.”

On the company support page, Google states users can switch off location services for any of its services at any time, though this would obviously impact the performance of some. The Maps application for example cannot function without it, and does track user movements by the minute once switched on. With such opportunity for abuse, Google introduced pause features for some of its apps, allowing the user to become invisible for a undefined period of time.

The relationship with the user and the concept of trust is critical to Google. Revenues are generated off creating free services and implementing advertising platforms into the services, though to remain relevant Google needs the consumer data to improve applications. Without constant upgrades and fine-tuning, Google could not maintain the dominant position is enjoys today.

Collecting this data requires trust. The user must trust Google is not mishandling the data it acquires, but also respects the users right to privacy. Without this element of trust between the user and Google, it would not be able to acquire the critically important insight. With this revelation, Google has put a dent in its own credibility and damaged the relationship with the user.

The impact on Google overall will of course be limited. There are too many good stories to drown out the negative and ultimately the user needs Google. Such is the importance of Google’s services to the digital economy, or perhaps it should be worded as a lack of effective enough alternatives, we suspect few users will allow this invasion of privacy to impact their daily routines.

This is not supposed to be any form of validation for the contradictory ‘do no evil’ business, but more a sad truth of today.

Should privacy be treated as a right to protect stringently, or a commodity for users to trade for benefits?

Loading ... Loading ...

Streaming approaches half of all music industry revenues

It might not be anywhere as near as data intensive as video, but the growing influence of music streaming is another part of the network congestion question which needs to be factored in.

During the days of yesteryear, queues would form around the corner for the latest release of the next chart topper, but gone are those days. Those of more advanced years might look badly nostalgically about the anticipation of getting their hands on the latest Beatles banger (Ask Scott, Ray or Iain for more details), but now gratification is all about a simple click on the mouse.

Music is an aspect of the digital economy which is rarely discussed from a telco perspective, but it will start to have an impact before too long. Video is of course the big headache when it comes to traffic management and network congestion, but there are so many more moving cogs which collectively will have an impact; that is important to remember about them every now and then.

According to findings from MIDiA Research, music streaming is fuelling growth in the $17.4 billion global music industry, with a 39% year-on-year uplift in revenues to now represent 43% of the total revenues across the industry. An increase in revenues is the most obvious way to measure usage in the industry, but when popular streaming services like Spotify offer ‘all you can eat’ music, revenues do not perhaps tell the entire story.

Looking at the bitrates used by some of the more popular services, Apple Music uses a 256 kbps bitrate, which suggest an hour of music streaming would eat up 115 MB, while other apps use multiple options. Google’s music offering has three categories; the bitrate on low ranges from 96 kbps to 128 kbps, medium is 256 kbps and high is 320 kbps. On low quality you could use between 43 MB and 58 MB in an hour, while on high it would be 144 MB. On Spotify, the default mobile bitrate is 96 kbps and for desktop is 160 kbps, while users have the option of using 320 kbps.

In terms of users, Sportify now has at least 71 million subscribers (as of December 2017) and 157 active users worldwide. Apple has said it has 36 million subscribers, while Google has around 7 million (2017), including its YouTube subscriptions. These are not mind blowing numbers, but growth is continuing to be very healthy.

Over the course of 2017, users in the US spend more than 32 hours a week listening to music, up from 26.6 hours according to Nielsen Music research, with on-demand streaming up 12.5% year-on-year. This might not sound massive but streaming music has been normalised for years. The accelerated growth you usually see at the beginning was a long time ago, though 12.5% is still a significant number to bear in mind. These numbers will have a notable impact on the information highway.

Video is continuing to grow, but less data intensive trends are continuing to play a role in the connected era. Music streaming might not be a game changer when it comes to network congestion, but when you add up all the minor impacts of music, gaming, navigation, messaging etc. the headaches start to become a bit more varied. Always worth noting every now and then.