Intel looks to maps to fuel autonomous vehicles race

Whoever wins the autonomous car race will make a fortune, so Intel is doubling down efforts. Millions are being directed towards R&D, and building its own mapping database is another good move.

Speaking at CES in Las Vegas, Mobileye (an Intel business) CEO Amnon Shashua said two million cars from BMW, Nissan and Volkswagen will all be fitted with a front-facing camera, which will aggregate location and environment data in the cloud, building high-definition maps with Mobileye’s Road Experience Management (REM) program. It might not be the most exciting aspect of autonomous vehicles, but an extensive mapping database is a good tool to have.

The boring parts of a technology are usually some of the most important, and this is no different. For an autonomous vehicle to work, it has to know what is going on around it. This isn’t simply a case of collecting visual data from the immediate surroundings, but being able to plan a journey at the beginning, or adjusting plans on data which has been collected from other vehicles further ahead. It isn’t the bouncing excitement of AI-processors or super-sensitive cameras, but it is just as (or arguably, more) important.

Aside from the BMW, Nissan and Volkswagen cars, relationships have also been announced with NavInfo and SAIC Motor, allowing the team to collect data in China. Considering most companies who have extensive mapping databases (Google, Uber etc.) have had difficulties operating in the Chinese market, these could be very significant partnerships.

As it stands, there are very few organizations which could answer the calls of the industry for suitably detailed mapping data, but Intel could soon be one of those. Assuming enough vehicles are sold, the database could be extensively populated without specific projects to collect the data, like Uber is doing now or Google has been doing for years. Such a database could make Intel a very attractive company for car brands to work with.

For the moment, this push is seemingly all about mapping data, but assuming the cameras get on enough vehicles, and the team is able to nail real-time data analytics, it would be a very useful database in the future. Intel has said the cameras on ADAS-equipped vehicles are intelligent agents that can also be used to collect dynamic data. Data on hazards, construction, traffic density and weather could be routed to other vehicles to allow for more efficient driving.

Mobileye recently signed a next-step agreement with Volkswagen to formalize the collection and marketing of this data, while there are also relationships the city of Dusseldorf, Spain Directorate-General of Traffic, Gett Taxi, Berkshire Hathaway GUARD Insurance, and Buggy TLC Leasing for the use of this information.

Data is the new oil in the digital economy, and should Intel be able to horde enough through activities like this, it could turn out to be a very useful revenue stream. All of a sudden, Intel is no longer just a manufacturer of processors, but a lean, mean, data machine which has a business model suitable for the connected era.

Right now, the team has been shouting about Level 2/3 autonomous driving (some self-driving capabilities, but humans will still need to be aware), as well as Level 2+ breakthroughs. Level 2+ is a new one for us, but generally these sorts of claims are when the engineering team is struggling with a breakthrough, but the marketing team need something new to say. Intel isn’t a multi-national for no reason, it has some of the finest PR-minds and spin-gurus in the business.

Intel won’t be able to make money with this information, or at least any serious cash, for some time, but as long as it keeps collecting it will be an incredibly valuable resource once society is ready for the mass market penetration of autonomous vehicles.

Nvidia claims autonomous driving breakthrough, but let’s see

Nvidia has attempted to jump-start the CES PR euphoria, claiming it can achieve Level 5 autonomous driving right now with its Xavier processors.

The chip itself was initially announced 12 months ago, but this quarter has seen the processor delivered to customers. Testing has begun, and Nvidia has been stoking the fire with a very bold claim.

“Delivering the performance of a trunk full of PCs in an auto-grade form factor the size of a license plate, it’s the world’s first AI car supercomputer designed for fully autonomous Level 5 robotaxis,” Nvidia said on its blog.

Hyping up a product to almost undeliverable heights is of course nothing new in the tech world, and Nvidia has learned from the tried and tested playbook. Make an incredibly exceptional claim for a technology which is unlikely to be delivered to the real world for decades.

Xavier will form part of the Nvidia’s Drive software stack, containing 9 billion transistors. It is the product of a four-year project, sucking up $2 billion in research and development funds, with contributions from 2,000 engineers. It is built around an 8-core CPU, a 512-core Volta GPU, a deep learning accelerator, computer vision accelerators and 8K HDR video processors. All to deliver Level 5 autonomous driving.

Just as a recap, Level 5 autonomous driving is the holy grail. At this point, humans will not be needed to interact with the car at any point:

  • Level 0: Full-time performance by the human driver
  • Level 1: Driving assistance of either steering or acceleration/deceleration using information about the driving environment. Human drives the rest of the time.
  • Level 2: The system can be responsible for both steering and acceleration/deceleration using information about the driving environment. This could be described as hands off automation.
  • Level 3: This is known officially as conditional automation. The autonomous driving system will be responsible for almost all aspects of the dynamic driving task. Humans will still need to be aware to intervene in certain circumstances. This could be described as eyes off automation.
  • Level 4: The car will be almost fully-autonomous, though there might be rare circumstances where a human would have to intervene. Aside from the most extreme circumstances, this could be described as mind off automation.
  • Level 5: Full autonomy. You don’t even have to be awake.

During the same pre-CES event, the team also announced AR products, new partnerships and solutions in the gaming space, but Level 5 autonomy is the headline maker. Reaching this level is all well and good, but the technology does not have a foot in reality just yet. Nvidia might be there in terms of technological development, so it claims, but that does not mean autonomous cars will be hitting the roads any time soon. Not by a long way.

Firstly, while the processors might be there, the information is not. Companies like Google have been doing a fantastic job at creating mapping solutions, but the details is still not there for every single location on the planet. Until you can accurately map every single scenario and location a car may or may not end up in, it is impossible to state with 100% accuracy that Level 5 autonomous vehicles are achievable.

Secondly, to live the autonomous dream, a smart city is necessary. To optimize driving conditions, the car will need to receive data from the traffic lights to understand the flow of vehicles, and also any unusual circumstances. To ensure safety and performance, connectivity will have to be ubiquitous. The smart city dream is miles away, and therefore the autonomous vehicles dream is even further.

Thirdly, even if the technology is there, everything else isn’t. Regulations are not set up to support autonomous vehicles, neither is the insurance industry or the judicial system. If an autonomous vehicle is involved in a fatal incident, who get prosecuted? Do individuals need to be insured if they are asleep in the car? There are many unanswered questions.

Finally, when will we accept autonomous vehicles? Some people are incapable of sitting in a passenger seat while a loved one drives, how will these individuals react to a computer taking charge? Culturally, it might be a long time before the drivers of the world are comfortable handing control over to a faceless piece of software.

Nvidia might be shouting the loudest in the race to autonomous vehicles right now, but let’s put things in perspective; it doesn’t actually mean anything.