Mobile chip-maker Qualcomm reckons all the stuff it has learned about processing AI in smartphones will come in handy in datacentres too.
The Qualcomm Cloud AI 100 Accelerator is a special chip designed to process artificial intelligence in the cloud. Specifically Qualcomm seems to think it has an advantage when it comes to ‘AI inference’ processing – i.e. using algorithms that have been trained with loads of data. This stands to reason as it has its chips in millions of smart devices, all of which will have been asked to do some inference processing of their own from time to time.
“Today, Qualcomm Snapdragon mobile platforms bring leading AI acceleration to over a billion client devices,” said Qualcomm Product Management SVP Keith Kressin. “Our all new Qualcomm Cloud AI 100 accelerator will significantly raise the bar for the AI inference processing relative to any combination of CPUs, GPUs, and/or FPGAs used in today’s datacentres. Furthermore, Qualcomm Technologies is now well positioned to support complete cloud-to-edge AI solutions all connected with high-speed and low-latency 5G connectivity.”
The datacentre chips in question will largely be provided by Intel, although Nvidia has done a great job of converting its struggling mobile chip efforts into a successful AI processing operation. Qualcomm claims a 10x performance per watt advantage over incumbent AI inference chips and, while it didn’t call out any competitors in its press release, the predominance of their names in the headlines of other stories covering this launch makes it likely that has been the angle behind the scenes.