UK mobile chip design giant Arm has created specialised chip designs specifically for machine learning and object detection.
Arm, which at some stage in the past few months seems to have decided its name is no longer an abbreviation of Advanced RISC Machines and is instead a type of limb, is best known for providing the designs for mobile chips such as applications and baseband processors. As the tech world gets increasingly keen on artificial intelligence and mobile edge computing, it makes sense for Arm to get involved at a silicon level.
This has taken the form of Project Trillium, which is described as ‘a suite of Arm IP including new highly scalable processors that will deliver enhanced machine learning (ML) and neural network (NN) functionality.’ The point of it seems to be to equip mobile devices with a degree of autonomous (as opposed to cloud-based) machine learning capability that they currently lack.
“The rapid acceleration of artificial intelligence into edge devices is placing increased requirements for innovation to address compute while maintaining a power efficient footprint,” said Rene Haas, President of the IP Products Group at Arm. “To meet this demand, Arm is announcing its new ML platform, Project Trillium. New devices will require the high-performance ML and AI capabilities these new processors deliver.”
The main chip design is the ML one, which puts a premium on scalability – presumably meaning more chips equals more ML power. On top of that Arm has launched a distinct design for object detection, which covers things like facial recognition and the detection of other objects via the device’s camera. The two apparently perform even better in combination and better when you throw special Arm neural network software.
Jem Davies, Arm’s GM of ML, has blogged on the launch and unsurprisingly thinks ML is the biggest thing since sliced bread. “In my opinion, the growth of machine learning represents the biggest inflection point in computing for more than a generation,” he blogged. “It will have a massive effect on just about every segment I can think of. People ask me which segments will be affected by ML, and I respond that I can’t think of one that won’t be.”
As a scuba diver Davies chose a diving illustration to show how cool life could be when everything has ML chips embedded in it. You could have a heads-up-display in your mask that provides real-time augmented reality information and even automated action, such as defensive counter-measures if a shark should suddenly turn up unannounced.
AI and the various other bits of computer cleverness that are generally associated with it, are very much in vogue in the mobile space these days. We’ve been broken in gently by the cloud-driven smart assistants like Siri, but enabling much of that processing to be done locally offers clear advantages. On the back of Project Trillium expect chip vendors, and consequently devices vendors, to be offering novel AI features before long.