Paul Krill
Editor at Large

New Arm partnerships extend AI performance from edge to cloud

news
19 Sep 20242 mins
Generative AIMachine LearningPython

Integration with PyTorch and ExecuTorch brings Arm computing performance to the machine learning stack for developers.

Digital technology, software development concept. Coding programmer working on laptop with circuit board and javascript on virtual screen
Credit: Tippapatt

Looking to bring AI and machine learning workloads to Arm-based hardware, Arm is integrating its Arm Kleidi AI acceleration technology with PyTorch and ExecuTorch, the new on-device inference runtime from PyTorch.

The arrangement, announced September 16, is intended to extend AI performance benefits from the edge to the cloud. With support for PyTorch and ExecuTorch, the next generation of apps can run large language models (LLMs) on Arm CPUs. Arm has partnered with PyTorch and TensorFlow to integrate Arm Kleidi libraries directly into these frameworks. KleidiAI, a library for AI/ML framework developers, is also slated to be integrated into ExecuTorch in October 2024.

Since its launch four months ago, Kleidi has accelerated development and unlocked major AI performance uplifts on Arm CPUs, said Alex Spinelli, vice president of developer technology at Arm. In the cloud, Kleidi builds on existing work enhancing PyTorch with the Arm Compute Libraries (ACL) and offering a blueprint for optimizing AI on Arm. With Kleidi, applications can benefit from performance improvements when new framework versions are released, but without taking extra steps to build on Arm today.

Kleidi is built on three pillars:

  • Open Arm technology directly integrated into key frameworks, enabling LLMs to access the performance of Arm CPUs.
  • Developer empowerment through resources including usage guidance, learning paths, and demonstrations.
  • An ecosystem of machine learning software providers, frameworks, and open source projects with access to the latest AI features.

Spinelli said Arm is working closely with key parts of the machine learning stack for developers, including cloud service providers like AWS and Google and the growing ML ISV community, such as Databricks. Arm also is building demonstration software stacks to show developers how to build AI workloads on Arm CPUs.

Exit mobile version