Arm has unveiled its first-ever data centre CPU. The chip, called the Arm AGI CPU, has been designed for Agentic AI – an AI system that reasons, plans and acts autonomously rather than simply responding to individual queries. The England-based company has also released a list of technology companies that will be using the new processor.
Why this is important for Arm
Historically, Arm’s business model has been licensing its chip architecture to other companies, who then design and manufacture their own processors based on that technology. It is a model that has put Arm's architecture inside hundreds of billions of devices – from smartphones to servers. The latest announcement changes this business model in a fundamental way. For the first time, Arm is producing the chip.
“AI has fundamentally redefined how computing is built and deployed. Agentic computing is accelerating that change. Today marks the next phase of the Arm compute platform and a defining moment for our company,” said Rene Haas, CEO of Arm.
The move gives partners a new option: rather than choosing between licensing Arm's intellectual property or adopting its Compute Subsystems, they can now deploy Arm-designed silicon directly.
What the Arm’s new chip actually delivers
The technical specifications of the Arm AGI CPU are designed for Agentic AI.
The processor packs up to 136 Arm Neoverse V3 cores per CPU, delivering leading performance per core with 6GB/s memory bandwidth per core at sub-100 nanosecond latency. It operates at a 300-watt thermal design power (TDP), with a dedicated core per program thread — meaning it can sustain consistent performance under heavy load without throttling or wasting idle threads.
In terms of density, the chip supports high-density 1U server chassis for air-cooled deployments with up to 8,160 cores per rack, and liquid-cooled systems capable of delivering more than 45,000 cores per rack. Compared to x86 CPUs, Arm claims the AGI CPU delivers more than 2x performance per rack — a gain it says could translate into up to $10 billion in capital expenditure savings per gigawatt of AI data centre capacity.
Meta is lead partner and co-developer
The most significant partnership in the launch is with Meta – the parent company of Facebook, Instagram and WhatsApp – which is not simply a customer but a co-developer of the chip. Meta worked alongside Arm to build the AGI CPU, and will deploy it to optimise the infrastructure underpinning its family of apps. The chip will work alongside Meta's own custom silicon — the Meta Training and Inference Accelerator (MTIA) — enabling more efficient orchestration in large-scale AI systems.
"We worked alongside Arm to develop the Arm AGI CPU to deploy an efficient compute platform that significantly improves our data center performance density," said Santosh Janardhan, head of infrastructure at Meta.
The full partner list
Beyond Meta, confirmed partners deploying the chip include
Cerebras, Cloudflare, F5, OpenAI, Positron, Rebellions, SAP, and SK Telecom.
On the manufacturing and systems side, Arm has partnered with lead OEMs and ODMs including
ASRock Rack, Lenovo, Quanta Computer and Supermicro.
The broader ecosystem supporting the platform's expansion into silicon now spans more than 50 leading companies across hyperscale computing, cloud, silicon design, memory, networking, software, and manufacturing. That list includes some of the most important names in the industry:
Watch
Nvidia Makes History: First Company to Hit $4 Trillion Market Cap
Amazon Web Services (AWS), Google Cloud, Microsoft Azure, and TSMC, which is manufacturing the Arm AGI CPU using its advanced 3nm process technology. Other major names lending their support include
Broadcom, Marvell, Micron, Samsung, SK hynix, Hugging Face, Databricks, Oracle Cloud, Red Hat, Snowflake, Cisco, Arista, MediaTek, GitHub, and many others.
Nvidia CEO Jensen Huang pointed to a partnership spanning nearly two decades: “Arm's adaptability has made it possible for us to integrate Arm across all of our platforms and for all different phases of AI. Together we're creating one seamless platform, from cloud to edge to AI factories.”