Anthropic in early talks to buy DRAM-less AI inference chips from UK startup — Fractile’s SRAM architecture reduces need for pricey memory during extreme pricin

Anthropic in early talks to buy DRAM-less AI inference chips from UK startup — Fractile's SRAM architecture reduces need for pricey memory during extreme pricin

The Claude developer is exploring a fourth chip supplier alongside Nvidia, Google, and Amazon.

When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works .

Fractile's chips aren’t expected to reach commercial readiness until around 2027, placing any deployment well outside Anthropic's near-term procurement plans and roughly inside the same window as its Google-Broadcom TPU partnership .

Founded in 2022 by Oxford PhD Walter Goodwin, Fractile is developing an inference chip that co-locates memory and compute on the same die using SRAM rather than shuttling data to separate DRAM chips . That data movement between the GPU and off-chip DRAM is one of the main bottlenecks in running large AI models at speed.

You may like Meta's new MTIA lineup joins hyperscalers' unified push for dedicated inferencing chips Broadcom to supply Anthropic with 3.5 gigawatts of Google TPU capacity from 2027 Meta reveals four new MTIA chips built for AI inference — to be released on a six-month cadence Goodwin told Fortune in July 2024 that Fractile's design stores data needed for computations directly next to the transistors that perform the arithmetic, rather than relying on off-chip DRAM. Based on simulations at the time, Goodwin said Fractile could run a large language model 100 times faster and 10 times cheaper than Nvidia's GPUs , though the company had not yet manufactured test chips.

The company raised $15 million in seed funding, co-led by Kindred Capital, the NATO Innovation Fund, and Oxford Science Enterprises. Fractile is now in talks to raise $200 million at a $1 billion-plus valuation, with Founders Fund, 8VC, and Accel among the potential investors. The Fractile team reportedly includes engineers from Graphcore, Nvidia, and Imagination Technologies, and the company is building its own software stack alongside the hardware.

Anthropic has deliberately avoided dependence on any single chip vendor, running Claude on Nvidia GPUs, Amazon's Trainium processors through Project Rainier, and Google's TPUs under a deal announced in October that provided over 1GW of compute capacity. In early April, that expanded to 3.5GW of TPU capacity from 2027 through 2031.

The interest in Fractile coincides with surging demand on Anthropic's existing infrastructure. The company's annualized revenue run rate passed $30 billion in March, up from around $9 billion at the end of 2025, and its inference costs have been a drag on gross margins. Unlike OpenAI and xAI, which are building or expanding their own massive data center footprints, Anthropic has opted to rent capacity from multiple providers and negotiate leverage through diversified chip supply.

Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.

Key considerations

  • Investor positioning can change fast
  • Volatility remains possible near catalysts
  • Macro rates and liquidity can dominate flows

Reference reading

More on this site

Informational only. No financial advice. Do your own research.

Leave a Comment