
The deal pulls one of the biggest custom ASIC designers into Nvidia's proprietary interconnect ecosystem.
When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works .
Nvidia announced today that it has invested $2 billion in Marvell Technology and entered a partnership connecting Marvell to Nvidia's AI factory and AI-RAN ecosystem through NVLink Fusion, the tech that allows third-party silicon to plug directly into Nvidia's proprietary interconnect fabric.
Marvell is one of the two dominant custom ASIC design houses alongside Broadcom. Its clients include AWS, for which it has helped develop the Trainium series of AI accelerators , as well as Microsoft and Google . These custom chips exist, in large part, to give hyperscalers an alternative to buying Nvidia GPUs, making Nvidia's investment in the company somewhat noteworthy.
Per the deal, Marvell will provide custom XPUs and NVLink Fusion-compatible scale-up networking, while Nvidia will supply Vera CPUs, ConnectX NICs, Bluefield DPUs, NVLink interconnect, and Spectrum-X switches. The two companies will also collaborate on silicon photonics and AI-RAN infrastructure for 5G and 6G networks.
You may like Nvidia invests $4 billion into photonics firms in a bid to bolster data center interconnect supply chains Nvidia pumps another $2 billion into CoreWeave and announces standalone availability of Vera CPU How Nvidia's $20 billion Groq 3 LPU deal reshapes the Nvidia Vera Rubin Platform "The inference inflection has arrived. Token generation demand is surging, and the world is racing to build AI factories," said Jensen Huang, founder and CEO of Nvidia. "Together with Marvell, we are enabling customers to leverage Nvidia's AI infrastructure ecosystem and scale to build specialized AI compute."
NVLink Fusion, first announced in May last year , enables heterogeneous AI infrastructure where non-Nvidia accelerators can communicate with Nvidia GPUs, CPUs, and networking hardware over NVLink's high-bandwidth, low-latency fabric. Platforms built through the program must include at least one Nvidia product, whether a CPU, GPU, or switch.
Marvell's contribution, meanwhile, focuses on custom XPUs and high-speed optical interconnects. The company reported $8.2 billion in revenue for its fiscal year 2026 (ended January 2026), with data center revenue accounting for more than 74% of the total. Marvell's Celestial AI acquisition late last year added photonic fabric technology to its portfolio, and this deal now places that capability inside Nvidia's ecosystem.
"By connecting Marvell's leadership in high-performance analog, optical DSP, silicon photonics and custom silicon to Nvidia's expanding AI ecosystem through NVLink Fusion, we are enabling customers to build scalable, efficient AI infrastructure," said Matt Murphy, chairman and CEO of Marvell.
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Key considerations
- Investor positioning can change fast
- Volatility remains possible near catalysts
- Macro rates and liquidity can dominate flows
Reference reading
- https://www.tomshardware.com/tech-industry/SPONSORED_LINK_URL
- https://www.tomshardware.com/tech-industry/nvidia-invests-2-billion-in-marvell-to-deepen-nvlink-fusion-partnership#main
- https://www.tomshardware.com
- Modder gets Intel's OEM-only 'Bartlett Lake' CPU to post on a regular Asus Z790 motherboard — BIOS was edited by Claude AI to make Core Ultra 9 273QPE boot
- At just $17.68 per Terabyte, this 22TB external Seagate Expansion HDD is one of the cheapest hard drives on the market — backup copious amounts of data for $389
- New NVIDIA Nemotron 3 Super Delivers 5x Higher Throughput for Agentic AI
- Dev showcases ‘seamless, massive world with zero loading screens on N64 hardware’ — 30-year-old Nintendo retro console coaxed into draw distances matching the s
- Iran issues direct strike threat to Nvidia, Microsoft, Apple, Google, 14 other US tech companies — 'These companies should expect destruction of their facilitie
Informational only. No financial advice. Do your own research.