
When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works .
It's perfectly normal to consider that Nvidia and AMD are the only players in the AI accelerator space, at least for the time being. Other marques want a slice of that pie, though, and Qualcomm is among them. The Snapdragon maker has finally scored a big deployment, installing 1,024 AI100 chips in Saudi Arabia's Humain outfit, its CEO announced . There's only one slight issue, though: AI100 unveiled in 2019 , and is looking pretty old by today's standards.
The AI100 has been available as a drop-in card since mid-2023 , but its architecture is now about six years old. Although at the time it was a promising design banking on power efficiency for inference tasks, it's a pretty tough sell today as its small memory capacity (only 128 GB in the Ultra variant) limits the size of the models it can run — reportedly only those with up to 32B parameters. In 2026 terms, that's peanuts, as contemporary reasoning models use tens of times that amount.
As far as we could tell, Humain's deployment is the very first one at scale for Qualcomm's wares, possibly signaling that the U.S. company is exceedingly late to the party and more than a few dollars short. Even still, the AI100 racks must have some redeeming qualities for Humain to have bought them. Since latest-gen chips are on back order for at least months if not years, and every slab of silicon is being hoovered by the likes of OpenAI, Oracle, et al , perhaps Humain opted to go with what it could actually get its hands on, even if it's old hardware.
AMD denies report of MI455X delays as Nvidia VR200 systems are rumored to arrive early — company says Helios systems 'on target for 2H 2026'
Microsoft introduces newest in-house AI chip — Maia 200 is faster than other bespoke Nvidia competitors, built on TSMC 3nm with 216GB of HBM3e
Cambricon targets 500,000 AI chips in 2026 as China accelerates domestic hardware push
The Saudi outfit announced partnerships with Nvidia, AMD, and Qualcomm in May 2025. Said announcements earmarked 18,000 of Nvidia's GB300 Grace Blackwell accelerators, and 500 MW worth of compute capacity from AMD wares. Adobe is reportedly Humain's first AI datacenter customer, so one might hypothesize that the Qualcomm AI100 accelerators are fine for basic image-fill and generation tasks. For its part, Qualcomm has already announced its AI200 chip for late 2026 and AI250 for 2027. Let's hope the timeline actually sticks this time around.
Follow Tom's Hardware on Google News , or add us as a preferred source , to get our latest news, analysis, & reviews in your feeds.
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Key considerations
- Investor positioning can change fast
- Volatility remains possible near catalysts
- Macro rates and liquidity can dominate flows
Reference reading
- https://www.tomshardware.com/tech-industry/artificial-intelligence/SPONSORED_LINK_URL
- https://www.tomshardware.com/tech-industry/artificial-intelligence/qualcomms-2019-vintage-ai100-chip-finally-scores-a-major-deployment-saudi-arabias-humain-takes-delivery-of-1-024-systems#main
- https://www.tomshardware.com
- New SemiAnalysis InferenceX Data Shows NVIDIA Blackwell Ultra Delivers up to 50x Better Performance and 35x Lower Costs for Agentic AI
- GeForce NOW Brings GeForce RTX Gaming to Linux PCs
- LG opens pre-orders for massive 52-inch 5K2K curved monitor — $1,999 monster is built for both gaming and productivity, with 240Hz refresh rate
- GeForce NOW Brings GeForce RTX Gaming to Linux PCs
- DJI sues the FCC over its prohibition on importing new foreign-made drones into the US — Chinese firm contests its placement on the regulator's 'covered list'
Informational only. No financial advice. Do your own research.