
When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works .
It's perfectly normal to consider that Nvidia and AMD are the only players in the AI accelerator space, at least for the time being. Other marques want a slice of that pie, though, and Qualcomm is among them. The Snapdragon maker has finally scored a big deployment, installing 1,024 AI100 chips in Saudi Arabia's Humain outfit, its CEO announced . There's only one slight issue, though: AI100 unveiled in 2019 , and is looking pretty old by today's standards.
The AI100 has been available as a drop-in card since mid-2023 , but its architecture is now about six years old. Although at the time it was a promising design banking on power efficiency for inference tasks, it's a pretty tough sell today as its small memory capacity (only 128 GB in the Ultra variant) limits the size of the models it can run — reportedly only those with up to 32B parameters. In 2026 terms, that's peanuts, as contemporary reasoning models use tens of times that amount.
As far as we could tell, Humain's deployment is the very first one at scale for Qualcomm's wares, possibly signaling that the U.S. company is exceedingly late to the party and more than a few dollars short. Even still, the AI100 racks must have some redeeming qualities for Humain to have bought them. Since latest-gen chips are on back order for at least months if not years, and every slab of silicon is being hoovered by the likes of OpenAI, Oracle, et al , perhaps Humain opted to go with what it could actually get its hands on, even if it's old hardware.
AMD denies report of MI455X delays as Nvidia VR200 systems are rumored to arrive early — company says Helios systems 'on target for 2H 2026'
Microsoft introduces newest in-house AI chip — Maia 200 is faster than other bespoke Nvidia competitors, built on TSMC 3nm with 216GB of HBM3e
Cambricon targets 500,000 AI chips in 2026 as China accelerates domestic hardware push
The Saudi outfit announced partnerships with Nvidia, AMD, and Qualcomm in May 2025. Said announcements earmarked 18,000 of Nvidia's GB300 Grace Blackwell accelerators, and 500 MW worth of compute capacity from AMD wares. Adobe is reportedly Humain's first AI datacenter customer, so one might hypothesize that the Qualcomm AI100 accelerators are fine for basic image-fill and generation tasks. For its part, Qualcomm has already announced its AI200 chip for late 2026 and AI250 for 2027. Let's hope the timeline actually sticks this time around.
Follow Tom's Hardware on Google News , or add us as a preferred source , to get our latest news, analysis, & reviews in your feeds.
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Key considerations
- Investor positioning can change fast
- Volatility remains possible near catalysts
- Macro rates and liquidity can dominate flows
Reference reading
- https://www.tomshardware.com/tech-industry/artificial-intelligence/SPONSORED_LINK_URL
- https://www.tomshardware.com/tech-industry/artificial-intelligence/qualcomms-2019-vintage-ai100-chip-finally-scores-a-major-deployment-saudi-arabias-humain-takes-delivery-of-1-024-systems#main
- https://www.tomshardware.com
- Portable USB DVD writer with 2.5-inch SATA and SD card dock slashed by 30% to $21 — optical drive can read and write CDs at 24X, and DVDs at 8X
- AI boosted US economy by 'basically zero' in 2025, says Goldman Sachs chief economist — 'We think there's been a lot of misreporting of the impact that AI inves
- Save $558 on a Ryzen 7 9850X3D and X870 motherboard bundle at Newegg — $1,339 includes RAM priced at $1,129, free 2TB portable SSD, and AMD's latest gaming chip
- AMD and Meta strike $100 billion AI deal that includes 10% stock deal — 6 gigawatt agreement includes up to 160 million AMD shares
- All About the Games: Play Over 4,500 Titles With GeForce NOW
Informational only. No financial advice. Do your own research.