Nvidia and partners to build seven AI supercomputers for the U.S. gov’t with over 100,000 Blackwell GPUs —combined performance of 2,200 ExaFLOPS of compute

Nvidia and partners to build seven AI supercomputers for the U.S. gov't with over 100,000 Blackwell GPUs —combined performance of 2,200 ExaFLOPS of compute

Follow Tom's Hardware on Google News , or add us as a preferred source , to get our latest news, analysis, & reviews in your feeds.

Anton Shilov Social Links Navigation Contributing Writer Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

usertests The systems will be used to build three-trillion-parameter AI simulation models as well as for classic scientific computing. Before anyone complains, these are dual-use systems with FP64 capability. And machine learning has been effectively used for scientific discoveries for years. For example, AlphaFold, which got its creators a Nobel Prize in Chemistry. Reply

JRStern Yeah but you know they're just doing this because everybody's doing this and it sounds good. You ask me we are already at about 300% overcapacity moving fast for 2500% overcapacity. Need a lot more R&D which will do two things, first it will fix some of the major shortcomings of current LLMs, and second it will find a way to make current technology work about 1000x cheaper. Then we'll be able to build 100x current capacity for less than 10% of what we've spent so far. Reply

jp7189 JRStern said: Yeah but you know they're just doing this because everybody's doing this and it sounds good. You ask me we are already at about 300% overcapacity moving fast for 2500% overcapacity. Need a lot more R&D which will do two things, first it will fix some of the major shortcomings of current LLMs, and second it will find a way to make current technology work about 1000x cheaper. Then we'll be able to build 100x current capacity for less than 10% of what we've spent so far. Let's hope these aren't used for LLMs. That's been done and there is plenty of commercial competition to push that concept forward. Massive simulations should be the focus for these… nuke power (hopefully), weapons (unfortunately), signals (generation, propagation, interference), medicine.. just to name a few. All of these AI models benefit from more parameters and more precision, are important to our future, and don't have immediate ROI (meaning it's in the realm of gov research rather than commercial interest) Reply

JRStern jp7189 said: Let's hope these aren't used for LLMs. 100,000 B200's is optimized for LLMs and only LLMs. I suppose it's also good for keeping your coffee warm but perhaps not optimal for that purpose. Reply

Key considerations

  • Investor positioning can change fast
  • Volatility remains possible near catalysts
  • Macro rates and liquidity can dominate flows

Reference reading

More on this site

Informational only. No financial advice. Do your own research.

Leave a Comment