
Follow Tom's Hardware on Google News , or add us as a preferred source , to get our latest news, analysis, & reviews in your feeds.
Anton Shilov Social Links Navigation Contributing Writer Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.
usertests The systems will be used to build three-trillion-parameter AI simulation models as well as for classic scientific computing. Before anyone complains, these are dual-use systems with FP64 capability. And machine learning has been effectively used for scientific discoveries for years. For example, AlphaFold, which got its creators a Nobel Prize in Chemistry. Reply
JRStern Yeah but you know they're just doing this because everybody's doing this and it sounds good. You ask me we are already at about 300% overcapacity moving fast for 2500% overcapacity. Need a lot more R&D which will do two things, first it will fix some of the major shortcomings of current LLMs, and second it will find a way to make current technology work about 1000x cheaper. Then we'll be able to build 100x current capacity for less than 10% of what we've spent so far. Reply
jp7189 JRStern said: Yeah but you know they're just doing this because everybody's doing this and it sounds good. You ask me we are already at about 300% overcapacity moving fast for 2500% overcapacity. Need a lot more R&D which will do two things, first it will fix some of the major shortcomings of current LLMs, and second it will find a way to make current technology work about 1000x cheaper. Then we'll be able to build 100x current capacity for less than 10% of what we've spent so far. Let's hope these aren't used for LLMs. That's been done and there is plenty of commercial competition to push that concept forward. Massive simulations should be the focus for these… nuke power (hopefully), weapons (unfortunately), signals (generation, propagation, interference), medicine.. just to name a few. All of these AI models benefit from more parameters and more precision, are important to our future, and don't have immediate ROI (meaning it's in the realm of gov research rather than commercial interest) Reply
JRStern jp7189 said: Let's hope these aren't used for LLMs. 100,000 B200's is optimized for LLMs and only LLMs. I suppose it's also good for keeping your coffee warm but perhaps not optimal for that purpose. Reply
Key considerations
- Investor positioning can change fast
- Volatility remains possible near catalysts
- Macro rates and liquidity can dominate flows
Reference reading
- https://www.tomshardware.com/tech-industry/supercomputers/SPONSORED_LINK_URL
- https://www.tomshardware.com/tech-industry/supercomputers/nvidia-and-partners-to-build-seven-ai-supercomputers-for-the-u-s-govt-with-over-100-000-blackwell-gpus-combined-performance-of-2-200-exaflops-of-compute#main
- https://www.tomshardware.com
- Intel's revolutionary 54-year-old 4004 chip was the world's first programmable microchip — 2,300-transistor 10,000nm processor exposed
- QNAP's new NAS brings exotic data center 'ruler' SSDs into your house — massive ES.1 form factor SSDs for up to 19.2TB of storage for $4,399
- NVIDIA Launches Omniverse DSX Blueprint, Enabling Global AI Infrastructure Ecosystem to Build Gigawatt-Scale AI Factories
- WD launches investigation into problems with its controversial SMR hard drives — same drives that got WD sued in 2021 now reporting failure rates due to 'fundam
- Australia's police to use AI to decode criminals' emoji slang to curb online crime — "crimefluencers" will be decoded and translated for investigators
Informational only. No financial advice. Do your own research.