
Moving to modern games, our intrepid tester tried Arc Raiders, Cyberpunk 2077, and CS2. Despite all three titles lacking multi-GPU support, Arc Raiders and CS2 ran at playable frame rates, albeit with extremely low graphics settings. At the lowest settings — 70% resolution scaling at 1080p — Arc Raiders ran at around 40-45 FPS. CS2 fared better, with 120-160 FPS depending on the scene. Cyberpunk 2077 handled the worst on this card, producing just 20-30 FPS on the S10000 at the lowest settings with FSR set to its ultra performance profile.
The S10000 ranked among the most powerful workstation cards in 2012, as Nvidia had shied away from dual-GPU professional cards at the time. The S10000 sported two Tahiti GPUs featuring 1,792 shader cores, 112 TMUs, 32 ROPs, and 28 CUs each. Memory was split across two 384-bit memory interfaces for each GPU, and each chip was connected to 3GB of GDDR5.
RandomInGamingHD's work to resurrect his particular S10000 shows how potent the dual-GPU graphics card was at the time. It was essentially an HD 7990 consumer graphics card, but with disabled cores on each die. If the second GPU worked in today's titles, the S10000 would likely be capable of playing Arc Raiders near 60 FPS and Cyberpunk 2077 at playable FPS, but with Crossfire and SLI support long gone in modern games, this is the best performance we'll ever see from this graphics card in those titles.
Follow Tom's Hardware on Google News , or add us as a preferred source , to get our latest news, analysis, & reviews in your feeds.
Aaron Klotz Social Links Navigation Contributing Writer Aaron Klotz is a contributing writer for Tom’s Hardware, covering news related to computer hardware such as CPUs, and graphics cards.
teeejay94 Confusing as to why AMD would drop support with cards like this still in the wild. They just want dual GPUs to go away so bad, you know why though? Has nothing to do with lack of technology they both, Nvidia and AMD, dont actually want to put the money forth to make this tech good. They're literally just being cheap AF. Record profits instead of record customer satisfaction. I think this is probably the lowest satisfaction rate weve ever seen especially coming from Nvidia. Thats why Nvidia relies also on people who don't know what they're talking about, they'll just go on and on all day about how great something is without a care at all for objectivity. Thats the one thing Nvidia especially hates is objective opinions. Jensen is somewhat going "won't you just like this product already" No, I won't. Why would I like a 5090 because you finally upped the vram after begging you for more vram for 10 years? HA! Reply
usertests teeejay94 said: Has nothing to do with lack of technology they both, Nvidia and AMD, dont actually want to put the money forth to make this tech good. They're literally just being cheap AF. Dual cards are obviously difficult to support because of latency and other issues. They might end up converging on something similar once GPUs start using multiple GCDs, but they'll operate differently and be seen as single units by games. Despite RDNA3 using chiplets, we still haven't seen consumer GPUs adopt multiple GCDs, and everything consumer-oriented from Nvidia has been monolithic. High-end VR might work with dual (two separate) GPUs targeting both eyes independently, and there are ways to use more than one GPU, like Lossless Scaling on a second GPU, or the PhysX trick that has been made temporarily obsolete. Otherwise, it's dead. Reply
User of Computers teeejay94 said: Confusing as to why AMD would drop support with cards like this still in the wild. They just want dual GPUs to go away so bad, you know why though? Has nothing to do with lack of technology they both, Nvidia and AMD, dont actually want to put the money forth to make this tech good. They're literally just being cheap AF. Record profits instead of record customer satisfaction. I think this is probably the lowest satisfaction rate weve ever seen especially coming from Nvidia. Thats why Nvidia relies also on people who don't know what they're talking about, they'll just go on and on all day about how great something is without a care at all for objectivity. Thats the one thing Nvidia especially hates is objective opinions. Jensen is somewhat going "won't you just like this product already" No, I won't. Why would I like a 5090 because you finally upped the vram after begging you for more vram for 10 years? HA! …alternatively, the card is over a decade old, and driver support for even the single-GPU cards has been gone for 2 or 3 years. Something tells me that this isn't a grand conspiracy. I think the card is just ancient. I'm not sure why you're foaming at the mouth over this. Reply
blppt Remember we were promised that dual GPUs would survive and even possibly thrive with DX12 technologies, even able to mix GPU brands? Reply
Moores_Ghost blppt said: Remember we were promised that dual GPUs would survive and even possibly thrive with DX12 technologies, even able to mix GPU brands? This. Multigpu is a real, viable solution for gamers on budgets and DX12 has had it since day one with one title supporting it so far. Microsoft built it and no one uses it. Reply
spongiemaster blppt said: Remember we were promised that dual GPUs would survive and even possibly thrive with DX12 technologies, even able to mix GPU brands? And you believed any of that? DX12 was the death of multi GPU support because it shifted the burden of development to the game developers rather than the hardware makers. There was zero chance game studios were going to put in any effort to support such a niche market that promised zero return on that investment. AMD and Nvidia at least had the incentive that they could sell additional cards, and they still dumped support because of how complex the driver development was for so small a market. Reply
blppt spongiemaster said: And you believed any of that? To some extent, yes, but that was before it became apparent that the devs would struggle just to get a handle on the shader compilation difficulty, which immediately erased any chance of mGPU support on their end. Also, I doubt AMD and Nvidia like the idea of cheaper graphics cards in parallel matching their flagship GPUs (Nvidia and their $3K 5090). Before it wasn't a huge issue because there was inconsistent performance with AFR rendering, but DX12 introduced a mode that wouldn't necessarily require that hack job of a mGPU solution. Reply
call101010 With the power demand of high end cards today , dual GPU is a dream to come back … regardless of any other reason. gone are the days of dual GPUs … Reply
Key considerations
- Investor positioning can change fast
- Volatility remains possible near catalysts
- Macro rates and liquidity can dominate flows
Reference reading
- https://www.tomshardware.com/pc-components/gpus/SPONSORED_LINK_URL
- https://www.tomshardware.com/pc-components/gpus/2012-amd-firepro-s10000-dual-gpu-card-runs-arc-raiders-at-playable-frame-rates-but-half-of-its-gpu-power-goes-unused-in-the-process#main
- https://www.tomshardware.com
- Opt-In NVIDIA Software Enables Data Center Fleet Management
- Nintendo Switch 2 RAM prices rise 41%, NAND flash up 8% — console giant shares nosedive in face of increased cost warnings
- White House U-turn on Nvidia H200 AI accelerator exports down to Huawei's powerful new Ascend chips, report claims — U.S. committed to 'dominance of the America
- Intel fails to get EU antitrust ruling overturned in longstanding 16-year AMD competition case — chipmaker sees $1.2 billion fine reduced to $278 million
- Snapmaker Raises Millions as Chinese Big Tech Investors Pile Into 3D Printing
Informational only. No financial advice. Do your own research.