
Follow Tom's Hardware on Google News , or add us as a preferred source , to get our latest news, analysis, & reviews in your feeds.
Aaron Klotz Social Links Navigation Contributing Writer Aaron Klotz is a contributing writer for Tom’s Hardware, covering news related to computer hardware such as CPUs, and graphics cards.
qxp So I disagree with description of Opteron's as "worst ever CPU design". At their time they were solid 64-bit chips, with better price/performance than comparable Intel offerings. Nice to hear about open source firmware. Control over such firmware can have an advantage in real-time control applications, as closed-source commercial firmware can result in periodic latency spikes due to firmware code running in competition with the OS (not sure if all motherboards do it). Reply
usertests qxp said: So I disagree with description of Opteron's as "worst ever CPU design". At their time they were solid 64-bit chips, with better price/performance than comparable Intel offerings. The article doesn't say Opteron, it says Bulldozer/Piledriver: Bulldozer and Piledriver are renowned for their terrible single-core performance and earned a reputation as some of AMD's worst ever CPU designs. Technically, there are Opterons using older cores, Jaguar cores, and even ARM Cortex-A57. The Bulldozer family has a horrible reputation from having low single-thread performance during a time when 8 "cores" were hard to utilize, terrible efficiency exasperated by being pushed towards 5 GHz in the enthusiast parts, and having effectively fake core counts using modules with shared resources, which they were sued over. The fastest FX-9590 is steamrolled by the i7-4770K and later parts using half the energy. Maybe the MCM 16-"core" Opterons were a great value compared to Intel Xeon offerings, but they were less efficient. Reply
palladin9479 qxp said: So I disagree with description of Opteron's as "worst ever CPU design". At their time they were solid 64-bit chips, with better price/performance than comparable Intel offerings. Nice to hear about open source firmware. Control over such firmware can have an advantage in real-time control applications, as closed-source commercial firmware can result in periodic latency spikes due to firmware code running in competition with the OS (not sure if all motherboards do it). Much of Bulldozer / Piledrivers hate isn't justified. AMD had moved over to a new modular chip design methodology and was very much still developing and refining it. Much of BD/PD was built using automated chip design software that essentially built the design using lego like modules. After much refining and development, that software is now responsible for Ryzen and why AMD can build custom CPU's so cheaply. By contrast Intel has to do all the tuning and optimization work by hand based on whatever process node they are using for that CPU. This advantage has allowed AMD to go through design cycles much faster then Intel with a smaller staff. Or to put it another way, if AMD never attempted Bulldozer, we wouldn't have Ryzen today. Reply
qxp usertests said: The article doesn't say Opteron, it says Bulldozer/Piledriver: Technically, there are Opterons using older cores, Jaguar cores, and even ARM Cortex-A57. The Bulldozer family has a horrible reputation from having low single-thread performance during a time when 8 "cores" were hard to utilize, terrible efficiency exasperated by being pushed towards 5 GHz in the enthusiast parts, and having effectively fake core counts using modules with shared resources, which they were sued over. The fastest FX-9590 is steamrolled by the i7-4770K and later parts using half the energy. Maybe the MCM 16-"core" Opterons were a great value compared to Intel Xeon offerings, but they were less efficient. Actually, at the very top the article says "..Opteron CPUs based on.." and I was remembering the 8-core ones. For me they were priced right and run great – just put Linux on them and start 8 separate processes. Really don't see what is so hard to utilize 8 cores. Reply
palladin9479 qxp said: Actually, at the very top the article says "..Opteron CPUs based on.." and I was remembering the 8-core ones. For me they were priced right and run great – just put Linux on them and start 8 separate processes. Really don't see what is so hard to utilize 8 cores. I believe it's more that the consumer workloads of that time were not heavily parallel. Your application can spawn 20 threads but if 90~95% of the work is being done in the main logic loop, then extra cores are not doing much for you. That was the landscape of that era and how all processors were evaluated. BD wasn't so great as a server CPU architecture due to how new and unoptimized the process was at that time, PD (which I still have some) was much better but by then people had a bad taste in their mouths. The Opteron's based on PD were pretty good workstation and server CPUs, but the CPU is one of the less important components of those systems. Intel had a near monopoly on the server / workstation platforms of that era, these are the motherboard + NIC + storage + management components. Enthusiasts could definitely "build" a cheap workstation / server from AMD components, but most customers just purchased these items as singular units from vendors who provided complete support and validation for the product. Most of those AMD offerings were packaged as "cheap" and frequently were sold with cheaper lower performing components or with less capacity. AMD realized how important those vendors were, and have since formed very strong relationships with them to ensure good premium products are made available to the market. Reply
qxp palladin9479 said: I believe it's more that the consumer workloads of that time were not heavily parallel. Your application can spawn 20 threads but if 90~95% of the work is being done in the main logic loop, then extra cores are not doing much for you. That was the landscape of that era and how all processors were evaluated. BD wasn't so great as a server CPU architecture due to how new and unoptimized the process was at that time, PD (which I still have some) was much better but by then people had a bad taste in their mouths. The Opteron's based on PD were pretty good workstation and server CPUs, but the CPU is one of the less important components of those systems. Intel had a near monopoly on the server / workstation platforms of that era, these are the motherboard + NIC + storage + management components. Enthusiasts could definitely "build" a cheap workstation / server from AMD components, but most customers just purchased these items as singular units from vendors who provided complete support and validation for the product. Most of those AMD offerings were packaged as "cheap" and frequently were sold with cheaper lower performing components or with less capacity. AMD realized how important those vendors were, and have since formed very strong relationships with them to ensure good premium products are made available to the market. In my case, I got a Dell server first with Intel CPU. It was a power hog and run very slow. Next I got an Opteron system which was faster and cheaper. Needless to say I was only buying AMD for a year or two. All were prebuilt, with onsite warranty. Intel had a near monopoly with vendors like Dell because of the contracts. Dell still lists mostly Intel systems on their website. Thankfully, Supermicro was making perfectly good motherboards and cases, and if you bought systems from a good vendor they were rock-solid. Intel later released E5's and those were nice. 20-thread systems came later, and if your application is mostly single thread, you would want to configure the system to reflect that – higher frequency and CPU with smaller latency, which I think would imply AMD as well at the time (certainly at the time of Athlon 64). Reply
hwertz I wonder what that RAM costs? As we well know DDR4 pricing is apocalyptically high recently, picking one of these up and shoving 256-512GB RAM into it could be useful and possibly much less expensive than just the DDR4 or DDR5 alone (unless others already did this and demand drove the prices up already; I read about someone buying about that age Xeon to run like DeepSeek on since they could cram 512GB or maybe even 1TB RAM into it.) Reply
rluker5 Those quad core: https://www.tomshardware.com/news/amd-fx-bulldozer-false-advertising-class-action-lawsuit-eight-cores-settlement,40256.html Bulldozers and Piledrivers are still good for low end use like old games and video streaming, like my Dell Venue 11 Pro 5130 with a 2w Atom chip that performed on par with the quad core Phenom 2 laptop I had. So it is nice that they can do modern boot times now, but I wouldn't waste money on extra ram for such a system, just get something better with that money like some used Zen 3, or Haswell or newer build. Haswell is slower, but it is so cheap on Ebay. 4770k+mobo+ram < 100 delivered and is still fairly nice for everything but workstation, new AAA games > 60 fps all the time, and W11 official support. Zen 3 has 2 of those covered but runs about 3x as much for a 5600x+mobo+ram. Edit: Also isn't CSM a thing for getting around UEFI signing? Reply
artk2219 palladin9479 said: Much of Bulldozer / Piledrivers hate isn't justified. AMD had moved over to a new modular chip design methodology and was very much still developing and refining it. Much of BD/PD was built using automated chip design software that essentially built the design using lego like modules. After much refining and development, that software is now responsible for Ryzen and why AMD can build custom CPU's so cheaply. By contrast Intel has to do all the tuning and optimization work by hand based on whatever process node they are using for that CPU. This advantage has allowed AMD to go through design cycles much faster then Intel with a smaller staff. Or to put it another way, if AMD never attempted Bulldozer, we wouldn't have Ryzen today. Its a similar story for Intel and the Pentium 4, well at least in terms of what they gained from it. Intel learned how to deal with heat, high motherboard power requirements, what it takes to build a cpu to run at high frequencies, hyper threading, multi core processors, how to tune a power hungry architecture for lower power mobile requirements. Intel learned a ton from the design, even if it was a flawed product. Core 2, and especially Nehalem, benefited greatly from what they learned during this period, even if they were a bunch of cheaters :confused: Reply
qxp artk2219 said: Its a similar story for Intel and the Pentium 4, well at least in terms of what they gained from it. Intel learned how to deal with heat, high motherboard power requirements, what it takes to build a cpu to run at high frequencies, hyper threading, multi core processors, how to tune a power hungry architecture for lower power mobile requirements. Intel learned a ton from the design, even if it was a flawed product. Core 2, and especially Nehalem, benefited greatly from what they learned during this period, even if they were a bunch of cheaters :confused: That's not my recollection. Pentium 4 much slower than Athlon 64 because Intel tried to ramp up the frequency by having a brand new design with extra deep pipeline that suffered horribly on missed branches. From what I heard, Core 2 was an evolved Pentium III design that was deveoped by a separate team in Israel and aimed at low-power applications. The design was brought forward when Pentium 4 flopped. I don't know of any further designs that were based on Pentium 4. Reply
Key considerations
- Investor positioning can change fast
- Volatility remains possible near catalysts
- Macro rates and liquidity can dominate flows
Reference reading
- https://www.tomshardware.com/pc-components/cpus/SPONSORED_LINK_URL
- https://www.tomshardware.com/pc-components/cpus/amds-ancient-bulldozer-and-piledriver-platforms-getting-open-source-firmwares-in-2025-latest-update-delivers-15-second-boot-up-times-with-256gb-memory-setups#main
- https://www.tomshardware.com
- NVIDIA, NPS Commission the Navy’s AI Flagship for Training Tomorrow’s Leaders
- Google sues China-based hackers it says stole $1 billion — 'Lighthouse' platform offers phishing services to crooks for a monthly fee, hit over a million victim
- AMD users flag heavy SSD write activity tied to chipset driver — incessant log file writes observed every time a window is moved or resized
- Save £200 on this MSI QD-OLED gaming monitor — just £399 for a vibrant 1440p display with a 240Hz refresh rate
- For just $10-per-terabyte, you can grab a 24TB Seagate BarraCuda HDD before shortages hit — solve all your storage needs for just $249
Informational only. No financial advice. Do your own research.