CPU upgrades aren’t dead from the DDR5 apocalypse — here are in-place DDR4 upgrades you can still make

CPU upgrades aren't dead from the DDR5 apocalypse — here are in-place DDR4 upgrades you can still make

DRAM apocalypse or not, this is the best upgrade you can make on an LGA1700 platform, and it's even faster at gaming than Intel's newer platform: The newer Core Ultra 7 265K is 7% slower than the Core i7-14700K in games, on average. The Arrow Lake chip has a 6% advantage in multi-threaded performance and an 8% jump in single-threaded workloads, but shelling out for a new kit of DDR5 and an LGA1851 board doesn’t add up with such marginal differences.

The Intel Core i7-14700KF is a 20-core (8+12)/28-thread processor with a max turbo frequency of 5.6GHz. It's on a slight sale for Black Friday, filling in the gap for the Core i7-14700K, which is currently sold out.

Although the Core i7-14700K is the best upgrade you can make, it’s not available for the best price – the Core Ultra 7 265K has seen drops to around $50 less than what the Core i7-14700K is selling for now. If you want the best value, you should go after the Core i5-13400F , provided you can find it in stock. It’s available for $170 at Newegg at the time of writing, though only from a third-party seller.

(Image credit: Tom's Hardware) (Image credit: Tom's Hardware) (Image credit: Tom's Hardware) (Image credit: Tom's Hardware) (Image credit: Tom's Hardware) (Image credit: Tom's Hardware) The newer Core i5-14400 (or its F-series variant) is within a point or two of the Core i5-13400F, depending on the benchmark you look at, and that’s true across both games and productivity applications. The Raptor Lake Refresh chip is available for around $180 to $220, and that extra cost buys you… basically nothing. Go for the Core i5-13400F if you can find it in stock, but worst-case scenario, you’ll spend an extra $20 or so for the Core i5-14400.

Although both CPUs offer a great value proposition, they benefit more from faster DDR5 kits than Intel’s higher-end offerings, likely due to a significant reduction in L2 cache. You’re looking at around an 8% jump going from DDR4-3600 to DDR5-6800. At stock JEDEC speeds, however, DDR4 and DDR5 are in lockstep.

The Intel Core i5-13400F is a 10-core (6+4)/16-thread processor from the Raptor Lake range. It features a maximum boost clock speed of 4.6GHz on the P-cores, and although it's a generation older than the Core i5-14400, it offers largely similar performance.

Upgrading an AMD platform is significantly more difficult, not only because AMD chose a hard cut-off for DDR4 support with AM4, but also because popular options like the Ryzen 7 5800X3D have been sold out for quite some time. From resellers, you’ll spend about $500 on the Ryzen 7 5800X3D and about $350 on the Ryzen 7 5700X3D , which kind of undermines the whole point of sticking with a DDR4 platform.

AMD launched Zen 3, its last architecture with DDR4 support, half a decade ago. Particularly for high-end chips like the Ryzen 9 5950X, you’re not going to find any stock at reasonable prices. Thankfully, AMD refreshed some chips from its Zen 3 lineup last year with the forgotten Ryzen 5000XT range.

The Ryzen 9 5900XT from AMD is a 16-core/32-thread processor that can boost up to 4.8GHz. It's on sale for $81 off for Black Friday, but we've seen the price drop lower in the past.

First, there's the Ryzen 9 5900XT, which has to be one of the biggest naming blunders I’ve ever seen out of AMD (and that’s saying something). You’d assume this is a souped-up 12-core part that builds on the Ryzen 9 5900X , but no. It comes with the same sixteen Zen 3 cores as the Ryzen 9 5950X. It’s the same silicon, cache, and all, just clocked 100MHz lower than the original 16-core Zen 3 chip.

It’s backordered on Newegg at the time of writing for around $270, or you can get it faster by spending around $290 at B&H . These prices are fine – you’re getting a 16-core chip – but we’ve seen the Ryzen 9 5900XT drop to around $220 earlier this year. We’re in a precarious situation with these XT chips. Although AMD says AM4 chips still make up a significant portion of sales, I suspect these XT models don’t have a ton of shelf life left in them.

(Image credit: Tom's Hardware) (Image credit: Tom's Hardware) (Image credit: Tom's Hardware) (Image credit: Tom's Hardware) If you don’t need 16 cores, you have two options. The Ryzen 7 5700X is the best bang for your buck right now at around $166 , though, like the majority of the original Zen 3 lineup, you might run into stock issues. The Ryzen 7 5800XT is more readily available at $200 , but that extra $40 or so buys very little in performance. And, like the Ryzen 9 5900XT, we saw the Ryzen 7 5800XT at a much better price of $133 earlier this year. Shame that the sale isn’t running for Black Friday.

These chips share the same silicon with identical cache amounts and the same eight Zen 3 cores, but the XT model has a 105W TDP while the original X model is capped at a 65W TDP. As we’ve seen across several AMD generations now, Zen is extremely efficient around that 65W mark, and if you want to make up the gap, PBO (or manual overclocking) is always an option with one of the best CPU coolers .

AMD's Ryzen 7 5700X is an 8-core/16-thread CPU sporting the Zen 3 architecture. It tops out at a 4.6GHz boost clock speed, but it's otherwise identical to the more expensive Ryzen 7 5800X.

Regardless of the CPU you go with, it doesn’t make sense to upgrade if you’re already using a Zen 3 chip. Unless you need the extra core count for heavily-threaded applications, you shouldn’t replace, say, a Ryzen 5 5600X with a Ryzen 9 5900XT for better gaming performance. You’ll be disappointed.

An upgrade in-place like this makes sense on older Zen platforms. Zen 3 is significantly faster than Zen 2 in games across the board, and leaps and bounds ahead of Zen and Zen+. Although AMD is now largely considered the performance king of CPUs, it only really hit its stride with Zen 3. The Ryzen 9 5900XT and Ryzen 7 5700X are your best options based on what’s available, but if you miraculously stumble upon a Zen 3 X3D chip at your local Micro Center or Best Buy, jump at that opportunity.

As is always the case with AM4, make sure to check your motherboard’s compatibility before buying anything. AMD's 400 and 500-series chipsets should be safe, but 300-series chipsets are iffy. Double-checking is especially important if you’re picking up an XT chip and want to make sure your existing board has the power delivery circuitry to fully exploit the newer, high-core-count options.

With current memory prices, there aren’t any slam-dunk CPU upgrades unless you already have a current kit. You’re locked out of X3D CPUs, and that’s what you should go for if you want the highest gaming performance – even the $230 Ryzen 7 7600X3D can keep pace with non-X3D chips that cost twice as much.

For my money, Raptor Lake refresh is looking very attractive right now. Arrow Lake was a disappointment on the gaming front, so you’re not giving up much by picking up a last-gen Intel chip, and the LGA1700 platform supports DDR4, so you don’t need to shell out a ton of money for a kit of memory if you are using a kit that you already have. It seemed silly to support DDR4 beyond Alder Lake at the time, but the decision is paying off for Intel now.

Best gaming laptop deals | Best monitor deals | Best PC and laptop deals | Best SSD deals | Best CPU deals | Gaming Chair | Best hard drive deals | Best PC case deals | Best Dell and Alienware deals | Best 3D printer deals | Best PC peripherals deals | Best motherboard deals | Best CPU cooler deals | Best gaming chair deals

Jake Roach Social Links Navigation Senior Analyst, CPUs Jake Roach is the Senior CPU Analyst at Tom’s Hardware, writing reviews, news, and features about the latest consumer and workstation processors.

TEAMSWITCHER "Although AMD is now largely considered the performance king of CPUs" If you still game at 1080p. Reply

bill001g The problem with this whole article is it assumes you restricted by CPU. Thinking just upgrading the GPU automatically causes a CPU bottleneck. If you are not currently limited by the CPU upgrading only the cpu will make no difference. This seems to be someone who is reading the benchmarks without understanding why they test at 1080. Most people with expensive GPU are running at higher resolution which shifts even more burden to the GPU. It would be a very narrow use case where swapping out the cpu for a more powerful one would make any difference. There are very few games that are cpu limited and it would have to be someone playing at 1080. It would be more a case of someone wants to upgrade their machine and going to a DDR5 platform is too costly. They could first upgrade the GPU and then the CPU. There are number of people that have 5090 in older machines and they still are not being cpu limited at 4k resolutions. Reply

Crazyy8 TEAMSWITCHER said: "Although AMD is now largely considered the performance king of CPUs" If you still game at 1080p. https://tpucdn.com/review/amd-ryzen-9-9950x3d/images/average-fps-3840-2160.png Reply

Gururu bill001g said: The problem with this whole article is it assumes you restricted by CPU. Thinking just upgrading the GPU automatically causes a CPU bottleneck. If you are not currently limited by the CPU upgrading only the cpu will make no difference. This seems to be someone who is reading the benchmarks without understanding why they test at 1080. Most people with expensive GPU are running at higher resolution which shifts even more burden to the GPU. It would be a very narrow use case where swapping out the cpu for a more powerful one would make any difference. There are very few games that are cpu limited and it would have to be someone playing at 1080. It would be more a case of someone wants to upgrade their machine and going to a DDR5 platform is too costly. They could first upgrade the GPU and then the CPU. There are number of people that have 5090 in older machines and they still are not being cpu limited at 4k resolutions. This^ and why CPU testing and even more so, recommendations, seem so out of touch. Hardware should be tested the way it's meant to be used. Reply

stuff and nonesense Gururu said: This^ and why CPU testing and even more so, recommendations, seem so out of touch. Hardware should be tested the way it's meant to be used. Testing of CPUs requires a system to be capable of feeding the processor with all the data it can eat with minimal restrictions and a path that gives as little friction as possible to allow the device to show its limits. If you match a current CPU with say a R9 390 at 1080p there will likely be little difference in the frame rate seen and at 1440p even less. For testing purposes, to show differences, the supporting hardware needs to be able to hinder the CPU as little as possible. Going back to the R9 390, a good GPU in its day, testing any CPU since the 4790K and the Ryzen 2700x will show diminishing returns wrt frame rate improvements. Similarly, testing with it as a reference GPU today will show little improvement. Every PC component has a point where it can flow no more data, memory, GPU, CPU. The idea behind testing at 1080p is to ensure that, for testing CPUs, the GPU is running flat out like an open tap/faucet thus allowing the CPU to shine. Using the quickest available GPU at the time gives an indication of how the processor/system will be able to perform at the next level… 1440p in a few years time when the GPU no longer provides friction to the data flow at that resolution for a given game at similar quality settings. Testing at 1440p for some games is meaningless, shown where with different processors and a fixed GPU the output frame rate is the same within a margin of error. Testing at 2160p is largely meaningless for the same reason. Testing GPUs requires the throughput to be able to feed the GPU to its processing capacity. A 4790K saturated the R9 390. A 7800x3d would show little to no difference in fps output using that card. It is easier to take the fastest CPU of the day or the standardised test rig and pair it with the equipment under test. This will give you the absolute max performance for the system on that day or with a standardised system it will give you a set of results that can be reliably compared with other GPUs. Testing a modern GPU at 1080p doesn’t really load the GPU, it needs to be stressed. It needs to do work. 1440p gives greater load, 2160p provides even greater GPU load. Testing different GPUs in the standardised system will show fps differences for the equipments under test for the applied settings and resolution. As the resolution/settings become higher/more complex the GPU output fps decreases and the CPU becomes less important. Look at the graphs where the current intel and AMD systems show near parity at 2160p with a given GPU. Should you buy a 14900k, 7800x3d for 1080p gaming? Probably not, though it’s your choice. The same goes for a xx80 card or 7900xt, 9070xt but it’s your choice. People need to understand why 1080 is relevant and why 2160p isn’t the be all and end all. It is still a huge part of the market, it shows what a CPU/system can provide a GPU. 2160p shows the other side of the equation and somewhere within lies the best balance for a given budget. Reply

Gururu ^ I agree with all of that. However, I think we need to make the assumption that when people argue against 1080 testing, that it is not always because they don't understand its value in a testing cycle. In my case, I am asking for testing that benefits the likely consumer usage. When people review cars, do they take them to a Nascar racetrack, see how fast they are and call it a day? Not really, because it says nothing of how the car would handle to a consumer bound my speed limits and other laws. If a mechanic or race car driver insisted that the Lamborghini was the only car to buy because it was the fastest, I'd feel like I was missing something since I wouldn't benefit from it in little old suburbia. Reply

stuff and nonesense Gururu said: ^ I agree with all of that. However, I think we need to make the assumption that when people argue against 1080 testing, that it is not always because they don't understand its value in a testing cycle. In my case, I am asking for testing that benefits the likely consumer usage. When people review cars, do they take them to a Nascar racetrack, see how fast they are and call it a day? Not really, because it says nothing of how the car would handle to a consumer bound my speed limits and other laws. If a mechanic or race car driver insisted that the Lamborghini was the only car to buy because it was the fastest, I'd feel like I was missing something since I wouldn't benefit from it in little old suburbia. Perhaps the reviews of games should be done wrt the recommend specifications for that game, the minimum specs and a fully specified pc, components from AMD, intel and Nvidia? Should run well, should run but we don’t recommend it and a sports car… What would your solution be? Reply

Gururu stuff and nonesense said: Perhaps the reviews of games should be done wrt the recommend specifications for that game, the minimum specs and a fully specified pc, components from AMD, intel and Nvidia? Should run well, should run but we don’t recommend it and a sports car… What would your solution be? Just about every site has a recommended tier build (ultimate, mid, budget, etc.). Whatever video card or CPU they wish to test should be tested with the recommended build that has the closest equivalent GPU/CPU (eg. price). Then test ALL the GPUs/CPUs (top to bottom) in that build. That will let us know when the GPU/CPU becomes pointless. Reply

Mattzun The article states its for CPU swaps and keeping DDR4, Were the charts for the 14700K even based on a 14700k with DDR4? There were reviews for 12th gen that used both DDR4 and DDR5, but those DDR4 number were largely gone by 14th gen. There was a 15 percent performance difference between DDR4 and DDR5 on the reviews I've seen. Reply

vanadiel007 bill001g said: The problem with this whole article is it assumes you restricted by CPU. Thinking just upgrading the GPU automatically causes a CPU bottleneck. If you are not currently limited by the CPU upgrading only the cpu will make no difference. This seems to be someone who is reading the benchmarks without understanding why they test at 1080. Most people with expensive GPU are running at higher resolution which shifts even more burden to the GPU. It would be a very narrow use case where swapping out the cpu for a more powerful one would make any difference. There are very few games that are cpu limited and it would have to be someone playing at 1080. It would be more a case of someone wants to upgrade their machine and going to a DDR5 platform is too costly. They could first upgrade the GPU and then the CPU. There are number of people that have 5090 in older machines and they still are not being cpu limited at 4k resolutions. Not as narrow as you think: play Path of Exile 2 at UW QHD resolution with a 5900X and go to the town of act 4 and walk around for 30 seconds. Now do the same thing with on a 9800X3D with the same video card (7900XTX) and same amount of memory etc…. Watch how much smoother your framerate now is walking around for 30 seconds in the exact same area. I recently upgraded from that exact scenario and the difference is large. Reply

Key considerations

  • Investor positioning can change fast
  • Volatility remains possible near catalysts
  • Macro rates and liquidity can dominate flows

Reference reading

More on this site

Informational only. No financial advice. Do your own research.

Leave a Comment