
Follow Tom's Hardware on Google News , or add us as a preferred source , to get our latest news, analysis, & reviews in your feeds.
Anton Shilov is a contributing writer at Tom\u2019s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends. ","collapsible":{"enabled":true,"maxHeight":250,"readMoreText":"Read more","readLessText":"Read less"}}), "https://slice.vanilla.futurecdn.net/13-4-19/js/authorBio.js"); } else { console.error('%c FTE ','background: #9306F9; color: #ffffff','no lazy slice hydration function available'); } Anton Shilov Social Links Navigation Contributing Writer Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.
hotaru251 AMD's never been against making arm chips. (in fact they had done so in past) They just havent focused on it atm because x86 is still king. Reply
usertests hotaru251 said: AMD's never been against making arm chips. (in fact they had done so in past) They just havent focused on it atm because x86 is still king. I've wondered if they'll ever try designing an ARM-x86 hybrid chip for some purpose. Now IBM is trying an ARM-Power hybrid. Reply
hotaru251 usertests said: I've wondered if they'll ever try designing an ARM-x86 hybrid chip for some purpose. Now IBM is trying an ARM-Power hybrid. hybrid? prolly not as if arm gets adopted enoguh they'd likely just go full in arm. Apple's shown you can use translation layers well enough to run x86 on arm. Reply
Notton https://www.notebookcheck.net/Zen-architecture-pioneer-Jim-Keller-feels-AMD-was-stupid-to-cancel-the-K12-Core-ARM-processor.629843.0.html"Jim's plan with the K12 was to work on a new decode unit since the cache and execution unit design for ARM and x86 were almost similar, but AMD had other plans after he left." The impression I get from that is ARM/x86 hybrid doesn't make much sense if every translation can be done through software. At least when it comes to Zen architecture. Reply
thestryker This really doesn't seem like an x86 v Arm thing so much as general purpose versus semi-custom. Most of the designs we've seen have been modified Arm cores rather than ground up. This probably explains the shift towards Arm rather than RISCV. It seems unlikely that nvidia would dump money into Intel and license nvlink for use in Xeons if they thought that market wouldn't be relevant a couple of years after first productization. Reply
bit_user usertests said: I've wondered if they'll ever try designing an ARM-x86 hybrid chip for some purpose. Now IBM is trying an ARM-Power hybrid. No, those aren't POWER. They're Telum mainframe CPUs. Totally different ISA. Yes, IBM has two proprietary ISAs. hotaru251 said: hybrid? prolly not as if arm gets adopted enoguh they'd likely just go full in arm. You don't know the mainframe world. These folks care about legacy in ways that put shame to x86. Reply
bit_user Notton said: The impression I get from that is ARM/x86 hybrid doesn't make much sense if every translation can be done through software. At least when it comes to Zen architecture. Then why does Zen 5 still use 4-wide decoders, while the latest ARM CPUs are 10+ wide? Even Intel's ginormous P-cores are only 8-wide, and not all of those 8 are fully general. In fact, the overhead of decoding ARM64 instructions is so low that the ARM cores which dropped 32-bit support no longer even have mOP caches, which are functionally redundant with L1i caches. Instead, they could spend that silicon and power budget on wider decoders and more pipelines. Reply
bit_user thestryker said: This really doesn't seem like an x86 v Arm thing so much as general purpose versus semi-custom. Most of the designs we've seen have been modified Arm cores rather than ground up. That's not true. Amazon, Google, Microsoft, and Nvidia (prior to Vera) all used off-the-shelf cores, for their server CPUs. Yes, they packaged them up on their own, but they had no viable alternative to doing so. Only Nvidia did any real value-add by integrating NVLink into their own silicon. Going forward, with ARM providing its own silicon, you'll be able to read much more into companies' decision either to use it or continue doing their own chip-making. However, once Ampere's Altra fell into obsolescence, you could no longer read into anyone's decision not to use it. thestryker said: It seems unlikely that nvidia would dump money into Intel and license nvlink for use in Xeons if they thought that market wouldn't be relevant a couple of years after first productization. I think the two are separate. Nvidia invested in Intel for their fabs, alone. The NVLink partnership was probably done for two reasons: To give Intel a vital lifeline, complementing the monetary investment. To deny AMD ownership of the x86 segment, in the the AI hardware stack. Reply
Key considerations
- Investor positioning can change fast
- Volatility remains possible near catalysts
- Macro rates and liquidity can dominate flows
Reference reading
- https://www.tomshardware.com/pc-components/cpus/SPONSORED_LINK_URL
- https://www.tomshardware.com/pc-components/cpus/report-claims-arm-chips-will-power-90-percent-of-ai-servers-based-on-custom-processors-in-2029-x86-and-risc-v-on-the-outside-looking-in#main
- https://www.tomshardware.com
- Smooth Moves: 90 Frames-Per-Second Virtual Reality Arrives on GeForce NOW
- Act fast to grab this 32GB Corsair Vengeance DDR5 RAM for just $269, the cheapest kit in months — limited-time Woot deal is now $100 cheaper than the next-best
- Raspberry Pi flagship 500+ model now costs almost as much as a Mac Mini — firm Pi launches 3GB model to fight increasing DRAM prices
- Level up your gaming setup with this Core Ultra 5 250K Plus bundle for $299 — save $60 with ASRock's rock-solid B860 motherboard
- PC enthusiast finds relic Nvidia 3D Vision 2 glasses for $2.99 — PC gaming artifact from 2011 cost $149 new, was once Nvidia's 'vision' for the future of gaming
Informational only. No financial advice. Do your own research.