
Aaron Klotz is a contributing writer for Tom\u2019s Hardware, covering news related to computer hardware such as CPUs, and graphics cards. ","collapsible":{"enabled":true,"maxHeight":250,"readMoreText":"Read more","readLessText":"Read less"}}), "https://slice.vanilla.futurecdn.net/13-4-18/js/authorBio.js"); } else { console.error('%c FTE ','background: #9306F9; color: #ffffff','no lazy slice hydration function available'); } Aaron Klotz Social Links Navigation Contributing Writer Aaron Klotz is a contributing writer for Tom’s Hardware, covering news related to computer hardware such as CPUs, and graphics cards.
PEnns "can run a local AI LLM" "All we know officially is the CPU inside, which is AMD's flagship Strix Halo APU sporting 16 Zen 5 CPU cores that can clock up to 5.1GHz, a Radeon 8060S iGPU with 40 CUs, XDNA 2 NPU, and 64MB of L3 cache. The 395+ can be configured with 32GB to 128GB of system memory;" Why does it need AI?? Are we talking about a NAS or a workstation, or what exactly??? Reply
CelicaGT PEnns said: "can run a local AI LLM" "All we know officially is the CPU inside, which is AMD's flagship Strix Halo APU sporting 16 Zen 5 CPU cores that can clock up to 5.1GHz, a Radeon 8060S iGPU with 40 CUs, XDNA 2 NPU, and 64MB of L3 cache. The 395+ can be configured with 32GB to 128GB of system memory;" Why does it need AI?? Are we talking about a NAS or a workstation, or what exactly??? So, I've been around the sun a fair few times and this may date me. Anyways when Compact Disc finally became popular every single audio product seemed overnight to gain some kind of stupid DIGITAL branding on it. DIGITAL everything, even on products where it was completely irrelevant and misapplied. I think perhaps, this is like that.. AI everything, even when it's not really. I mean yes/maybe in this case it but it can also be….a NAS, or a workstation? LOCAL AGENTIC AI!! Maybe just wipe it and install SteamOS. Yes, that one. (Based on what I've read Agentic storage solutions could just maybe erase everything, but maybe it won't. It's a chance their willing to take with your data I guess) Reply
alan.campbell99 Uh, nope. I just want a bunch of disks to store my stuff when I look at a NAS. Reply
timsSOFTWARE PEnns said: "can run a local AI LLM" "All we know officially is the CPU inside, which is AMD's flagship Strix Halo APU sporting 16 Zen 5 CPU cores that can clock up to 5.1GHz, a Radeon 8060S iGPU with 40 CUs, XDNA 2 NPU, and 64MB of L3 cache. The 395+ can be configured with 32GB to 128GB of system memory;" Why does it need AI?? Are we talking about a NAS or a workstation, or what exactly??? It doesn't need AI, but one of the benefits of an APU like the Strix Halo AMD chip that they are using in this machine – or the Apple silicon chips – is that, as APUs, their graphics processor can use system RAM. The problem that current regular PCs have for local AI hosting is that their architecture was built for traditional gaming and other graphics workloads, that require relatively little VRAM compared to system memory. But for AI inference – where you really would like to have the whole model loaded into VRAM, and decently powerful LLMs range in size from tens of GB to terabytes – that leaves you with the conundrum of having to choose between CPU-only inference (comparatively slow), GPU inference (fast, but you can only run small models), or rely on things like RAM or even disk offloading with MoE models (where weights are transferred back and forth between RAM and VRAM depending on which "experts" will be active for the next token – also relatively slow.) The graphics processor on APUs with shared memory like AMD's Strix Halo and Apple silicon can access the entirety of the shared memory. So while they are typically slower than a traditional graphics card/have less memory bandwidth, it's a relatively affordable way to have accelerated inference with models that would be too large to fit on a consumer graphics card. For example, you could run a ~70GB model on this machine, fitting comfortably in the 128GB shared memory, and get the benefit of APU-accelerated inference, whereas if you wanted to run something of the same size in GPU memory, you'd need at least a 6000 Pro Blackwell or several other cards (even though the 6000 Pro would be quite a bit faster). Reply
USAFRet Strike one brand off my short list to replace my 8 year old QNAP. Reply
Key considerations
- Investor positioning can change fast
- Volatility remains possible near catalysts
- Macro rates and liquidity can dominate flows
Reference reading
- https://www.tomshardware.com/pc-components/nas/SPONSORED_LINK_URL
- https://www.tomshardware.com/pc-components/nas/minisforums-new-flagship-nas-comes-with-openclaw-pre-installed-strix-halo-powered-n5-max-can-run-a-local-ai-llm#main
- https://www.tomshardware.com
- SK hynix introduces turbocharged LPDDR6, 33% faster and 20% more power efficient than LPDDR5X — 16Gb chips deliver 10.7 Gbps, uses 10nm node
- US RAM crisis hits boiling point as AI mania wipes out all 32GB DDR5 kits under $359 — cheaper kits vanish from shelves within seconds of listing
- China firm Lisuan's homegrown 6nm G100 series GPUs announced with up to 12GB of VRAM — LX 7G106 can play Cyberpunk 2077 and other popular Steam games, arrives J
- Iran threatens Nvidia, Microsoft, other tech companies with strikes over alleged attack on Tehran bank — says that economic centers and banks are now considered
- IBM and Lam's new partnership paves the way toward sub-1nm logic using High-NA EUV — Albany lab to pioneer dry resist process integration
Informational only. No financial advice. Do your own research.