
Aaron Klotz is a contributing writer for Tom\u2019s Hardware, covering news related to computer hardware such as CPUs, and graphics cards. ","collapsible":{"enabled":true,"maxHeight":250,"readMoreText":"Read more","readLessText":"Read less"}}), "https://slice.vanilla.futurecdn.net/13-4-18/js/authorBio.js"); } else { console.error('%c FTE ','background: #9306F9; color: #ffffff','no lazy slice hydration function available'); } Aaron Klotz Social Links Navigation Contributing Writer Aaron Klotz is a contributing writer for Tom’s Hardware, covering news related to computer hardware such as CPUs, and graphics cards.
PEnns "can run a local AI LLM" "All we know officially is the CPU inside, which is AMD's flagship Strix Halo APU sporting 16 Zen 5 CPU cores that can clock up to 5.1GHz, a Radeon 8060S iGPU with 40 CUs, XDNA 2 NPU, and 64MB of L3 cache. The 395+ can be configured with 32GB to 128GB of system memory;" Why does it need AI?? Are we talking about a NAS or a workstation, or what exactly??? Reply
CelicaGT PEnns said: "can run a local AI LLM" "All we know officially is the CPU inside, which is AMD's flagship Strix Halo APU sporting 16 Zen 5 CPU cores that can clock up to 5.1GHz, a Radeon 8060S iGPU with 40 CUs, XDNA 2 NPU, and 64MB of L3 cache. The 395+ can be configured with 32GB to 128GB of system memory;" Why does it need AI?? Are we talking about a NAS or a workstation, or what exactly??? So, I've been around the sun a fair few times and this may date me. Anyways when Compact Disc finally became popular every single audio product seemed overnight to gain some kind of stupid DIGITAL branding on it. DIGITAL everything, even on products where it was completely irrelevant and misapplied. I think perhaps, this is like that.. AI everything, even when it's not really. I mean yes/maybe in this case it but it can also be….a NAS, or a workstation? LOCAL AGENTIC AI!! Maybe just wipe it and install SteamOS. Yes, that one. (Based on what I've read Agentic storage solutions could just maybe erase everything, but maybe it won't. It's a chance their willing to take with your data I guess) Reply
alan.campbell99 Uh, nope. I just want a bunch of disks to store my stuff when I look at a NAS. Reply
timsSOFTWARE PEnns said: "can run a local AI LLM" "All we know officially is the CPU inside, which is AMD's flagship Strix Halo APU sporting 16 Zen 5 CPU cores that can clock up to 5.1GHz, a Radeon 8060S iGPU with 40 CUs, XDNA 2 NPU, and 64MB of L3 cache. The 395+ can be configured with 32GB to 128GB of system memory;" Why does it need AI?? Are we talking about a NAS or a workstation, or what exactly??? It doesn't need AI, but one of the benefits of an APU like the Strix Halo AMD chip that they are using in this machine – or the Apple silicon chips – is that, as APUs, their graphics processor can use system RAM. The problem that current regular PCs have for local AI hosting is that their architecture was built for traditional gaming and other graphics workloads, that require relatively little VRAM compared to system memory. But for AI inference – where you really would like to have the whole model loaded into VRAM, and decently powerful LLMs range in size from tens of GB to terabytes – that leaves you with the conundrum of having to choose between CPU-only inference (comparatively slow), GPU inference (fast, but you can only run small models), or rely on things like RAM or even disk offloading with MoE models (where weights are transferred back and forth between RAM and VRAM depending on which "experts" will be active for the next token – also relatively slow.) The graphics processor on APUs with shared memory like AMD's Strix Halo and Apple silicon can access the entirety of the shared memory. So while they are typically slower than a traditional graphics card/have less memory bandwidth, it's a relatively affordable way to have accelerated inference with models that would be too large to fit on a consumer graphics card. For example, you could run a ~70GB model on this machine, fitting comfortably in the 128GB shared memory, and get the benefit of APU-accelerated inference, whereas if you wanted to run something of the same size in GPU memory, you'd need at least a 6000 Pro Blackwell or several other cards (even though the 6000 Pro would be quite a bit faster). Reply
USAFRet Strike one brand off my short list to replace my 8 year old QNAP. Reply
Key considerations
- Investor positioning can change fast
- Volatility remains possible near catalysts
- Macro rates and liquidity can dominate flows
Reference reading
- https://www.tomshardware.com/pc-components/nas/SPONSORED_LINK_URL
- https://www.tomshardware.com/pc-components/nas/minisforums-new-flagship-nas-comes-with-openclaw-pre-installed-strix-halo-powered-n5-max-can-run-a-local-ai-llm#main
- https://www.tomshardware.com
- Future Games Show Spring Showcase announced for GDC — Baldur's Gate 3 and LEGO Batman actors to host
- This £8.97 TP-Link Ethernet switch is a must-have for 4K streaming and lag-free gaming — compact 5-port unmanaged switch runs silent and unlocks gigabit speeds
- NVIDIA Brings AI-Powered Cybersecurity to World’s Critical Infrastructure
- Crafty AI tool caught repurposing its training GPUs for unauthorized crypto mining during testing — experimental agent breached safety, controllability, and tru
- In wake of outage, Amazon calls upon senior engineers to address issues created by 'Gen-AI assisted changes,' report claims — recent 'high blast radius' inciden
Informational only. No financial advice. Do your own research.