AMD announces MI350P PCIe AI accelerator card with 144GB of HBM3E — roughly 40% faster in FP16 and FP8 theoretical compute compared to Nvidia’s H200 NVL competi

AMD announces MI350P PCIe AI accelerator card with 144GB of HBM3E — roughly 40% faster in FP16 and FP8 theoretical compute compared to Nvidia's H200 NVL competi

Aaron Klotz is a contributing writer for Tom\u2019s Hardware, covering news related to computer hardware such as CPUs, and graphics cards. ","collapsible":{"enabled":true,"maxHeight":250,"readMoreText":"Read more","readLessText":"Read less"}}), "https://slice.vanilla.futurecdn.net/13-4-23/js/authorBio.js"); } else { console.error('%c FTE ','background: #9306F9; color: #ffffff','no lazy slice hydration function available'); } Aaron Klotz Social Links Navigation Contributing Writer Aaron Klotz is a contributing writer for Tom’s Hardware, covering news related to computer hardware such as CPUs, and graphics cards.

Kindaian Can I have one of this with "slow" memory and a cheap price point? I'm asking as an hobbyist that would like to be able to run high memory LLM models locally on the cheap. Asking 20k (imaginary number but probably not far from reality) for one card like this is way too expensive for an hobby consumer AI computer. Reply

User of Computers Kindaian said: Can I have one of this with "slow" memory and a cheap price point? I'm asking as an hobbyist… AMD: No. you're too poor. Reply

Kindaian Me: As an hobbyist, I just don't throw silly money on my project. If you (AMD) don't do it, someone else will, if not, as an hobbyist, I can afford to work around the lack of offer at a reasonable price point. Reply

GenericUser2001 Kindaian said: Can I have one of this with "slow" memory and a cheap price point? I'm asking as an hobbyist that would like to be able to run high memory LLM models locally on the cheap. Asking 20k (imaginary number but probably not far from reality) for one card like this is way too expensive for an hobby consumer AI computer. Maybe look at some of the Ryzen Ai Max machines? If you look around you can find those with 128 GB of RAM for under $3k. I think that is what AMD intends to be the hobbyist local AI option. Reply

User of Computers Kindaian said: Me: As an hobbyist, I just don't throw silly money on my project. If you (AMD) don't do it, someone else will, if not, as an hobbyist, I can afford to work around the lack of offer at a reasonable price point. AMD: so buy the MI350P. Reply

bit_user Kindaian said: Can I have one of this with "slow" memory and a cheap price point? I'm asking as an hobbyist that would like to be able to run high memory LLM models locally on the cheap. Asking 20k (imaginary number but probably not far from reality) for one card like this is way too expensive for an hobby consumer AI computer. Sad to say (at least, for me it is, since I'm not a Mac-head), the I think best thing for this was the high-memory Mac Studio machines with M3 Ultra and 512 GB of RAM (although 256 GB is now the only one you can buy). They can do 819 GB/s of memory bandwidth. Their NPU is only capable of 36 TOPS, but I'm not sure how much you could get out of them by also harnessing the GPU and CPU cores (which have matrix cores). Reply

bit_user User of Computers said: AMD: No. you're too poor. Eh, if you look at the actual silicon inside these cards, the price isn't as unreasonable as some cards out there. For instance, cards like the RTX Pro 6000 Blackwell, that are basically just gaming GPUs with more memory and slightly lower clockspeeds. This MI350P has 2.23x the bandwidth of even that card, as well as 50% more memory and 14.1% more TOPS. Kindaian said: Me: As an hobbyist, I just don't throw silly money on my project. If you (AMD) don't do it, someone else will, if not, as an hobbyist, I can afford to work around the lack of offer at a reasonable price point. There's always a nonlinear relationship between cost and time. If you want faster hardware, you either pay an exorbitant price for it now, or you just wait a few years. GenericUser2001 said: Maybe look at some of the Ryzen Ai Max machines? If you look around you can find those with 128 GB of RAM for under $3k. I think that is what AMD intends to be the hobbyist local AI option. They sort of backed into this one, almost by accident. Their original goal was just to build a Mac Pro competitor. They didn't initially set out to make it into an edge AI machine that could go toe-to-toe with competitors like Nvidia's GB10. Lead times on CPUs are like 3-4 years. So, we'll have to see what they bring to market in 2027 or 2028, in order to see what their most competitive edge AI offering looks like. Reply

Key considerations

  • Investor positioning can change fast
  • Volatility remains possible near catalysts
  • Macro rates and liquidity can dominate flows

Reference reading

More on this site

Informational only. No financial advice. Do your own research.

Leave a Comment