Intel’s upcoming Arc B770 discrete GPU leaks out on GitHub, launch appears imminent — Reportedly featuring the BMG-G31 GPU, 16GB+ VRAM, 32 Xe2 cores, and 300W T

Intel's upcoming Arc B770 discrete GPU leaks out on GitHub, launch appears imminent — Reportedly featuring the BMG-G31 GPU, 16GB+ VRAM, 32 Xe2 cores, and 300W T

Hassam Nasir Social Links Navigation Contributing Writer Hassam Nasir is a die-hard hardware enthusiast with years of experience as a tech editor and writer, focusing on detailed CPU comparisons and general hardware news. When he’s not working, you’ll find him bending tubes for his ever-evolving custom water-loop gaming rig or benchmarking the latest CPUs and GPUs just for fun.

Notton B580 vs B770 (leaked specs) 20 to 32Xe cores: 1.6x 456GB/s to 608GB/s memory bandwidth: 1.33x 190W to 300W max power draw: 1.58x I eagerly await benchmark numbers and price, but if it's in the same performance range as a 9060XT 16GB to 5060Ti 16GB, and is cheaper than either, that would be an easy choice to make. So that would be $350 USD, or the MSRP AMD never achieved for the 9060XT/16GB. Reply

Gururu Interesting time and probably ideal launch given memory prices. This will crush all the 8GB cards out there and probably outprice the mid end 16GB cards. Reply

hotaru251 Notton said: B580 vs B770 (leaked specs) 20 to 32Xe cores: 1.6x 456GB/s to 608GB/s memory bandwidth: 1.33x 190W to 300W max power draw: 1.58x I eagerly await benchmark numbers and price, but if it's in the same performance range as a 9060XT 16GB to 5060Ti 16GB, and is cheaper than either, that would be an easy choice to make. So that would be $350 USD, or the MSRP AMD never achieved for the 9060XT/16GB. a 16gb gpu in 2026 durign a ram "shortage" isnt selling for udner 400 Reply

Gururu hotaru251 said: a 16gb gpu in 2026 durign a ram "shortage" isnt selling for udner 400 It will be curious to see. My bet is ~450 and likely based on ram they procured well in advance of the shortage. Reply

Notton hotaru251 said: a 16gb gpu in 2026 durign a ram "shortage" isnt selling for udner 400 Lighten up, Francis Reply

thestryker I'm mostly curious what the client level strategy will be with this card. It is possible that Intel has already acquired the memory needed for whatever volume they're planning. If that's the case then they could be price competitive with the 9060 XT/5060 Ti. It does have the potential to be a 4070 level performing card which would put it a bit above those. I mostly just hope that Intel doesn't abandon the client dGPU space, but the lack of discussion and release cadence isn't encouraging. Reply

usertests Notton said: So that would be $350 USD, or the MSRP AMD never achieved for the 9060XT/16GB. Not true, they went down to $340-350 around Nov/Dec, not counting Micro Center where one model was $330. Now they are $390, so the good times are over, but they did briefly achieve their MSRP. Reply

palladin9479 As long as it's cheaper then the xx60+ class cards then it'll be a win. It's really down to price nowadays. Intel is using an older TSMC process, which is cheaper but they'll use more die area. Having it with a full 256 bit memory bus is definitely as advantage over those 128 bit cards. Reply

beyondlogic pricing in uk is been up and down more down then anything they are however selling out probly since the ram shortage ive seen a uptick over the holidays. ive been waiting for b770 i believe it exists in what form and price well have to wait and see it needs to be 300 pounds or 335 at max if they go over that i wont see it selling well. Reply

das_stig No way is Intel going to sell this for < 400 £/USD even if they bought enough RAM for a year, they will gouge the living life out of the price for max profit but maybe keep it shy of AMD/NV prices, especially with the issues the company is facing. Reply

Key considerations

  • Investor positioning can change fast
  • Volatility remains possible near catalysts
  • Macro rates and liquidity can dominate flows

Reference reading

More on this site

Informational only. No financial advice. Do your own research.

Leave a Comment