
Luke James is a freelance writer and journalist.\u00a0 Although his background is in legal, he has a personal interest in all things tech, especially hardware and microelectronics, and anything regulatory.\u00a0 ","collapsible":{"enabled":true,"maxHeight":250,"readMoreText":"Read more","readLessText":"Read less"}}), "https://slice.vanilla.futurecdn.net/13-4-22/js/authorBio.js"); } else { console.error('%c FTE ','background: #9306F9; color: #ffffff','no lazy slice hydration function available'); } Luke James Social Links Navigation Contributor Luke James is a freelance writer and journalist. Although his background is in legal, he has a personal interest in all things tech, especially hardware and microelectronics, and anything regulatory.
IntelUser2000 GPU based on open source? I'm all for it. But don't expect a 12nm chip to be significantly faster if it moves to say 5nm. When people say Moore's Law is dying, they mean that traditional just porting to new node and benefitting era is gone. It requires significant work to get the full advantages of a new node. Nvidia/AMD has already done the hard work and know this. It's very unlikely a startup can take full advantage of the newer nodes like established players do. Even if you have a potential winner, the actual winner is determined by the details, such as optimizing and fine tuning the thing. And of course timing is important as fine tuning and optimization is potentially half the work, and can throw off your schedule. Oftentimes the "traditional" ones continue on because they keep making new products like clockwork, whereas a startup's advantages are lost due to delays, which Bolt graphics is already suffering with the delay to 2027. Reply
bit_user The article seems to lack a source link. I'm guessing it's based on this press release: https://www.prnewswire.com/news-releases/bolt-graphics-completes-tape-out-of-test-chip-for-its-high-performance-zeus-gpu-a-major-milestone-in-reducing-computing-costs-by-17x-302750442.html IntelUser2000 said: GPU based on open source? I'm all for it. Huh? Specifically what is the "open source" claim and where did you see it? IntelUser2000 said: But don't expect a 12nm chip to be significantly faster if it moves to say 5nm. Nvidia's RTX 2000 was built on a 12nm node; RTX 4000 was built on a N5-family node. Want to guess which one is faster? It's obviously not going to be the exact same chip on N5 that they're taping out out 12 nm. The 5 nm version will certainly be scaled up and running at higher clock speeds. IntelUser2000 said: Even if you have a potential winner, the actual winner is determined by the details, such as optimizing and fine tuning the thing. And of course timing is important as fine tuning and optimization is potentially half the work, and can throw off your schedule. Oftentimes the "traditional" ones continue on because they keep making new products like clockwork, whereas a startup's advantages are lost due to delays, which Bolt graphics is already suffering with the delay to 2027. Yeah, time-to-market is a serial killer of promising chip startups. By the time Bolt gets a chip to market that's on a remotely competitive node, they'll have missed their market window. Even if their current tech is competitive against a RTX 5090, their final production silicon will have to face a RTX 6090 (or later). Same story repeats itself over and over. That's why it's so hard to displace the big CPU and GPU makers. Reply
thestryker bit_user said: Yeah, time-to-market is a serial killer of promising chip startups. By the time Bolt gets a chip to market that's on a remotely competitive node, they'll have missed their market window. Even if their current tech is competitive against a RTX 5090, their final production silicon will have to face a RTX 6090 (or later). Honestly they're fine (with this timeline) for the market they're aiming at. There's no chance nvidia is going to design anything to compete as long as the ai money is flowing. I think the biggest problem they're facing (at least on the non-HPC markets) is going to be the software side of things. For example with rendering it wouldn't matter if they were double the speed of whatever is released at the time without rock solid software. None of this is to say that sooner isn't better, because it absolutely is, just that their niche is pretty safe as long as they don't stumble. Reply
bit_user thestryker said: Honestly they're fine for the market they're aiming at. There's no chance nvidia is going to design anything to compete as long as the ai money is flowing. Nvidia will eventually release RTX 6090 (and the corresponding workstation cards). Contrary to popular belief, Nvidia hasn't stopped development on non-AI software. Last year, they updated their non-realtime raytracing library to take advantage of the Shader Execution Reordering (SER) in the RTX 5090: https://developer.nvidia.com/rtx/ray-tracing/optix Reply
thestryker bit_user said: Nvidia will eventually release RTX 6090 (and the corresponding workstation cards). Do you really think the 6090 is going to have a minimum double the RT capability of the 5090? I sure don't. There's too many other things that nvidia needs their parts to do that Bolt isn't even looking at. Reply
Key considerations
- Investor positioning can change fast
- Volatility remains possible near catalysts
- Macro rates and liquidity can dominate flows
Reference reading
- https://www.tomshardware.com/tech-industry/semiconductors/SPONSORED_LINK_URL
- https://www.tomshardware.com/tech-industry/semiconductors/bolt-graphics-tapes-out-its-first-zeus-gpu-test-chip-on-tsmc-12nm#main
- https://www.tomshardware.com/subscription
- Showstopper Build: Greyscale — building a custom-looped ITX PC that pushes the form factor to its limits
- Alienware Area-51 gaming desktop now available with AMD’s flagship Ryzen 9 9950X3D2 Dual Edition — top CPU config starts at $4,449.99 with RTX 5070 and 32GB DDR
- Grab this epic Creality K1C 3D printer at its lowest-ever price for a limited time only, now just $369 — huge $190 saving nets you a fast, fully enclosed Core X
- NVIDIA and Google Cloud Collaborate to Advance Agentic and Physical AI
- Rethinking AI TCO: Why Cost per Token Is the Only Metric That Matters
Informational only. No financial advice. Do your own research.