
The announcement characterizes the deal as a non-exclusive agreement, meaning Groq will remain an independent entity, and GroqCloud, the company's platform through which it loans its LPUs, will continue to operate as before. Before this deal, Groq was valued at $6.9 billion in September of this year and was on pace to report $500 million in fiscal revenue .
Follow Tom's Hardware on Google News , or add us as a preferred source , to get our latest news, analysis, & reviews in your feeds.
Hassam Nasir Social Links Navigation Contributing Writer Hassam Nasir is a die-hard hardware enthusiast with years of experience as a tech editor and writer, focusing on detailed CPU comparisons and general hardware news. When he’s not working, you’ll find him bending tubes for his ever-evolving custom water-loop gaming rig or benchmarking the latest CPUs and GPUs just for fun.
jp7189 I've never quite understood groq's magic sauce, but they use on-board sram measured in MB rather than using stacks of expensive HBM measured in GB and still knock it out of the park for LLM specific tasks. Im kinda sad to see Nvidia gobble them up because they seemed to have something truly different, and I'm actually fan of competition. Reply
bit_user jp7189 said: I've never quite understood groq's magic sauce, but they use on-board sram measured in MB rather than using stacks of expensive HBM measured in GB and still knock it out of the park for LLM specific tasks. It's basically what all the dataflow ASICs do. It was great with convolutional neural networks, but kinda broke with transformers. I think you have to scale up to a large number of chips, for it it to work with LLMs, which is another aspect they focused on. Unlike Tenstorrent, they went with a more tightly-coupled chip-to-chip interface. BTW, they don't say exactly how much SRAM they've got, but they say "hundreds of MBs", which is on the order of what Nvidia now has. I think Groq might still have at least an order of magnitude higher SRAM per TOPS, though. jp7189 said: Im kinda sad to see Nvidia gobble them up because they seemed to have something truly different, and I'm actually fan of competition. Nvidia had seemed to be moving in this direction, with NVDLA, but then they reversed course and excluded it from their latest SoCs. More info on their tech is here: https://groq.com/lpu-architecture See also: https://www.nextplatform.com/2022/03/01/groq-buys-maxeler-for-its-hpc-and-ai-dataflow-expertise/ Reply
LordVile More money shovelled onto the pyre Reply
Key considerations
- Investor positioning can change fast
- Volatility remains possible near catalysts
- Macro rates and liquidity can dominate flows
Reference reading
- https://www.tomshardware.com/tech-industry/artificial-intelligence/SPONSORED_LINK_URL
- https://www.tomshardware.com/tech-industry/artificial-intelligence/nvidia-buys-ai-chip-startup-groqs-assets-for-usd20-billion-in-the-companys-biggest-deal-ever-transaction-includes-acquihires-of-key-groq-employees-including-ceo#main
- https://www.tomshardware.com
- Seagate's 22TB Expansion external hard drive drops to an all-time low of $249.99 — ideal for massive backups and data archives
- How NVIDIA H100 GPUs on CoreWeave’s AI Cloud Platform Delivered a Record-Breaking Graph500 Run
- LG Display reveals world's first 4K 240Hz OLED gaming monitor with a true RGB "striped" subpixel layout — New panel succeeds WOLED with multi-stack Tandem OLED
- Microsoft promises to nearly double Windows storage performance after forcing slow software-accelerated BitLocker on Windows — new CPU hardware-accelerated cryp
- AI data centers may soon be powered by retired Navy nuclear reactors from aircraft carriers and submarines — firm asks U.S. DOE for a loan guarantee to start th
Informational only. No financial advice. Do your own research.