
Internal reorganization officially refocuses cloud efforts on R&D less than three years after DGX Cloud launched.
When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works .
(Image credit: Getty Images / NurPhoto) Share Share by: Copy link Facebook X Whatsapp Reddit Flipboard Share this article Join the conversation Follow us Add us as a preferred source on Google Tom's Hardware Premium Roadmaps
Nvidia has reorganized its cloud computing group, further scaling back its ambitions to operate a public cloud service that would directly compete with Amazon Web Services, according to reporting by The Information . The changes include folding the DGX Cloud business into Nvidia’s core engineering structure, under the leadership of SVP Dwight Diercks, who oversees software engineering.
The decision to scale back comes as Nvidia continues to post record revenue from data center GPUs, while facing growing pressure to balance internal platform ambitions against the interests of its largest customers. The restructuring comes after it was announced in September that Nvidia would stop trying to compete with AWS and Azure .
Nvidia is not exiting cloud infrastructure altogether, but is narrowing its scope. Instead of selling GPU compute as a service under its own brand, the company is repositioning DGX Cloud as an internal platform for its engineers, with a focus on chip demand and AI model development.
Nvidia's China presence hits zero, says CEO Jensen Huang, and companies are already working around it
Inside the AI accelerator arms race: AMD, Nvidia, and hyperscalers commit to annual releases through the decade
Google TPUs garner attention as AI chip alternative, but are only a minor threat to Nvidia's dominance
DGX Cloud was introduced in early 2023 as Nvidia’s attempt to abstract its flagship DGX systems into a managed service. Hosted initially on infrastructure provided by AWS, Google Cloud, Oracle Cloud, and Microsoft Azure, the service offered dedicated H100-based clusters with Nvidia’s full software stack preinstalled. It offered a straightforward, attractive pitch to enterprise customers, allowing them to rent Nvidia’s preferred AI platform without building their own data centers.
In practice, the model proved difficult to scale. Pricing was high compared to commodity GPU instances, integration with existing cloud tooling was uneven, and support responsibilities were split between Nvidia and its hosting partners. Customers running DGX Cloud across multiple providers faced operational complexity, while hyperscalers themselves were rapidly cutting prices on H100 capacity and rolling out their own managed AI services.
Against that backdrop, Nvidia has now folded the DGX Cloud team into its broader engineering organization under engineering leadership. The group’s remit has shifted toward internal use, including AI model development, software validation, and pre-silicon and post-silicon testing of new GPU platforms.
This makes a lot of sense given that Nvidia’s largest customers are the very companies it would have been competing against should DGX Cloud have continued operating as it was. AWS, Microsoft, Google, and other cloud providers account for a significant share of Nvidia’s data center revenue . Running a first-party cloud service risked creating channel conflict at a time when those customers are committing billions of dollars to Nvidia hardware.
Operating a competitive cloud platform also requires sustained capital expenditure on facilities, networking, power, and operations. Nvidia’s advantage lies in silicon, systems, and software, not running global data center fleets. While DGX Cloud leveraged partner infrastructure, Nvidia still bore the cost of support, platform engineering, and customer acquisition without controlling the underlying economics.
Considering that hyperscalers have strong incentives to differentiate above Nvidia’s hardware layer, DGX Cloud as a first-party provider fundamentally didn’t make sense. AWS continues to invest in Trainium and Inferentia accelerators ; Google is pushing TPUs; and Microsoft is expanding its Maia program, which recently partnered with Intel Foundry to build chips. Even so, all of them remain deeply dependent on Nvidia GPUs for leading-edge AI workloads. For Nvidia, preserving that dependency is understandably worth more than capturing cloud service revenue.
There were also signs of strain in GPU supply dynamics. Nvidia has increasingly acted as both supplier and customer, leasing back large volumes of GPU capacity from cloud providers and specialized operators — in one case, 18,000 GPUs to the tune of $1.5 billion over four years .
Nvidia's China presence hits zero, says CEO Jensen Huang, and companies are already working around it
JP Morgan says Nvidia is gearing up to sell entire AI servers instead of just AI GPUs and componentry
Google TPUs garner attention as AI chip alternative, but are only a minor threat to Nvidia's dominance
Long-term commitments to buy unused capacity ensure Nvidia has access to compute for internal development while helping partners justify aggressive data center expansion. This arrangement becomes harder to justify if Nvidia is simultaneously selling competing cloud services.
By stepping back and restructuring DGX under its engineering business, Nvidia is simplifying those relationships. DGX Cloud becomes a tool that helps partners deploy Nvidia hardware more effectively, rather than a product that competes with their own offerings. That alignment is particularly important as Nvidia prepares successive GPU generations that will demand even tighter hardware and software coordination.
The reorganization does not necessarily mean there will be reduced investment in cloud-adjacent technologies. On the contrary, Nvidia continues to expand its software stack across inference, orchestration, networking, and systems management. Platforms such as CUDA, TensorRT, and Nvidia’s inference frameworks are designed to run everywhere hyperscalers operate, and DGX Cloud now functions as a proving ground for those technologies.
Ultimately, everyone’s a winner here. Hyperscalers benefit from the neutering of what could have become a major source of friction, and Nvidia benefits from maintaining its position as the go-to partner of those hyperscalers rather than a competitor, even as cloud providers continue exploring in-house silicon to manage costs and supply risk. It also demonstrates to enterprise customers that Nvidia’s primary role is to enable and power AI infrastructure rather than operate it.
There’s no ignoring how quickly Nvidia’s priorities have evolved. What looked, two years ago, like an effort to move up the stack into cloud services now looks more like an experiment that informed a more focused strategy. DGX Cloud is still kicking, but as infrastructure for Nvidia’s own engineers and a bridge to its partners, not as an effort to become the next big cloud platform.
Luke James Social Links Navigation Contributor Luke James is a freelance writer and journalist. Although his background is in legal, he has a personal interest in all things tech, especially hardware and microelectronics, and anything regulatory.
Key considerations
- Investor positioning can change fast
- Volatility remains possible near catalysts
- Macro rates and liquidity can dominate flows
Reference reading
- https://www.tomshardware.com/tech-industry/SPONSORED_LINK_URL
- https://www.tomshardware.com/tech-industry/nvidia-restructures-dgx-cloud-team-refocuses-cloud-efforts-internally#main
- https://www.tomshardware.com
- LG UltraGear evo gaming monitor lineup announced ahead of CES 2026 — 27-inch 5K Mini LED, 39-inch curved Tandem OLED, and a 52-inch 5K2K large format display
- China's reverse-engineered Frankenstein EUV chipmaking tool hasn't produced a single chip — sanctions-busting experiment is still years away from becoming opera
- China chipmaker SMIC raises wafer prices by about 10% as memory demand tightens capacity
- Old Ryzen AM4 CPUs top US, UK Amazon charts as DDR5 pricing pushes buyers to last-gen platform — DDR4-friendly Ryzen 5 5800X, XT claim spots in the top 5
- Deck the Vaults: ‘Fallout: New Vegas’ Joins the Cloud This Holiday Season
Informational only. No financial advice. Do your own research.