New Commerce Department AI export rules could be seismic change for CSPs and data center operators — buying American GPUs at scale means committing to building

New Commerce Department AI export rules could be seismic change for CSPs and data center operators — buying American GPUs at scale means committing to building

After pushing foreign firms to build in the U.S., the Commerce Dept. is now targeting entire countries with tit-for-tat export controls.

When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works .

If countries want to buy American-made AI chips and high-end semiconductors in quantities of more than 200,000, they may need to invest in U.S. AI data centers or security guarantees.

The Trump administration is considering a new series of export rules over access to the most cutting-edge AI accelerators and networking hardware, the administration confirmed last week , akin to similar demands made of foreign companies earlier this year . As with several other major trade policies in recent months, the finer details have yet to be confirmed and are almost certainly subject to change. But this represents the first major attempt by President Trump to replace AI diffusion rules that were instigated by the Biden Administration — and subsequently thrown out by the Trump Administration following the 2024 election.

The U.S. Department of Commerce has tried to make it clear that this isn't the Trump administration adopting similar AI diffusion rules. This is something new, if profoundly similar … with a few extra steps.

You may like US Commerce Department confirms harsh new AI export rules, shoots down reports over the return of Biden-era AI Diffusion rule U.S. government preps sweeping export controls for Nvidia, AMD AI hardware Analyzing Washington's new AI accelerator export rules Breaking it down Under the Biden-era AI diffusion rules , countries were separated into tiers. Tier 1 countries, including important strategic and historic allies like the United Kingdom and Canada, would receive little in the way of export controls. Tier 2 countries had restrictions on the quantities of chips they could import without specific licenses. Tier 3 countries were banned from importing them altogether.

The rules being floated by the Trump Administration are more restrictive and more complex, but seem to follow a similar pattern, with a complex route to import for traditional allied nations.

Shipments to any country of 1,000 chips or less (of Nvidia GB300 or equivalent hardware) would be given a simplified export process with some claimed limited exemptions. Chip orders of more than 1,000 but less than 200,000 would require pre-approval from the U.S. Department of Commerce, as well as an export license. Operational transparency would also be expected, though it's not clear what level of detail the White House wants.

For the largest orders of 200,000 GB300-class chips or more, companies and singular entities would be required to make a direct investment in American AI data centers, as well as hire potential on-site inspections. Final approval may also require national security assurances. This impacts companies such as Amazon 's AWS, Microsoft , Oracle, and OpenAI.

Although this system doesn't employ tiering, there are countries that are still outright banned from importing the latest American-made chips. Those include China, Russia, Iran, and North Korea, among others.

While the details of this new framework are still being worked out, the bones of it were used for a recent Nvidia and Cerebras deal with the United Arab Emirates. There, the Middle-Eastern country was forced to invest one dollar in U.S. infrastructure for each dollar it spent on its own domestic developments. That effectively doubled the price of the hardware, as well as securing direct investment in American infrastructure.

American reshoring efforts in manufacturing and deployment were the narrative throughline for much of 2025, with a multitude of chipmakers and other cloud service providers (CSPs) building either new sites or expanding upon existing infrastructure .

U.S. posts official H200 and MI325X AI GPU export rules to China, but with plenty of caveats

White House U-turn on Nvidia H200 AI accelerator exports down to Huawei's powerful new Ascend chips, report claims

The Nvidia H200 export saga, as it happened — Beijing ponders response and buyers line up

In criticizing the Biden Administration's AI diffusion rules, the U.S. Commerce Department described them as "burdensome, overreaching, and disastrous," and stated that the new framework would be much better.

Although the new arrangement is different, it absolutely places greater burdens on America's longtime allies, suggesting it may be more effective at wrong footing those who are used to a favorable relationship with American companies than its traditional adversaries.

But whether countries are allied with or against America in a geopolitical sense, they will all need to pay for the privilege of the latest hardware, at least in the near term. With no clear alternative to Nvidia graphics , any company or country looking to train new AI models will need to buy American. For inferencing workloads, alternatives exist , but Nvidia remains the gold standard.

This kind of tactic does not encourage dependency and international cooperation, however. In fact, it actively discourages it. If trading status, customer loyalty, or long-time business interests do little to affect the eventual price — and indeed, buying more is arguably worse for any customers — the incentive is to find an alternative.

Nvidia wants to make CUDA the backbone of AI development, because that will cement its hardware as the primary driver of AI. But why would other countries want to do that if they have to pay over the odds for the privilege, and the rules might change any time, anyway?

Smuggling is an ever-present problem , so even strict export controls may not halt access in the near term. Alternative inferencing hardware is on the rise in China and elsewhere , but the ruler of the roost is Nvidia, which now earns most of its cash supplying CSPs and data center operators. So long as demand for frontier-level AI models remains strong, it's going to require American investment for the foreseeable future, if companies wish to deploy cutting-edge AI accelerators at scale.

Jon Martindale is a contributing writer for Tom's Hardware. For the past 20 years, he's been writing about PC components, emerging technologies, and the latest software advances. His deep and broad journalistic experience gives him unique insights into the most exciting technology trends of today and tomorrow. ","collapsible":{"enabled":true,"maxHeight":250,"readMoreText":"Read more","readLessText":"Read less"}}), "https://slice.vanilla.futurecdn.net/13-4-18/js/authorBio.js"); } else { console.error('%c FTE ','background: #9306F9; color: #ffffff','no lazy slice hydration function available'); } Jon Martindale Freelance Writer Jon Martindale is a contributing writer for Tom's Hardware. For the past 20 years, he's been writing about PC components, emerging technologies, and the latest software advances. His deep and broad journalistic experience gives him unique insights into the most exciting technology trends of today and tomorrow.

Key considerations

  • Investor positioning can change fast
  • Volatility remains possible near catalysts
  • Macro rates and liquidity can dominate flows

Reference reading

More on this site

Informational only. No financial advice. Do your own research.

Leave a Comment