
Following several rumors that touted a 2025 release as early as 2024, we eventually learned that the N1/N1X chips have been pushed back to 2026 . It's possible the upcoming Nvidia GTC, planned for March 16-19, is likely the stage where these chips will be unveiled. Pricing will remain a key factor in its prevalence; Jason Tsai of DigiTimes said that "it may remain a niche luxury product" unless it lands around the $1,500 range.
Follow Tom's Hardware on Google News , or add us as a preferred source , to get our latest news, analysis, & reviews in your feeds.
Hassam Nasir is a die-hard hardware enthusiast with years of experience as a tech editor and writer, focusing on detailed CPU comparisons and general hardware news. When he\u2019s not working, you\u2019ll find him bending tubes for his ever-evolving custom water-loop gaming rig or benchmarking the latest CPUs and GPUs just for fun. ","collapsible":{"enabled":true,"maxHeight":250,"readMoreText":"Read more","readLessText":"Read less"}}), "https://slice.vanilla.futurecdn.net/13-4-17/js/authorBio.js"); } else { console.error('%c FTE ','background: #9306F9; color: #ffffff','no lazy slice hydration function available'); } Hassam Nasir Social Links Navigation Contributing Writer Hassam Nasir is a die-hard hardware enthusiast with years of experience as a tech editor and writer, focusing on detailed CPU comparisons and general hardware news. When he’s not working, you’ll find him bending tubes for his ever-evolving custom water-loop gaming rig or benchmarking the latest CPUs and GPUs just for fun.
bit_user The article said: Nvidia, of course, is a GPU manufacturer, so irrespective of the Arm CPU cores, we can expect the N1/N1X chips to be targeted at gaming. The company already supplies the chips for the Nintendo Switch 2, but the last time we saw it release a chip to the public was back in 2015 with the Tegra X1 (which also powered the original Switch). Still got the facts jumbled, on this point. ConsoleSoCLaunch DateNintendo SwitchNvidia Tegra X12017-03Nintendo Switch 2Nvidia Orin NX-derived2025-06 Seriously, you can look this stuff up on Wikipedia: https://en.wikipedia.org/wiki/Tegra#Devices_9 Reply
Notton Here's hoping N1X isn't another one-off product like the Shield and TV based on said Tegra. The specs and leaked CPU benchmark numbers look promising, but Nvidia drivers have been on a downward trend ever since they released the RTX 50xx series, and the iGPU portion is based off of that. Oh, and let's not forget Nvidia partnered with Mediatek on this project. Mediatek drivers are good enough on Android, but on Windows? lol Likewise, Windows-on-ARM leaves a lot to be desired, though I'm not entirely sure if it's Qualcomm that's dropping the ball or Microsoft… I assume it's the latter, but ya never know. If it's supported, I'd much rather see how N1X performs on Linux, or Aluminum. Reply
bit_user Notton said: Here's hoping N1X isn't another one-off product like the Shield and TV based on said Tegra. The specs and leaked CPU benchmark numbers look promising, but Nvidia drivers have been on a downward trend ever since they released the RTX 50xx series, and the iGPU portion is based off of that. Well, I presume the iGPU interface spec that Nvidia is going to use with Intel CPUs is the same. So, if Nvidia only needs to make one series of iGPUs to sell into both ARM and x86 markets, then it's very likely they'll continue. As for Mediatek, what other option does it have, besides Nvidia or ARM's Mali graphics? Maybe they could go back to Imagination Technologies, but that would just put them in the same lot as probably a bunch of Chinese SoCs. Notton said: Likewise, Windows-on-ARM leaves a lot to be desired, though I'm not entirely sure if it's Qualcomm that's dropping the ball or Microsoft… I assume it's the latter, but ya never know. I think Windows/ARM is going to show a lot more maturity than it did when Snapdragon X first launched. Qualcomm is releasing their second gen of those products, so they'll be making another big push to get support shored up. Notton said: If it's supported, I'd much rather see how N1X performs on Linux, or Aluminum. If it's the same silicon as GB10, then you already can. Jeff Geerling tested GB10 on some generic application benchmarks + Linux gaming: https://www.jeffgeerling.com/blog/2025/dells-version-dgx-spark-fixes-pain-points/ Phoronix did a more comprehensive suite of CPU tests: https://www.phoronix.com/review/nvidia-gb10-cpu And GPU tests: https://www.phoronix.com/review/dell-pro-max-gb10-llama-cpp Reply
thestryker bit_user said: Still got the facts jumbled, on this point. The T210, which is the X1, launched with the Shield TV in 2015. Reply
bit_user thestryker said: The T210, which is the X1, launched with the Shield TV in 2015. Yes, my table was listing the launch dates of the consoles. If you want to talk about the SoC's, Orin NX launched in 2022. So, it was even older by the time Nintendo Switch 2 launched than the SoC used by the Switch 1 was, at the time of its launch. As I've said before, I think Switch 2 probably would've used Atlan, had it not been cancelled. I think that would've been based on the Ada architecture and would've made Switch 2 a fair bit more powerful. The first of the Orin SoCs were announced back in 2018, though I think it didn't ship until 2021. It has already been succeeded by Thor, which is Blackwell-based and launched last year. Reply
thestryker bit_user said: As I've said before, I think Switch 2 probably would've used Atlan, had it not been cancelled. I think that would've been based on the Ada architecture and would've made Switch 2 a fair bit more powerful. I still don't know why a more advanced node wasn't used for the T239 given that the Ada version didn't launch. While the SoC isn't great even dropping to one of the Samsung 5nm nodes ought to have been a huge efficiency improvement. Reply
bit_user thestryker said: I still don't know why a more advanced node wasn't used for the T239 given that the Ada version didn't launch. Maybe because that would cost money and Nintendo is cheap? AFAICT all Orin SoC's use Samsung 8 nm, which makes sense if they're based on Ampere (it also used that node). thestryker said: While the SoC isn't great even dropping to one of the Samsung 5nm nodes ought to have been a huge efficiency improvement. If it's not an optical shrink, then forget it. Reply
thestryker bit_user said: If it's not an optical shrink, then forget it. All of Samsung's 5nm based nodes are quite a bit better than anything based on their 10nm technology so there's no question it'd be a lot better. bit_user said: Maybe because that would cost money and Nintendo is cheap? I suspect this is it entirely as nvidia doesn't need the better node for anything they're making based on Orin. Reply
bit_user thestryker said: All of Samsung's 5nm based nodes are quite a bit better than anything based on their 10nm technology so there's no question it'd be a lot better. Mine was a point about cost & effort, not the relative merits of the nodes. Reply
Key considerations
- Investor positioning can change fast
- Volatility remains possible near catalysts
- Macro rates and liquidity can dominate flows
Reference reading
- https://www.tomshardware.com/pc-components/cpus/SPONSORED_LINK_URL
- https://www.tomshardware.com/pc-components/cpus/nvidias-n1-n1x-chips-leak-once-again-this-time-tipped-for-release-in-first-half-of-2026-hotly-anticipated-chips-to-reportedly-debut-on-dell-and-lenovo-laptops#main
- https://www.tomshardware.com
- India Fuels Its AI Mission With NVIDIA
- Nemotron Labs: How AI Agents Are Turning Documents Into Real-Time Business Intelligence
- Leading Inference Providers Cut AI Costs by up to 10x With Open Source Models on NVIDIA Blackwell
- Survey Reveals AI Advances in Telecom: Networks and Automation in Driver’s Seat as Return on Investment Climbs
- OpenAI couldn’t finance its data centers, so it took control of the hardware instead — company's chip design aspirations lag behind Google and Amazon
Informational only. No financial advice. Do your own research.