Nvidia Pascal GPUs debuted 10 years ago today, best known for the GTX 1060 and GTX 1080 Ti — architecture kicked off with the Tesla P100
But Nvidia could already sense success in its transition from a gaming hardware leader to an AI data center goliath.
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
You are now subscribed
Your newsletter sign-up was successful
Nvidia delivered “five architectural breakthroughs” with the launch of the Tesla P100 accelerator 10 years ago today. Behind the “most advanced hyperscale data center accelerator ever built” was the new Pascal GPU architecture, probably best known among Tom’s Hardware readers for the GeForce GTX 10 series graphics cards for consumers. The architecture spawned legendary GPUs like the GTX 1080 Ti. It was also the architecture behind the entry/mid-range dominating GTX 1060, which still made the grade as a minimum-level card for 2026’s Crimson Desert.
At the P100 accelerator launch, Nvidia spilled the beans on the Pascal architecture. Naturally, it came from the perspective of the hyperscale data center operator, though. Rather than building the foundations for AI slop, Jensen Huang predicted that unleashing this AI processing power would help scientists address “our greatest scientific and technical challenges -- finding cures for cancer, understanding climate change, building intelligent machines.”
The Pascal architecture was undeniably impressive, and in the context of data centers, the P100 was claimed to deliver “over a 12x increase in neural network training performance compared with a previous-generation NVIDIA Maxwell-based solution.”
Article continues belowThese new 16nm FinFET Pascal GPUs, featuring 15.3 billion transistors each, could be partnered with CoWoS with HBM2 for 720GB/s memory bandwidth. Moreover, up to eight Tesla P100 GPUs could easily be scaled using the new NVLink.
For PC gamers, the first taste of Pascal came later in 2016, with the debut of the GeForce GTX 1080. You can check out our review of the Pascal GP104-based graphics card at that link. In short, the GTX 1080 was the first consumer graphics card to “deliver next-gen gaming —playable frame rates at 4K or in VR with quality settings cranked up,” noted our graphics card review team.
However, the majority of GTX 10 Pascal users probably experienced the architecture via the classic everyman card, the GTX 1060. This 6GB card was released with pricing starting at $250 in the summer of 2016. It proved to be a remarkable upgrade for users of previous architectures, and at 120W was capable of coming within striking distance of the prior-gen top-tier GTX 980 (4GB) graphics card for 100s of dollars less. That's what a nice gen-on-gen upgrade looks like.


A year later the king of Pascal GPUs was released, the fabled GTX 1080 Ti with 11GB of VRAM. At the time we noted that it “extended Titan X-class performance to gamers for $500 less.”
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
When the Pascal GPU architecture was revealed, Nvidia was clearly driving forward with its plans to whittle down its reliance on PC gamers for its revenue. It is indeed good for a business to have a wide user base, and not overly rely on any particular fickle market. However, PC DIYers may feel, 10 years later, that the pivot has gone too far. Now when we watch Nvidia keynotes, we expect hours of droning on about AI and data centers, with only a few occasional morsels thrown to PC gamers and creators.
Pascal GPUs were excluded from the stream of new GeForce Game Ready drivers last October. However, users will continue to get quarterly security updates through to October 2028.
Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

Mark Tyson is a news editor at Tom's Hardware. He enjoys covering the full breadth of PC tech; from business and semiconductor design to products approaching the edge of reason.
-
Penzi The 1080ti was the first time I experienced overkill in computing. It was driving a 1440p display and never broke a sweat. Hear, hear!Reply -
abufrejoval It said "4k" on my R9 290X. It also said it on pretty near every other GPU I bought later.Reply
That did include the GTX 1080ti, as well as every successor since, although I drew the line past the RTX 4090 so far.
It's still not 4k, nor VR, certainly not "with quality settings cranked up".
Not even putting a 7950X3D underneath helped.
Sure some games work quite well even with an 8 core APU and a mobile RTX 4060 at 1080p.
But the ones I wanted to play with full fluidity and acceptable realism refused to deliver, above all every Flight Simulator ever published by M$. There is always studder, when that can absolutely never happen in thin air and thus kills all realism.
The optimized games prove it's the software's fault. But I sure wish I could have returned every piece of hardware that said "4k" and "gaming", when all it could guarantee was a stable static image at that resolution.
Just like when they say AI computers are "intelligent".
I still run my GTX 1080ti on a 22-core Xeon. Because one of my sons preferred the 2080ti instead, the next runs the 3090 while two others make do with a 4070 and a 5070. It even does some games ok at 4k. Mostly it supports functional GPU pass-through testing and those Xeons don't have an iGPU, either.
My kids all avoid the one big mistake I made: running a 4k screen, and also wanting to game on it after hours.
Stick with 1440p, people, life is so much better there!
Except that realism requires something better than the tunnel vision a screen restricts you to (still no VR).
I didn't have a choice, I use mine for work, and work is at least 4k on no less than 43".
Or the equivalent in a headset you can tolerate wearing for a working day, but that seems further off than ever.