Nvidia GeForce GTX 1630 Review: Lobotomized Turing

Rose tinted glasses won't make this old GPU look any better

Colorful GeForce GTX 1630
(Image: © Tom's Hardware)

Tom's Hardware Verdict

The GeForce GTX 1630 has arrived woefully late to the party, at a price that's double what it should be. With half the shader cores and memory channels disabled, it turns a budget GPU into a halfwit, with the lowest performance of any relatively current graphics card.

Pros

  • +

    Faster than GTX 1050

  • +

    Hardware video encoding

  • +

    Manages 1080p medium at 30 fps

  • +

    4GB VRAM (not that it matters much)

Cons

  • -

    Costs more than GTX 1650

  • -

    Slower than every other modern GPU

  • -

    Still needs a 6-pin PCIe power connector

  • -

    About three years late to the GTX 16-series party

Why you can trust Tom's Hardware Our expert reviewers spend hours testing and comparing products and services so you can choose the best for you. Find out more about how we test.

The Nvidia GeForce GTX 1630 started popping up in rumors in leaks a few months back. Part of me thought, "Surely Nvidia won't release a new and pathetically slow Turing variant this late in the game." But the realist in me knew it was only a matter of time — the GT 1030 and GT 730 cards that started shipping again last year was the only evidence we needed.

Let's be blunt: The GTX 1630 isn't anywhere close to being one of the best graphics cards, and in fact it lands near the very bottom of our GPU benchmarks hierarchy. The only slower GPUs that we've tested are the GTX 1050, RX 560, RX 550, and the aforementioned GT 1030. None of those are worth your time or money either, but at least they're not being released in mid-2022.

The only real contender for the GTX 1630 is AMD's recently launched Radeon RX 6400, but this new Nvidia card actually makes the lackluster 6400 look good. Actually, the real competition — and the reason no one should give the GTX 1630 the time of day — comes from the existing GTX 1650 and GTX 1650 Super. The latter basically doubles the specs of the 1630, and pretty much doubles performance as well.

Colorful sent us this sample for review, but there doesn't seem to be an official price from Nvidia. What we can find online suggests that the soft MSRP has been set at $199, which is just silly. EVGA lists its own GTX 1630 for $199, or you can also buy the far superior GTX 1650 Super for $199. The 1630 also nominally replaces the GT 1030, which had a launch price of $79 — $70 for the faster GDDR5 variant. It feels as though the GTX 1630 was priced according to mid-2021, but it's now laughably expensive.

Here's how the specifications for Nvidia's old-timer Turing TU117 GPUs stack up, with the RX 6400 and RX 6500 XT for comparison. 

Swipe to scroll horizontally
GPU Specifications
Graphics CardGTX 1630RX 6500 XTRX 6400GTX 1650 SuperGTX 1650
ArchitectureTU117Navi 24Navi 24TU116TU117
Process TechnologyTSMC 12FFNTSMC N6TSMC N6TSMC 12FFNTSMC 12FFN
Transistors (Billion)4.75.45.46.64.7
Die size (mm^2)200107107284200
SMs / CUs1016122014
GPU Cores51210247681280896
RT/RA CoresN/A1612N/AN/A
Boost Clock (MHz)17752815281517251665
VRAM Speed (Gbps)121816128
VRAM (GB)44446
VRAM Bus Width646464128128
ROPs3232324832
TMUs3264488056
TFLOPS FP32 (Boost)1.85.84.34.43
Bandwidth (GBps)96144128192128
PCIe LinkGen3 x16Gen4 x4Gen4 x4Gen3 x16Gen3 x16
TDP (watts)751075310075
Launch DateJun-22Jan-22Jan-22Nov-19Apr-19
Launch Price$199 $199 $159 $159 $149
Online Price$199$168$149~$125 used~$90 used

AMD's RX 6400 basically tied the GTX 1650, so there's little question it will easily beat the lower spec GTX 1630. About the only advantage that the GTX 1630 has is the presence of video encoding hardware — the TU117 has Pascal-era hardware, not the improved Turing encoder, but it's still better than nothing.

What's particularly odd with the GTX 1630 is that Nvidia has been shipping the same TU117 GPU in laptops as the MX450 (and more recently MX550) for a couple of years, though granted NVENC is disabled on those parts. Apparently, there were enough chips that couldn't reach the required 14 SMs for the MX450 or GTX 1650, and Nvidia and its partners figured a cut-down 10 SM variant might still sell on desktops — to the uninformed, anyway.

You basically get all the same features as a GTX 1650, just with less performance. You don't even necessarily get a lower power card, as the GTX 1630 models we've seen still come with a 6-pin power connector. That might be because these were less desirable chips with defects, or maybe it's because Nvidia tried to make up for the lack of GPU cores with slightly higher clocks. At least the RX 6400 can be found in half-height models and doesn't require additional power.

The 4GB GDDR6 comes clocked at 12Gbps, with a 64-bit interface. As noted already, that's exactly half of what the GTX 1650 Super provides. Also note that, unlike AMD's RDNA 2 GPUs, there's no Infinity Cache to make up the difference, though the 512 GPU cores are already going to be a limiting factor.

With most modern graphics card selling at close to MSRP, the GTX 1630 feels like far too little, far too late. Last year we saw GTX 1650 cards going for $300 or more, and a $200 GTX 1630 might have made some kind of warped sense. Today, you can get the RX 6500 XT, RX 6400, GTX 1650, and even GTX 1650 Super for $200 or less. Maybe this was supposed to be for big OEMs, so they could toss in a weak GPU and claim to still offer dedicated graphics, but that's ultimately just going to lead to disappointed customers. 

Jarred Walton is a senior editor at Tom's Hardware focusing on everything GPU. He has been working as a tech journalist since 2004, writing for AnandTech, Maximum PC, and PC Gamer. From the first S3 Virge '3D decelerators' to today's GPUs, Jarred keeps up with all the latest graphics trends and is the one to ask about game performance.

  • King_V
    Upon reading this review, I feel that I owe you an apology for being such a proponent for the GPU Battle of Meh...

    Uh, but, I guess it's better than the 1050 non-Ti, so, uh, that's . . no, I can't even get myself to say it's some kind of victory.

    My sympathies for your suffering on this one, @JarredWaltonGPU
    Reply
  • Eximo
    OEMs are still selling GTX1650 and 1650 Super in low end gaming desktops. Just don't see this happening, practically e-waste unless they cut the price in half.
    Reply
  • PiranhaTech
    King_V said:
    Upon reading this review, I feel that I owe you an apology for being such a proponent for the GPU Battle of Meh...

    Uh, but, I guess it's better than the 1050 non-Ti, so, uh, that's . . no, I can't even get myself to say it's some kind of victory.

    My sympathies for your suffering on this one, @JarredWaltonGPU
    I don't mind weak GPUs from Nvidia or AMD as long as they are priced okay. There's been a few times where an integrated GPU doesn't work for whatever reason (for a PC not used for gaming), or my good GPU goes out, I'm short on money, and I need a working PC.

    They definitely have their uses. However, come on! With that kind of performance, be $120 at most, probably $80-100. It's 75W on top of all of this. I could be more forgiving if it ran at 50W or lower. Give us something, Nvidia! The GT 1030 is a 30W GPU
    Reply
  • King_V
    PiranhaTech said:
    I don't mind weak GPUs from Nvidia or AMD as long as they are priced okay. There's been a few times where an integrated GPU doesn't work for whatever reason (for a PC not used for gaming), or my good GPU goes out, I'm short on money, and I need a working PC.

    They definitely have their uses. However, come on! With that kind of performance, be $120 at most, probably $80-100. It's 75W on top of all of this. I could be more forgiving if it ran at 50W or lower. Give us something, Nvidia! The GT 1030 is a 30W GPU
    Well, as predicted when the specs were announced and when the card first came out, Nvidia released a GPU that makes the RX 6400 look like a hero.
    Reply
  • waffleinc
    When I first heard about this GPU, I was somewhat excited. At the time, I needed a cheap and cheerful GPU for nothing more than basic home use and video encoding. I figured that as long as it's only $100, that would be fine, even with the low core count and slow memory. Then I saw that it would be $200 and 75w. Yeah, no thanks.
    Reply
  • Giroro
    What version of NVENC does it have?
    I assume the older/worse version of the GTX 1650... If so, I wouldn't even want this for $100... Maybe $80.

    With the updated encoders, I could see myself paying nearly $125 as an entry point to throw my old 3700x into a dedicated streaming PC.

    But today Newegg has a 1660 Super for $210 ($30 MIR), and you can get an RTX 2060 for $250 (which also completely embarrasses the pricing of the RTX 3050).

    Nvidia needs to step back get real with how they're letting their board partners price their crappy new cards.
    Reply
  • JarredWaltonGPU
    King_V said:
    Upon reading this review, I feel that I owe you an apology for being such a proponent for the GPU Battle of Meh...

    Uh, but, I guess it's better than the 1050 non-Ti, so, uh, that's . . no, I can't even get myself to say it's some kind of victory.

    My sympathies for your suffering on this one, @JarredWaltonGPU
    I'm still waiting for the "joy" of testing Arc A380. I mean, technically it should perform pretty decently based on specs. But I am not at all looking forward to the driver shenanigans I'll likely have to deal with. On the bright side, it's taken so long to get the GPU shipped from China that drivers will hopefully be quite a bit better by the time the card arrives! Also, AV1 encoding should be interesting, assuming it works properly. I'll try to encode and upload the A380 video using its AV1 hardware. 🙃
    Reply
  • Thunder64
    The article basically says "Don't buy this crap" and yet 2/5 stars. What is a one star product? This costs the same as much better cards and still requires external power.

    Giroro said:
    What version of NVENC does it have?
    I assume the older/worse version of the GTX 1650... If so, I wouldn't even want this for $100... Maybe $80.

    With the updated encoders, I could see myself paying nearly $125 as an entry point to throw my old 3700x into a dedicated streaming PC.

    But today Newegg has a 1660 Super for $210 ($30 MIR), and you can get an RTX 2060 for $250 (which also completely embarrasses the pricing of the RTX 3050).

    Nvidia needs to step back get real with how they're letting their board partners price their crappy new cards.

    Or for $10 more on Amazon you could get an RX 6600. That would have a bit more performance but more importantly 8GB of RAM.

    JarredWaltonGPU said:
    I'm still waiting for the "joy" of testing Arc A380. I mean, technically it should perform pretty decently based on specs. But I am not at all looking forward to the driver shenanigans I'll likely have to deal with. On the bright side, it's taken so long to get the GPU shipped from China that drivers will hopefully be quite a bit better by the time the card arrives! Also, AV1 encoding should be interesting, assuming it works properly. I'll try to encode and upload the A380 video using its AV1 hardware. 🙃

    How are you even going to review it? Intel should betting more flak about having to put games into "tiers". Do you separate your review by tier? The conclusion could be quite different depending on what you intend to play.
    Reply
  • missingxtension
    This is just a slap in the face, so disrespectful. Rx6400 was so disrespectful, then Nvidia comes and "hold ny beer".
    I am waiting to see if Intel at least ca do something reasonable for the price, I just can't see Intel being good at pricing. But this guys here make em look like the value leader. Wait till the recession and shrinking computer market comes, they are trying to cause it.
    Reply
  • hannibal
    Nvidia knows exactly how fast (slow) this GPU is! When this is $199, just ques how much more faster GPUs are gonna cost!
    4050 should cost douple of this gpu so $400 and that still makes 4050 to look good compared to this gpu.
    The GPU price hike is not over, at least according the Nvidia!
    Reply