Nvidia May Prep GeForce GTX 1630 to Rival Radeon RX 6400

GeForce RTX Graphics Card
(Image credit: Nvidia)

Out of nowhere, rumors have emerged that Nvidia is reportedly planning to put together a GeForce GTX 1630 GPU to compete in the entry-level market, as reported by VideoCardz. Specifications are unknown. Still, the name does suggest it will leverage Nvidia's older Turing architecture rather than Ampere. If this information is accurate, the GTX 1630's sole purpose is probably to compete against AMD's Radeon RX 6400.

VideoCardz admitted that all its data surrounding the GTX 1630 is nothing but rumors; however, the news outlet has double-confirmed with its sources that Nvidia is planning to make this GPU. Apparently, that part of the story is genuine. Still, we have to take all this data with a grain of salt since we don't have direct access to the source material.

Nvidia's decision to make a GTX 1630 is surprising and unusual at the same time. If this is true, it will mark the first time Nvidia has cracked its decade-long running GTX nomenclature by making an xx30-series GPU a GTX product. In the past, the 10, 20, 30, and 40 series products were under Nvidia's entry-level "GT" branding, with the 50 series serving as the barrier to entry for the GTX lineup.

According to Videocardz's source material, the GTX 1630 supposedly replaces the aging GTX 1050 Ti with a lower price point than the GTX 1650. It is bizarre, considering the GTX 1650's mission was to do that and achieve that goal years ago. There's a minimal point to making a GTX 1630 unless Nvidia wants a redundant GPU in its 16 series lineup.

We don't have any leaked data on GPU specifications, but it's not hard to guess what it could be. Nvidia's current GTX 1650 is equipped with its smallest Turing die known as TU117 and cut down to 864 cores and 14 SMs. A potential GTX 1630 would presumably have a further reduced SM count of just 12 or even 10 SMs, cutting TU117's maximum SM count of 16 by almost half.

Memory will come either in GDDR5 or GDDR6 flavors and top out at 4GB. Nvidia is already using GDDR5 and GDDR6 in the GTX 1650 and its four different variants (not including the Super counterparts), so it's easy to guess Nvidia will use either of the two on a GTX 1630.

If Nvidia decides to keep using GDDR5 and only cuts the SM counts, this theoretical GTX 1630 will potentially have 80% or 90% of the performance of a GTX 1650.

Pricing will probably be similar to an AMD's RX 6400 since the current GTX 1650 and RX 6500 XT shares the same price bracket of $200. A price reduction by $20-$30 or more for a GTX 1630 would certainly make sense.

However, a GPU like this makes little sense in a market where GPU prices plummet every month. If we saw this GPU when AMD made its RX 6500 XT, it would make sense as a desperate attempt to get GPUs to gamers. But, that situation is quickly dying now that many modern GPUs are available at MSRP.

Super aggressive pricing will be the only hope for a viable GTX 1630 solution in the current marketplace. The last standing price bracket that lacks any modern GPU is the sub $150 market, where the most modern GPUs you'll find are the GT 1030 and RX 550. If it can beat the price of its GT 1030 right now with a faster, more modern GPU, the GTX 1630 could become a massive success as a sub $150 gaming card.

Aaron Klotz
Freelance News Writer

Aaron Klotz is a freelance writer for Tom’s Hardware US, covering news topics related to computer hardware such as CPUs, and graphics cards.

  • InvalidError
    It is a sad day when companies can relaunch 4-5 years old entry-level products to "compete" with the competition's newest entry-level products. Shows just how uncompetitive the entry-level market has become.
    Reply
  • Alvar "Miles" Udell
    Could be worse, could be nVidia reusing the 9800 alphabet soup for years while AMD flails.
    Reply
  • CoD611
    If it's cheap enough, I honestly wouldn't mind having it as a backup GPU (since there's no integrated graphics on the Ryzen 5900X) and/or for extra display outputs.
    Reply
  • King_V
    When they first mentioned the RX 6400 being released for the desktop, I idly wondered/speculated if Nvidia would bring out a sub-1650 GPU . . . not at the level where I'd say "I called it" but I did wonder.

    But given that the 6400 performs at about 1650 level, I didn't think there'd be much of a reason. Possible, but not definite.
    Reply
  • InvalidError
    King_V said:
    When they first mentioned the RX 6400 being released for the desktop, I idly wondered/speculated if Nvidia would bring out a sub-1650 GPU . . . not at the level where I'd say "I called it" but I did wonder.
    Remember when Jensen used to say people would be stupid to buy non-RTX GPUs? If he was true to his words, Nvidia should be launching a GA107 RTX3030.
    Reply
  • renz496
    this is no RX6400 rival. nvidia saw the interest for GT1030 going up because of mining. nvidia decision to make GT1630 probably stem from this. by the time next mining wave arrive pascal most likely already at the end of it's main driver support . pascal is 6 years old at this point. nvidia typically support their card for 8 years in main driver. so they simply need the replacement.
    Reply
  • edzieba
    The reason for existence of "why would a consumer ever buy one of these at retail" cards is simple: they are for OEMs, and a retail launch with token or no marketing is nearly free money.
    e.g. OEM wants an Nvidia GPU to offer as an option in their dropndown system configurator alongside an AMD GPU. An AMD GPU (regardless of efficacy) is available for $x, so OEM wants an Nvidia GPU for $x, and go to Nvidia saying "we want a GPU for $x, we don't care about its performance, and we intend to buy 100,000 per month for the next 10 months". Nvidia thinks that x million dollars is x million dollars, and worth pumping out a tiny chip design - based on an existing chip design (using lowest binned almost-reject dies) slapped onto a lowest-possible-cost PCIe board - for a contractually guaranteed revenue stream and continued mindshare from that OEMs customers.
    Reply
  • InvalidError
    edzieba said:
    Nvidia thinks that x million dollars is x million dollars, and worth pumping out a tiny chip design - based on an existing chip design (using lowest binned almost-reject dies) slapped onto a lowest-possible-cost PCIe board
    TU117, the smallest Turing die, is still comparably huge at 200sqmm next to Navi24's 107sqmm.
    Reply
  • King_V
    InvalidError said:
    Remember when Jensen used to say people would be stupid to buy non-RTX GPUs? If he was true to his words, Nvidia should be launching a GA107 RTX3030.
    LOL, actually, I had forgotten about that particular line from him!

    So, I guess we're all on equal footing now, and Nvidia is as stupid as everyone else :LOL:
    Reply