Sign in with
Sign up | Sign in

GeForce GTS 250: Nvidia's G92 Strikes Again

GeForce GTS 250: Nvidia's G92 Strikes Again
By

Last year was full of ups and downs in the graphics market. First, Nvidia unveiled its GT200 graphics processor and a pair of boards centering on the chip. It wiped the floor with everything else out there—not exactly difficult given AMD’s mid-range Radeon HD 3800-series, which had already been trumped.

Then AMD pulled a rabbit out of its hat, launching the RV770 GPU and two boards based on that piece of silicon. The fastest Radeon HD 4870 wasn’t quite quick enough to best the fastest Nvidia chip, but it was fast enough that everyone knew the dual-processor Radeon HD 4870 X2 AMD had pre-announced during the launch would put the underdog on top.

Since then, AMD has been busy populating its lineup with mainstream and entry-level boards based on derivative architectures. The Radeon HD 4830 has turned into the least-expensive performance offering. The Radeon HD 4670 and 4650 form the meat of AMD’s mid-range lineup. And the Radeon HD 4500-/4300-series boards make up the entry-level.

Nvidia has responded to AMD’s challenge in a number of different ways. At the high-end, it launched its own dual-GPU card, the GeForce GTX 295. In the middle of its performance line, a less-handicapped GeForce GTX 260 with 216 shader processors gets the jump on AMD’s Radeon HD 4850 (and indeed the 4870 with 512 MB of memory, as you’ll see in the benchmarks here). And a 55 nm replacement for the GT200 yields the company’s latest GeForce GTX 285.

Of course, then there’s Nvidia’s emphasis on its value-adds: CUDA, PhysX, and 3D Vision, all enabled through the company’s software drivers. While we’d consider the trio of technologies to still be in their early stages of mainstream adoption, they’re all still technically advantages. AMD is working out the kinks in its Stream video encoder, doesn’t offer any sort of physics acceleration, and has been oddly quiet about its partnership with 3D monitor-maker iZ3D, which as we revealed at this year’s CES, gives you the same experience on AMD or Nvidia graphics hardware.

In Need Of A Mainstream Answer

While it’d seem to have all of its bases covered, we have to imagine that the massive 55 nm GT200 GPU is still far too large (read: expensive) to work into a card any cheaper than the GeForce GTX 260, leaving Nvidia without a suitable successor to the aging G92, a chip that’s nearly a year and a half old.

Fortunately for Nvidia, that relatively-geriatric architecture was designed and executed well enough, carrying over from a 65 nm process down to 55 nm. Even today, it’s able to do more than just compete against the RV770-based lineup from AMD—a fact proven by today’s GeForce GTS 250 launch.

But while the new board’s name might sound like something new wedged in between the GTX 260/285 and older GeForce 9800-series boards, the truth of the matter is that it’s G92 reborn. More specifically, it’s the GeForce GTX 9800+, a die-shrunk version of the GeForce GTX 9800, which was already a slightly-overclocked re-introduction of the GeForce 8800 GTS.

Display all 86 comments.
This thread is closed for comments
Top Comments
  • 18 Hide
    johnbilicki , March 3, 2009 7:16 AM
    "so long as performance goes up or sideways as price goes down, we don’t see an issue with the reintroduction of proven technology"

    ...which (in the context it has been applied) is the same as saying we don't mind nVidia renaming an 8800GT to a 9800GT and then a 9800GT to a whatever 2xx series...and so on and so forth. My point is simple: nVidia is pulling an extremely sleazy marketing scheme on consumers by renaming existing models. If you goof admit it and get on with life; that's why I appreciated the fact that when the first generation of Phenoms were botched AMD gracefully renamed unaffected quads with a 50 (IE 9650 instead of 9600). Trying to remember all the different names of the exact same model is like dealing with someone who IM's you from five different screen names, eventually you just end up blocking them out.
  • 13 Hide
    curnel_D , March 3, 2009 8:13 AM
    Chris, it's a decent article, but why in the world would you use 512mb models in everyting aside from the 250 and 260. If you would have shown the 1gb 4870, along with a 1gb 9800+, it would have showed a clearer picture of how the 250 is identical to the 9800+/9800/8800GT.

    Meh.

    And there are MASSIVE rumours saying that Nvidia is hand-picking the review models sent to reviewers, even confirmed by HardOCP. Addressing that in this article would have been great.
  • 12 Hide
    curnel_D , March 3, 2009 11:59 AM
    ProximonWell, caveat emptor. If the average consumer can't be bothered to google up a benchmark and just assumes a new name equals a better GPU, then they get what they deserve.A dual slot cooled video card that is just slower than a 4850 could be a good thing if they work the price low enough.

    Yes, I agree with that totally if we're talking about the demographic these forums target. But that's absolutely absurd if you count everyone.

    The "average joe" is usually a hobby gamer who has a full time job, if not two, a wife, kids, generally lower pay compared to the white colar IT job market, and just doesnt have the time for all of the 'homework'. And even then, alot of people still wouldnt know what those benchmarks mean, or even where to find them on google if they know what the word benchmark means at all.

    It'd be the same if Ford released a new Mustang called Mustang GTX250. But in reality, it was identical to the Mustang GT with a different name and better tires. Ford would catch all kinds of hell for it, which is exactly why they dont do it.

    But Nvidia apparently think's they're above the average consumer, and hopes to get a one-up on them to get rid of all of their oversupplied chips.

    Dont sell yourself short, but dont give these companies credibility for doing what they're doing. Nvidia has been very anti-consumer lately, and they shouldnt get any reason to excuse it.
Other Comments
  • 10 Hide
    Anonymous , March 3, 2009 6:17 AM
    i wonder what would be the stand of 4850 and 4870 with 1gb frame buffer
  • 9 Hide
    thepinkpanther , March 3, 2009 6:53 AM
    when the GTX4xx series i guess nvidia will launch the g92 refresh yet again, this time as an entry level graphics card.
  • 3 Hide
    xx12amanxx , March 3, 2009 7:00 AM
    Hmm no mention of the slower model's Nvidia is going to push instead of these cherry picked Oced model's.I heard these Oced model's were just for reviewers and that most of these cards will actually be slower model's with even less performance.
  • 18 Hide
    johnbilicki , March 3, 2009 7:16 AM
    "so long as performance goes up or sideways as price goes down, we don’t see an issue with the reintroduction of proven technology"

    ...which (in the context it has been applied) is the same as saying we don't mind nVidia renaming an 8800GT to a 9800GT and then a 9800GT to a whatever 2xx series...and so on and so forth. My point is simple: nVidia is pulling an extremely sleazy marketing scheme on consumers by renaming existing models. If you goof admit it and get on with life; that's why I appreciated the fact that when the first generation of Phenoms were botched AMD gracefully renamed unaffected quads with a 50 (IE 9650 instead of 9600). Trying to remember all the different names of the exact same model is like dealing with someone who IM's you from five different screen names, eventually you just end up blocking them out.
  • 0 Hide
    Anonymous , March 3, 2009 7:20 AM
    Good review, but i missed the noise and heat comparative
  • 8 Hide
    cangelini , March 3, 2009 7:31 AM
    xx12amanxxHmm no mention of the slower model's Nvidia is going to push instead of these cherry picked Oced model's.I heard these Oced model's were just for reviewers and that most of these cards will actually be slower model's with even less performance.


    Cherry picked? It's a retail product.
  • 13 Hide
    curnel_D , March 3, 2009 8:13 AM
    Chris, it's a decent article, but why in the world would you use 512mb models in everyting aside from the 250 and 260. If you would have shown the 1gb 4870, along with a 1gb 9800+, it would have showed a clearer picture of how the 250 is identical to the 9800+/9800/8800GT.

    Meh.

    And there are MASSIVE rumours saying that Nvidia is hand-picking the review models sent to reviewers, even confirmed by HardOCP. Addressing that in this article would have been great.
  • -8 Hide
    sohei , March 3, 2009 10:24 AM
    i think Nvidia want's to marry this card with us ...love with force is not possible ...we need a new "woman" from nvidia not other clothes ...Nvidia has enough experience with clothes ...they should enter in fashion business like Microsoft
  • 2 Hide
    vaskodogama , March 3, 2009 10:34 AM
    huh, anyway, I don't like the naming of GT200 cards anyway! AMD's got better price, and naming scheme!
    thepinkpantherwhen the GTX4xx series i guess nvidia will launch the g92 refresh yet again, this time as an entry level graphics card.

    I Agree!
  • 10 Hide
    nerrawg , March 3, 2009 11:01 AM
    Conclusion in this article finally gets to the point, after having compared OCed cards against vanilla. Good article - yet "The Real Story" might be missing out on a few more valid points.

    1. All of the AMD 4800 cards can be easily overclocked, especially the cheap 4830 which often OCs over 700 Mhz on its GPU clock. This will effect the value evaluation, because the 9800+/250 is gonna have to OC pretty well to match it bang for buck, and seeing as the tested cards are already OCed, well I really wonder if it has that headroom?

    2. 4850s and particularly 4870s come in much hotter versions than the vanilla flavors - ex. sapphire toxic etc. The prices of these models will be important to consider.

    3. The G92 architecture is from what I have seen sketchy performance wise in SLI compared to the 4800 series in Crossfire. I am not sure of this, but I would be cautious of using a G92 card if you where planning on using a multicard setup, atleast from the tests I have seen. It would be interesting to see direct tests between a GTX 250 SLI and 4830/4850 CF setup. I'd put my money on the CF solution and I'd love to be proved wrong for Nvidia's sake.
  • -4 Hide
    Proximon , March 3, 2009 11:34 AM
    Well, caveat emptor. If the average consumer can't be bothered to google up a benchmark and just assumes a new name equals a better GPU, then they get what they deserve.

    A dual slot cooled video card that is just slower than a 4850 could be a good thing if they work the price low enough.
  • 2 Hide
    trinix , March 3, 2009 11:50 AM
    Not everyone is a tech person, they'll ask friends and family about performance and the tech person in the family might prefer the nvidia or the ati card and recommend that one over the better one.

    Also at shops the knowledge isn't always better. I've seen people behind the counter, who don't know the difference between ddr1 and ddr2 memory and will just tell you they don't have it.

    Rebranding is evil, but if that's the way Nvidia can keep making money and stay alive, I'd rather have that than the solo reign of Ati.
  • 12 Hide
    curnel_D , March 3, 2009 11:59 AM
    ProximonWell, caveat emptor. If the average consumer can't be bothered to google up a benchmark and just assumes a new name equals a better GPU, then they get what they deserve.A dual slot cooled video card that is just slower than a 4850 could be a good thing if they work the price low enough.

    Yes, I agree with that totally if we're talking about the demographic these forums target. But that's absolutely absurd if you count everyone.

    The "average joe" is usually a hobby gamer who has a full time job, if not two, a wife, kids, generally lower pay compared to the white colar IT job market, and just doesnt have the time for all of the 'homework'. And even then, alot of people still wouldnt know what those benchmarks mean, or even where to find them on google if they know what the word benchmark means at all.

    It'd be the same if Ford released a new Mustang called Mustang GTX250. But in reality, it was identical to the Mustang GT with a different name and better tires. Ford would catch all kinds of hell for it, which is exactly why they dont do it.

    But Nvidia apparently think's they're above the average consumer, and hopes to get a one-up on them to get rid of all of their oversupplied chips.

    Dont sell yourself short, but dont give these companies credibility for doing what they're doing. Nvidia has been very anti-consumer lately, and they shouldnt get any reason to excuse it.
  • 3 Hide
    jeverson , March 3, 2009 12:03 PM
    I'm just curious... When nVidia launched the 9800+ you were able to SLI it with the regular 9800 series. Does this mean you will be able to SLI the GTX 250 cards with either a 9800 or 9800+? Would be nice to know.
  • 5 Hide
    Pei-chen , March 3, 2009 12:30 PM
    Chris, you should really consider sending Kevin, Tuan and Jane to training. Kevin and Tuan can't keep facts straight and Jane is simply blogging.

    Using GTS 250 as example, Kevin and Tuan reported that 250 is using 512 bit memory bus. I almost went over to Anand to check the spec. before clicking on this article.
  • 3 Hide
    nerrawg , March 3, 2009 12:43 PM
    http://www.tomshardware.com/news/Nvidia-GTS250-Twintech,7150.html

    As stated above, found it at the bottom of article, 512 bit is listed as only difference between 9800+ and 250 - but its not in this article - I trust Chris on this one

  • -2 Hide
    Pei-chen , March 3, 2009 1:08 PM
    xx12amanxxHmm no mention of the slower model's Nvidia is going to push instead of these cherry picked Oced model's.I heard these Oced model's were just for reviewers and that most of these cards will actually be slower model's with even less performance.


    Curnel_D…. And there are MASSIVE rumours saying that Nvidia is hand-picking the review models sent to reviewers, even confirmed by HardOCP. Addressing that in this article would have been great.

    Are you two idiots? GTS 250 is the same as 9800+ GTX. If Nvidia can sell retail 9800+ at 738MHz why would they need to cherry pick GTS 250 at the same clock?

    If every GTS 250 is running at 850MHz vs 738 on 9800+ you can say Nvidia is binning better chip for 250 but they are running at the same speed.
  • 0 Hide
    curnel_D , March 3, 2009 1:10 PM
    Pei-chenChris, you should really consider sending Kevin, Tuan and Jane to training. Kevin and Tuan can't keep facts straight and Jane is simply blogging.Using GTS 250 as example, Kevin and Tuan reported that 250 is using 512 bit memory bus. I almost went over to Anand to check the spec. before clicking on this article.

    I actually prefer jane over kevin and tuan both. She might be blogging, but most of the time it's interesting, and never misleading or downright untrue.

    Both Kevin and Tuan are total morons IMO. Are they college kids doing a practicum or something? Because there's no way they actually have any journalism credibility. They're even the laughing-stock of other forums on a consistant basis.
Display more comments