Mainstream Graphics Card Roundup

Zotac GTS250 1 GB (GeForce GTS 250 1,024 MB)

To see all photos in our gallery for this card, click on the image.

Zotac sent us a GeForce GTS 250 with 1,024 MB of graphics RAM and standard clock rates for testing. Above all, the extra RAM makes itself felt in Far Cry 2. At a resolution of 1920 x 1200 with 8X AA, frame rates double, putting it on par with a standard 9800 GTX+ with 512 MB. At 4X AA you’ll see an increase of 10 FPS vis-à-vis the reference card, although there isn’t much difference in other benchmarks.

This circuit board is large, at nearly 10.62" (27 cm). It uses an eight-pin PCIe connector, so a six- to eight-pin adapter is included in the retail box. The graphics chip supports DirectX 10 with Shader Model 4.0. In terms of overall performance, this card falls squarely in the middle of the pack. Those who want to game cheaply on 1680x1050 resolution monitors should look for G92 chips, which may be found in the GeForce 8800 GTS 512, the GeForce 9800 GTX, and now the GeForce GTS 250.

Zotac adopts the double-slot Nvidia reference cooler for this card. Because the GPU is fabricated using 55nm technology, cooling performance is very good. In a 2D environment, idle temperatures measure around 44°C/111.2° F. Under heavy load, those readings climb to 77° C/170.6° F. But noise output leaves something to be desired. While 39.6 dB(A) at 2D idle is acceptable, the 50.2 dB(A) we measured at heavy load is louder than most G92 cards. Even the 65 nm GeForce 8800 GTS 512 MB and the GeForce 9800 GTX models generate noise levels under 45 dB(A).

The retail box includes the aforementioned six to eight-pin power adapter, plus a DVI-to-HDMI adapter. An internal SPDIF cable handles the sound. You’ll also find optical media for the 3DMark Vantage Advanced Edition and the game XIII Century Death of Glory in the box as well.

Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
115 comments
    Your comment
    Top Comments
  • NuclearShadow
    I think its highly unfair that you would put The Last Remnant in as a benchmark. The game simply hates ATI cards and if you included that game when it came to making a conclusion then I think your intentionally being biased.

    Also I'm not sure why your holding the 260 as the best choice. The Zotac one you even picture is priced at $175 at newegg while the HIS 4850 1GB is like $115 at newegg. Sure the 260 outperforms it but when you take that price difference and look at the performance the 4850 1GB is certainly attractive. The 250 1GB lowest price on newegg is $140 and Zotac's costs $154.99 and if you compared the 4850 1GB to it using your own charts you would see the major killer of the sum of fps is largely effected by The Last Remnant.

    Speaking of Zotac I noticed that for some reason whenever they are mentioned they get a major ass kissing. While they make good products its clear that there is a bias here. You even picture the Zotac 260 and even gave it the ability to be selected on your own little comparison charts and look what you get when you compare it to normal 260 216sps http://www.tomshardware.com/charts/gaming-graphics-charts-2009/compare,1173.html?prod[2447]=on&prod[2463]=on The exact same results and for some reason you deemed it necessary to list it individually as if it were special.

    Next time how about giving a real conclusion instead of a advertisement. Comparing a $115 card to a $175 and pushing Zotac down our throats makes it damn obvious what your doing.
    35
  • Anonymous
    Why do you keep on including the last remnant test when it's obvious that there is a problem with the ati cards? Therefore the overall results are biased and it's unfair to ati and to the foes who jump directly to the conclusion.

    Also when you say *quote* "DirectX 10 crashed at 8x AA and the game and screen went black. Switch to DirectX 9 instead, and the game works at 8x AA and offers frame rates up to 50% higher" *unquote* for HAWX didn't you mean "ati cards were a lot faster that nvidia ones using DirectX 10 thanks to DirectX 10.1 and that was unacceptable. Hence the switch to DirectX 9 instead, and the game works at 8x AA and offers frame rates up to 50% higher for nvidia and ati is fcked again, close one guys".

    I am not an ati fanboy but I think TH has got its tongue sticked up a juicy green @ss.
    34
  • cinergy
    Tino is putting again a big geforce ad. No mention of recent HUGE Radeon price cuts (eg. Radeon HD 4890 goes for $199.99 - 10$ mail rebate in newegg, and 4850 should go at 99$). And HAWX is again benchmarked without dx10.1 setting because of such crappy results for ATI. And not even a mention such technology exist in the game!
    28
  • Other Comments
  • ColMirage
    Great article! Good to see a large variery of old and new.
    0
  • Anonymous
    Why do you keep on including the last remnant test when it's obvious that there is a problem with the ati cards? Therefore the overall results are biased and it's unfair to ati and to the foes who jump directly to the conclusion.

    Also when you say *quote* "DirectX 10 crashed at 8x AA and the game and screen went black. Switch to DirectX 9 instead, and the game works at 8x AA and offers frame rates up to 50% higher" *unquote* for HAWX didn't you mean "ati cards were a lot faster that nvidia ones using DirectX 10 thanks to DirectX 10.1 and that was unacceptable. Hence the switch to DirectX 9 instead, and the game works at 8x AA and offers frame rates up to 50% higher for nvidia and ati is fcked again, close one guys".

    I am not an ati fanboy but I think TH has got its tongue sticked up a juicy green @ss.
    34
  • cinergy
    Tino is putting again a big geforce ad. No mention of recent HUGE Radeon price cuts (eg. Radeon HD 4890 goes for $199.99 - 10$ mail rebate in newegg, and 4850 should go at 99$). And HAWX is again benchmarked without dx10.1 setting because of such crappy results for ATI. And not even a mention such technology exist in the game!
    28
  • NuclearShadow
    I think its highly unfair that you would put The Last Remnant in as a benchmark. The game simply hates ATI cards and if you included that game when it came to making a conclusion then I think your intentionally being biased.

    Also I'm not sure why your holding the 260 as the best choice. The Zotac one you even picture is priced at $175 at newegg while the HIS 4850 1GB is like $115 at newegg. Sure the 260 outperforms it but when you take that price difference and look at the performance the 4850 1GB is certainly attractive. The 250 1GB lowest price on newegg is $140 and Zotac's costs $154.99 and if you compared the 4850 1GB to it using your own charts you would see the major killer of the sum of fps is largely effected by The Last Remnant.

    Speaking of Zotac I noticed that for some reason whenever they are mentioned they get a major ass kissing. While they make good products its clear that there is a bias here. You even picture the Zotac 260 and even gave it the ability to be selected on your own little comparison charts and look what you get when you compare it to normal 260 216sps http://www.tomshardware.com/charts/gaming-graphics-charts-2009/compare,1173.html?prod[2447]=on&prod[2463]=on The exact same results and for some reason you deemed it necessary to list it individually as if it were special.

    Next time how about giving a real conclusion instead of a advertisement. Comparing a $115 card to a $175 and pushing Zotac down our throats makes it damn obvious what your doing.
    35
  • scrumworks
    Wow! Tom's just cant let nvidia go. ATI clearly has price-performance advantage now. No Last Remnant benchs are gonna change it.
    28
  • Summer Leigh Castle
    I'm not an expert but the article felt like Toms was trying too hard... just a "little" bias here.
    21
  • d0gr0ck
    That's o
    -13
  • da bahstid
    Tino must have missed that whole thing a few months back where educated readers decided they weren't going to tolerate such ridiculously biased conclusions. There's only a $5 difference in price between the His 4870 and Zotac 260 on Newegg as I write this (nothing like Tino's claim that the Zotac has a >$35 advantage), and the performance of the two came within 0.5% of each other...DESPITE two extremely pro-NVidia slanted tests (Last Remnant and Hawx).

    I actually encourage keeping the Last Remnant test because ATI shouldn't get breaks for poor drivers (or inadequate collaboration with developers), but by that same token if NVidia loses out on lack of 10.1 support that result absolutely needs to be included. TH was actually starting to look credible again, it must have taken you guys months of seriously attentive work and comprehensive benchmarking to regain that...what in the world are you guys thinking starting up this tripe again?
    27
  • Ramar
    I agree, the last remnant is stupid. Everyone with an ATI card knows they probably can't play it. Funny, considering it was developed for a console with an ATI chip.

    Let's do some simple math here to prove if ATI really has the price to performance advantage.

    I'll use far cry 2 because I think it's a very fair description of DirectX10 power, "WIMTBP" be damned.

    Top range, GTX 295 vs 4870X2, there's a performance difference on par with their respective prices, especially in the highest res and AA/AF setting.

    Higher-mid, 4890 vs GTX 275. Again, the performance percentage is very close to the twenty dollar difference between cards, and exceeded in nvidia's side at the highest res.

    High-mid, 4870 vs GTX 260 216, even on Left 4 Dead, a source engine game favoring ati, the 260 comes out on par. This is a tie, really. But don't kid yourself into thinking the 4850 is any kind of match for the 260.

    Mid-range, 4850 vs...well, if you take a GTS 250, they're very evenly matched. If you REALLY want a 9800GT for the same price, well, sucks to have an IQ of 50.

    Also factor in the growing use of Physx and ATI doesn't make a very compelling argument. Prices are matched frustratingly well and the only real "killer deal" is a 4850x2 for slightly over $200.

    Know that I like ATI and I'm not saying they're bad cards, I'm just saying they're only on par with Nvidia's offerings, not above them.

    Just remember that DirectX11 will be in full swing in around six months and none of this will matter anyway.
    3
  • d0gr0ck
    I don't know what you did to get those TLR benches on the ATI cards. On a single HD 4870 (512MB reference style) card I easily got the playable 60fps at 1680x1050 at high settings + medium shadows. Once I upgraded to crossfire the framerate blows past 60fps on all high settings. Slowdown only occurs when the game effects churn out an inordinate amount of lighting/shadow effects. I use an old X38/E8500 combo to game on, so by all means you should be getting better results than I do.
    11
  • JeanLuc
    Who in all honesty plays "The Last Remnant"? It's hardly the biggest and best PC game on the market right now it just makes me wonder why Toms Hardware would pick such a obscure console port to benchmark video cards on?

    Once again TH tests HAWX on DX10 not on DX10.1. Can someone please explain why you wouldn't test a game using all the available technology? The excuse 'it's not fair to compare a video card that can only utilise DX10 against video cards that can uses 10.1' is unelectable. If you own an ATI card or plan on buying one you should make it clear that the Radeon card has the advantage as anyone who has DX10 can get DX10.1.
    23
  • d0gr0ck
    JeanLucIt's hardly the biggest and best PC game on the market right now it just makes me wonder why Toms Hardware would pick such a obscure console port to benchmark video cards on?


    It is one of the more recent games to use the Unreal 3 engine on the PC. So it's probalby serving the place of what used to be held by Unreal Tournament 3 in the benches. Just my guess.
    -2
  • chyll2
    another nvidia ad.
    16
  • neiroatopelcc
    page 15 sais "Current Nvidia graphics chips usually produce a CPU score between 6300 and 6400, whereas Nvidia chips come in around 6600" A bit much nvidia here perhaps! might want to make that last one an ati name
    14
  • gkay09
    Have been reading good article here but this one certainly is a let down...
    We all know for the fact that even Nvidia has acknowledged that DirectX 10.1 has lot more to offer than the DirectX 10 and the newer Nvidia mobile GPUs from nvidia are DirectX 10.1 compliant...
    http://www.xbitlabs.com/news/video/display/20090616133250_Nvidia_s_New_Graphics_Chips_Support_DirectX_10_1.html

    And there are XFX 4870 1GB cards for $150 on newegg right now and their performance is very much comparable with the GTX 260...So for about $150 would you still suggest the GTS 250 1GB ?
    15
  • Anonymous
    Bias Bias Bias Bias Bias Bias Bias Bias Bias Bias Bias Bias .........
    8
  • Ramar
    gkay09Have been reading good article here but this one certainly is a let down...We all know for the fact that even Nvidia has acknowledged that DirectX 10.1 has lot more to offer than the DirectX 10 and the newer Nvidia mobile GPUs from nvidia are DirectX 10.1 compliant...http://www.xbitlabs.com/news/video [...] _10_1.htmlAnd there are XFX 4870 1GB cards for $150 on newegg right now and their performance is very much comparable with the GTX 260...So for about $150 would you still suggest the GTS 250 1GB ?


    DirectX 10.1 makes an almost unnoticeable difference and Nvidia is simply smart enough to wait for DX11. It's only on those mobile processors because it's a selling point in a very hostile environment.

    I just checked newegg and the difference between buying the cheapest 4870 1GB and the cheapest GTX 260 216 is 10 dollars, with a 10 dollar discount on the 4870. They are almost identically performing cards, but I'd personally rather spend the extra ten dollars on Physx than DX10.1.
    -12
  • rhys216
    Lol! I'm glad it's not just me!
    I'm hoping Tino is simply biased and unable to put together an honest comparison, rather than Tom's hardware being in the pocket of Intel & Nvidea!
    All I know is something stinks, as you don't get it this wrong by accident.
    13
  • rhys216
    @ Ramar!
    DX10.1 was originally going to be DX10.
    Nvidea could cut the mustard in time and M$ watered down DX10 to help out Nvidea!
    It had nothing to do with Nvidea being smart, in fact it's the complete opposite Lol!

    Perhaps you haven't had enough sleep? how about you go back to bed and cosy on up with TIno some more?
    18
  • neiroatopelcc
    RamarDirectX 10.1 makes an almost unnoticeable difference and Nvidia is simply smart enough to wait for DX11. It's only on those mobile processors because it's a selling point in a very hostile environment.I just checked newegg and the difference between buying the cheapest 4870 1GB and the cheapest GTX 260 216 is 10 dollars, with a 10 dollar discount on the 4870. They are almost identically performing cards, but I'd personally rather spend the extra ten dollars on Physx than DX10.1.

    I've got a 4870 and I can honestly say 10.1 makes NO DIFFERENCE in the real world. But then physx doesn't either (unless you play that mirrors edge thing - which is EA, thus you don't). Buy what is cheapest.

    rhys216Lol! I'm glad it's not just me!I'm hoping Tino is simply biased and unable to put together an honest comparison, rather than Tom's hardware being in the pocket of Intel & Nvidea!All I know is something stinks, as you don't get it this wrong by accident.

    I don't nessecarily agree with you. They did include one game that runs exceptionally poorly on ati hardware, and perhaps should've excluded it from the combined results, but apart from that it might well be honest.
    I believe Tino is from the german office right? And according to idealo.de (german price comparison database) the cheapeast reasonable 1gb card is an 4830 @ 87€. The cheapest gtx 260 costs 2€ less than a 4870 1gb and performs slightly better on average. It's not like the 4870 is really poor or anything, but the 260 simply is a bit better.
    That said I've just recommended a collegue of mine to buy a 4670 for her kids computer and expect to buy 2 4770's for my parents system.
    Anyway. Why do you accuse them of being paid by intel ? intel wasn't even mentioned except for what system they were testing on. And given x58 is THE ONLY choice for CF and SLI at once, that's a nobrainer.
    5