After every new graphics card is introduced, a flurry of clones from different manufacturers hits the market. The reason is simple: in order to be competitive, and to get products out at the same time as the other vendors, there is little room for customization. Regardless of the manufacturer, the first version of any new graphics card is almost always based closely on the reference design that either AMD or Nvidia provides. As vendors launch carbon copies of the PCB board and coolers, there is little reason to compare one manufacturer’s version against the others, unless you’re looking at warranty and bundle details.
But after a while, some companies get creative, providing a truly unique spin on the product. Sometimes a unique cooler is used, sometimes the buyer is given more control over things like voltages and overclocks, and sometimes the entire PCB is redesigned. Reviewing these non-reference solutions is usually a much more interesting proposition, which brings us to our comparison here.
By now, the mid-range battle between the Radeon HD 4850 and the GeForce GTS 250 is no secret, and it’s common knowledge that either of these cards offers competitive and compelling performance at very reasonable prices. But how do the unique implementations of these products compare?

Enter the Gigabyte GV-N250ZL-1GI and Asus EAH4850 MT. We put these two cards through their paces to really see if either of these unique offerings can break the stalemate between the garden-variety Radeon HD 4850 and the vanilla GeForce GTS 250.
Let’s start alphabetically with Asus' card.
- Introduction
- The Asus EAH4850 MT: MT Stands For Matrix
- The Asus EAH4850 MT: Software
- The Gigabyte GV-N250ZL-1GI: ZL Stands For Zalman
- The Gigabyte GV-N250ZL-1GI: Software And Cooling
- Overclocking The Asus EAH 4850 Matrix Using iTracker
- Overclocking the GV-N250ZL-1GI Using Gigabyte’s Gamer HUD Lite
- Test System Setup And Benchmarks
- Synthetic Benchmarks: 3DMark Vantage
- Game Benchmarks: Crysis
- Game Benchmarks: Left 4 Dead
- Game Benchmarks: Fallout 3
- Game Benchmarks: World in Conflict
- Game Benchmarks: Burnout Paradise: The Ultimate Box
- Power, Temperature, And Noise Benchmarks
- Conclusion
Hi rags_20 -
Actually, the appearance of the card in that picture is caused by barrel or pincushion distortion of the lens used to take the photo. The card itself isn't bent.
/ Tuan
looks bad... and eratic. and makes the forums/coments system
more clutered than need be.
ps. your not running the same bench markes as Toms so your not really comparable.
yes, same game and engine, but for example in crysis, the frame rates are completely different from the start, through to the snowey bit at the end.
pps. are you comparing your card to there card at the same resolution?
I've been looking for a comparison like this for several weeks. Thank you although it didn't help me too much in my decision. I also missed some comments regarding the Physix, Cuda, DirectX 10 or 10.1 and Havok discussion.
I would be very happy to read a review for the Gainward HD4850 Golden Sample "Goes Like Hell" with the faster GDDR5 memory. If it then CLEARLY takes the lead over the GTS 250 and gets even closer to the HD4870 then my decision will be easy. Less heat, less consumption and almost same performance than a stock 4870. Enough for me.
btw. Resolutions I'm most interested in: 1440x900 and 1650x1080 for 20" monitor.
Thank you
No, its classified as a C2Q. E6600 is classified as C2D.
Directly from the article on page 11:
Let’s move on to a game where we can crank up the eye candy, even at 1920x1200. At maximum detail, can we see any advantage to either card?
Nothing to see here, though given the results in our original GeForce GTS 250 review, this is likely a result of our Core 2 Quad processor holding back performance.
Clearly this is not an ideal setup to eliminate the processor from affecting benchmark results of the two cards. Most games are not multithreaded, so the 2.4Ghz clock of the Q6600 will undoubtedly hold back a lot of games since they will not be able to utilize all 4 cores.
To all,
Stop triple posting!
Later in the article you write,
Your math is wrong. A claim of 20% over clock on the GV-N250ZL-1GI would equal 885.6 MHz. 10% of 738MHz = 73.8 MHz. So a 10% overclock would equal 811.8 MHz. 815 MHz is nowhere near 20%. In fact, according to your numbers, the GV-N250ZL-1GI barely lives up to its 10% minimal capability.
No what he is saying is this- Gigabyte claims that the extra copper in the PCB will allow for a 10%-30% further increase compared to how much a standard cards speed can be raised by overclocking. So saying that a standard card oc's to 800MHz which is a 62MHz increase, Gigabyte is claiming a 6.2 (10%) to 18.6 (30%) MHz further increase on top of that. So "technically" a 20% increase would have put it at 816.4 MHz, only 1.4MHz more than the 815MHz he acheived.
The GTS 250 is a 9800GTX+ is a 9800GTX is -also- an 8800GTS 512. So this...3 year old card is still running strong.
Also, Gigabyte's Ultra Durable is for two functions, overclocking and obviously, durability. Yes, it will overclock better. But it also will probably never stop functioning.
From someone who's gone through numerous motherboards and graphics cards with minimal overclocking on either, that means a lot more than performance.
It's in the specs but I should have stressed the point: I overclocked the Q6600 to 2.7 GHz, it was plenty quick for these cards.
Not exactly. The 8800 GTS at least sported diffrent clockspeeds. I also believe it was on a larger die, if memory serves.
Is it? If so, please provide some proof of that statement as I haven't seen evidence of that.
You misunderstand Gigabyte's claim. As universalremonster points out, they're alaiming a 10% increase in overclocks over other GTS 250's, not claiming that all of their cards will overclock 10% over stock clocks.