Radeon HD 4850 Vs. GeForce GTS 250: Non-Reference Battle

The Asus EAH4850 MT: MT Stands For Matrix

Yes, MT stands for Matrix. But what does Matrix supposed to mean? Based on the card’s unique capabilities, it stands for a redesign that allows better power handling, a great cooler, and–certainly the most compelling feature for overclockers–GPU voltage control. There is also Asus’ iTracker software that gives the user a great deal of control and customization over how the card operates.

Let’s start with the PCB redesign. Components have been moved around compared to AMD’s reference, which was likely intended to clean things up and accommodate the unique heatpipe cooler that Asus uses on the EAH4850 MT. The card is about as long as most Radeon HD 4850 cards, with the PCB just under 9.5." As far as tangible changes go, Asus has enhanced the card with four-phase power versus the reference design's two-phase power system, which should allow for more efficient operation and more stable power under load. Asus has also added some hardware to enable its Asus Super Hybrid Engine, a proprietary chip allowing this card to perform some of its unique abilities in conjunction with Asus’ iTracker utility.

It is noteworthy that the card still requires the same six-pin PCI Express (PCIe) power connector as a garden-variety Radeon HD 4850. Asus also made the decision to include only 512 MB on this special-edition Radeon HD 4850 card. While this likely keeps costs down, it’s not going to help the card in its battle with Gigabyte’s GV-N250ZL-1GI and its full gigabyte of RAM. Then again, it might not hurt it much either, as more RAM doesn't always equal more performance. We’ll see how this decision stacks up in the benchmarks.

Default clock speeds on our EAH4850 MT test sample are 625 MHz for the GPU and 1,986 MHz for the memory, which is the same as the reference card. With the iTracker utility installed, users have access to a mild overclock profile that sets the GPU to 660 MHz. However, the user has access to much more radical settings if he or she wants to experiment with higher overclock speeds.

At first glance, you would assume that the Asus 4850 Matrix sports the same two dual-link DVI outputs and analog video output as the reference card. However, the yellow DVI output is single-link only. This might be a little disappointing for those of you with two 30” 2650x1600 monitors, but the rest of us can breathe easy.

Physically, the most obvious improvement is the EAH4850 MT's heatpipe cooler. It’s a large cooling system that utilizes three heat pipes, while most of it is made of aluminum with a copper block that covers the GPU. While the cooler populates two slots (blowing air out of a vent at the back), it doesn't use the rear vent in the traditional manner, since hot air is not forced out the back by a closed fan shroud. Instead, much of the heated air is free to vent out of the sides of the cooler and back into the case.

It might seem like a strange call to not channel heated air out of the back of the case, until you consider that the cooler is designed as a hybrid piece with a passive mode. If the heatpipe cooler is to work with the fan turned off (minimizing noise in 2D mode), then there has to be airflow around the cooler. A closed system built to channel air out the back of the case would make that virtually impossible, so the price of absolute silence is more heat in your case.

  • rags_20
    In the second picture of the 4850, the card can be seen bent due to the weight.
  • rags_20
    The Gigabyte would be more effective with 2 fans.
  • tuannguyen
    rags_20In the second picture of the 4850, the card can be seen bent due to the weight.
    Hi rags_20 -

    Actually, the appearance of the card in that picture is caused by barrel or pincushion distortion of the lens used to take the photo. The card itself isn't bent.

    / Tuan
  • jebusv20
    demonhorde665... try not to triple post.
    looks bad... and eratic. and makes the forums/coments system
    more clutered than need be.

    ps. your not running the same bench markes as Toms so your not really comparable.
    yes, same game and engine, but for example in crysis, the frame rates are completely different from the start, through to the snowey bit at the end.

    pps. are you comparing your card to there card at the same resolution?
  • alexcuria

    I've been looking for a comparison like this for several weeks. Thank you although it didn't help me too much in my decision. I also missed some comments regarding the Physix, Cuda, DirectX 10 or 10.1 and Havok discussion.

    I would be very happy to read a review for the Gainward HD4850 Golden Sample "Goes Like Hell" with the faster GDDR5 memory. If it then CLEARLY takes the lead over the GTS 250 and gets even closer to the HD4870 then my decision will be easy. Less heat, less consumption and almost same performance than a stock 4870. Enough for me.

    btw. Resolutions I'm most interested in: 1440x900 and 1650x1080 for 20" monitor.

    Thank you
  • spanner_razor
    Under the test setup section the cpu is listed as core 2 duo q6600, should it not be listed as a quad? Feel free to delete this comment if it is wrong or when you fix the erratum.
  • KyleSTL
    Why a Q6600/750i setup? That is certainly less than ideal. A Q9550/P45 or 920/X58 would have been a better choice in my opinion (and may have exhibited a greater difference between the cards).
  • B-Unit
    zipzoomflyhighand no the Q6600 is classified as a C2D. Its two E6600's crammed on one die.
    No, its classified as a C2Q. E6600 is classified as C2D.
  • KyleSTL

    Directly from the article on page 11:
    Game Benchmarks: Left 4 Dead
    Let’s move on to a game where we can crank up the eye candy, even at 1920x1200. At maximum detail, can we see any advantage to either card?

    Nothing to see here, though given the results in our original GeForce GTS 250 review, this is likely a result of our Core 2 Quad processor holding back performance.
    Clearly this is not an ideal setup to eliminate the processor from affecting benchmark results of the two cards. Most games are not multithreaded, so the 2.4Ghz clock of the Q6600 will undoubtedly hold back a lot of games since they will not be able to utilize all 4 cores.

    To all,

    Stop triple posting!

  • weakerthans4
    The default clock speeds for the Gigabyte GV-N250ZL-1GI are 738 MHz on the GPU, 1,836 MHz on the shaders, and 2,200 MHz on the memory. Once again, these are exactly the same as the reference GeForce GTS 250 speeds.

    Later in the article you write,
    or the sake of argument, let’s say most cards can make it to 800 MHz, which is a 62 MHz overclock. So, for Gigabyte’s claim of a 10% overclocking increase, we’ll say that most GV-N250ZL-1GI cards should be able to get to at least 806.2 MHz on the GPU. Hey, let’s round it up to 807 MHz to keep things clean. Did the GV-N250ZL-1GI beat the spread? It sure did. With absolutely no modifications except to raw clock speeds, our sample GV-N250ZL-1GI made it to 815 MHz rock-solid stable. That’s a 20% increase over an "expected" overclock according to our unscientific calculation.

    Your math is wrong. A claim of 20% over clock on the GV-N250ZL-1GI would equal 885.6 MHz. 10% of 738MHz = 73.8 MHz. So a 10% overclock would equal 811.8 MHz. 815 MHz is nowhere near 20%. In fact, according to your numbers, the GV-N250ZL-1GI barely lives up to its 10% minimal capability.