Sign in with
Sign up | Sign in

The Asus EAH4850 MT: Software

Radeon HD 4850 Vs. GeForce GTS 250: Non-Reference Battle
By

As far as software bundling goes, the EAH4850 MT sample we tested was not adorned with all of the retail trimmings. The promotional shots of the bundle, with which the card will be shipped, revealed a video cable, a Molex-to-PCIe converter cable, a DVI-to-HDMI dongle, a DVI-to-VGA dongle, and a CrossFire connector. It looks like Asus will not include any games with the EAH 4850 MT, but there will be a couple of CDs with drivers and proprietary utilities for the card.

Speaking of software utilities, Asus has made a little program for the Matrix card series that it calls "iTracker." This little application is really the heart and soul of what sets the Asus 4850 Matrix apart from the pack.

The iTracker app offers fantastic utility, assuming you wrap your head around the confusing user interface. I can’t understand why any software developer would assume the user should automatically know they're supposed to click a tiny (and almost invisible) grey triangle to access the most important settings. Maybe it’s just me and I was having an off day, but I will admit that only after I made it over that hump that things got a lot easier.

The iTracker has three main categories: Profile, Information, and Configuration. Of the three, Profile is the most important and powerful as it allows you to enable a pre-defined profile of both 2D and 3D GPU and memory clock speeds, GPU and memory voltages, and fan modes, including four tiers of cooling fan speeds to kick in at different GPU temperatures. Of course, the most interesting profile is "User Defined,” which gives the user direct control over these goodies and the flexibility of defining up to three unique sub-profiles.

The information category is interesting and displays all sorts of information from the basic clock speeds and temperatures to fan speeds, voltages, and even power draw. Frankly, I wish this program worked with every video card out there.

The Configuration category is where you can set alarms if temperatures, voltages, or even fan speed goes into territory with which you’re not comfortable. It also shows information about the card, such as which BIOS it’s using. Good stuff.

Of course, the real fun with iTracker comes when you overclock with it. But we’ll dig into overclocking a little later, so for now let’s take a closer look at the Gigabyte GV-N250ZL-1GI.

Display all 58 comments.
This thread is closed for comments
Top Comments
  • 17 Hide
    Dax corrin , April 20, 2009 5:12 PM
    Time to ban DemonHorde 665, the abuse of the English language is making all dead spelling teachers spin in their graves.
Other Comments
  • 0 Hide
    rags_20 , April 20, 2009 7:02 AM
    In the second picture of the 4850, the card can be seen bent due to the weight.
  • 0 Hide
    rags_20 , April 20, 2009 7:44 AM
    The Gigabyte would be more effective with 2 fans.
  • 3 Hide
    tuannguyen , April 20, 2009 8:02 AM
    rags_20In the second picture of the 4850, the card can be seen bent due to the weight.


    Hi rags_20 -

    Actually, the appearance of the card in that picture is caused by barrel or pincushion distortion of the lens used to take the photo. The card itself isn't bent.

    / Tuan
  • 5 Hide
    jebusv20 , April 20, 2009 11:00 AM
    demonhorde665... try not to triple post.
    looks bad... and eratic. and makes the forums/coments system
    more clutered than need be.

    ps. your not running the same bench markes as Toms so your not really comparable.
    yes, same game and engine, but for example in crysis, the frame rates are completely different from the start, through to the snowey bit at the end.

    pps. are you comparing your card to there card at the same resolution?
  • 2 Hide
    alexcuria , April 20, 2009 11:37 AM
    Hi,

    I've been looking for a comparison like this for several weeks. Thank you although it didn't help me too much in my decision. I also missed some comments regarding the Physix, Cuda, DirectX 10 or 10.1 and Havok discussion.

    I would be very happy to read a review for the Gainward HD4850 Golden Sample "Goes Like Hell" with the faster GDDR5 memory. If it then CLEARLY takes the lead over the GTS 250 and gets even closer to the HD4870 then my decision will be easy. Less heat, less consumption and almost same performance than a stock 4870. Enough for me.

    btw. Resolutions I'm most interested in: 1440x900 and 1650x1080 for 20" monitor.

    Thank you
  • -1 Hide
    spanner_razor , April 20, 2009 1:55 PM
    Under the test setup section the cpu is listed as core 2 duo q6600, should it not be listed as a quad? Feel free to delete this comment if it is wrong or when you fix the erratum.
  • 4 Hide
    KyleSTL , April 20, 2009 3:11 PM
    Why a Q6600/750i setup? That is certainly less than ideal. A Q9550/P45 or 920/X58 would have been a better choice in my opinion (and may have exhibited a greater difference between the cards).
  • 6 Hide
    B-Unit , April 20, 2009 3:21 PM
    zipzoomflyhighand no the Q6600 is classified as a C2D. Its two E6600's crammed on one die.


    No, its classified as a C2Q. E6600 is classified as C2D.
  • 3 Hide
    KyleSTL , April 20, 2009 3:34 PM
    ZZFhigh,

    Directly from the article on page 11:
    Quote:
    Game Benchmarks: Left 4 Dead
    Let’s move on to a game where we can crank up the eye candy, even at 1920x1200. At maximum detail, can we see any advantage to either card?

    Nothing to see here, though given the results in our original GeForce GTS 250 review, this is likely a result of our Core 2 Quad processor holding back performance.

    Clearly this is not an ideal setup to eliminate the processor from affecting benchmark results of the two cards. Most games are not multithreaded, so the 2.4Ghz clock of the Q6600 will undoubtedly hold back a lot of games since they will not be able to utilize all 4 cores.

    To all,

    Stop triple posting!

  • 3 Hide
    weakerthans4 , April 20, 2009 3:36 PM
    Quote:
    The default clock speeds for the Gigabyte GV-N250ZL-1GI are 738 MHz on the GPU, 1,836 MHz on the shaders, and 2,200 MHz on the memory. Once again, these are exactly the same as the reference GeForce GTS 250 speeds.


    Later in the article you write,
    Quote:
    or the sake of argument, let’s say most cards can make it to 800 MHz, which is a 62 MHz overclock. So, for Gigabyte’s claim of a 10% overclocking increase, we’ll say that most GV-N250ZL-1GI cards should be able to get to at least 806.2 MHz on the GPU. Hey, let’s round it up to 807 MHz to keep things clean. Did the GV-N250ZL-1GI beat the spread? It sure did. With absolutely no modifications except to raw clock speeds, our sample GV-N250ZL-1GI made it to 815 MHz rock-solid stable. That’s a 20% increase over an "expected" overclock according to our unscientific calculation.


    Your math is wrong. A claim of 20% over clock on the GV-N250ZL-1GI would equal 885.6 MHz. 10% of 738MHz = 73.8 MHz. So a 10% overclock would equal 811.8 MHz. 815 MHz is nowhere near 20%. In fact, according to your numbers, the GV-N250ZL-1GI barely lives up to its 10% minimal capability.
  • 1 Hide
    dimaf1985 , April 20, 2009 4:53 PM
    This whole article is completely invalid and the results are skewed because, as was documented on tweaktown, Catalyst 9.3 performance is much lower compared to 9.2. Catalyst 9.4 reclaims some of those performance losses, but 9.2 is still a bit better, if you compare the two analyses. Redo these tests with 9.2 drivers.
  • 1 Hide
    universalremonster , April 20, 2009 5:07 PM
    weakerthans4Later in the article you write,Your math is wrong. A claim of 20% over clock on the GV-N250ZL-1GI would equal 885.6 MHz. 10% of 738MHz = 73.8 MHz. So a 10% overclock would equal 811.8 MHz. 815 MHz is nowhere near 20%. In fact, according to your numbers, the GV-N250ZL-1GI barely lives up to its 10% minimal capability.


    No what he is saying is this- Gigabyte claims that the extra copper in the PCB will allow for a 10%-30% further increase compared to how much a standard cards speed can be raised by overclocking. So saying that a standard card oc's to 800MHz which is a 62MHz increase, Gigabyte is claiming a 6.2 (10%) to 18.6 (30%) MHz further increase on top of that. So "technically" a 20% increase would have put it at 816.4 MHz, only 1.4MHz more than the 815MHz he acheived.
  • 17 Hide
    Dax corrin , April 20, 2009 5:12 PM
    Time to ban DemonHorde 665, the abuse of the English language is making all dead spelling teachers spin in their graves.
  • 3 Hide
    universalremonster , April 20, 2009 5:12 PM
    personally I think it's the Zalman accounting for a bulk of the 20% extra and not the couple ounces of copper. That cooler rocks.
  • -5 Hide
    Ramar , April 20, 2009 5:44 PM
    To the reviewer: Good article, but you forgot two things:

    The GTS 250 is a 9800GTX+ is a 9800GTX is -also- an 8800GTS 512. So this...3 year old card is still running strong.

    Also, Gigabyte's Ultra Durable is for two functions, overclocking and obviously, durability. Yes, it will overclock better. But it also will probably never stop functioning.

    From someone who's gone through numerous motherboards and graphics cards with minimal overclocking on either, that means a lot more than performance.
  • -6 Hide
    tacoslave , April 20, 2009 7:06 PM
    it is known that nvidia cards tax the cpu less. So if a title is cpu bound than the nvidia card will usually come out on top. Thats why you see them performing similarly when resolutions increase and when you move away from cpu dependency
  • 0 Hide
    cleeve , April 20, 2009 8:56 PM
    KyleSTLWhy a Q6600/750i setup? That is certainly less than ideal. A Q9550/P45 or 920/X58 would have been a better choice in my opinion (and may have exhibited a greater difference between the cards).


    It's in the specs but I should have stressed the point: I overclocked the Q6600 to 2.7 GHz, it was plenty quick for these cards.
  • 0 Hide
    cleeve , April 20, 2009 8:59 PM
    RamarTo the reviewer: Good article, but you forgot two things: The GTS 250 is a 9800GTX+ is a 9800GTX is -also- an 8800GTS 512.


    Not exactly. The 8800 GTS at least sported diffrent clockspeeds. I also believe it was on a larger die, if memory serves.
  • 1 Hide
    cleeve , April 20, 2009 9:00 PM
    tacoslaveit is known that nvidia cards tax the cpu less.


    Is it? If so, please provide some proof of that statement as I haven't seen evidence of that.
  • 0 Hide
    cleeve , April 20, 2009 9:01 PM
    weakerthans4Later in the article you write,Your math is wrong. A claim of 20% over clock on the GV-N250ZL-1GI would equal 885.6 MHz. 10% of 738MHz = 73.8 MHz. So a 10% overclock would equal 811.8 MHz. 815 MHz is nowhere near 20%. In fact, according to your numbers, the GV-N250ZL-1GI barely lives up to its 10% minimal capability.


    You misunderstand Gigabyte's claim. As universalremonster points out, they're alaiming a 10% increase in overclocks over other GTS 250's, not claiming that all of their cards will overclock 10% over stock clocks.
Display more comments