Sign in with
Sign up | Sign in
Your question

Best upgrade from 9800GT for under $250 (w/ tech feedback plz)

Last response: in Graphics & Displays
Share
December 22, 2010 5:20:35 PM

Hello, I was wanting to upgrade my current graphics card and was hoping to get some opinions on the best way to spend $250. I play mainly FPS as well as World of Warcraft (which can get pretty graphics intensive at times, youd be suprised) I play at 1680x1050 res at all times.
My current setup:
EVGA 790i MOBO
EVGA 9800 GT CARD
Intel Core 2 Duo E8500 OC@ 5% to 3.28GHZ
610W Power Supply

I mainly thought that the best bang for my buck gaming wise would be the graphics card. I've been out of the loop for a while. Any suggestions?

I'm also hoping to get some tech speak back as to the merits of one over the other. It would help refresh my memory of what all the specifications are saying when I spent a month or two researching a couple years ago. I find it very interesting but it seems like things change fast in the industry.

Thanks!

*EDIT Im not too familiar with card manufacturers outside of the NVIDIA chipset(I stick with EVGA), so if you have suggestions of brand names that are most reliable I'd appreciate that too. (ATI, AMD)
a c 624 U Graphics card
December 22, 2010 5:27:45 PM

You could upgrade to a GTX460 for ~$200.

m
0
l
a c 130 U Graphics card
December 22, 2010 5:27:55 PM

The absolute best performance would probably come from two GTX460 768MB's in SLI.

The best single-card performance at that price is an HD5870, however.

*Ed
The graph above is using an overclocked version of the GTX460. That performance is limited to that specific card only; other cards do not come nearly as close to those numbers.
m
0
l
Related resources
December 22, 2010 5:38:18 PM

shadow187 said:
The absolute best performance would probably come from two GTX460 768MB's in SLI.

The best single-card performance at that price is an HD5870, however.

*Ed
The graph above is using an overclocked version of the GTX460. That performance is limited to that specific card only; other cards do not come nearly as close to those numbers.



What manufacturer would you suggest using for the HD5870?
m
0
l
a c 624 U Graphics card
December 22, 2010 5:40:12 PM

texusmcash said:

I'm also hoping to get some tech speak back as to the merits of one over the other. It would help refresh my memory of what all the specifications are saying when I spent a month or two researching a couple years ago. I find it very interesting but it seems like things change fast in the industry.

Thanks!

The GTX460's almost always come factory overclocked, so the above graph is a good indication of the performance you are likely to see in the real world. The big advantage of the GTX460 (versus the HD5870) is that it does a much better job at DirectX 11 tesselation (2011 should be a big year for DirectX 11 game releases) and it has PhysX.
m
0
l
a c 273 U Graphics card
December 22, 2010 5:57:12 PM

shadow187 said:
The absolute best performance would probably come from two GTX460 768MB's in SLI.

The best single-card performance at that price is an HD5870, however.

*Ed
The graph above is using an overclocked version of the GTX460. That performance is limited to that specific card only; other cards do not come nearly as close to those numbers.

When the 5850 and 5870 were all the rage you were forever on about how overclockable the cards were and how everybody who got one would OC it as they all OC'd so well and now you seem to be talking down OC'd cards, why is that?
m
0
l
a c 130 U Graphics card
December 22, 2010 5:59:28 PM

http://www.newegg.com/Product/Product.aspx?Item=N82E168...

This HD5870 is $229 after MIR and a promo-code (order quick!). That's probably the best deal you're going to get, and sapphire is a very reputable brand.

In response to the specific DX11 feature of tessellation; so far as to be seen the only way to tell is by using a specific tessellation-stress-tool, called Unigene. Unigene specifically stresses tessellation to the max, whereas in normal games this kind of stress would be unseen.

If you are worried about DX11 performance, there have been a few improvements in ATI's next-gen cards. The HD6870 series performs just a bit under the HD5870, but can easily be overclocked to 6870 levels (provided you are willing to).
http://www.newegg.com/Product/Product.aspx?Item=N82E168...
That card will allow you to achieve performance beyond an HD5870 with a guarantee to run cooler and quieter. I have the ASUS CuCore HD6850 (very similar to the HD6870) and I've been very pleased with it. During gaming I hit 63c, but that's because my card is overvolted and overclocked. Stock I would imagine it would never go above 60c. My case (from 2004 or so) has 2 80mm exhaust, 1 top-80mm exhaust, one 80-mm side fan, and one 120mm intake fan.

@MM;
Simple. When you have a stock card that can achieve 10% more performance nearly guaranteed by adjusting two sliders (memory + core), it's very easy to recommend that. I was especially keen on the ASUS CuCore variety, as their cards seemed to be much better (though with Sapphire's new program TrixX maybe they've caught up). If your memory is as keen as it seems to be you might remember that probably 85%+ of my HD5850 recommendations have been for the CuCore HD5850.

Keep in mind I was not talking down OC'ing at all. If that's how you interpreted my post then I'm sorry for the confusion I have caused you. I was merely remind the OP that the HAWK edition of the GTX460 is overclocked. If he purchases a different card and wonder why his framerates are slower, well, that would be the reason why. It's also due to the fact that a card heavily OC'd such as the HAWK reduces total headroom available for overclocking (should the OP decide to do so).
m
0
l
a c 624 U Graphics card
December 22, 2010 6:15:29 PM

The Hawk is only moderately overclocked and is very representative of the performance of most GTX460's that are likely to be purchased.

If tesselation is such a non-issue, why did AMD make it one of the main improvements of the 6900 series? Tesselation is clearly the centerpiece of DirectX 11 and holds the most promise for delivering the photorealism that gaming graphics is trying to achieve. Despite this renewed focus by AMD, their tesselation performance continues to lag significantly behind Nvidia.

The problem with the 6800 series of cards is that they utilize an image quality optimization that inflates their benchmark scores by 6-10% at the cost of image quality.

Tom's Hardware had a nice write up on the issue:
http://www.tomshardware.com/reviews/geforce-gtx-570-gf1...
Quote: "For the time being, we're going to have to leave everything at default and point out the observed and confirmed image quality issue currently affecting Radeon HD 6800-series cards. This may or may not become a factor in your buying decision, but right now, the bottom line is that Nvidia offers better texture filtering, regardless of whether you’re one of the folks who can appreciate it."
m
0
l
a c 130 U Graphics card
December 22, 2010 6:24:46 PM

Actually the 6900 series is a bit worse at tessellation, though it has a second dedicated tessellator. We'll see what happens; everything but benchmarks points to better performance.

Also I'm sure it's a nice write-up as it doesn't look at this debacle as a possible optimization, it looks at it as a cheat. Please consider the following:
Quote:
We have a hard time spotting differences as much as you do, and while making the screenshots we increased gamma settings to 50% and applied a resolution of 2560x1600 to try it look more visible.

Do you spot the difference ? Probably not, that is the rule we life by here at Guru3D, if you can not see it without blowing up the image or altering gamma settings and what not, it's not a cheat. And sure, we know .. this game title is not a perfect example, it however is a good real world example.


Quote:
We seriously had a hard time finding an application where the optimizations show well. So mind you, the above example is catered to show the image quality anomaly.


Quote:
For weeks now here at Guru3D we have been discussing and debating whether or not to disable the ATI optimization manually. But that one question remains, is it a valid optimization or an unacceptable cheat ? Well, we conclude that it's not a cheat ... but it remains to be a trivial optimization that can be seen if you know how to look, seek and find it. And that does not sit right with us.


http://www.guru3d.com/article/exploring-ati-image-quali...

Make of it what you will.
m
0
l
a c 273 U Graphics card
December 22, 2010 6:25:17 PM

@Shadow, even a cheap 460 can achieve quite a good OC in excess of the HAWK's, I know this because I've tried it.
m
0
l
a c 130 U Graphics card
December 22, 2010 6:34:29 PM

Your point would hold more ground, but the HAWK is currently the cheapest GTX460 on Newegg, :lol: 

I'm not discrediting the card nor its ability to be overclocked at all. I haven't had any experience with it nor have I read intricate reviews (as I have with the 6850, considering I was receiving one).

Moreover, the OP's budget is $250. There is a better card that overclocks greatly to even better performance and it's quite a ways under his budget still, hence the recommendation over the 460. If I have $1,000 to build a new computer, I will not build a machine centered around a Core2Duo E5300 and an HD5770.
m
0
l
a c 273 U Graphics card
December 22, 2010 6:37:44 PM

shadow187 said:
Your point would hold more ground, but the HAWK is currently the cheapest GTX460 on Newegg, :lol: 

I'm not discrediting the card nor its ability to be overclocked at all. I haven't had any experience with it nor have I read intricate reviews (as I have with the 6850, considering I was receiving one).

Moreover, the OP's budget is $250. There is a better card that overclocks greatly to even better performance and it's quite a ways under his budget still, hence the recommendation over the 460. If I have $1,000 to build a new computer, I will not build a machine centered around a Core2Duo E5300 and an HD5770.

As I'm not in the U.S Newegg is not is not where I look to for the cheapest cards. :kaola: 
m
0
l
a c 130 U Graphics card
December 22, 2010 6:39:14 PM

Oh, that's my bad. Sorry for the assumption, however as most forum-goers here are indeed in the U.S. I tend to use that (USD$) for pricing. Newegg is also my preferred site due to reliability and pricing, though if you have a few sites that I could cross-reference with don't be afraid to post! :) 
m
0
l
a c 273 U Graphics card
December 22, 2010 6:49:46 PM

shadow187 said:
Oh, that's my bad. Sorry for the assumption, however as most forum-goers here are indeed in the U.S. I tend to use that (USD$) for pricing. Newegg is also my preferred site due to reliability and pricing, though if you have a few sites that I could cross-reference with don't be afraid to post! :) 

I've seen quite a few posters who are not from the US ask about cards using $'s as a price reference because they know that people will assume they are in the US, I tend to use Scan.co.uk for my online purchases and there the HAWK is not the cheapest.
m
0
l
a c 624 U Graphics card
December 22, 2010 8:15:26 PM

shadow187 said:

http://www.guru3d.com/article/exploring-ati-image-quali...

Make of it what you will.

I have always enjoyed that Guru3d article, which is basically my home website for tech stuff. That is where I got the 10% figure for how much the optimization bloats performance benchmarks (6% is for the 6850 and 10% is for the 6870):
"The optimization however allows ATI to gain 6% up to 10% performance at very little image quality cost. And I choose the words 'very little' here very wisely. The bigger issue and topic at hand here is no-matter what, image quality should be 100% similar in-between the graphics card vendors for objective reasons."

Also, that part about "And that does not sit right with us" is basically the main thesis of the article.

I particularly like the parts about the "optimizations" being a moral/ethical issue that casts a negative light on AMD and raises ethical questions.

"We urge and recommend AMD/ATI to disable the optimization at default in future driver releases and deal with the performance loss, as in the end everything is about objectivity and when you loose consumer trust, which (as little as it is) has been endangered, that in the end is going to do more harm then good. The drop in 3-4 FPS on average is much more acceptable then getting a reputation of being a company that compromises on image quality. And sure it raises other questions, does ATI compromise on other things as well ? See, the cost already outweigh the benefits.

So the moral right thing to do for AMD/ATI is to make the High Quality setting the standard default. But again here we have to acknowledge that it remains a hard to recognize and detect series of optimizations, but it is there and it can be detected."
m
0
l
a c 130 U Graphics card
December 22, 2010 8:56:59 PM

The big issue here isn't what ATI did. The fact that they get 6-10% performance increase for nothing the user can possibly notice is commendable.

The real issue is how nVidia spun it. They told the public, "Hey, they're cheating."
They told a few sites specifically what was wrong and said, "Hey, post these reviews. ATI's cheating."

Tell me this;
Do you honestly believe that if nVidia hadn't mentioned this issue that it would have been noticed?
m
0
l
a c 624 U Graphics card
December 22, 2010 10:29:15 PM

shadow187 said:
The big issue here isn't what ATI did. The fact that they get 6-10% performance increase for nothing the user can possibly notice is commendable.

The real issue is how nVidia spun it. They told the public, "Hey, they're cheating."
They told a few sites specifically what was wrong and said, "Hey, post these reviews. ATI's cheating."

Tell me this;
Do you honestly believe that if nVidia hadn't mentioned this issue that it would have been noticed?

What you are saying is the same as the old adage, "It's only wrong if I get caught". But in addition to Nvidia there were several German websites to also point out the problem. In the high tech graphics world, enthusiasts are going to notice and AMD got caught. As guru3d says: "The optimization can be seen, it really is visible image quality degradation."

In fact, there has been an a clear concensus on this issue in every article I have read. Every site that has reported on this issue is sending a consistent message: It is a fact that AMD did this and, even if it is hard to detect, it sets a real bad precedent.

Nvidia had two options, advertise the problem or do it themselves to keep pace. If Nvidia starts to fudge with optimizations in the same way, then we all lose out. Instead of fudging and trying to match the results of AMD, Nvidia did the right thing... advertise the problem, rather than cave to the pressure to follow suit.

If someone doesn't feel this should be factored into their buying decision, great for them, but it is there and people should be aware when of it when we give advice.
m
0
l
!