I currently own a small-ish Acer Aspire One low-power desktop computer. According to dxdiag, it has a GeForce 9200 GPU by NVIDIA on the motherboard. I recently met it's limitations in purchasing Starcraft II, and finding that it often lags even with every other program closed, and the graphics on the lowest available setting.
I've been entertaining the idea of getting a new computer, and I'd like to build it up from scratch (well, piece by piece at least). My current budget is only ~500 dollars (counting taxes) but I plan on waiting and saving up more paychecks. I would just like to know how one measures the strength of a given graphics card? The numerical system they use confuses me greatly. For example the official blizzard site tells me the system requirements for running SC2 are "128 MB PCIe NVIDIA® GeForce® 6600 GT" or better... I have a 9200... And then I look at the NVIDIA website, and I see their highest graphics cards are in the 400 - 500 numbers. Does NVIDIA decrease their numbers as their cards get more powerful?
Also Radeon seems to use another completely different system... Basically I just want a ball-park estimate on how much I might have to spend in order to get a decent gameplay. For example, "Oh to play on medium graphics you'll want to spend at least XXX$" "And for high end stuff you're going to have to put in around XXX $" "Oh, and if you want to be able to handle Diablo III when it comes out, make sure you get [insert technical term here] which will usually cost you around XXX $"
There are different series for each manufacturer.
For example Radeon has the 5000 and 6000 series at the moment as most recent.
Radeon 5830 = Radeon 6850 (more or less in terms of performance).
But the 6850 is newer.
Similar thing with Nvidia. Each series is gonna have some cards which are more powerful than the ones from the previous generation and some which are weaker. This is so that they can address the different levels of price/performance consumers will want.
It's always best to get something in the current series, for Nvidia this is the 500 series and AMD/ATI it is the 6000 series because it is the latest technology. You don't want to be buying something too old.
One of the most important things you need to consider before buying a graphics card is what resolution you will be playing at (for the monitor). The higher the resolution, the more you will have to spend to get decent gameplay. Higher resolutions provide sharper images though. So tell us what resolution your monitor is (or what resolution you will be happy playing at).
Well nvidia works like this:
They come out with a new line of cards like once a year. The current card you have is in the 9000 series. The 9200 is a cheap card for people who do little to no gaming. In the 9000 series, are the 9300, the 9400, the 9500, the 9600, and the 9800. Each time the number increases, so does the performance (and the cost). So, a 9800 is made for heavy gaming, and every thing else is in be tween. A 8800 (even though it was made before) is better than a 9200. It was made for gaming and cost big $$$, but the 9200 was made cheap. After the 9000 series, nvidia decided to knock a digit off and then count up again. So the next series was the 200 series. They have been going up from there, and are now at the 500 series. The 520 is a cheap card like the 9200, and the 580 is awesome like the 9800. So you probably want to get a card in the 500 series. A good card for the cash is a 550. That will play all games at medium to high settings. It costs about $120. A 560 will run most games maxed out with decent frame rates. It's about $180-$200. A 570 will run pretty much everything on max, with very good frame rates. About $300. And finally, the 580. Overkill. This will run crysis 2 maxed out on a 50" hd tv. $500. I have a 450 and its awesome. I can play all my games on high with no problem. If I where you, I would go for a 550, or 560. Just depending on how much you want to spend. You have to keep in mind that other hardware can bottleneck a card, so dont spend all your money on it.
@ AMD X6850
I’m using a relatively recent 16:9 24 inch monitor, pixel count is 1920 x 1080. Of course I’d rather get a 1080p capable card, but if it means spending 300+ dollars I’m willing to cut back and drop the resolution. So long as the blurr factor doesn’t get too bad. But really it’s the lag I can’t stand, when the whole system spends several seconds skipping.
Hmm… my 9200 is a chip on the mobo though, I’m not sure it can be removed/replaced. Also I have to worry about the low PSU. The sticker says 220 W. I actually bought myself a low-profile card not long ago for around 40 bucks. The box said 1 GB, whereas the Blizzard suggested system requirements said 128 MB… so I figured, great, plenty of power to spare. Apparently being low-profile wasn’t enough, I’m still not sure what happened but my entire computer locked up, the screen went black, and even after I pulled out the card no image would send to any monitor I plugged in. I managed to fix it with a CMOS reset that was suggested from this site. I suspect the power requirements overloaded what the PSU could offer and it tripped something up.
So long-story short, I think it’s safe to say this bargain-comp cannot have it’s graphics capabilities upgraded, at least not through PCIe slots. It’s probably smarter for me to wait until I get my new comp before I seriously get back into gaming. But while I save my money, I do want to browse up on my options, for example thank you for mentioning the NVIDIA 550 and it’s price point, that helps.
I have searched, mostly on the NVIDIA and Radeon sites, but there’s a lot of technobabble there I don’t fully understand. I mean… there’s more to it than just the MB/GB number mentioned on the card I take it?
Just to point out an example of the technobabble I was just referring to,
This card sports the GF116 GPU from its GeForce GTX 550 Ti, but with one of its four Streaming Multiprocessors disabled. As a result, it has 144 shader cores, three PolyMorph Engines, and 24 texture units.
Polymorph Engines sounds like something right out of science fiction, lol.
I remember hearing somewhere that either NVIDIA or Radeon cards tend to run very well on Blizzard games... But I can't remember which. I don't play just Blizz games, but I've been a big fan ever since Warcraft 2.
Hmmm, thanks for that. That chart is close to exactly what I was looking for.
240$ is probably pushing my limits (especially after I buy the case, the PSU, the mobo, the CPU, etc...) but it's probably worth the wait to save up more money. I suspect that if I did go for a lower card I'd be able to handle Starcraft II well enough, but when Diablo III came out... along with many more games that will follow, I'd have to bring down the resolution. I don't want to optimize my build to play games from last year only.