Do I really, truly need more than my old 8800gt?

futari

Distinguished
Nov 11, 2010
32
0
18,530
First post here, and first attempt at build from parts.

I just bought (but haven't yet received) a DIY kit including a Phenom II X4 955BE, Gigabyte AM3 MB and 4GB DDR3 1600.

I didn't buy a graphics card yet, but I was thinking, if I just pull the 8800gt from my ancient (and current) Athlon 64 3200+ box, will I actually suffer noticeably?

The most taxing thing I plan to use the computer for is 1680x1050 gaming with medium-high settings (minimal AA or filtering, but other effects high); I don't care about benchmarks or huge framerate numbers as long as I have smooth, stutter-free gameplay and the ability to play 1080i/p video files. I don't plan to play bleeding-edge graphics games (like Crysis series).

I know the bottleneck may be laughable in this case, but how much longer could I get by with the 8800gt?
 
Solution
^ 8800GT is still a decent card and how long it would last would depend on the type of games that you want to play...
For the current FPS games I doubt it would suffice at that resolution at settings though...

blazer9131

Distinguished
Aug 30, 2010
40
0
18,530
I have a 8800GT by PNY, and it's still serving me beautifully, even for BlackOps. Right now I'm running medium settings at 1440 x 900 and pulling in a good 30FPS, excluding the lag-spikes due to w/e issue they're having with the game. I'm on an even worse setup than your current PC, I'm running this on a E2200 @ 3.05GHz and 4 gigs of ram.

I am upgrading my PC however, cause I know the card's getting old, like, literally old. Mine's been overclocked for 90% of it's life and that's taken quite a hit on the poor thing. I had to switch to a better cooler last year cause my temps hit 100C and I was afraid I would fry the thing. Now they're better, maxing out at about 85-90C. I don't know if it's just my imagination or not, but my frames have started to drop, I don't know what it might be, but I'm thinking my GPU might be slowly giving way if that's even possible. That, or I'm just seeing things.
 

blazer9131

Distinguished
Aug 30, 2010
40
0
18,530


I don't know how much of a bottlekneck my CPU is (E2200 @ 3.05GHz) but at medium settings for Fallout New Vegas I'm hardly pulling in 40FPS, and I'm only on a 1440x900 screen.. I'm not sure how well medium would do on a 1680x1050..

BlackOps I'm on medium, and pulling in about 30-40FPS as well.
 

futari

Distinguished
Nov 11, 2010
32
0
18,530
I think I was fooling myself a bit: on my old (current) box, I actually played games like Company of Heroes at 1680x1050 low-medium and just forced myself to actually tolerate a sub-25fps.

My main source of confusion was that I knew my processor was the clear bottleneck for my graphics card, but it's the exact reverse now.

In any case, I'm hoping I won't have to upgrade my primary components until I can make a jump as big as Athlon 3200+ to 955BE again.
 
There is a compromise to this

The Game Rundown: Finding CPU/GPU Bottlenecks, Part 2
http://www.tomshardware.com/reviews/game-performance-bottleneck,2738-16.html
Conclusion: A Trend Toward 3+ Cores

The average optimal number of CPU cores suggested by the test results is 2.75, showing a clear trend towards at least three CPU cores.The question of whether the CPU or GPU is most important is easily answered. If you don't have a multi-core CPU, then upgrade it. If you have a dual-core CPU at around 3 GHz, then invest your money into a graphics card, as most games are GPU-limited. This is not something that will change with new DirectX 11 games.


So just Rana X3 + cheap 770/785 board and hit a GPU upgrade if your games are screaming for it hehe