After upgrading my main machine, I now find myself with a spare nVidia 8800 GTS 512MB video card. So I think to myself, well, it should be 4 times the speed of my spare computer's video, which is a ATI HD 2600 Pro 512Mb card. Right? I tested both in the main machine and the 8800 killed the 2600 (but Im now running duel 5770's in the main).
However, when I stick them in the spare machine, the 8800 actually runs slower than the 2600. How can that be?
SPARE MACHINE
-----------------------
* 3.0Ghz AMD Athlon 64 X2 Dual Core 6000+ Windsor
* ASUS V3-M2A690G motherboard
* 2G Ram (Kingston [Unbuffered DIMM (KVR800D2N5K2/2G)])
* 1G Ram (Kingston KTC1G-UDIMM) {all 3G running at 800Mhz not 1066}
* Samsung 320GB HD321KJ HDD - SATA II 7200rpm
* Silverstone SST-ST60F PSU 600W
* ASUS 23" monitor (2ms) - set to 1680x1050
* Win 7 64 bit
* Keyboard, Mouse and headphones (only other devices)
I used this site http://support.asus.com/PowerSupplyCalculator/PSCalculator.aspx?SLanguage=en-us to calculate that I needed a maximum of 500W to run the 8800GTS and only 400W for the 2600, so both cards should get enough power I thought. (I also tried other sites and they said even less Watts were needed).
Anyone know why the 8800GTS is being outperformed by the 2600? Because it feels like a waste to put such a nice video card in the cupboard and use some older, theoretically slower card.
However, when I stick them in the spare machine, the 8800 actually runs slower than the 2600. How can that be?
SPARE MACHINE
-----------------------
* 3.0Ghz AMD Athlon 64 X2 Dual Core 6000+ Windsor
* ASUS V3-M2A690G motherboard
* 2G Ram (Kingston [Unbuffered DIMM (KVR800D2N5K2/2G)])
* 1G Ram (Kingston KTC1G-UDIMM) {all 3G running at 800Mhz not 1066}
* Samsung 320GB HD321KJ HDD - SATA II 7200rpm
* Silverstone SST-ST60F PSU 600W
* ASUS 23" monitor (2ms) - set to 1680x1050
* Win 7 64 bit
* Keyboard, Mouse and headphones (only other devices)
I used this site http://support.asus.com/PowerSupplyCalculator/PSCalculator.aspx?SLanguage=en-us to calculate that I needed a maximum of 500W to run the 8800GTS and only 400W for the 2600, so both cards should get enough power I thought. (I also tried other sites and they said even less Watts were needed).
Anyone know why the 8800GTS is being outperformed by the 2600? Because it feels like a waste to put such a nice video card in the cupboard and use some older, theoretically slower card.