ATI vs. Nvidia.
I prefer Nvidia because they support Linux, I can work with generic ATI drivers if I have to. This card will go into a dual-boot system - Ubuntu/Windows XP Home. The reason for upgrading is primarily to play World of Warcraft and Everquest (the only reason why Windows is on this machine) at reasonable speeds which has not been possible with the on-board Intel graphics.
The Nvidia PCI card has a Core clock of 300MHz, ATI's PCI 1x card is 550MHz.
The Nvidia PCI card has a Memory Clock of 533MHz , ATI's PCI 1x card is 800MHz.
Price - Nvidia PCI card is $45.99 and the ATI PCI 1x is $99.99.
Max resolution for both cards is much higher than anything that will be needed.
So how much does Core clock and Memory clock speeds impact video performance? Is there a clear advantage in using PCI 1x over PCI?
(Got it right when I hit submit (on board graphics).)
In my experience and in my opinion - I have never had good luck with Intel's on-board graphics and ANY 3D app. Add to that the fact that they do not support OpenGL (which is on of the backbones for KDE), I am not likely to ever use it as a permanent solution unless those practices change in the future.
Besides, I'm just trying to get some life into an older computer, not rebuild the silly thing.
Even then they are buggy and only work with some cards, and for the price, might as well buy a new MoBo.
And the bandwidth for PCIe 1X is effectively 4 times as much as PCI (and bi-directional), and the difference is pretty significant. It probably wouldn't matter as much for the GF6200 as the X1550, but it does make a difference.
Here's a quick easy representatio of the bandwidths involved;
Anything above a low end GF8600/HD2600 will be chocked by PCIe 1X, and PCI would turn it into something comparable to the 780G intergrated chipset.
Either get an updated mobo with a solid GF8200/9100 intergrated VPU (since you use Linux and thus intel or AMD IGP is little benefit) or else get the X1550.