I am searching for a new graphics card and I am in doubt between NVIDIA and ATI. I've seen different benchmarks and, as tricky as benchmarks may be, they have given me a clue of which card is superior to the other in terms of performance. Numbers are easy to compare.
But perhaps there is more to a video card than just numbers. I've read in some places that there are not so trivial differences between ATI and NVIDIA. For instance: ATI drivers are more buggy; the image quality of NVIDIA is superior to the image quality of ATI (and vice-versa); ATI usually embraces the latest technology while NVIDIA often rebrands its old video cards; ATI is designed towards gaming and NVIDIA is designed towards general computing; one is faster in some kinds of applications than the other.
NVIDIA GeForce and ATI Radeon are really very different beasts, beginning with their architecture. Comparing raw numbers is a very simplistic approach and, at least to me, it doesn't seem to reveal all the truth behind these cards.
I am particularly interested in these non-trivial differences. I haven't owned a decent video card for quite a while right now (some years, in fact), so I cannot really tell the differences of the latest ATI and NVIDIA series.
I am looking for a high-end video card for general purposes, not just gaming (but also some gaming). I intend to use two 1920x1080 monitors to do quite a few demanding multimedia tasks. So, I would appreciate if someone told me which are the main differences between these two cards, apart from the ones I can easily see on traditional benchmarks. On which one the image quality is better? For which kind of applications is NVIDIA or ATI better? Which will probably better support GPGPU? Which offers better performance for dual monitors and high resolutions? And so on...
But perhaps there is more to a video card than just numbers. I've read in some places that there are not so trivial differences between ATI and NVIDIA. For instance: ATI drivers are more buggy; the image quality of NVIDIA is superior to the image quality of ATI (and vice-versa); ATI usually embraces the latest technology while NVIDIA often rebrands its old video cards; ATI is designed towards gaming and NVIDIA is designed towards general computing; one is faster in some kinds of applications than the other.
NVIDIA GeForce and ATI Radeon are really very different beasts, beginning with their architecture. Comparing raw numbers is a very simplistic approach and, at least to me, it doesn't seem to reveal all the truth behind these cards.
I am particularly interested in these non-trivial differences. I haven't owned a decent video card for quite a while right now (some years, in fact), so I cannot really tell the differences of the latest ATI and NVIDIA series.
I am looking for a high-end video card for general purposes, not just gaming (but also some gaming). I intend to use two 1920x1080 monitors to do quite a few demanding multimedia tasks. So, I would appreciate if someone told me which are the main differences between these two cards, apart from the ones I can easily see on traditional benchmarks. On which one the image quality is better? For which kind of applications is NVIDIA or ATI better? Which will probably better support GPGPU? Which offers better performance for dual monitors and high resolutions? And so on...