I apologize if this comes off as an obvious question, but I have done some research through the forums about differences between video cards, specifically the whole ATI vs. Nvidia. While both seem to have their pros and cons and both seem to perform better according to their game, Nvidia seems to have the better "performance". ATI on the other hand seems to perform equally well in most games, yet the true benchmark comes down to FPS.
There is no argument that Nvidia does better under higher levels of AA and such, yet what I was wondering was does the FPS difference really make a difference. I realize that at lower levels of FPS there can be a clear distinction and visible difference between say 1 extra fps and 10 extra. Where does the line of visible performance draw.
For example say an ATI card puts out 50 fps for a given game, and a Nvidia card does 60. Now that is 20% difference. While on paper that may strike alot of ooos and ahhs but is it really a visually determining difference. I realize this will be different say if it was 10 FPS and 15 FPS, there is believe it would be pretty significant. Yet the higher the FPS goes does it reach a point where the game is just simply smooth or does the extra fps really add that much more. Perhaps there is a range of fps that can be noticed.
I only ask because I do not that capability to test this and see. It just seems to me that 10-15 fps in the higher levels, though on paper may be large percentage changes, wouldn't really be noticed. Of course this does not take into account the performance hits of resolution or added aspects such as AA, or price for that matter.
Just my thoughts.
There is no argument that Nvidia does better under higher levels of AA and such, yet what I was wondering was does the FPS difference really make a difference. I realize that at lower levels of FPS there can be a clear distinction and visible difference between say 1 extra fps and 10 extra. Where does the line of visible performance draw.
For example say an ATI card puts out 50 fps for a given game, and a Nvidia card does 60. Now that is 20% difference. While on paper that may strike alot of ooos and ahhs but is it really a visually determining difference. I realize this will be different say if it was 10 FPS and 15 FPS, there is believe it would be pretty significant. Yet the higher the FPS goes does it reach a point where the game is just simply smooth or does the extra fps really add that much more. Perhaps there is a range of fps that can be noticed.
I only ask because I do not that capability to test this and see. It just seems to me that 10-15 fps in the higher levels, though on paper may be large percentage changes, wouldn't really be noticed. Of course this does not take into account the performance hits of resolution or added aspects such as AA, or price for that matter.
Just my thoughts.