I have a question regarding the requirements of a system for frame rate. I recently got Half Life 2 and checked the frame rate between two machines.
Graphic Card - ATI 1900XT PCI-Express
AMD 64x2 Dual Core 4400 - 2.2GHz
Window XP Media Edition
Frame Rate: 74fps HL2 Episode One test
Graphic Card - ATI 1900XT PCI-Express (same exact physical card)
AMD 64x2 Dual Core 4800 - 2.4GHz
Window XP Home Edition
Frame Rate: 125fps HL2 Episode One test
Now the only difference I see is the different AMD core (2.2GHz vs. 2.4GHz), but does it really make that much of a difference between the 2 systems or am I missing something else? I didn't do any tuning for the card via either the game itself or the card, maybe there is something in the game I can move, like lower some resolution or something, that my friend did to machine b (and isn't telling me), which I didn't do to machine a. Is there any ideas someone could share that may make sense, other than buying a new CPU, and finding out it gets the same results. If power load was the issue I'd assume it would just no even work.
Before running each test, ensure the graphical settings are equal (for ease just set them to Default). Also, are both machines using the same driver?
The RAM might be of different spec also, that's something you need to check; but I don't see RAM making such a large difference.
The most obvious thing that I can see being different in both these setups would be the resolution the test is running at. Try it again, and ensure that the graphical settings, including resolution, are set to the same values, and run the test 3 times, then average to ensure a more accurate result.
You can put most of your money on the processor really. I have a 1950 in on a conroe 6600 and im doing 150 fps in the Lost Coast test. I'm also sporting 2 gigs of ram. Don't be fooled, anything that can speed the system up can easily change the performance.
Are you running the EXACT same test, such as a pre-recorded ingame test like the one in FEAR. If you simply play the game on one and then play it differently on the other then you will almost definitely see a difference in performance - especially if one test has many more effects than the other.
Apples and oranges man thats what this comparasin is.
The systems most likely have a totaly different configuration. along with alot of other things like were stated already that can be causeing that. Its highly unlikely that the cpu is changing things more then 5% do you even know the resolutions video settings running software on both machines? IMO there is no way your going to get a equal comparasin from those two machines especialy since they are OEM that hardware ehh forget it. There is alot more different about those systems then just the video card. 74frames though isnt bad its more then your eyes can see.
I re-checked the graphics settings. The resolution was different, but I found it didn't make a huge difference. I originally thought I was running the same test, but turns out I wasn't:
HalfLife2 Lost Coast Video Test (option in menu before games are started)
HalfLife CounterStrike Video Test (option in menu before games are started)
Give very different frame rate results.
After <psuedo> apples to apples test the difference was less.
HL2 Lost Coast Menu Video test
Frame Rate: 83fps
4:3 aspect ratio
HL2 Lost Coast Menu Video test
Frame Rate: 98fps
4:3 aspect ratio
Resulotion didn't make too much of a difference, 2-3 frames. Someone mentioned that you couldn't tell since it is more than 30 frames. In theory that should be absolutely true, but I do notice that the 98 is more fluid and the 83 seems to jump frame. This may not be a result of the frame rate, it could be Steam or something burning up CPU, but this frame jump every 5-15 seconds was noticable on the slower frame rate. I appreciate all the good input, next time I'll do more diligence testing up front, not test the wrong thing.
I have the F.E.A.R. game loaded on the slow machine. I'll try it with the fast machine/good card, but are the graphic that good? They kinda suck compared to HL2, even on my slow PC. I guess thats a discussion for a different forum.
Tecnicly people can see the difference in frames up to 60. For me when it starts to go under 60 i start to see the frame problem. at 30 its just so much snap shot crap i cant play. Course resolution does make a big difference in frames having to render all those extra pixles takes some power.
Do you think going from 1Gig RAM to 2Gig or 3Gig would help the frame rate any? Or is the bottleneck in the GPU and video card memory area? What function does the video card use motherboard RAM for? Is there a good place to look at video card pipleline/video path information?