I've got 2 questions about how my new card is performing in games.
1. I tested 8X MSAA vs. 4X SSAA performance in Mass Effect 1&2 and Portal 1&2. I forced AA through the driver for each... For some reason I'm getting higher fps when using supersampling than multisampling. Granted, I'm taking twice the samples in MSAA, but according to Tom's Antialiasing article I should be getting twice the framerate with 8X MSAA!
Example: I pop two portals beside each other in a wall, and my framerate dips to 45 with 8X MSAA - I switch to 4X SSAA and the same scene gives me 57 fps. Also walking around the Citadel in ME1 I get about 40 fps with 4X SSAA and a little over 30 with 8X MSAA. I'm fine with it, because supersampling looks way better, but I wish I knew the cause.
I know draw distance is CPU-dependent, but is MSAA, too? (I know mine is rather weak, but it's still able to play these games well.) SSAA always destroyed my framerate with my old HD 5670 512MB, but 8X MSAA was just as playable in these games as with my new HD 6870.
2. My second question is a quick one: Why is GRID maxing out my GPU? Every time I look at Afterburner, the GPU usage shows about 90-100% while racing or showing a replay. My card gets pretty hot too. I have all the settings cranked up with 8X MSAA, but Crysis 1&2 on Very High are less intense...
Unfortunately trying to force settings through the CCC does not always work with all games. ME and Portal 2 clearly aren't actually supersampling if you are getting higher framerates than 8x MSAA. For Mass Effect you can change the exe file name to ut3.exe, and that should allow you to enable supersampling.
As for why your GPU is not getting as much use with Crysis 1 and 2, with your low resolution and your Athlon II X2 it is possible that you are hitting a CPU bottleneck with those games, so your GPU is actually waiting for the CPU to feed it data to a small extent. For that reason you may not be able to see 100% usage from your 6870 on those games. Keep task manager open on the performance tab and see what the CPU usage is when playing those games. If both cores are running 100% constantly while in game, that means your CPU isn't keeping up with your graphics card.
As for GRID, do you have Vsync on? If not, turn it on, that should lower the GPU usage while still delivering the maximum frames your display can support. If GRID does not have a vsync option, you can force it using a program called D3DOverrider, it's part of the RivaTuner package.
Forcing AA through CCC works for the games I'm talking about: ME1&2 and Portal 1&2. (Also CoD4 - I get 60-70fps with 4xSSAA, but that's beside the point...) I can definitely tell a difference in the Portals and Mass Effects, not only with the lessening of jaggies, but also the framerate. Believe me, it looks beautiful with 4x SSAA - 8x is flawless, but the framerate drops too much to play. I'll try to post some screenshots this afternoon.
I know my Athlon II is bottlenecking me somewhat - it's constantly pegged at 100% during ME1&2, but the other games it runs about 85-95%.
I'll try Vsync in GRID to see if that lowers the GPU usage.
Okay well I could only reproduce one example in ME1, so I guess it was other unknown factors. Maybe Vsync was throwing my original results off... Or my computer was doing some other graphically intensive thing while I was playing the games...