9800 GTX outperforming 5850 Toxic?

rndmavis

Distinguished
Mar 22, 2010
125
0
18,710
I upgraded to a Sapphire 5850 Toxic edition from the evga 9800 GTX this week and am stunned by what I am seeing. I can't figure out what's going on with the 5850's performance.

I ran 3d mark 05 at stock CPU speeds. The 9800 GTX scored 10k. The 5850 scored 12k

I OC'd the CPU to 3.1 ghz from 2.4 and scored 16k with the 5850.

In Bad Company 2, I actually noticed a DECREASE in performance with the 5850. I double checked this by trying the 9800 gtx again. The game runs better with the 9800 GTX on low, medium, and high. I've tried multiple drivers on the 5850.

What I'm seeing in BC2 on medium:
9800 GTX runs average of 42 FPS
5850 runs average of 35

The averages vary per map, but I made sure to compare the FPS on the same maps.

I've read multiple times that a q6600 won't bottleneck any single card on the market but with my experience I'm not convinced. I've also read that nvidia cards are able to take a bigger load off of a CPU.

Before anyone asks, both rails are connected.

Edit: 24" monitor @ 1950x1080. Lowered the resolution with very slight change. Also tried dx9, 10, and 11. All results are with dx9.

System specs

Windows 7 64
CPU: q6600 2.4 ghz
Memory: 4g g.skill 800
M/B: EVGA 680i
PSU: 600 watt Thermaltake
 
Solution
Actually, your problem is that a Q6600 at stock speeds simply will not feed the 5850 - it's a major bottleneck. Heck, my 5850 on an overclocked Q9650 (3.6GHz 4x900) was still bottlenecked. The 5850 has a lot of headroom and your Q6600 simply won't push anything, at 2.4GHz, more than a 9800GTX. I'll give you some examples using my spreadsheet of some 3dMark06 scores. I realize it's synthetic, but it'll give you an idea of what you're running into.

A Q6600 at 2.4GHz with an 8800GT - 3d06 was 11,481
A Q6600 at 3.0GHz (9x333) with an 8800GT - 12,420 (about 8% faster for a 25% bump in clock - the 8800GT is more of a bottleneck)
A Q6600 at 2.4GHz with a 4870 - 12,286 (slower - despite the card clearly being faster - exactly what you're...

rndmavis

Distinguished
Mar 22, 2010
125
0
18,710
Even with poorly optimized drivers, I would not think that the 5850 would get beaten by the 9800 GTX. Could I be wrong to think that?

1950x1080. Tried smaller resolutions as well with very slight difference.
 

rndmavis

Distinguished
Mar 22, 2010
125
0
18,710


Interesting. I would doubt what you are saying because it sounds crazy, but rivatuner showed no load change on the gpu (thought the program wasn't reading right). Maybe this is it... I did try the 10.3 beta drivers at guru3d which made no change. This was a few days ago. Any ideas?
 

rndmavis

Distinguished
Mar 22, 2010
125
0
18,710


I will check GPU-Z tonight when I get home.

I did not use driver sweeper. Will do that as well.

Will update on status when I'm back home.
 

dkapke

Distinguished
Jun 6, 2006
181
0
18,710
Actually, your problem is that a Q6600 at stock speeds simply will not feed the 5850 - it's a major bottleneck. Heck, my 5850 on an overclocked Q9650 (3.6GHz 4x900) was still bottlenecked. The 5850 has a lot of headroom and your Q6600 simply won't push anything, at 2.4GHz, more than a 9800GTX. I'll give you some examples using my spreadsheet of some 3dMark06 scores. I realize it's synthetic, but it'll give you an idea of what you're running into.

A Q6600 at 2.4GHz with an 8800GT - 3d06 was 11,481
A Q6600 at 3.0GHz (9x333) with an 8800GT - 12,420 (about 8% faster for a 25% bump in clock - the 8800GT is more of a bottleneck)
A Q6600 at 2.4GHz with a 4870 - 12,286 (slower - despite the card clearly being faster - exactly what you're seeing...and this was in several games)
A Q6600 at 3.0GHz with the 4870 - 14,286 (an OC of the proc got me 16% faster speeds with the faster card)
A Q9650 at 3.6GHz with the 4870 - 16,072 (another 12.5%, and 31% faster than the Q6600 at stock speeds)
A Q9650 at 3.6GHz with the 5850 - 19,369 (nice little 20.5% bump, but not as much as I was expecting)
A i7 920 at 3.6GHz with the 5850 - 23,769 (almost 23%!!! - same clock speed, different processor).

What I'm saying is if the Q6600 can't push a 4870, especially at stock speeds, it sure as heck isn't going to come anywhere near feeding a 5850 and yes...you may get lower speeds in some games/benchmarks due to the architecture of the card. Even if you can overclock the Q6600 to 3.0GHz (fairly easy with a good board) or 3.4GHz with good cooling and a great board, you're still not going to be able to feed a 5850. The tops a Q6600 can feed at it's highest overclock is a 5770 or 4870. Anything more than those two and you'll need a proc upgrade to see any difference.
 
Solution

rofl_my_waffle

Distinguished
Feb 20, 2010
972
0
19,160
That is right, your processor is a major bottleneck in bad company 2. It might not be in other games but bad company 2 is very processor intensive. It runs havok physics and tracks huge map with up to 32 players.

You should get way more frame rate with a 5850. To give you an example, before I upgraded to a 5970, I was using a 4870. At even at 2560x1600 resolution I was able to get 80 FPS on low-medium settings with no AA, but I am running a 3.8ghz i7 920.

A 5850 without bottleneck would destroy bad company 2 without AA.

Also the reason why the 5850 performs less than a 9800 with bottleneck is because the 5850 is running directx 11.
 

rndmavis

Distinguished
Mar 22, 2010
125
0
18,710


Cleaned out all the drivers and did a fresh install to 10.3b.



My original post stated I tried dx9, 10, and 11 with 9 giving the best performance. All my results are using dx9.



This appears to be the problem. GPU-Z shows speeds of 157 mhz core and 300 mhz memory speeds during gameplay. Here's a portion of the log I just recorded:

2010-03-23 13:07:22 , 157.0 , 300.0
2010-03-23 13:07:23 , 157.0 , 300.0
2010-03-23 13:08:15 , 765.0 , 1125.0
2010-03-23 13:08:16 , 157.0 , 300.0
2010-03-23 13:08:17 , 157.0 , 300.0
2010-03-23 13:08:18 , 157.0 , 300.0

The ONE second that it bumps up to highest speeds is when I alt-tab'd to windows. Game crashed when I tried to go back in.

So I've tried driver sweeper and a fresh install with 10.3b. Where to go from here?

Thanks so much for everyone's input


 

rofl_my_waffle

Distinguished
Feb 20, 2010
972
0
19,160
Updating drivers might not help. Force speeds instead this is how to do it.

1. Open CCC

2. Unlock and Enable Overdrive.

3. Go to Options/Profiles/Profiles Manager. Create a new profile. Under composition make sure “ATI Overdrive” is checked. Save and Close

4. In windows go to: C:\Users\{yourusername}\AppData\Local\ATI\ACE\Profiles

5. Open the xml document with the name of the profile you just created (notepad is fine)

6. Change the values of the Clock and Memory to something higher like this

Feature name="CoreClockTarget_0"

Property name="Want_0" value="40000"

Property name="Want_1" value="60000"

Property name="Want_2" value="80000"

Feature

Feature name="MemoryClockTarget_0"

Property name="Want_0" value="80000"

Property name="Want_1" value="95000"

Property name="Want_2" value="120000"



7. Save and close. Go back to CCC and activate the profile you just created.

Change to the numbers to the clock you want. Those are just my settings for the 5970.
 


Yeh, I had a feeling. The funny thing is, I say this in half the goddamn threads,and everybody ignores me, only to figure out I'm right a week later.
 

rndmavis

Distinguished
Mar 22, 2010
125
0
18,710



Manually changing the clock speeds seems...intimidating. What values would I put if I wanted 765 core mhz and 1125 memory mhz?

Can't find that file location. There is no AppData or anything similar. Where else would the profile files be?

Here's a chat I had with EA


Karl: Hi, my name is Karl. How may I help you?
: hi Karl
: I am currently running a radeon 5850
: when i play bad company 2, the video card stays at idle speeds
Karl: okay
: it seems that bc2 is the only game where this happens
Karl: okay
Karl: In order to assist you further, please provide me this information from a program called dxdiag.exe.
Karl: let me check
: thanks
Karl: I've checked your dxdiag file and came to know that your graphic driver is compatible to play the game, but you are using windows 7 64 bits which are not compatible for playing any EA games
: had no idea. any idea of compatibility mode would help the issue?
Karl: if its possible then, try to play the game on 32 bits or on any other operating system. You can play the game but the performance is not guarantee. EA games are not tested on 64 bits
Karl: Please respond if we are connected to this session.
: its strange that people i know have the same OS and 3d card as me and it works fine
Karl: Since I am not receiving any response from you I have to end this session.
Karl has disconnected


Really? That's the answer they have? Seems like a way to get out of supporting another OS.
 
I see you were logging GPU-Z to a file, but did you also check the box for 'logging while in background'? I only ask because I see a fairly large gap in your data. It should be sampling every second, but there is a gap of nearly a minute.
 

rndmavis

Distinguished
Mar 22, 2010
125
0
18,710


I feel like a noob now :)

The new log shows 100% load @ 765.0 / 1125.0 mhz in game. In this case, why would the 9800 GTX be performing better?
 

daedalus685

Distinguished
Nov 11, 2008
1,558
1
19,810


Take care to check if you are actually running at the same settings. I'm not sure how the game defaults dx11 but make sure you are not running in dx11 and comparing the 9800 to that. It is entirely possible the game enables features the 9800 doe snot support by default.

Many folks are claiming a cpu bottleneck.. they are wrong as that will decrease scaling, not performance (i.e. you would see no improvement, you would not see a drop), though a faster cpu is nice for modern games thus I'd keep it OCed.

There are a few possibilities but likely all revolve around software.
 


I LOL'd pretty hard at that.
 
Well, that is a harder question. I don't have BC2, so I don't really know how it should perform. However, a couple ideas/comments.

1.) The scores in the first post are 3DMark06 scores, right (it says 05)?

2.) Does OCing the processor help the game at all?

3.) What are the 3-D settings in Catalyst Control Center?

4.) Do you have any other games you can try?
 

flyinfinni

Distinguished
May 29, 2009
2,043
0
19,960
It really could be a CPU bottleneck in addition to something else because if the settings are still the same, you really shouldn't see a DROP in performance, at least not THAT much. What OC are you running on your CPU currently? You gained 4000 points in 3dMark when you OCed from 2.4 to 3.1- that sure is alot of improvement, but check your GPU scores and see how much just the GPU improved by. If the GPU improved by a ton, then you are probably CPU limited. Just gotta crank the OC on that CPU if that really is the case.
 

rndmavis

Distinguished
Mar 22, 2010
125
0
18,710



1) Tests were done in 3DMark06

2) OCing resulted in a slight improvement of about 4 FPS - 2.4 ghz -> 3.1 ghz

3) CCC 3D settings:
Standard: Balanced
AA: Use application settings
AAMode: Performance
AF: Use application settings
AI: Standard
Mipmap: Quality
Smoothvision HD AA: Use application settings
Smoothvision HD Anisotropic Filtering: Use application settings
Vsync: Always off


4) Other games show roughly 50% performance increase - Tried Source engine (L4D2) and Eclipse engine (Dragon Age: Origins).



All info given is at 2.4 speeds unless stated otherwise. I ran 3DMark 06 after formatting yesterday and got 12.6k. I'm considering buying a q9x50 processor to see what happens. Not sure if I should do that or just spring for new i7/mobo/mem (really would want to avoid this).

Edit: I checked a power consumption calculator. Added in 50% capacitor aging. I've had the 600 watt PSU for 3 years and my computer is never off even if I'm gone for a few days. The calc estimates power consumption of 524 watts. I wanted to bring this into the equation just in case.
 

rndmavis

Distinguished
Mar 22, 2010
125
0
18,710
After logging CPU, GPU usage, and some OC'ing. I've determined that there is a strong possibility that the CPU is bottlenecking. Let share my findings.

My first GPU log shows the load on the 5850 tops out at 40%. Usually hanging around 32%.

Here's the CPU usage during this test:

goldpony-1269493516-Perfmonbf2stock.gif



The second set of logging I did was with the CPU OC'd to 3.2 ghz. The load on the 5850 was maxed at 44% and hovered around 38%

OC'd CPU usage for this test:

goldpony-1269492666-bf23.gif


The CPU usage does drop when it's OC'd. I do not know what CPU usage looks like as the speed scales, but the GPU was able to provide more power when the CPU was OC'd.

Is this solid enough to say that the CPU is the problem? Still the question remains of my friend's PC who runs a 4850 and q6600 with better performance than me.

I also read that nvidia cards/drivers allow the GPU to handle more of the processing in games, while the same information said that Radeon stuff pushes more of the work to the CPU. Any truth to this?
 

flyinfinni

Distinguished
May 29, 2009
2,043
0
19,960
I would imagine your PSU should be fine- you've still got a little headroom even over a 50% capacitor aging, which is quite a bit. I doubt that is your problem.
With those tests, I'd say you are most likely CPU limited- you GPU is only hitting 44% Usage at max? If you're not getting great performance and just not using the GPU in that instance (rather than being held back and not getting the graphical performance you want) then I could understand that, but your GPU should be maxing out when you are throwing heavy heavy graphics at it. With my i5 (@3.6 GHz) and 5750's xfire (about equal to a 5850), I'm maxing both GPUs in most games I play that are not limited by a frame cap.
I wouldn't buy a new CPU yet- can you OC it any higher? Crank the OC on that as high as you can go, and see if that does it. If not, THEN its time to start looking for a new CPU.