Sign in with
Sign up | Sign in
Your question

GTX 460 768MB suffering huge FPS dip when going from DX9 to DX10/DX11

Last response: in Graphics & Displays
Share
January 24, 2011 11:16:15 AM

Just curious, how much of a performance hit do you guys get when going up from DX9 to DX10/11?

In Crysis, DX9 gives 32.8 FPS while DX10 gives 28.85.
In Warhead, DX9 gives 48.31 FPS while DX10 gives 45.24.
In F1 2010, DX9 gives 45 FPS while DX11 gives 30 FPS.
In DiRT 2, DX9 gives 59 FPS while DX11 gives 47.8.

Is this normal? If it isn't, what solution can I apply? Will reinstalling drivers help?

Here are my pertinent specs:

Intel Core 2 Duo E6550 @ 2.8 GHz
4GB DDR2-800 RAM
Inno3D GTX 460 768MB
a b U Graphics card
January 24, 2011 11:45:24 AM

that is completely normal.DX 10/11 is supposed to be faster than dx9 if done right but devs don't do it as it was supposed so it becomes slower.your fps loss seems right to me.
a b U Graphics card
January 24, 2011 1:20:11 PM

Most games still support XP, so you have a "base" DX9 path, with occasional branches for DX10/11. As a result, the DX9 path is the most optimized.
Related resources
January 24, 2011 1:23:10 PM

So my huge dips are fine?

From what you're saying, upgrading CPU will still cause dips like these though at increased framerates?
a b U Graphics card
January 24, 2011 4:26:01 PM

Probably.
January 24, 2011 4:39:22 PM

It's normal!
January 24, 2011 4:50:52 PM

Some games enable extra special effects when using 10/11 because 9 can't do them.

I would compare screen shots and see if there is any gained quality. If not, stick with 9 if it's more performant.
a c 216 U Graphics card
January 24, 2011 5:03:08 PM

celpas said:
that is completely normal.DX 10/11 is supposed to be faster than dx9 if done right but devs don't do it as it was supposed so it becomes slower.your fps loss seems right to me.


This is a misunderstood concept.

DX10 is faster than DX9 when doing the same code.

DX11 is faster than DX10 when doing the same code.

Games written for DX10 usually include new visual improvements that DX9 cannot perform which will cost you some performance.

Games written for DX11 usually include new visual improvements that DX10 cannot do, so it also slow things down. Things like tessellation and ocular occlusion are added.

The general guideline most dev's follow is to not write a game in DX10 or DX11 if you can get the same visuals out of DX9. Even if it would be faster to write it in DX11. This is because more people can play it if it's written in DX9. Same goes for DX10 vs DX11.
a c 144 U Graphics card
January 24, 2011 6:06:24 PM

^+1

i think that many people still don't get this thing right. i still remember right before dx11 comes out many people think that dx11 games will still run faster than dx10/dx9 even with the new visual improvement added in the game. (which is obviously eating your performance if you try to think about it logically) ;) 
a c 171 U Graphics card
January 24, 2011 6:38:00 PM

i wouldnt say what you are getting is "huge dips" id say theyre exactly what i would expect. As renz said, your getting better visual quality, of course performance is not going to remain the same.
January 25, 2011 8:28:59 AM

I guess I'll be sticking with DX9. I'd prefer consistently smooth framerates rather than eye candy that I won't be looking at when I'm shooting at aliens or hitting the apex.

Are there any games that look substantially better in DX10/DX11 than DX9? Crysis and F1/DiRT don't look particularly different.
a b U Graphics card
January 25, 2011 9:28:41 AM

bystander said:
This is a misunderstood concept.

DX10 is faster than DX9 when doing the same code.

DX11 is faster than DX10 when doing the same code.

Games written for DX10 usually include new visual improvements that DX9 cannot perform which will cost you some performance.

Games written for DX11 usually include new visual improvements that DX10 cannot do, so it also slow things down. Things like tessellation and ocular occlusion are added.

The general guideline most dev's follow is to not write a game in DX10 or DX11 if you can get the same visuals out of DX9. Even if it would be faster to write it in DX11. This is because more people can play it if it's written in DX9. Same goes for DX10 vs DX11.

I still think that dx 10 is not optimised well.take for example crysis warhead,when I run it in dx 9 at everything max I get the performance of dx10 gamer settings(high).Also it feels much more responsive than dx10.From comparing dx10 vs dx9 in the game all I could find was that dx10 had a slightly more draw distance and a brighter sun but it took nearly 5fps away.The only games in which dx10 was actually done right were metro 2033 and just cause 2.In metro 2033 there was only a 2fps difference between dx9 and dx10 and the difference was enormous.the dx9 image was way too bright and less realistic while dx10 looked darker and more realistic.
a b U Graphics card
January 25, 2011 9:33:37 AM

jut703 said:
Just curious, how much of a performance hit do you guys get when going up from DX9 to DX10/11?

In Crysis, DX9 gives 32.8 FPS while DX10 gives 28.85.
In Warhead, DX9 gives 48.31 FPS while DX10 gives 45.24.
In F1 2010, DX9 gives 45 FPS while DX11 gives 30 FPS.
In DiRT 2, DX9 gives 59 FPS while DX11 gives 47.8.

Is this normal? If it isn't, what solution can I apply? Will reinstalling drivers help?

Here are my pertinent specs:

Intel Core 2 Duo E6550 @ 2.8 GHz
4GB DDR2-800 RAM
Inno3D GTX 460 768MB

btw what settings do you play crysis warhead at?Just interested because you are getting 45 fps in the game
January 25, 2011 11:16:52 AM

Those are controlled benchmark runs.

Both Crysis and Warhead are at 1920x1080, High Settings, 0xAA.

In Crysis I use the heavy Assault Harbor bench.
In Warhead I use the Frost bench which is the heaviest of the bunch but not as heavy as Assault Harbor.
a b U Graphics card
January 25, 2011 1:11:07 PM

your card is being heavily bottlenecked by your RAM and your processor bcause I just ran the crysis assault harbor benchmark and got 36fps average.My rig is
Intel core i7 920 2.66ghz
MSI Geforce GTX 260 1792MB Single card
6gb DDR3 1600MHZ RAM

1/25/2011 8:34:26 PM - Vista 64
Beginning Run #1 on Map-harbor, Demo-Assault_Harbor
DX9 1920x1080, AA=No AA, Vsync=Disabled, 32 bit test, FullScreen
Demo Loops=1, Time Of Day= 5
Global Game Quality: High
==============================================================
TimeDemo Play Started , (Total Frames: 4100, Recorded Time: 132.23s)
!TimeDemo Run 0 Finished.
Play Time: 112.02s, Average FPS: 36.60
Min FPS: 24.77 at frame 2902, Max FPS: 55.44 at frame 1851
Average Tri/Sec: 12700486, Tri/Frame: 346998
Recorded/Played Tris ratio: 0.04
TimeDemo Play Ended, (1 Runs Performed)
==============================================================

Completed All Tests

<><><><><><><><><><><><><>>--SUMMARY--<<><><><><><><><><><><><><>

1/25/2011 8:34:26 PM - Vista 64

Run #1- DX9 1920x1080 AA=No AA, 32 bit test, Quality: High ~~ Last Average FPS: 36.60

Try going to www.geforce.com and update your drivers.
January 25, 2011 9:20:15 PM

You cant compare your setup to his!?

And of course your system will run faster because of your components.


You have a GTX260 with 1792MB??? He only has 768MBMB!

Different models will average different results, common sense there.
a c 216 U Graphics card
January 25, 2011 11:52:27 PM

celpas said:
I still think that dx 10 is not optimised well.take for example crysis warhead,when I run it in dx 9 at everything max I get the performance of dx10 gamer settings(high).Also it feels much more responsive than dx10.From comparing dx10 vs dx9 in the game all I could find was that dx10 had a slightly more draw distance and a brighter sun but it took nearly 5fps away.The only games in which dx10 was actually done right were metro 2033 and just cause 2.In metro 2033 there was only a 2fps difference between dx9 and dx10 and the difference was enormous.the dx9 image was way too bright and less realistic while dx10 looked darker and more realistic.


Crysis in DX10 mode used new shading methods not availible in DX9. It definitely added new features. However, those newer, more demanding features may not be worth the performance loss, and might not even be noticeable, but they did new techniques regardless. That is why it performs slower in DX10.

Metro 2033 was interesting to me. I was playing with some of the advanced features on the benchmark, like DoF, advanced PhysX and such, but could hardly tell a difference. Even going from high to very high, the quality difference was small, yet the performance impact was huge. Looking at the description field, the difference in a lot of areas was whether the game would approximate things or if it would calculate it out.

a b U Graphics card
January 26, 2011 12:38:05 AM

accolite said:
You cant compare your setup to his!?

And of course your system will run faster because of your components.


You have a GTX260 with 1792MB??? He only has 768MBMB!

Different models will average different results, common sense there.



He owns a gtx 460 which should run faster than a gtx 260 no matter what the components.He is running at 1920X1080 which is more of a gpu bound resolution.
January 26, 2011 1:28:43 AM

celpas said:
He owns a gtx 460 which should run faster than a gtx 260 no matter what the components.He is running at 1920X1080 which is more of a gpu bound resolution.

yep u are correct a GTX 460 would out perform ur GTX 260 and yes i would say few things is bottlenecking his PC maybe if he upgraded to Quad Core and got more ram and OC the RAM he would get lil more out of it
January 26, 2011 1:38:42 AM

Oh my mistake, I was in a hurry and misread there what you have and what op has, all this 260 460 560 is starting to blend together for me.



a b U Graphics card
January 26, 2011 1:55:17 AM

There is also a possibility of vram due to which my gtx 260 can be faster.Crysis,metro2033,gta 4 all gobble up a lot of vram and he only has 768mb of vram which is not sufficient for crysis,metro.Also accoording ot the grapgh the 460 768mb is 6%faster than the gtx 260
a c 376 U Graphics card
January 26, 2011 2:15:28 AM

celpas said:
He owns a gtx 460 which should run faster than a gtx 260 no matter what the components.

Not when the components are an E6550 vs i7.
January 26, 2011 2:22:53 AM

yeah that list of cards needs updateing missing GTX 570 and yeh i know GTX 560 out but too new to be posted in ;) 
a b U Graphics card
January 26, 2011 2:23:11 AM

jyjjy said:
Not when the components are an E6550 vs i7.

he will see bottleneck in cpu bound games like bfbc2 and gta 4.crysis ,metro 2033 don't utilise cpu much at all.In fact metro 2033 showed no difference when moving from core 2 duo to an i7
http://www.legionhardware.com/articles_pages/metro_2033...

besides he is playing at 1920X1080 resolution which is more gpu bound.
January 26, 2011 2:36:01 AM

but thats just a small list lets be honest most games intense games need a nice CPU usage :/  wen i upgraded from my lga 775 to a 1155 with a 9600 GT i got a uber FPS increase in 80% of my games and crysis got a huge FPS increase from like 25 to 35 thats with same settings and rez with 9600 and i have upgraded to GTX 570 since then so but still i got performance
a c 376 U Graphics card
January 26, 2011 2:37:52 AM

celpas said:
he will see bottleneck in cpu bound games like bfbc2 and gta 4.crysis ,metro 2033 don't utilise cpu much at all.In fact metro 2033 showed no difference when moving from core 2 duo to an i7
http://www.legionhardware.com/articles_pages/metro_2033...

besides he is playing at 1920X1080 resolution which is more gpu bound.

Crysis does need a decent cpu. I'm not surprised a core 2 duo at 2.8ghz isn't quite enough for even a GTX 460 even in that game.

@OP: Have you tried overclocking the GTX 460? It does so very well, often over 30%. You'll know which games are being bottlenecked by the CPU if the overclocking doesn't get you significantly better results.
January 26, 2011 2:38:49 AM

forgot to include that DDR2 vs DDR3 >.> that also helped the JUMP in FPS
January 26, 2011 10:18:10 AM

My RAM is already overclocked, that's originally DDR2-667.

My video card is already overclocked, running it at 820/1000. About as fast as a stock GTX 460 1 GB, if not faster.

As you can see, Warhead is much less CPU dependent than the original Crysis. I gained a significant amount of FPS when overclocking it even when I ran my GTX 460 at stock settings. Celpas, what are your Warhead benchmark scores? I would assume my 460 scores higher than it there, whereas in Crysis you beat me due to my vastly inferior CPU.

In Crysis, my biggest FPS jump was when I overclocked from 2.33 to 2.8 GHz, even at 1080p.

I know 1920x1080 is supposed to be highly GPU bound, but as I have learned, you will have to have a decent CPU no matter what. I already felt that a GTS 450 would have been the most optimal upgrade from my old HD 3850 without bottlenecking my rig. However, a 450 isn't strong enough for 1920x1080 at High settings so I got the 460.
January 26, 2011 10:19:37 AM

bowzef, any online article you could link that showed that DDR2 to DDR3 gives an FPS JUMP? I would say marginal gains, but definitely not jump.

Oh yeah, I forgot to mention that I'm already running the latest drivers, 266.58.
a b U Graphics card
January 26, 2011 10:37:10 AM

he will see bottleneck in cpu bound games like bfbc2 and gta 4.crysis ,metro 2033 don't utilise cpu much at all.In fact metro 2033 showed no difference when moving from core 2 duo to an i7
http://www.legionhardware.com/arti [... said:
ide,9.html

besides he is playing at 1920X1080 resolution which is more gpu bound.
]he will see bottleneck in cpu bound games like bfbc2 and gta 4.crysis ,metro 2033 don't utilise cpu much at all.In fact metro 2033 showed no difference when moving from core 2 duo to an i7
http://www.legionhardware.com/arti [...] ide,9.html

besides he is playing at 1920X1080 resolution which is more gpu bound.


the review on that link doesn't include a core2duo . depending on the game the difference between a c2d and an i7 is noticeable even @ 1080p.

f1 2010 which is on the op's list, is very cpu bound.

http://www.tomshardware.com/reviews/sandy-bridge-core-i7-2600k-core-i5-2500k,2833-19.html

it also has a bug that locks GPU usage @ 60%.
a b U Graphics card
January 26, 2011 11:18:40 AM

ok.I will run the warhead benchmark and post the results here
a b U Graphics card
January 26, 2011 12:48:24 PM

whic hwas the warhead benchmark you were running the frost flythrough or only frost?
January 26, 2011 2:49:30 PM

Frost, not the flythrough. Gives more realistic FPS that way.
a c 376 U Graphics card
January 26, 2011 5:27:40 PM

jut703 said:
I already felt that a GTS 450 would have been the most optimal upgrade from my old HD 3850 without bottlenecking my rig. However, a 450 isn't strong enough for 1920x1080 at High settings so I got the 460.

I'd say the card you bought is about exactly right for your processor and resolution. There will always be a bottleneck of some sort and there is no rule that says if it is the CPU that is worse. The card will be limited in some games by the CPU but there are still many games where your CPU is adequate and the GTX 460 will be of great benefit over weaker cards.
a c 171 U Graphics card
January 26, 2011 7:46:24 PM

id overclock that cpu if possible.
a c 376 U Graphics card
January 26, 2011 9:50:56 PM

It is already OCed .5ghz actually. Should be able to go higher though.
a b U Graphics card
January 27, 2011 1:09:19 AM

OK.Seems as if your cpu was bottlenecking in crysis bcoz I ran the warhead benchmark and got 41 fps in dx10 and 45 in dx9 average but still try and replace your core2duo with a quadcore because the difference between and 460 should be more than just 3-4 fps.
a c 216 U Graphics card
January 27, 2011 1:17:06 AM

celpas said:
OK.Seems as if your cpu was bottlenecking in crysis bcoz I ran the warhead benchmark and got 41 fps in dx10 and 45 in dx9 average but still try and replace your core2duo with a quadcore because the difference between and 460 should be more than just 3-4 fps.


That's about exactly how much higher a 460 768mb should perform compared to a 260. Look on the chart above. The two only perform about 10% different (49% vs 55%). 4 FPS is about 10% of your score.

Perhaps you were talking about another posters upgrade...
January 27, 2011 5:29:08 AM

@bystander
My GTX 460 was overclocked to 875/1100 MHz with my 48 FPS. I admit I'm CPU bound. Perhaps if I had a GTX 260 I won't even be able to break 40 FPS.

@jyjyy
If it's just about right, why can't I soundly outperform a GTX 260 at stock paired with an i7-920?

@psychosaysdie
I've overclocked it and that's the highest it'd go without voltage increase. I'm using stock cooler and value RAMs so I can't push that far.
January 27, 2011 7:30:06 AM

Video card's overclocked to 875/1100 MHz.
a c 376 U Graphics card
January 28, 2011 3:12:13 PM

jut703 said:
@jyjyy
If it's just about right, why can't I soundly outperform a GTX 260 at stock paired with an i7-920?

I assumed you were not only going to be playing that specific game. If so then I guess maybe you did get a card that is more powerful than your processor can utilize.
a b U Graphics card
January 29, 2011 2:06:40 AM

jyjjy said:
I assumed you were not only going to be playing that specific game. If so then I guess maybe you did get a card that is more powerful than your processor can utilize.

Actually bottle neck is a problem which no upgrade can completely solve.My original processor in this rig was a core 2 quad q8200 and when I upgraded to the i7 I saw a massive fps boost in all games which showed that my video card was not being utlised 100%.If I upgraded my processor to an i7 extreme edition my fps would probably match a gtx 460 1gb paired with a core 2 duo
a c 216 U Graphics card
January 29, 2011 2:12:09 AM

celpas said:
Actually bottle neck is a problem which no upgrade can completely solve.My original processor in this rig was a core 2 quad q8200 and when I upgraded to the i7 I saw a massive fps boost in all games which showed that my video card was not being utlised 100%.If I upgraded my processor to an i7 extreme edition my fps would probably match a gtx 460 1gb paired with a core 2 duo


What resolution do you play at, and what FPS did you start at and end up at?
a b U Graphics card
January 29, 2011 2:52:41 AM

bystander said:
What resolution do you play at, and what FPS did you start at and end up at?

I have 2 monitors but I usually play on 22INCH SAMSUNG MONITOR at 1680X1050.In crysis was getting 26fps previously increased to 30 on i7.In GTA 4 was getting 35fps increased to 48 on upgrade.In metro 2033 I was getting average 30 on high settings increased to 34.
!