Sign in with
Sign up | Sign in
Your question

DirectX 10 shafted, nvidia implementation dodgy?

Last response: in Graphics & Displays
Share
June 3, 2007 8:37:41 PM

After receiving my first DX10, Nvidia 8500 GT, card this weekend I am more than a little surprised at the performance comparsion between DX10 and DX9.

Now I know the card was going to be weak, but was pleasantly surprised after a bit of tweaking (OC) it performed pretty well in my games, STALKER, Oblivion, Company of Heroes (DX9) about 31FPS (more or less max settings).

The DX9 on CoH is important due to the recent patch which I was eager to try when i found out today. After installing the DX10 patch i went to the options to bench and was surprised to see most gfx defautling to low so i thought i'd turn a few up and try, must be able to take it it can in dx9, bam 6fps, shit try default 20fps.

Well this isn't right in default it looks terrible, everything is turned down and fps is still low.

I had tried the DX10 sdk samples before which had equally bad performance, i put this down to it being sample code but now i'm wondering is there either something more serious wrong with either DX10 or nvidia's currently implementation.

I know the 8500 is low in shaders and that this will be a cited as the cause but then why the decent performance in shader intensive dx 9 games.
Hoping for some thoughts on this
June 3, 2007 8:48:33 PM

its definatly the card. I see DX10 on anything but the 8800 and the HD2900 nothing more than a way to up the price. To get 30fps outta the card when all is near max isnt bad.

The specific parts of the drivers for dx10 may be still young amd meed optimizing, if thats possible.
June 3, 2007 8:55:01 PM

I agree with Paul. It's your card. There's nothing wrong with it, it's just that it's underpowered for DX10.

I've been playing a couple DX10 things now (Lost Planet, CoH, and the nVidia DX10 demos) and they run relatively smoothly. The head demo really taxes my system... but the games work fine.

Sorry bud... but if you were looking for a good DX10 experience, the 8500 isn't the answer. Stick to DX9... there's still a lot of life left in DX9.
Related resources
June 3, 2007 8:56:55 PM

Oh yeah the G86 is a sweet chip when it comes to overclocking and temps on it are incredible.

core 647 mem 656 (ddr3)
ild 50 load 60

The card was factory oc'd to 600 600.

I got this for £43 incl GoogleCheckout £10 off

With the performance i was getting and the price i was considering going sli hoping to get the equivalent of 8600 GT 512mb for £86, not bad i thought. But this DX10 performance has turned me way off.

Does this mean that nvidia's implementation ways heavily towards dx9.

would like to hear from some 8800 owners who have tried the CoH dx10 patch.
June 3, 2007 8:57:38 PM

You're doing an apples to oranges comparison. When you run CoH in d3d10 mode you turn on a dozen new features that are impossible in d3d9. So yeah, of course it's going to suck. If it gets only 30fps with d3d9, then in d3d10 you need to do 3x the work to see the new visuals you won't be happy.

It's nothing to do with the implementation, the 8800 has 128 shaders units, the 8500 has 32 or 16 or something. It's due to lack of power. While developers may choose to make a low end version of d3d10 most won't bother like Relic didn't. The cards that would run d3d10 low can just run the d3d9 version fine.
June 3, 2007 9:02:34 PM

Quote:
Does this mean that nvidia's implementation ways heavily towards dx9.


Really there isn't much difference in pixel shaders between 2.0, 2.0a, 2.0b, 3.0, and Direct3D's 4.0. They just tend to get longer. nv didn't mess anything up or make their card 9 centric, they just cut so much power from the 8600 that it can't run the pixel shaders at decent frame rates unless you lower your resolution a lot. Or shorten the shaders to 9.0 lengths.

d3d10 reduces CPU overhead, if you're GPU bound it does nothing for you.

Quote:
would like to hear from some 8800 owners who have tried the CoH dx10 patch.


http://www.legitreviews.com/article/507/1/ has 8800 and 2900 numbers.
June 3, 2007 9:02:46 PM

But does it seem bizarre to have a dx10 card that actually performs worse in dx10. I was not expecting it to suddenly allow me to have tons of eye candy but at least close to the same amount. And trust me it is no were near, indeed the software raster is nearly as fast (small exaggeration but not much)
June 3, 2007 9:03:19 PM

FYI... i just upgraded to the June 1st release of the ForceWare drivers and i saw major improvements in the Lost Planet demo.
June 3, 2007 9:08:07 PM

I will just accept the performance, its a cooler running card than the last so thats not so bad. still i am a bit disappointed but it will allow me to play around with dx10 programming as well

Oh should mention to i'm impressed with nvidia's new image quality, i never realised how bad it was before :)  i came from 6800gs
June 3, 2007 9:20:59 PM

If this is the way does it me then in practice the majority of the geforce 8 series is effectively dx9 because u are better off with it until ur improved performance give deminished returns and then switch to dx10 which is at about at least 8800 gts level
June 3, 2007 9:25:28 PM

Quote:
But does it seem bizarre to have a dx10 card that actually performs worse in dx10. I was not expecting it to suddenly allow me to have tons of eye candy but at least close to the same amount. And trust me it is no were near, indeed the software raster is nearly as fast (small exaggeration but not much)


I don't think you understood. Yes, your d3d10 card is performing better in d3d9. It will perform even better in d3d8 and still better in d3d7! And best in 6!

Imagine a game that supports 10,9,8,7 and even 6.

In 6 you just have one texture, no lightmaps. Looks wrong, but is fast since the 'pixel shader' is just one instruction to read the texture.

In 7 you have the texture plus a light map texture, the 'pixel shader' is three instructions, read texture A, read texture B, and then multiply them. Slower than 6 and uses more memory.

In 8 you have pixel shader 1.1, it might read a 3rd texture and as a specular map and multiply that by the specular lighting calculated in the vertex shader. Slower and more memory than 7. Again.

In 9 maybe a normal map and gloss map is added and then all the lighting is moved per pixel. Maybe a shadow map too and the scene is rendered twice (once for what you see and once to create the shadow map). Slower pixel performance, more memory, and more CPU overhead to render the scene twice.

Then 10 comes along and adds soft particles, improved shadow filtering, shadow maps more lights, and other things. Slower again. More memory again.

What you're saying is similar to OMG the nv GeForce3 sucks, it's faster in dx7 mode than when it uses pixel shader 1.1! OMG the ati 9700 sucks, its faster in pixel shader 1.1 mode than 2.0 mode. Why did they release a card that is faster when it does less work??

In CoH you've got low (pixel shader 1.1), high (pixel shader 2.0) and Ultra (Pixel Shader 4.0/Direct3D 10). shader settings. Even if you set the other details settings to the same you're still using more expensive shaders and hence need more power.
June 3, 2007 9:28:53 PM

You buy small you get small.
June 3, 2007 9:43:54 PM

looks like its a while until directx 10 really is for the masses then
June 4, 2007 1:10:50 AM

Quote:
looks like its a while until directx 10 really is for the masses then


I hear what you're saying about "Why market DX10 if it won't run DX10 titles."

DX10 doesn't have many releases now, and most releases are going to be geared towards enthusiasts who have been dying to open pandora's DX-10 box.

I woulnd't be too dissapointed yet. Newer DX10 titles 'should' provide you with a way of getting the DX9 performance with some DX10 bells-and-whistles, without a huge performance hit. The market wants crazy-ridiculous DX10 gfx, so that's what developers are focusing on right now.
June 4, 2007 10:02:03 AM

Yeah Whizzard thats kind of what i was getting at. I wasn't expecting new DX 10 shading madness with uber fps for nothing but conversely i wasn't expecting DX 10 to be so expensive gpu processing power wise.

So if 16 shader units at 1200mhz isn't acceptable and i doubt 32 (8600) would be that much better does it look like 96 (8800 GTS) will become the more acceptable entry level for dx 10.

I know you mentioned optimised code later on whizz but i can't see any optimization improving dx 10 performance on these cards by that much, because as i said before even some of the simple DX10 SDk samples run terrible, the instancing demo actually crashed the card at one point and the motion blur one is a powerpoint slide shown on a 486 :) 
June 4, 2007 12:26:08 PM

Jaydee - Its a long post :)  anyway I assume u meant it in either relation to the fact there is no from the ground up DX 10 games or possibly the pathetic fill rate on these cards.

These are fair points indeed the fill rate for definite, but as for the software angle I still think no amount of code tinkering is going to allow these cards (incl 8600) to run DX10 with anything like the eye candy on DX9 games. I'll mention the DX SDK samples again because while the code may not be optimised they are in essence only small snap shot programmes of individual techniques which one would have thought would be at least with in the realms of even an entry level card.

As a side note if anyone in the know is reading, is fill rate affected by core and/or memory OC or is there some other factor limiting it.
a b U Graphics card
June 4, 2007 12:37:10 PM

I understand. If you read my posts they went something like I wouldnt reccomend ANY DX10 midrange card as theyre choked, overpriced, and the software that is currently posing as "DX10" doesnt help, it henders, the performance of these too weak and overpriced cards for DX10. DX9 however is another matter, tho for the same price, you can achieve a much higher performing card for the price. I find that a shame, and Im not liking the direction, or sidestepping these "DX10" games or "patched" games that are currently out. If these games were properly written ina more "full" DX10 mode, they would show increases without the manufacturer having to do a thing, only driver upgrading.
June 4, 2007 12:59:57 PM

Yeah good points and similar to my own train of thought regarding upgrading. I had only done so because i was interested in trying out some DX 10 programming and the software raster was painful to use. I'm fairly pleased with card barring the dx10 performance. The temperatures and power consumption is considerably lower than my old card and the addition of gddr 3 on the xpertvision model for a price (£53) that was lower than most others, and then with the google checkout £10 off offer it seemed even sweeter. My old card was a flamethrower compared with this one and as I mentioned before nvidia new image quality is noticeably better than the 6 series i had before on the same settings.

But it just goes to show the nature of technology alot of promises are made long before they will actually come true or indeed are fake promises. Bring to mind the HD tv arena where HD Ready was touted as the next sliced bread of TV tech then within a year or two its like, 'Oh wait sorry did we say HD Ready we meant Full HD is the next slice bread' well wait what about that HD Ready 'ah not as good'. Still have SD tv myself and its fine sliced bread to me :) 
June 4, 2007 1:13:05 PM

Quote:

With the performance i was getting and the price i was considering going sli hoping to get the equivalent of 8600 GT 512mb for £86, not bad i thought. But this DX10 performance has turned me way off.


FYI, the 8500 is missing the sli bridge. Anyone know if there is awork arround, because if there isn't your not going to get your 8600gt performance. seem to remeber that nvidia wasn't going to have sli on the cheapo cards as sli tends to be for people who have lost of money...
June 4, 2007 1:15:58 PM

No offense but what do you expect from a lower-end DX10 card?

There's the 8800 GTX/Ultra at the top, 8800GTS's and 2900 XT near the top, 8600 series in the middle, and the 8500s right now at the bottom-end on mid-range, and finally the 8300 and 8400 are entry-level cards.
June 4, 2007 1:20:44 PM

Why would anyone buy an 8500 anyway.....
June 4, 2007 1:31:54 PM

Blake - The 8500 is sli ready i believe it just uses the PCI-e bus for GPU>GPU communication probably enough bandwidth considering they're not uber cards.

Oh blake i am cosidering trying sli on these because i can get another one for £43, if i do i'll put up a post if ur interested, ebuyer are out of stock at the mo so i'll have to wait and see.

Hehe guys not all of us are going to go out and spend 200+ on gfx hell not even a decent size minority, and anyway if DX10 is suppose to free up cpu but to get it you need to spend 200+ on gfx when a decent CPU can be got for around 100 then there is something strange going on, it smacks a little of false economy to me :)  Oh and the 8600 is not that spectacular better than the 8500, though i am taking into account the insane overclocking headroom in the G86, mine is about 30% up on stock and it hasn't even flinched, temps stay well with in limits and no noticable artifacts.
June 4, 2007 1:31:56 PM

Quote:
After receiving my first DX10, Nvidia 8500 GT, card this weekend I am more than a little surprised at the performance comparsion between DX10 and DX9.

Now I know the card was going to be weak, but was pleasantly surprised after a bit of tweaking (OC) it performed pretty well in my games, STALKER, Oblivion, Company of Heroes (DX9) about 31FPS (more or less max settings).

The DX9 on CoH is important due to the recent patch which I was eager to try when i found out today. After installing the DX10 patch i went to the options to bench and was surprised to see most gfx defautling to low so i thought i'd turn a few up and try, must be able to take it it can in dx9, bam 6fps, **** try default 20fps.

Well this isn't right in default it looks terrible, everything is turned down and fps is still low.

I had tried the DX10 sdk samples before which had equally bad performance, i put this down to it being sample code but now i'm wondering is there either something more serious wrong with either DX10 or nvidia's currently implementation.

I know the 8500 is low in shaders and that this will be a cited as the cause but then why the decent performance in shader intensive dx 9 games.
Hoping for some thoughts on this
What did you expect when you bought the 8500GT? Also, you're getting an average of 31FPS on Oblivion at the max settings? I didn't know that Oblivion supported resolutions lower than 640x480. :lol: 
June 4, 2007 1:36:55 PM

Heyyou sorry the post may not be clear but the 31fps was on CoH, 31fps near high settings would be some feat on oblivion :) , and don't take it as a definitive figure or anything, a few key things were turn down 2x AA and things like that but it was on highest shader (bar dx 10) oh and it was at 1024x768.

People always like to pick on the little guy :) 
June 4, 2007 1:51:05 PM

Quote:
Heyyou sorry the post may not be clear but the 31fps was on CoH, 31fps near high settings would be some feat on oblivion :) , and don't take it as a definitive figure or anything, a few key things were turn down 2x AA and things like that but it was on highest shader (bar dx 10) oh and it was at 1024x768.

People always like to pick on the little guy :) 
I always like to give people crap when they say they play Oblivion at the max, because I can just barely pull it off with my 8800GTX. :wink:

It's a shame that Direct X10 performance isn't all it's cracked up to be, but as long as the card is doing what you want in Direct X9, why does it matter? With only Lost Planet and Company of Heroes, there's still a lot of life left in Direct X9.
June 4, 2007 2:02:54 PM

I just want to play devil's advocate on behalf of the OP.

Most of the hype over DX 10 wasn't eye-candy; it was better performance. All this talk about less overhead and faster shaders. We, as general consumers, were led to believe that DX10-enabled apps would run BETTER at the same settings than under DX9.

So, in short, that's why "it matters" ;) 

I have an 8800 GTS, and I expected DX10 to both look better AND perform better. From what I've seen so far, DX10 looks the same as DX9 and performs worse. I understand that I, myself, made the argument about the direction developers are taking DX10, but I totally understand the dissapointment with mid-range DX10 cards.

We were all told that DX10 would perform better than DX9 at the same settings, and it's simply not living up to the hype.
June 4, 2007 2:19:16 PM

Quote:
I always like to give people crap when they say they play Oblivion at the max, because I can just barely pull it off with my 8800GTX. :wink:


Huh? My GTX doesn't even work properly and I play at 1920x1200 with 2xAA and QTP. I suppose it depends on what you consider acceptable fps (I get around 35 in thick forest). :?

Ontopic, it wasn't clear from the benchmarks, were both the dx9 and dx10 tests carried out under Vista? Methinks that the overhead of that OS would negate any dx10 performance gain.
June 4, 2007 2:24:06 PM

Oh yeah Heyyou it is not really having to much of bearing, the dx 10 performance, as its unlikely that game developers will be pushing dx10 code to the masses for at least another year, I'm thinking not until geforce 9 mid range has been released because thats is going to be the only feasible dx 10 critical mass point. Otherwise whats the point only people with 8800's are going to be able to run it and thats a small slice of the pie.

Whizz yeah the old saying 1 step forward 2 steps back comes to mind, it will probably even up in the long run and indeed it may be that it is exggerated by a little corporate greed. Nvidia at this point can say its full dx compliant range of cards and with out software who's going to question it :)  They probably knew fine rightly that these cards would never run dx10 acceptably.

P.S. Not to say other companies wouldn't do the same. Don't want to start any fanboi rants here :) 
June 4, 2007 2:52:41 PM

Claig - Both on vista, not in the habit of changing OS's just to try out a £43 gfx :)  the bench was pretty rough and ready i was only trying to get a feel for what to expect from the DX 10 API so take with appropriate pinch of sodium chloride :) 

As a side note vista was kind forced on me as no amount of fiddling let me get my Opty 170 recognised by XP pro, still not sure what was up there. Not to unhappy about it vista has been ok for me barring a few minor guibbles :) 

But looks like i will have to wait about a year for dx10 :(  what was vista all about again, hehe nah theres more than that in there :)  right?
June 4, 2007 4:37:00 PM

Well it's like they showed in these benches, http://www.legitreviews.com/article/507/3/

even the $800 uber 8800 ultra can't run the game at 60fps on 12x10 with quality on, and that's not even with AA, a must (for me anyways). I wouldn't feel too bad that your dirt cheap card can't do so well when even the top dogs are struggling.
June 4, 2007 5:46:39 PM

Cheers for the link there Bung, well looks like DX 10 is gpu hog, hope the cpu savings are worth it, little tongue in cheek there the added shader functionality should allow for some nifty stuff in the future i'm sure. But it goes to show that DX10 isn't really going to truely arrive with the current gfx generation.

But having seen some screenies of CoH DX10 i'm wondering was it worth the effort, only noticeably thing was the placement of numerous stones with instancing. Now admittedly these are static screenies so things like soft particiles aren't really noticeable and to be fair as well i'm sure Relic are saving the real DX 10 effort for 'opposing fronts' if thats what its called
June 5, 2007 1:15:01 AM

Quote:
I just want to play devil's advocate on behalf of the OP.

Most of the hype over DX 10 wasn't eye-candy; it was better performance. All this talk about less overhead and faster shaders. We, as general consumers, were led to believe that DX10-enabled apps would run BETTER at the same settings than under DX9.

So, in short, that's why "it matters" ;) 

I have an 8800 GTS, and I expected DX10 to both look better AND perform better. From what I've seen so far, DX10 looks the same as DX9 and performs worse. I understand that I, myself, made the argument about the direction developers are taking DX10, but I totally understand the dissapointment with mid-range DX10 cards.

We were all told that DX10 would perform better than DX9 at the same settings, and it's simply not living up to the hype.


Your expectations are both valid and not.

Firstly, the DX10 'goal' of improved performance coupled with improved visuals as you describe is entirely possible and likely.

Your expectations that everything would be up and running immediately or just a few months after DX10 and it's hardware are released are however entirely unrealistic.

The drivers for DX10 for both Nvidia and ATI are anything but close to being mature.

Game developers are still working in DX9, and bolting on (this sounds crude, but believe me, so is the actual work) DX10 features. With the games inidustry as it is, at this point performance gains are just nowhere near a priority for these games, rather increased visual clarity and realism are.

There's no magic switch in terms of game / software development that suddenly makes everything DX10, and also no magic switch than also starts the perfect DX10 GPU production line.

1) Game development at the least will require 18 months normally minimum of development, so we're talking true DX10 only games 12 months from now at the earliest.

2) DX10 hardware is also nowhere near mature, andit will take a while to harness the full capability as well as also keep an eye on the future

In respect of both points, as DX10 requires Vista only, all developers are still working on the basis of DX9. Sure, there will be some DX10 patches around and some games which are touted as DX10, but they are not coded as DX10 only.

Nvidia and ATI know this (maybe Nvidia has suceeded more than ATI) and have pitched their current hardware firmly at DX9 games, with enough horsepower to also make something of the early DX10 implementation - these are pretty much fairly ugly hacks, nothing more.

Right now is not so much a crossroads as a multi-stoey carpark for buying a new graphics card. The good news is that if you're sensible, you won't miss out. But as with any new development, if you want to stay on top and at the higher end, you'll have to pay for it in the next 18 - 24 months with 2 - 3 upgrades.
June 5, 2007 1:20:47 AM

Quote:
Quote:
Ontopic, it wasn't clear from the benchmarks, were both the dx9 and dx10 tests carried out under Vista? Methinks that the overhead of that OS would negate any dx10 performance gain.


You really need to see how Vista works before going on about overhead claims. The RAM useage on desktop is not the same as a gaming overheard.

Do not mistake this for poorer performance due to immature drivers.
June 5, 2007 4:07:22 PM

To be fair to Vista i notice very little performance impact in games, probably is a little there but nothing to write home about imho. But i also have a system that would be considered more or less recommended, Dual core opty and 2gb ram.

As a side not to vista and gaming I was immensly surprised to see that even the 'Game for Windows' games don't always follow vista's priveleges rules eg. LOTR Online wants to modify files in the Program Files folder which is a big no no for Vista, what a pain that was to solve.
!