Sign in with
Sign up | Sign in
Your question
Solved

Nvidia: DirectX 11 Will Not Catalyze Sales of Graphics Cards

Last response: in Graphics & Displays
Share
a b U Graphics card
September 17, 2009 12:35:26 AM

http://xbitlabs.com/news/video/display/20090916140327_N...
DirectX 11 by itself is not going be the defining reason to buy a new GPU

Anyone else tired of them telling us we dont need DX anymore? Or, that PC gaming isnt worth buying their cards for?
Maybe they arent, maybe we should just stick with ATI?
September 17, 2009 12:38:02 AM

Why would Nvidia downplay the importance of DirectX11 when they're right on the verge of unleashing a whole new lineup based on DX11 that is going to lay waste to the competition?

Oh wait.
a b U Graphics card
September 17, 2009 12:53:50 AM

They are right in a way though, dx11 is no longer the defining reason for why people will buy new graphics cards.

Dx11 has been playing second fiddle to eyefinity for the past week. :D 

If Nvidia actually had either they might have a leg to stand on, but they don't. They are shell-shocked tbh. They must be, because nobody was expecting this from ATI.
Related resources
a b U Graphics card
September 17, 2009 1:10:33 AM

Its marketing ploy....NVIDIA has no word on its next GPU....Nvidia may be late

Remember when AMD beat Intel to 64-bit...Intel made similar statements that 64 bit is not needed.
a b U Graphics card
September 17, 2009 1:26:12 AM

There is a pretty high chance of Nvidia abandoning dx11 I feel. If they do that they might as well just rebrand and shrink the g200 of course, and that is exactly what I believe they will do.
a c 172 U Graphics card
a b Î Nvidia
September 17, 2009 1:27:53 AM

I think that nvidia just plain sucks this time around and will be going ATI. Every one remembers their first quad sli setups and all the problems those had then the solder (just cringe). Now they act like they are still cool and the same old tricks still work. It just sucks when one gets one of their dream cards and it won't work because the pci-e switch isn't compatible with most boards even my xfx 780i has no luck. :pfff: 
a b U Graphics card
September 17, 2009 3:06:05 AM

Whats sad is, according to them, adding DX features arent truly of "value" such as what they consider is, like physx.
They talk about people willing to pay MORE for physx, yet disown every DX model since DX9, as even their DX10 wasnt fully ready, they have nothing higher, and theyre acting like its second rate.
Well, someone needs to tell them that frame rates and resolutions still count in pc gaming, and people arent necessarily going to pay more for physx.
What I reallt dont understand is, using DX11 will give a tremendous boost to things like framerates, and physx takes alot out of those same framerates, yet they shun DX11.
I think they may have taken their brains out, and are playing with them heheh
a c 172 U Graphics card
a b Î Nvidia
September 17, 2009 3:16:15 AM

2000 3dfx RIP you will be missed. :cry: 
201x Nvidia
a b U Graphics card
September 17, 2009 3:33:56 AM

leon2006 said:
Its marketing ploy....NVIDIA has no word on its next GPU....Nvidia may be late

Remember when AMD beat Intel to 64-bit...Intel made similar statements that 64 bit is not needed.



Agreed. I believe they are simply trying to talk down DX11 so that you might wait, if DX11 isn't that important, or even needed why go run to the first card out when you can wait for Nvidia.

They are trying to stop fanboys from moving over, the type that would believe anything Nvidia said (and I am not nivida bashing there is ATI fanboys that would believe them if they said the sky was vomit colored.) They can't looks like they are behind, and playing off that DX11 isn't worth it makes it seem like them not releasing isn't a big deal.

a b U Graphics card
September 17, 2009 3:42:35 AM

jennyh said:
They are right in a way though, dx11 is no longer the defining reason for why people will buy new graphics cards.

Dx11 has been playing second fiddle to eyefinity for the past week. :D 

If Nvidia actually had either they might have a leg to stand on, but they don't. They are shell-shocked tbh. They must be, because nobody was expecting this from ATI.

Maybe for some people, but in all honesty, I don't care that much. Give me one excellent high quality monitor and good framerates rather than 6 mediocre monitors any day. Now, if I could afford 6 HD projectors (so that there would not be any borders between the monitors), I would care. That's a wee bit out of my price range though.

(Don't get me wrong - I'm looking forward to the news on the 5000 series, the "eyefinity" just isn't one of the reasons)
a b U Graphics card
September 17, 2009 4:22:59 AM

I don't know about some people, but I think one of the biggest advantages to buying a new DX11 ready video card, is the improvements it garners in ALL games, regardless of the DX version. Ala 8800GTX when it was released, for the first year or so it was adored because it raped all games available, not because the COH DX10 patch came out....

Best solution

a b U Graphics card
a b Î Nvidia
September 17, 2009 5:15:33 AM
Share

JAYDEEJOHN said:
Whats sad is, according to them, adding DX features arent truly of "value" such as what they consider is, like physx.
They talk about people willing to pay MORE for physx, yet disown every DX model since DX9...


Really since DX8.1, when they brought out a DX8 part (GF3) after ATi brought out a DX8.1 part before them (R8500). The more things change, the most they stay the same.

The only one they did hype was DX9.0C (aka SM3.0), and that had less options / titles in it's GF6 generation than DX10.1 did.

My favourite part is that they think GPGPU will drive graphics card sales, as if most of PC buyers now use even 50% of their current PC's CPU power, let alone need a GPGPU co-processor.

They're grasping at straws, I'm interested in GPGPU/compute capabilities and probably more than most people since I need to for work, however I doubt even 1% heck even 0.1% of the graphics card market thinks/sees GPGPU when the they consider a new graphics card in store/online; and I don't think that will change no matter how much faster you can transcode videos... for most consumers who have tools that came with their video card to transcode and don't know that other software would even work with their Sorrny, Punasonic, Samschlong or Crasio camcorder, let alone worry about their NiCon and LieCa cameras taking JPEGs, not even RAW images, which currently are best processed with CS4 and accelerated universally with OPENGL not even OPENCL !! [:thegreatgrapeape:5]

It would be like saying DTS-HD will push the adoption of HDTVs this year. :pfff: 

Second, they speak of GPGPU, and like the article points out, they rely on older hardware and pushing CUDA just when we finally have unified open standards that they only have partial support for. If they were serious wouldn't they say "... and our upcoming processors which will lead the way on all fronts, not just our own... etc.

Even the last little bit about it being about features (what do they think DX11 is?) and not just resolution and speed, doesn't that kill their G200 reuse strategy and hurt them compared to the competition? Also doesn't that fly completely against their whole DX10 vs DX10.1 strategy they just went through where it was, hey no one cares about more features, we're faster than those guys. :sarcastic: 

Overall I get the feeling they didn't want to be on this analysts' call because they didn't have much positive or new to say; it's like watching someone from the sales team fill in on a presentation because the product engineers are stuck in traffic or they just told them the product demo broke. :ouch: 

That or else it's basically lowering expectations to soften the blow of a big disappointment, hey it won't be as fast as the other guys, but look at all this other stuff we have, BTW have you heard of our proprietary Cg laguage compiler, no? Ummm, HEY, we also have Fairies and Mermaids too !?! :lol: 
September 17, 2009 5:37:25 AM

What is NVidia trying to play here? Awhile ago, I read a thread which mentioned that they are going to take GT200, rename them and sell them as 300. Of course with a die shrink, and maybe a little "more" performance upscale? Well, it's only what I read. Let's wait and see.

About its claim on DirectX11, if you think it's not of value, don't make it. Just go backwards and support up to DirectX9. I'll wait and see your market shrink much faster than your die shrink.
September 17, 2009 6:38:42 AM

alikum said:
What is NVidia trying to play here? Awhile ago, I read a thread which mentioned that they are going to take GT200, rename them and sell them as 300. Of course with a die shrink, and maybe a little "more" performance upscale? Well, it's only what I read. Let's wait and see.

About its claim on DirectX11, if you think it's not of value, don't make it. Just go backwards and support up to DirectX9. I'll wait and see your market shrink much faster than your die shrink.


At least they know to shrink their market share better than their GPUs.
September 17, 2009 6:55:30 AM

This again confirms nVidia is very late with GT300, or new cards wont be powerful enough to dethrone 5870/5870 X2. If GT300 would be faster and launched in November, there is no way in hell nVidia would tell everyone "speed isnt important, DX11 doesnt matter", etc., common sense.
a b U Graphics card
September 17, 2009 11:48:42 AM

Please, isn't this what I've been arguing about for over a year now? I can simply point to the releases of DX8 and 9 and proof that people buy cards based on price/performance, and not for level of DX support.

DX10 was the exception because of the awsomeness of the 8800GTX.
a b U Graphics card
September 17, 2009 12:04:02 PM

Oh, thats right, DX10 was awesome cause of G80.
Now, since nVidia sat on its lead, did huge on sales and perf, where the hell is all their influence?
Did nVidia actually promote DX10?
If they did, they failed and miserably.
No, nVidia hasnt done nothing BUT release G80, and has tried only to go the gpgpu path ever since, and rename, relive off the G80 non success of promoting DX10, and sees no need to promote DX11 now either.
nVidia fails as a leader here, and needs to be dethrowned, simply for comments such as "we dont need more perf, and better resolutions", as they dont represent the average gamer, nor his/her needs
a b U Graphics card
a b Î Nvidia
September 17, 2009 2:26:47 PM

gamerk316 said:
Please, isn't this what I've been arguing about for over a year now? I can simply point to the releases of DX8 and 9 and proof that people buy cards based on price/performance, and not for level of DX support.


No that wasn't what you were arguing or that other people were arguing in return. :pfff: 

You were arguing against the benefit of DX11 because you said they wouldn't add it to games for years because devs would still focus on XP, and you went so far as to argue incorrectly that it was not possible to work on down-level hardware; and then we would have to corect you, over and over.

Other people pointing out that early comments by devs indicate that they would add to games with DX11 sooner, which there will be as we can see from the launch day hoopla.

Most of us also said (even earlier in this thread) that it wasn't just about DX11 because those first DX11 card would also provide performance AS WELL as features, and near term that alone would cause people to buy cards.


Quote:
DX10 was the exception because of the awsomeness of the 8800GTX.


Jebus, Fanboi much !?!

If DX10 was the exception for the GF8800GTX, then why did DX10 arrive long after the GF8800GTX yet sales were highest at the beginning of it's cycle before the launch of DX10 and SP1 (like the GF8800U were attractive), and most people buying them said they couldn't care less about DX10 because they felt that the R600 would do it better (before anyone knew what the R600 was or even what the final DX10 would influence since it was revised), they just wanted the performance now because it could play games much better than even previous generation Xfire and SLi solutions.

Seriously, you can try to re-vise history and the history of your comments, but it and they are freely out there quoted for posterity. :p 
a b U Graphics card
a b Î Nvidia
September 17, 2009 3:10:25 PM

Yeah I was going to post that in the other thread, the one with the Iraqi information minister.

There's also the counterbalancing video from the author about HD5K pricing, but funny if anything pricing is better than he says, but the G300's situation is worse than in that video because as he said on B3D "I wasn't aware of those when i was molesting the subtitles. I heard about the yields just today (CJ) and about the stock selling just yesterday"

The last part being Jensen nV CEO selling shares the day after the ATi launch ahead of this silly announcement about performance and such;
http://www.gurufocus.com/news.php?id=69204

That's more telling of nV's outlook of their own future really than anything else IMO.
a b U Graphics card
September 17, 2009 3:19:42 PM

Yea, short term at least, it appears Jenson wants his money back, especially after the Stanford write off, lol, do both the same year and cancel each other out, while also selling high, later to buy low heheh
September 17, 2009 3:33:22 PM

G300 is going to be one sweet surprise !!!
September 17, 2009 4:33:58 PM

gamerk316 said:
Please, isn't this what I've been arguing about for over a year now? I can simply point to the releases of DX8 and 9 and proof that people buy cards based on price/performance, and not for level of DX support.

DX10 was the exception because of the awsomeness of the 8800GTX.

Honestly, DX versions were important to me, always. Lets say you have two cards with similar speeds, one is DX10, another is DX11, do you really wont care which to buy? I always prefer to have newer tech, even if sometimes its not immediately available in the games. If nVidia wouldnt had stagnated 10.1, we all would enjoy ~+15% extra AA performance in the games, and its peanuts compared what DX11 brings (cant wait for tesselation, multi-threading, etc).
a b U Graphics card
September 17, 2009 4:53:58 PM

Quote:

You were arguing against the benefit of DX11 because you said they wouldn't add it to games for years because devs would still focus on XP, and you went so far as to argue incorrectly that it was not possible to work on down-level hardware; and then we would have to corect you, over and over.


Again, the 4000 series will not be able to execute DX11. Both M$ and ATI have confirmed this much. You're free to keep believing it though. As for XP, every game still ships with DX9.0c as its base, with DX10+ functions called if avaliable. The base is still DX9, and thats mostly because of XP at this point.

And BTW, theres a difference between installing the DX11 API on you're machine and actually RUNNING the code in question, particullarly where tesselation is concerned. While a small subset of the API may be supported by certain lines of cards. Really, you quickly run into a situation like the Anti-Alising situation in Batman:AA.

http://www.bit-tech.net/bits/2008/09/17/directx-11-a-lo...
Quote:
Having spoken with both Microsoft's Kevin Gee and AMD's Richard Huddy, we managed to confirm that the Xbox 360's (and by extension the Radeon HD 2000, 3000 and 4000 series) tessellator is not compatible with DirectX 11, but the DX11 tessellator is a superset of what's already on the market. Another good thing is that the Radeon HD 4000 series tessellator did go through a few changes, giving developers access to the feature in DirectX 10 applications – this wasn't possible with the tessellation unit inside both the HD 2000 and 3000 series GPUs.


Quote:

Other people pointing out that early comments by devs indicate that they would add to games with DX11 sooner, which there will be as we can see from the launch day hoopla.


You're forgetting development time. That alone puts the delay at the 12-18 months I've been citing, and has existed with previous versions of the API. I also point out, non C based engines will take significantly longer to recode then those developed with stright C, farther adding to delay.

Even DX 9.0 took a good 12 months to take off (some people called it a failure :D ), and I don't see DX11 being any different whatsoever. You get one rush of games to be the first (to drive sales), then a few games that use one or two features. Major development for the API takes 12-18 months post release, same as always.

Quote:

Most of us also said (even earlier in this thread) that it wasn't just about DX11 because those first DX11 card would also provide performance AS WELL as features, and near term that alone would cause people to buy cards.


And I've said that IF i were to buy a DX11 capable card, it would be because of performance, not so much DX11 support. Never once have I said that double the performance for ~$300 wasn't a bad deal.

I've never once argued against the 5000 series performance (except when performance was exaggerated, like a few people citing possible 3x claims...), I've argued that DX11 will take a good 12-18 months to take off, same as every previous release of the DX API. I also point out, I STILL haven't seen a benchmark with tesselation enabled and DX11 code running in an actual gameplay environment...

I honestly think that DX11 cards need to be powerful because tesselation may be a major hog in power, which would change perceptions of the card if true. We'll see what happens on that front, but I'd hoped to get a gameplay example of tesselation in action by now...
a c 130 U Graphics card
a b Î Nvidia
September 17, 2009 4:58:18 PM

So nVidia can't get a DX11 card out in time, and they're bashing it. Signs of a struggle, hm?

ATI is going to produce a single high-mid-level (5890 being higher) card that takes down nVidia's current best-card. For 100$ less. It'll have DX11 support, and use so little power comparibly.
GG, nVid.
September 17, 2009 5:53:41 PM

Nvidia waits for W7 and games ....
September 17, 2009 6:08:38 PM

it looks like nvidia cant make a DX11 card, so they come out with this cr@p that DX11 is not important
a b U Graphics card
September 17, 2009 6:33:07 PM

gamer, you argued 1.6 times perf, said you were disappointed etc etc.
nVidia failed to promote DX10, hugely, period. They simply dont care. Nor about DX10.1, and now it appears they dont care about DX11 either.
Your xp claims go against your nVidia claims of G80 doing so well, and then you pack in the lack of DX10 games while praising nVidia in doing DX10 so well?
Which is it?
Did nVidia fail hugely on promoting DX10, and didnt do so great as you just said?
Or, was it xp holding it back, as youve been saying in the past?
Or, is it that nVidia is just crap when it comes to support for any DX model lately, including DX10?
nVidia couldnt have done that well with the lack of DX10 games, so dont go there.
Saying G80 was a huge success coinciding with DX10 is simply not true, as it qwas G80s perf all along, and again, nVidia didnt do a thing to help DX10 along at all, and made the infamous split, creating DX10.1 because of their lack of support from the beginning.
I could go on and on, and yet you still claim tesselation etc cant be run on DX10 HW using the DX11 path......
I think you need yo reread Huddys statements, put them in perspective better.
All these things can be run , but just not as well, thats all.
While some may create savings, using a lessor path may actually cost more resources, and it all depends.
a c 271 U Graphics card
a c 168 Î Nvidia
September 17, 2009 7:04:35 PM

I can't help but wonder what bearing (if any) that the lack of an x86 licence might have had on the comments made in that article, more worrying still is the undertone of Nvidia pulling out of the discrete graphic card market altogether, something the ATi fans may celebrate at first then regret later as prices remain high and innovation stagnates.
a b U Graphics card
September 17, 2009 7:20:31 PM

I dont see them leaving the discrete market, unless they complete,y resize to just go scientific and gpgpu.
If they fail with trying to do both, its on them, but Im thinking LRB wont be as great as people think, and it shouldnt drive them out of the gpgpu market, nor will the 5xxx series do the same either.
Tho, I am happy to see this
http://www.tweaktown.com/news/13158/amd_and_pixelux_tea...

Well you see AMD in their usual fashion has been pushing an open platform; this is OpenCL (Cuda is also OpenCL just nVidia’s version). Well Pixelux is now going to work with AMD to develop OpenCL acceleration for their DMM engine.
September 17, 2009 7:43:47 PM

I own Nvidia cards and a mobo with Nvidia chip set and I feel Nvidia has nothing now to compete with Ati. They will be DOOMED if they got notin.
a c 271 U Graphics card
a c 168 Î Nvidia
September 17, 2009 7:47:51 PM

The stories of NV's management selling shares over the last few months has me wondering the worst, I hope I'm sensing gloom and doom where there isn't any, but then why would you sell loads of shares if you were about to pull the 'next big thing' out of your corporate butt?
a b U Graphics card
a b Î Nvidia
September 17, 2009 9:26:49 PM

gamerk316 said:
Quote:

Again, the 4000 series will not be able to execute DX11. Both M$ and ATI have confirmed this much. You're free to keep believing it though.
Quote:


No one ever said that, nor believed it no one thinks DX11-only features are going to work on DX10/10.1 hardware unless they have the hardware for it, and this has been the case for every generation; but you said that DX11 would not allow for code to work on down-level hardware and that you needed completely ifferent paths for each from the ground up as if it were as complicated as an XP path; all of which M$ always said was not the case, and I made that pretty clear to you in this thread;

http://www.tomshardware.com/forum/260003-33-real-direct...

The tesselator is not something anyone thought different from SINCE gamefest 2008 when they finally explained more about what it was and whether or not it could in any way interact with the previous iterations.

You make up things about DX10 & DX11 and Xp & Vista,and then redirect elsewhere again to try and get out of it, why you think in any way you called this based on your past mistakes makes it laughable, or that we would need you to explain it, especially as if you were edjucating people that the new DX11 card would also be fast in DX9 & DX10 and no one would ever buy it for that. :sarcastic: 

At best you blasted every sort of opinion possible in the hope that something would stick, but major error like the one I linked to above were the norm with those, so you're far from calling any of this. :pfff: 
a b U Graphics card
September 17, 2009 11:17:24 PM

Quote:
Again, the 4000 series will not be able to execute DX11. Both M$ and ATI have confirmed this much. You're free to keep believing it though. As for XP, every game still ships with DX9.0c as its base, with DX10+ functions called if avaliable. The base is still DX9, and thats mostly because of XP at this point.

And BTW, theres a difference between installing the DX11 API on you're machine and actually RUNNING the code in question, particullarly where tesselation is concerned. While a small subset of the API may be supported by certain lines of cards. Really, you quickly run into a situation like the Anti-Alising situation in Batman:AA.

http://www.bit-tech.net/bits/2008/09/17/directx-11-a-lo...
Having spoken with both Microsoft's Kevin Gee and AMD's Richard Huddy, we managed to confirm that the Xbox 360's (and by extension the Radeon HD 2000, 3000 and 4000 series) tessellator is not compatible with DirectX 11, but the DX11 tessellator is a superset of what's already on the market. Another good thing is that the Radeon HD 4000 series tessellator did go through a few changes, giving developers access to the feature in DirectX 10 applications – this wasn't possible with the tessellation unit inside both the HD 2000 and 3000 series GPUs.


So what? It would only make sense tessellator of the older cards was different from the new one and not able to be used or used fully but that doesn't mean the cards can't run DX11 that simply means they wouldn't be able to have tessellation. The same way Nvidia cards (older cards) will have some features but not all.

To say that it wouldn't work at all is simply a lack of understanding of how DX works.

Also many DX11 games are already being made, devs have had access to it for sometime now and there is already slated to be 3-4 releases to come out a few months after the cards.

The switch to DX11 will be much smoother then 10-/10.1.
a b U Graphics card
September 17, 2009 11:45:22 PM

The switch to dx11 will be much smoother, because Nvidia are no longer a force acting against progress.
a b U Graphics card
a b Î Nvidia
September 18, 2009 1:46:18 AM

I sure hope the DX11 transition is smoother. I need some excuse to buy one of these cards eventually.
a b U Graphics card
September 18, 2009 4:50:29 AM

It will be smoother. One of the major reasons 10 didn't take off was because Vista was brokem but by the time Vista had been updated, patched and fixed up a bit it was to late, DX11 had been told to be on it's way, and so was 10.1.


Nvidia didn't make any 10.1 cards because they didn't want to waste the money, knowing that DX11 was only just around the corner, but ATI needed the boost so they went head with it. Now we have a fixed, solid OS with a DX11 that is actually a game changer with some of it's features, so it will be adopted much faster.
a c 271 U Graphics card
a c 168 Î Nvidia
September 18, 2009 4:58:16 PM

Quote:
Vista was broken?

Have you not seen this?, it gave me a little smile.
a b U Graphics card
September 18, 2009 5:10:59 PM

nv is in deep trouble right now.

and its funny to see JSH pulling a hector ruiz.. a year and a half ago i've seen articles about him everywhere. anybody seen him?
September 19, 2009 5:13:27 PM

Be afraid nvidia be very afraid :ouch: 

thats so funny, ATI is going to take a big chunk out of nvid sales for a few months
September 19, 2009 5:20:51 PM

EVERYBODY!!! Haven't you noticed Nvdias stating a lot of rumor against ATI when ATI released their specs? I think Nvidias is afraid that they are gonna loose it. I mean 1st in fud it stated that gt300 will win over rv800 which is bullcrap, because for per watt performance Nvidia won't have a chance.

And now this bullcrap from nvdia, if anybody has seen the DX11 demos by ATI, i believe that it was amazing and is definitely why i wanna get a ATI GFX. UR thoughts.
a b U Graphics card
September 19, 2009 6:03:38 PM

Theyre gonna let loose the can of woop@$$, but if they lose it first they cant
September 19, 2009 8:10:07 PM

Nvidia is bullcrap in this generation.
February 7, 2010 1:49:34 AM

Its all about money . If you dont have it . Stay with your video card . Your gtx200 seriies will play until it Quits working unless it is a Bfg .........................LMAO
!