Nvidia: DirectX 11 Will Not Catalyze Sales of Graphics Cards

Solution


Really since DX8.1, when they brought out a DX8 part (GF3) after ATi brought out a DX8.1 part before them (R8500). The more things change, the most they stay the same.

The only one they did hype was DX9.0C (aka SM3.0), and that had less options / titles in it's GF6 generation than DX10.1 did.

My favourite part is that they think GPGPU will drive graphics card sales, as if most of PC buyers now use even 50% of their current PC's CPU power, let alone need a GPGPU co-processor.

They're grasping at straws, I'm interested...

jennyh

Splendid
They are right in a way though, dx11 is no longer the defining reason for why people will buy new graphics cards.

Dx11 has been playing second fiddle to eyefinity for the past week. :D

If Nvidia actually had either they might have a leg to stand on, but they don't. They are shell-shocked tbh. They must be, because nobody was expecting this from ATI.
 
I think that nvidia just plain sucks this time around and will be going ATI. Every one remembers their first quad sli setups and all the problems those had then the solder (just cringe). Now they act like they are still cool and the same old tricks still work. It just sucks when one gets one of their dream cards and it won't work because the pci-e switch isn't compatible with most boards even my xfx 780i has no luck. :pfff:
 
Whats sad is, according to them, adding DX features arent truly of "value" such as what they consider is, like physx.
They talk about people willing to pay MORE for physx, yet disown every DX model since DX9, as even their DX10 wasnt fully ready, they have nothing higher, and theyre acting like its second rate.
Well, someone needs to tell them that frame rates and resolutions still count in pc gaming, and people arent necessarily going to pay more for physx.
What I reallt dont understand is, using DX11 will give a tremendous boost to things like framerates, and physx takes alot out of those same framerates, yet they shun DX11.
I think they may have taken their brains out, and are playing with them heheh
 

darkvine

Distinguished
Jun 18, 2009
363
0
18,810



Agreed. I believe they are simply trying to talk down DX11 so that you might wait, if DX11 isn't that important, or even needed why go run to the first card out when you can wait for Nvidia.

They are trying to stop fanboys from moving over, the type that would believe anything Nvidia said (and I am not nivida bashing there is ATI fanboys that would believe them if they said the sky was vomit colored.) They can't looks like they are behind, and playing off that DX11 isn't worth it makes it seem like them not releasing isn't a big deal.

 

Maybe for some people, but in all honesty, I don't care that much. Give me one excellent high quality monitor and good framerates rather than 6 mediocre monitors any day. Now, if I could afford 6 HD projectors (so that there would not be any borders between the monitors), I would care. That's a wee bit out of my price range though.

(Don't get me wrong - I'm looking forward to the news on the 5000 series, the "eyefinity" just isn't one of the reasons)
 

Annisman

Distinguished
May 5, 2007
1,751
0
19,810
I don't know about some people, but I think one of the biggest advantages to buying a new DX11 ready video card, is the improvements it garners in ALL games, regardless of the DX version. Ala 8800GTX when it was released, for the first year or so it was adored because it raped all games available, not because the COH DX10 patch came out....
 


Really since DX8.1, when they brought out a DX8 part (GF3) after ATi brought out a DX8.1 part before them (R8500). The more things change, the most they stay the same.

The only one they did hype was DX9.0C (aka SM3.0), and that had less options / titles in it's GF6 generation than DX10.1 did.

My favourite part is that they think GPGPU will drive graphics card sales, as if most of PC buyers now use even 50% of their current PC's CPU power, let alone need a GPGPU co-processor.

They're grasping at straws, I'm interested in GPGPU/compute capabilities and probably more than most people since I need to for work, however I doubt even 1% heck even 0.1% of the graphics card market thinks/sees GPGPU when the they consider a new graphics card in store/online; and I don't think that will change no matter how much faster you can transcode videos... for most consumers who have tools that came with their video card to transcode and don't know that other software would even work with their Sorrny, Punasonic, Samschlong or Crasio camcorder, let alone worry about their NiCon and LieCa cameras taking JPEGs, not even RAW images, which currently are best processed with CS4 and accelerated universally with OPENGL not even OPENCL !! [:thegreatgrapeape:5]

It would be like saying DTS-HD will push the adoption of HDTVs this year. :pfff:

Second, they speak of GPGPU, and like the article points out, they rely on older hardware and pushing CUDA just when we finally have unified open standards that they only have partial support for. If they were serious wouldn't they say "... and our upcoming processors which will lead the way on all fronts, not just our own... etc.

Even the last little bit about it being about features (what do they think DX11 is?) and not just resolution and speed, doesn't that kill their G200 reuse strategy and hurt them compared to the competition? Also doesn't that fly completely against their whole DX10 vs DX10.1 strategy they just went through where it was, hey no one cares about more features, we're faster than those guys. :sarcastic:

Overall I get the feeling they didn't want to be on this analysts' call because they didn't have much positive or new to say; it's like watching someone from the sales team fill in on a presentation because the product engineers are stuck in traffic or they just told them the product demo broke. :ouch:

That or else it's basically lowering expectations to soften the blow of a big disappointment, hey it won't be as fast as the other guys, but look at all this other stuff we have, BTW have you heard of our proprietary Cg laguage compiler, no? Ummm, HEY, we also have Fairies and Mermaids too !?! :lol:
 
Solution

alikum

Distinguished
Nov 28, 2008
674
0
19,010
What is NVidia trying to play here? Awhile ago, I read a thread which mentioned that they are going to take GT200, rename them and sell them as 300. Of course with a die shrink, and maybe a little "more" performance upscale? Well, it's only what I read. Let's wait and see.

About its claim on DirectX11, if you think it's not of value, don't make it. Just go backwards and support up to DirectX9. I'll wait and see your market shrink much faster than your die shrink.
 

one-shot

Distinguished
Jan 13, 2006
1,369
0
19,310


At least they know to shrink their market share better than their GPUs.
 

Harrisson

Distinguished
Jan 3, 2007
506
0
18,990
This again confirms nVidia is very late with GT300, or new cards wont be powerful enough to dethrone 5870/5870 X2. If GT300 would be faster and launched in November, there is no way in hell nVidia would tell everyone "speed isnt important, DX11 doesnt matter", etc., common sense.
 

Harrisson

Distinguished
Jan 3, 2007
506
0
18,990
BaghdadBob.jpg


PhyX is the future of gaming. We have DirectX 11 right where we want it.. in it's last thoughts.

:D

http://forum.beyond3d.com/showpost.php?p=1335994&postcount=476
 
Please, isn't this what I've been arguing about for over a year now? I can simply point to the releases of DX8 and 9 and proof that people buy cards based on price/performance, and not for level of DX support.

DX10 was the exception because of the awsomeness of the 8800GTX.
 
Oh, thats right, DX10 was awesome cause of G80.
Now, since nVidia sat on its lead, did huge on sales and perf, where the hell is all their influence?
Did nVidia actually promote DX10?
If they did, they failed and miserably.
No, nVidia hasnt done nothing BUT release G80, and has tried only to go the gpgpu path ever since, and rename, relive off the G80 non success of promoting DX10, and sees no need to promote DX11 now either.
nVidia fails as a leader here, and needs to be dethrowned, simply for comments such as "we dont need more perf, and better resolutions", as they dont represent the average gamer, nor his/her needs
 


No that wasn't what you were arguing or that other people were arguing in return. :pfff:

You were arguing against the benefit of DX11 because you said they wouldn't add it to games for years because devs would still focus on XP, and you went so far as to argue incorrectly that it was not possible to work on down-level hardware; and then we would have to corect you, over and over.

Other people pointing out that early comments by devs indicate that they would add to games with DX11 sooner, which there will be as we can see from the launch day hoopla.

Most of us also said (even earlier in this thread) that it wasn't just about DX11 because those first DX11 card would also provide performance AS WELL as features, and near term that alone would cause people to buy cards.


DX10 was the exception because of the awsomeness of the 8800GTX.

Jebus, Fanboi much !?!

If DX10 was the exception for the GF8800GTX, then why did DX10 arrive long after the GF8800GTX yet sales were highest at the beginning of it's cycle before the launch of DX10 and SP1 (like the GF8800U were attractive), and most people buying them said they couldn't care less about DX10 because they felt that the R600 would do it better (before anyone knew what the R600 was or even what the final DX10 would influence since it was revised), they just wanted the performance now because it could play games much better than even previous generation Xfire and SLi solutions.

Seriously, you can try to re-vise history and the history of your comments, but it and they are freely out there quoted for posterity. :p
 
Yeah I was going to post that in the other thread, the one with the Iraqi information minister.

There's also the counterbalancing video from the author about HD5K pricing, but funny if anything pricing is better than he says, but the G300's situation is worse than in that video because as he said on B3D "I wasn't aware of those when i was molesting the subtitles. I heard about the yields just today (CJ) and about the stock selling just yesterday"

The last part being Jensen nV CEO selling shares the day after the ATi launch ahead of this silly announcement about performance and such;
http://www.gurufocus.com/news.php?id=69204

That's more telling of nV's outlook of their own future really than anything else IMO.
 

Harrisson

Distinguished
Jan 3, 2007
506
0
18,990

Honestly, DX versions were important to me, always. Lets say you have two cards with similar speeds, one is DX10, another is DX11, do you really wont care which to buy? I always prefer to have newer tech, even if sometimes its not immediately available in the games. If nVidia wouldnt had stagnated 10.1, we all would enjoy ~+15% extra AA performance in the games, and its peanuts compared what DX11 brings (cant wait for tesselation, multi-threading, etc).
 
You were arguing against the benefit of DX11 because you said they wouldn't add it to games for years because devs would still focus on XP, and you went so far as to argue incorrectly that it was not possible to work on down-level hardware; and then we would have to corect you, over and over.

Again, the 4000 series will not be able to execute DX11. Both M$ and ATI have confirmed this much. You're free to keep believing it though. As for XP, every game still ships with DX9.0c as its base, with DX10+ functions called if avaliable. The base is still DX9, and thats mostly because of XP at this point.

And BTW, theres a difference between installing the DX11 API on you're machine and actually RUNNING the code in question, particullarly where tesselation is concerned. While a small subset of the API may be supported by certain lines of cards. Really, you quickly run into a situation like the Anti-Alising situation in Batman:AA.

http://www.bit-tech.net/bits/2008/09/17/directx-11-a-look-at-what-s-coming/3
Having spoken with both Microsoft's Kevin Gee and AMD's Richard Huddy, we managed to confirm that the Xbox 360's (and by extension the Radeon HD 2000, 3000 and 4000 series) tessellator is not compatible with DirectX 11, but the DX11 tessellator is a superset of what's already on the market. Another good thing is that the Radeon HD 4000 series tessellator did go through a few changes, giving developers access to the feature in DirectX 10 applications – this wasn't possible with the tessellation unit inside both the HD 2000 and 3000 series GPUs.

Other people pointing out that early comments by devs indicate that they would add to games with DX11 sooner, which there will be as we can see from the launch day hoopla.

You're forgetting development time. That alone puts the delay at the 12-18 months I've been citing, and has existed with previous versions of the API. I also point out, non C based engines will take significantly longer to recode then those developed with stright C, farther adding to delay.

Even DX 9.0 took a good 12 months to take off (some people called it a failure :D), and I don't see DX11 being any different whatsoever. You get one rush of games to be the first (to drive sales), then a few games that use one or two features. Major development for the API takes 12-18 months post release, same as always.

Most of us also said (even earlier in this thread) that it wasn't just about DX11 because those first DX11 card would also provide performance AS WELL as features, and near term that alone would cause people to buy cards.

And I've said that IF i were to buy a DX11 capable card, it would be because of performance, not so much DX11 support. Never once have I said that double the performance for ~$300 wasn't a bad deal.

I've never once argued against the 5000 series performance (except when performance was exaggerated, like a few people citing possible 3x claims...), I've argued that DX11 will take a good 12-18 months to take off, same as every previous release of the DX API. I also point out, I STILL haven't seen a benchmark with tesselation enabled and DX11 code running in an actual gameplay environment...

I honestly think that DX11 cards need to be powerful because tesselation may be a major hog in power, which would change perceptions of the card if true. We'll see what happens on that front, but I'd hoped to get a gameplay example of tesselation in action by now...