Nvidia DX 10.1

mpavao81

Distinguished
Feb 18, 2008
359
0
18,810
Soo the new 9800 series is out an i use the term "NEW" loosely. My question is, when is nvidia going to make a Dx 10.1 card??? Im looking to buy a new card some time in the next 2 months, but i want a Dx10.1 card and i dont want to deal with ati drivers anymore. To me it doesnt make sense to buy any of the current nvidia cards right now that isnt dx 10.1 when dx 10.1 is coming out soon and current crop doesnt support it.
 

teh_boxzor

Distinguished
Aug 27, 2007
699
0
18,980
i think nvidia and ms already said that dx10.1 will be directly backwards-compatible with dx10. all dx10.1 will do is force the "free" 4x aa
 

bydesign

Distinguished
Nov 2, 2006
724
0
18,980
With Dx10 hardly utilized I don't think that it really matter accept for marketing purposes. Also the 10.1 cards out there are far less effective so it's even more pointless at the moment. Right now and for this generation of cards it's just hype. There is still no single card that delivers the performance that is required to run games at high-end resolution of 1900x1200 which needs to average at least 60fps. Even at 20"-22" resolutions this isn't possible and until one or both or them can there is nothing to get worked up about.
 

teh_boxzor

Distinguished
Aug 27, 2007
699
0
18,980
^ agreed. most dx10 games are all patched dx9 games currently. once true dx10 games come out than r700 and possibly gt200 (or whatever nvidia calls it) will already be out.
 

randomizer

Champion
Moderator

And Microsoft will have already released DX11b :lol:
 

emp

Distinguished
Dec 15, 2004
2,593
0
20,780
If you don't want an Ati card and you don't want to get a DX10 nvidia card, then you should look into one of these puppies right here. PCI-E 2.0, full DX10.1 support, and very energy efficient (If you need more performance you always add a second for Multi-Chrome Technology!)

product_banner-430gt.jpg


Chrome430GT.jpg


chrome430GT256MB_performance.jpg


You can buy them here:

https://s3gstore.s3graphics.com/cgi...mplate=PDGTemplates/FullNav/SearchResult.html
 

Heyyou27

Splendid
Jan 4, 2006
5,164
0
25,780
Actually, I don't believe it gives "free 4x aa", but applications and hardware have to support the feature in order to be Direct X10.1 compliant.
 



It doesn't force 'Free 4XAA", it forces card makers to support 4XAA (hey guess what both ATi and nV already support at least 8XAA with their current lineup).

And of course DX10.1 is backwards compatible with DX10, since DX10.1 is a superset, however that still means that DX10 cards still can't use DX10.1 features if called for, so if ever there is a DX10.1 minimum spec title (not likely anytime soon) then the DX10 cards won't be able to run them in hardware.

All in all not a big concern, wouldn't buy a DX10.1 compliant HD3400 over a GF8800GTS just for that.
 

jkflipflop98

Distinguished


Alot of people are pulling this from the X360's Xenos GPU. It offloads the framebuffer to 10mb of EDRAM that's setup as a GPU daughterchip. The EDRAM controller can antialias the frame while it's being stored off the GPU. Hence, "Free" 4xAA - no performance hit because the main part is never impacted.
My 8800GT has seemingly free 8xAA in everything but crysis and EQ2. :p
 

gpippas

Distinguished
May 11, 2007
463
0
18,790


Try Colin Mcrae dirt. You won't get free 8xAA in that if you turn the settings to ultra.
 

mpavao81

Distinguished
Feb 18, 2008
359
0
18,810
Well i was considering getting a 8800GTS 512mb oc, but then when i placed an order for it the store called me up and told me they were sold out and it was discontinued and they wouldnt be getting any more(which sucks). And it got me to thinking about dx10.1 since what ever card i get im not going to upgrade again for atleast 2 years. But i guess since dx10.1 support doesnt matter ill go with something else from nvidia. The store im ordering from suggested i go with the 9800 GTX cause it was only $40 more. But i have heard some mixed reviews on this. I have heard it is only slightly faster than the 8800GTS 512 but way louder. so kinda stumped right now on what to get.
 

crazywheels

Distinguished
Jan 11, 2006
334
0
18,780
What is the big deal about about 10.1 anyway is there that much of a difference. The 10.1 from what I heard hasn't made want to to switch to ATI yet. Go nVidia!!
 

hayest

Distinguished
Mar 21, 2008
9
0
18,520
If i am not mistaken, I read somewhere DX10.1 has to do with tessellation(don't know how to spell it) and the reason AMD has support for it is because their cards has a tessellation unit on it when they created the 2900xt. so it was just a small modification to complete the compliance with DX10.1. Also, I believe tessellation, if utilize correctly, could provide much better graphics with a much lower demand on graphics card power.
 

zeuseason

Distinguished
Mar 27, 2008
32
0
18,530
I believe DX 10.1 has a hardware requirement for it to be compatible of which NV has not released a card yet with it. ATI not sure about.
 

Aaaaaahhhhhhh.........VIA IS BACK!
 

resonance451

Distinguished
Feb 13, 2008
426
0
18,780
"Gamers shouldn't fret too much - 10.1 adds virtually nothing that they will care about and, more to the point, adds almost nothing that developers are likely to care about. The spec revision basically makes a number of things that are optional in DX10 compulsory under the new standard - such as 32-bit floating point filtering, as opposed to the 16-bit current. 4xAA is a compulsory standard to support in 10.1, whereas graphics vendors can pick and choose their anti-aliasing support currently. We suspect that the spec is likely to be ill-received. Not only does it require brand new hardware, immediately creating a minuscule sub-set of DX10 owners, but it also requires Vista SP1, and also requires developer implementation."
 
Nice cut and paste (without accreditation [The InQ]] :sarcastic: ), how about just reading the spec instead as part of M$' Siggraph presentation;
http://www.microsoft.com/downloads/details.aspx?FamilyID=96cd28d5-4c15-475e-a2dc-1d37f67fa6cd&DisplayLang=en

Basically returning much of what M$ removed to allow the GF8800 to be DX10 compliant, plus adding a little extra for image quality to give developers a little more tools in the shed.

It's funny the article you... well.... don't quote is another one of those "the sky is falling DX10 / DX10.1 what do I do?" and instead of being pragmatic about it like places like Tom's, B3D, TechReport and Extremetech, the InQ follows the line of it's no good because it'll immediately ghetto-ize everyone. Right, like how DX9.0C did, eh!?! [:thegreatgrapeape:5]
 
This kind of thing prompted the responses from M$ about the fears, and ended with a very important phrase about bringing DX10 to Vista (which still remains the biggest sticking point to why it would never work under the old DX10 (now DX10.1) standard, and that's virtualization.

http://www.next-gen.biz/index.php?option=com_content&task=view&id=6824&Itemid=2

Even Time Sweeny from Epic picked that as his most anticipated part of the original DX10 spec (which was later pushed to DX10.1);

"I see DirectX 10's support for virtualized video memory and multitasking as the most exciting and forward-looking features. Though they're under-the-covers improvements, they'll help a great deal to bring graphics into the mainstream and increase the visual detail available in future games."

http://www.firingsquad.com/hardware/directx_10_graphics_preview/page8.asp

As much as it will be a long time before anything is DX10.1 as minimum spec and rule out DX10 hardware, the benefits of DX10.1 features may be more in demand than you think, and may come sooner and offer more than we expected.

PS, Of course I still wouldn't buy hardware solely on that alone.