Sign in with
Sign up | Sign in

Nvidia Expected To Offer DirectX 10.1 GPU In Q'1, 2009

By - Source: Tom's Hardware US | B 18 comments

Chicago (IL) - Nvidia so far declined to provide any information to if and when it will consider supporting the DirectX 10.1 API in its GPUs, a technology that’s already integrated into AMD’s Radeon cards for some time now. Roadmap information we stumbled across today offers a bit more clarity and suggests that the company’s next-generation desktop and notebook chips will support DirectX 10.1.

NVIDIA 9500GT D9M

DirectX 10.1 has been a confusing story for most of us, with no clear indication on which graphics card you should buy to be able to get access to the best feature set. ATI Radeon cards as well as S3 have been supporting DirectX 10.1 for a while now, but Nvidia remains silent about its future API plans - leaving the gaming market and its customers in uncertainty.

A presentation slide we received, but unfortunately cannot share with you in order to protect our source, clearly states that Nvidia will offer DirectX 10.1 support with its next-generation notebook GPUs that are scheduled for a spring 2009 release. DirectX 10.1 is also likely to be offered in the next desktop GPU generation, which should debut either late in Q4 2008 or Q1 2009, with a possible ramp throughout Q1 and Q2 of 2009.

So, what does that mean? Well, it depends on your view.

What we know for sure is that with Nvidia’s decision to support DX10.1, the rest of t he industry will be embracing this API.

On the very high end, it may mean that you should think twice about spending $500 or more on a DX10.0 card. DX10.1 cards may be the better value proposition, if you want to run the latest games and don’t want to buy another $500 card six months from now. Nvidia’s new GPU generation, we hear, will also be 1.5 to 2 times faster than the current technology.

This decision may also have some implications for AMD. Realistically, AMD has a six-month advantage over Nvidia in terms of API support right now and also appears to have competitive hardware in place as well. If AMD plays this game smart, it should be able to regain market share, as the 4800 series may be the more attractive technology for computer graphics at this time - at least for those of us with a limited budget.

On a side note, Nvidia will be switching to GDDR5 memory, most likely within 2008. As GDDR5 chips are more available we expect first Nvidia GDDR5 cards to hit the market in Q4.

Display 18 Comments.
This thread is closed for comments
  • 1 Hide
    Anonymous , July 4, 2008 1:35 AM
    lol 1.5 - 2 times as fast as current technology?

    how nice that would be...

    I guess the gtx 280 was about a 50% increase over the 8800 gtx ... but the only problem with the gtx 280 is it doesn't beat the 9800 gx2... Nvidia shot themselves in the foot... if they hadn't made the 9800 gx2... the gtx 200 series would be like OMG ITS SO MUCH BETTER... but since theres already a cheaper card that equals it and sometimes beats it... its like wtf?
  • 1 Hide
    Anonymous , July 4, 2008 1:38 AM
    ^ what I forgot to say...

    its looking even more likely that this architecture they are claiming gets 1.5 - 2 times the performance is actually a new architecture... while the rumors that the chip that would come out after the 9800 gtx... was just another spin-off of the g80 core... well it looks to me like thats true... they really milked their g80 core to death lol
  • 0 Hide
    one-shot , July 4, 2008 5:20 AM
    After claiming the performance ring for so long they are now following AMD's lead, DX10.1 and GDDR5. What else is new....
  • 0 Hide
    element , July 4, 2008 6:59 AM
    @thogrom the GTX 280 wasn't 50% more powerful than an 8800GTX think around 20% at most
  • 0 Hide
    pulasky , July 4, 2008 8:28 AM
    Quote:
    will also be 1.5 to 2 times faster than the current technology.
    what JOKE.
  • 0 Hide
    liemfukliang , July 4, 2008 1:35 PM
    The most reasonable claim is a bit faster or equal to GTX 280 GX2 :) ).
  • 0 Hide
    Anonymous , July 4, 2008 2:07 PM
    the gtx 280 is about 30 - 50 % faster than a 8800 gtx... look at benchmarks... as I said... the shot in the foot for nvidia was the 9800 gx2... if they hadn't had that... the gtx 280 would be viewed as all that
  • -1 Hide
    hannibal , July 4, 2008 5:36 PM
    9800 gx2 was must because of 3870x2... You should not blame Nvidia for that :-)

    I am also sure, that there is more potential in 280, when compared to 8800gtx. It was just released, so it only have good driver support in few games and "most important" benchmarks.
    1.5-2 times faster is a lot to promise, but Nvidia has done it before. 50% to 100% more speed is not so bad. The dx10.1 can offer 10to20% alone so it's really not too bad thing.
    This is good news to all those 3870 ovners, who can in future see some dx10.1 support increase.
  • 0 Hide
    zarksentinel , July 5, 2008 2:14 AM
    1.5 to 2 times performance increase? just wait and see.
  • 0 Hide
    Heyyou27 , July 5, 2008 3:45 AM
    I would really like to be able to play Crysis on Very High someday, but I'm not expecting much.
  • -1 Hide
    Lans , July 5, 2008 2:25 PM
    Hmm, DX10.1 in Q1 of 2009? Rumors had it Nvidia was going to DX11 directly yet Nvidia must been working on it for awhile so I guess Nvidia intentionally planted the idea? :-)
  • 0 Hide
    goonting , July 6, 2008 12:09 AM
    wish that 10.1 is enabled in most games... i just hate it when developers hold it because of being grip by N team
  • 0 Hide
    maximiza , July 6, 2008 12:42 AM
    If they just put Dx10.1 on the Gtx 280 & 260, They woudln't look like a bunch of lemmings.
  • 0 Hide
    iocedmyself , July 6, 2008 9:16 AM
    Bravo, in another 6-9 months nvidia will have managed to produce a 10.1 compliant card, like what a year and change after ATI? they jumped the gun on DX10 cards by a year....but hey, when you can force developers to disable 10.1 performance enhancing features for your "the way it was meant to be overpaid" games, why bother? after all the gripe AMD/ATI got with the 2900xt release, i'm just loving this.

    The gtx 260 was a $400-450 card on release...the gtx 280 was what, $650-$700 and up? They release it 6 months late, but manage to snake their launch a few days ahead of the 4800 series. It's just beautiful.

    The $200 4850 is comparable in area's to the gtx260...despite having 384mb ram less, and both using GDDR3. So the "nvidia will lay waste to ati when GDDR5 is more widely availble" defense doesn't fly. The fact that they could price their cards so high without at least gddr4 is obscene. The 4870 outdoes the gtx 260 in near every area, matches performance with the gtx 280 in numerous games, and out does it in several more...in high resolutions, with AA enabled no less. With half the RAM, at 1/2 the cost of the initial gtx280 price.

    Without even having an official driver realese for the 4800 series yet. You can not deny just how little effort they've put in, and how badly they've been price gouging. AMD/ATI aims for the mid-range market, because all in all that's alot more sales then high end bracket, but end up competing with the flagship competition.
  • 0 Hide
    V3NOM , July 6, 2008 9:49 AM
    name one game that uses DX 10.1 and i will be convinced of AMD/ATI's dominance...
  • 0 Hide
    element , July 6, 2008 12:22 PM
    V3NOM -assasins creed, I use a Palit 8800GTX, still a great card and i don't have to worry about this bullcrap, nvidia's slipping.
  • 0 Hide
    aznguy0028 , July 7, 2008 7:26 AM
    uhh.... assasins creed HAD DX10.1 but they pulled it out at the last minute because of everyone's favorite company Nvidia. Good game to them to lagging the industry with their influence, now DX10.1 on their part and GDDR5. -yawn- ATI's been at the DX10.1 for awhile now. -_- gg.
  • 0 Hide
    techguy911 , July 13, 2008 7:44 PM
    Those benchmarks of gtx 200 series of cards were done on beta drivers, i bought an evga gtx 280 after downloading the newest driver from nvidia my fps in every game almost doubled, i really noticed an increase in performance.

    Therefore you can't rely on old benchmarks.