Sign in with
Sign up | Sign in
Your question

DX11 to show up soon?

Last response: in Graphics & Displays
Share
a b U Graphics card
June 2, 2009 6:27:31 PM
a c 130 U Graphics card
June 2, 2009 6:40:40 PM

Im sorry i draw the line at being linked to twitter :kaola: 

Mactronix
a b U Graphics card
June 2, 2009 6:44:46 PM

OK, ATI is supposed to show some DX11 working cards tomorrow
Related resources
a c 130 U Graphics card
June 2, 2009 6:54:44 PM

Thats good to know i trust we can rely on your good self to link us in to the reported findings ready for when i get home from work ?


Mactronix
a c 272 U Graphics card
June 2, 2009 7:21:08 PM

mactronix said:
Im sorry i draw the line at being linked to twitter :kaola: 

Mactronix

Agreed, I really can't take twatters seriously.
a c 130 U Graphics card
June 2, 2009 8:20:27 PM

Found this smiley, this seems like an opportune moment


[:mousemonkey:5]


Ta :) 

Mactronix
June 2, 2009 8:40:30 PM

this stinks I was just about to buy a Dx10 card.
a c 86 U Graphics card
June 2, 2009 8:41:10 PM

http://www.nordichardware.com/news,9394.html
Quote:
AMD shows DirectX 11 GPUs at Computex
Written by Andreas G 02 June 2009 18:19



AMD has a couple of DirectX 11 GPUs in the pipeline for this year. Among them we have the high-end RV870 and the mid-range RV840 (not sure about that name though), but of course there are more chips of the coming R8xx family. Anyhow, Computex is the place to go if you want to learn more about the coming hardware and according to a reliable source AMD has had a back room presentation where AMD had several DirectX 11 GPUs working and nearly ready.

Both us and the people there where quite shocked to learn how well the chips were working. It seems that most leaks have been fixed and that the chips are not far off. We have no launch date to share, but the source was confident we're talking a few months at most. TSMC and its problems of course play a role here, but things are apparently looking up for them too.

We will just have to wait and see, and most importantly keep our eyes open for the status updates of NVIDIA's coming G300 chip.


a c 175 U Graphics card
June 2, 2009 9:05:53 PM

First, I believe DX11 is supposed to be here soon. (is this supposed to work on Vista as well, or is this Win7 only?)

I'm not surprised AMD has working DX11 silicon already. DX11 isn't much more then DX10.1 with a few more things. AMD has had DX10.1 cards for quite awhile, so tacking on the DX11 parts probably aren't that hard. They have also already made GDDR5 and 40nm cards as well, so again this isn't all that new for them. I wonder what Nvidia has, and did they really find a way to more then double the shader count?
a c 130 U Graphics card
June 2, 2009 9:12:56 PM

Yes it will work on Vista

Mactronix
a b U Graphics card
June 2, 2009 10:00:50 PM

Rumors are running that ATI may come out with a full cross section, from top to bottom for DX11.
Rumors on nVidia side are somewhat more obscure, tho, theres reportedly several DX11 cards in the pipeline, and its said theyre several to 3-4 months behind ATI.
As to any specs on any cards, its all speculation, nothings really confirmed yet, or close. It may have alot to do with ATIs surprise with their R700 family release, and the 480/800 shader thing, besides them just normally holding their cards tight to the vest. Also, LRB will be here eventually, and thats just another competitor to keep things from
June 3, 2009 4:11:01 AM

mactronix said:
Yes it will work on Vista

Mactronix


Link? After all, DX11 is a new API in Win7. We never saw DX10 in XP, so why would MS make such a massive SP for Vista?
a b U Graphics card
June 3, 2009 4:25:19 AM

Also, Fud is almost crying so it must be true.

http://www.fudzilla.com/content/view/14018/34/

AMD must be an absolute mile ahead of Nvidia on dx11. We could be talking about a 6 month lead or even more. What will be interesting to see is the price these come in at - will they be priced the same as the top end nvidias yet with double the performance?

Exciting times for AMD fanboys and girls everywhere. ;) 
a b U Graphics card
June 3, 2009 4:26:18 AM

hunter315 said:
Unlike DX11 isnt supposed to be as much of a paradigm shift as DX10 was from DX9 so it likely wont be a massive SP.

According to this
http://blogs.msdn.com/ptaylor/archive/2008/07/28/gamefe...
most of DX11 will run on DX10 hardware fine except for a few parts of it.


The hardware tesselation and SM5 are probably two of the most important parts however, and those will need a dx11 card.
a b U Graphics card
June 3, 2009 5:06:10 AM

mactronix said:
Im sorry i draw the line at being linked to twitter :kaola: 

Mactronix


Just FYI, that's probably your BEST ongoing source for information on ATi, at least that which would put it in a good light.

Of course you could trust people in random forums, however you might want to look at your Driver Release notes you might see a familiar name there and now a familiar link.

Terry's been out front in the forums and public spaces since before the R9xxx series, he was primarily found at Rage3D and DriverHeaven with the occasional B3D appearance, and recently AMD's own Game forum, but this was a quite logical and expected step for anyone whose dealt with him in the past. He's quite open and public, and he's been a goof read and source for hints just like Wavey Dave.

I'm just saddened he listens to JACK, at least he has the good sense to balance that with the Mighty Q ! :sol: 
a b U Graphics card
June 3, 2009 5:33:27 AM

Nice catch, I was surprised that Charlie didn't have an update already since Terry Linked directly to his aptly-named "SemiAccurate" site.

Time to bring back an old Smiley in honour of the find ...
a b U Graphics card
June 3, 2009 5:44:12 AM



Hmm let's see. 959m transitors on the 4870 compared to possibly 1.2bn on this. The 3870 had 666m so yep that wafer is probably the top end 58xx gpu's assuming the writers math isn't way off.
a c 175 U Graphics card
June 3, 2009 5:49:53 AM

Quote:
The hardware tesselation and SM5 are probably two of the most important parts however, and those will need a dx11 card.


As any AMD fanboy should know, AMD has had a Tesselation engine in their cards since the 2xxxx days. They first built one in the Xenos, and carried it over to every card since they. They only need to support SM5. Its AMD and Intel that need to add a bunch of stuff seeing as they never supported DX10.1.

EDIT: As pointed out, the last sentence should read, "Its Nvidia and Intel that need to add a bunch of stuff seeing as they never supported DX10.1."
a b U Graphics card
June 3, 2009 5:57:26 AM

croc said:
Very different... Hard to believe as well, but now I have several different links saying the same thing, I guess I'm going to have to believe...


Well it is M$' own links, they wouldn't publish that if there was a chance it wasn't going to happen, I think they learned their legal lesson after the whole 'Vista Ready, Capable, can display a Vista logo in 2D' fiasco.

Quote:
OT, I thought you were headed down to Kiwiland?


That's the plan for skiing next summer post Olympics: Argentina, Chile, Oz and NZ.
a b U Graphics card
June 3, 2009 5:58:18 AM

4745454b said:
Quote:
The hardware tesselation and SM5 are probably two of the most important parts however, and those will need a dx11 card.


As any AMD fanboy should know, AMD has had a Tesselation engine in their cards since the 2xxxx days. They first built one in the Xenos, and carried it over to every card since they. They only need to support SM5. Its AMD and Intel that need to add a bunch of stuff seeing as they never supported DX10.1.


Yes I did know that, but i felt it was fair to pretend it didn't exist seeing as Nvidia doesn't have it and it never got used because of that anyway.
a b U Graphics card
June 3, 2009 6:05:52 AM

4745454b said:

As any AMD fanboy should know, AMD has had a Tesselation engine in their cards since the 2xxxx days. They first built one in the Xenos, and carried it over to every card since they. They only need to support SM5. Its AMD and Intel that need to add a bunch of stuff seeing as they never supported DX10.1.


I assume that last section is nV and Intel, not AMD.

Anywhooo, remember that the Tessilation in DX11 is handled differently and can call upon a Hull Shader and Domain Shader, both of which would be required in hardware to make it work, so slightly different but not a straight transition and would require software emulation on an HD2-4K card to enable.

ATi/AMD gloss that over with their 'history of Tesselation' bit with the original TruForm included.

You'll probably also see a re-organization of the shaders for a more efficient compute-shader model, but a little less of the change nV is likely to make to make their shaders more effficient in a compute role.
a b U Graphics card
June 3, 2009 6:07:34 AM

Well, nVidia hasnt been aiming in that direction, as its gpu related, and not gpgpu related heheh
a b U Graphics card
June 3, 2009 6:17:32 AM

So Nvidia are building a tesselator from scratch, dx11 even though they skipped 10.1, sticking more gpgpu stuff on and moving to gddr5 while shrinking to 40nm. Does anybody actually believe this is possible lol.

The reality is, they might as well sit this round out. Maybe concentrate on fixing those laptops instead.
a b U Graphics card
June 3, 2009 6:24:11 AM

Jennyh, remember we thought the same thing about the G80 vs R600, so while it looks daunting you never know until the final product ships.
a b U Graphics card
June 3, 2009 6:30:00 AM

Rumors are still running that G300 will be a monster, high performance, not only in gpgpu, but gaming as well. But itll cost ya
June 3, 2009 6:32:45 AM

Well nvidia is a big corp which should have a good r&d department I have good faith they will deliver on their refinement release of DX11 cards. Nvidia didn't skip dx10.1 but didn't want to change their processes for something that they know wont last that long always.

ATI always promotes them self as the best though innovation while nvidia takes the approach as best though refinement and making what works better.

ATI can take that approach due to the fact they aren't leading in nvidia vs ati and they have a smaller processing facility which cost less to change their practices from one card to the next.

edit:G80 vs R600 was rape cake T_T i hope it doesn't turn out like that again ATI did horrible vs nvidia making nvidia charge *** loads for their premium produce because ati couldn't deliver and made it worse with the G92 release.

Also rape does not trigger the censer lol
a b U Graphics card
June 3, 2009 6:40:17 AM

Kinda to the point tho, as DX11, Tesselation etc etc are all brand new, and nVidia has done the pooch when going to new processes also. I see jennyh's point, but at most itll only take more time for nVidia
a b U Graphics card
June 3, 2009 7:06:46 AM

IzzyCraft said:
Well nvidia is a big corp which should have a good r&d department I have good faith they will deliver on their refinement release of DX11 cards.


They were a big company 'with a good R&D dept' when they made the FX series too. Doesn't mean much.

Quote:
Nvidia didn't skip dx10.1


Yes they did, and went so far as to downplay it and say their reasoning was transistor budget.

Quote:
ATI always promotes them self as the best though innovation while nvidia takes the approach as best though refinement and making what works better.


They each do that when it fits their own current situation, that wasn't the party line during the GF6800/X800 era , and nV didn't expect that during the R9700/FX era before it either.

Quote:
ATI can take that approach due to the fact they aren't leading in nvidia vs ati and they have a smaller processing facility which cost less to change their practices from one card to the next.


ATi and nV use the same facility, TSMC.

As for who's leading, nV loses money making and selling their GPUs while ATi is AMD's main profit source to stem the flow of red from the CPU division, so really who leads whom right now?

It's also unlikely that the RV8xx series leaves itself open for a repeat of the G80 vs R600 situation as it doesn't appear to have the second launching part, and also doesn't have to worry about a different node than the competition, both will be using 40nm and should experience the same issues if there's a repeat of the problems that were experienced with 80nm HS.
a c 175 U Graphics card
June 3, 2009 7:39:33 AM

Quote:
Rumors are still running that G300 will be a monster, high performance, not only in gpgpu, but gaming as well. But itll cost ya


This actually bugs me big time. Nvidia STILL hasn't scaled the GTX chips down to the lower segments. They renamed the G92 chips and sold them as new. Now (rumor has it) they will make an EVEN BIGGER CHIP that supports DX11? If they couldn't scale down the GTX, how will the scale down the G300? Will they support DX11, but only for those who can pay their price? They will support DX11 in the high end, and support the other segments 12-18months later? This is seriously F'ed up, and I hope this isn't the case.
a c 175 U Graphics card
June 3, 2009 7:44:28 AM

Quote:
ATI always promotes them self as the best though innovation while nvidia takes the approach as best though refinement and making what works better.


Except for the 6xxx series against the x8xx series. The 6xxx supported SM3, while the x8xx only supported SM2. At least AMD was big enough to admit they didn't support it this round, and they would the next. How many rounds has Nvidia skipped DX10.1? (I guess two as long as you don't count the 9xxx chips as new.)

Companies always put the best spin on things. If it suits the marketing dept to claim SM3 support before their competitors, then they'll do it. But if they fire back with something else, well then they will claim XXX excuse. What matters to me is AMD said next round and did it. Nvidia only seems to rename chip, and put spin on things.

Edit: fixed quote tag.
a b U Graphics card
June 3, 2009 9:15:23 AM

Well, its speculated that G200 should shrink to 40nm for mid range and the good ol G90s for entry/low end. A higher clocked, much smaller G200 should outperform R700 series, and be priced/margined well for them, if the 40nm problems are solved.
Speculation is, they have (ATI) have their next gen ready, but due to the 40nm problems, theyre trying to ramp production/inventory for release times, so therell be no shortages like we see with the 4770
a b U Graphics card
June 3, 2009 10:26:25 AM

A 40nm g200 might outperform a 40nm r700, but will it outperform the bottom end r800?

ATI dont just hold all the aces, they hold all the cards (no pun intended), they are in complete control of the market except for the very, very top. They will get that back soon of course with the 58xx and the 58xx X2. I will be disappointed if the single gpu 58xx isnt at least as fast as a gtx295 however.

If we fast forward 3 months, the gpu market could easily look like this :-

5870x2 an absolute mile ahead of anything else (possibly 2x faster than a gtx295).
5870 in second place tied with gtx295

Now, assuming Nvidia shrinks the g200 and cuts prices as much as they can (dont forget the 448 and 512 bit buses), those 285gtx, 275gtx and 260gtx still have to be faster than the equivalent 5850 and 5830's. The difference again is, ATI will make money on these small chips while Nvidia keep losing it.

The good old g90's will be crushed by 40nm 4870 and 4850's.

Nvidia need a miracle to just stay in the game next year. They are 3-6 months behind and although I expect the g300 to be a real powerhouse, it could already be all over for them by the time it's released.
a b U Graphics card
June 3, 2009 12:10:53 PM

HardOCP has some DX11 screenies up. As expected, nothing there wows me, and I wouldn't have known it was DX11 if the pics weren't labeled.
a b U Graphics card
June 3, 2009 3:23:26 PM

gamerk316 said:
HardOCP has some DX11 screenies up. As expected, nothing there wows me, and I wouldn't have known it was DX11 if the pics weren't labeled.


But that's the same about DX8.1 vs DX9, DX9 vs DX10, etc.

A still screenshot does little to expose any differences, especially in things like efficiency, which is the name of the game with DX10.1 & DX11, not dramatic visual differences like DX7/8 to DX8.1m where suddenly you had shiny water/surfaces.

How are you going to appreciate the benefits of tessellation, memory optimizations or better multi-core support in a 2D image?

You could emulate almost any static visual in DX10 using DX8.1, so screenies of a sandbox isn't really going to do much. Even something like HDR, specular lighting wouldn't be truly appreciable until you had video, and a fleshed out scene with backgrounds and other objects to interact with.

I'm going to reserve judgment until there's motion video of DX10/10.1 alongside DX11 trying to achieve the same final output.

Also remember this is early hardware with next to no driver development, so I'm not expecting much for a while other than developers doing alot of crass demo stuff at first.
a b U Graphics card
June 3, 2009 5:02:34 PM

What bothers me, is that Tesseleation should be immediatly obvious if used. And their tesselation demo ran at 14FPS...

I think DX11 might be another incremental update to DX9...
a b U Graphics card
June 3, 2009 9:25:22 PM

No not really. Think of Tesselation taking a standard 70,000 plygon model running at 2-3 FPS, and instead turning that into a 5,000 plygon tesselated model that looks the same which runs @ 14fps, it's a massive improvement, but you wouldn't SEE a difference in a screenshot, and wouldn't know 14fps is 'good'.

If tesselation works properly you shouldn't be able to tell the difference between a hi-poly model and a low poly tesselated model, so a still wouldn't tell you much, and fps alone wouldn't either, unless, like I said you had another card running DX9/10/10.1 beside it to compare.

One thing is for sure, it's far from an incremental update from DX9, it's long since been a complete split as to how things are done, both from a threading, memory and feature perspective.

But also like I said, you can make DX8.1 look just like something rendered in DX11 in a still shot (same with software/CPU render can look like a DX11 output), the tough thing is to make them similar in full motion video.
a b U Graphics card
June 3, 2009 10:44:55 PM

I'm not sure what he means by it either, but it's probably a loose 'can't do that', ie it can be done but it would take ages to process on a cpu compared to a gpu.
a b U Graphics card
June 3, 2009 11:18:31 PM

Not sure how its broken down, but maybe using the compute shaders vs cpu, its requiring only a single pass, and thus the latency is slashed to the point where the strain on either or both cpu and gpu would be able to make it work vs reducing the amount, or severe fps lags
a b U Graphics card
June 3, 2009 11:25:56 PM

The compute shaders are really interesting actually. I've only just started to learn about the tech in dx11 but it looks pretty awesome on paper at least.
a b U Graphics card
June 4, 2009 5:50:11 AM

SS, I think he meant that computing thoughts (fear, direction choice, grouping, etc) for each character and having them all interact would be something a single CPU couldn't do as 'real-time' with so many characters on the screen at the same time. It really depends on all the features/emotions/choices that are accounted for by that model.

Now just give me better bots in UT ! :sol: 
June 4, 2009 12:47:57 PM

What if Nvidia has got something up its sleeve that's going to make us fall flat (flat broke that is)?
http://www.hexus.net/content/item.php?item=18666 It may be highly unlikely but who knows maybe AMD has a very good offer and they are going to dump ATI (that would be crazy,at least from what we can see) because if they keep ATI in the new deal it would turn into a monopoly that is going to drive prices up up and away (and it may actually hurt them more than help them if they keep ATI). This offer could be plausible because given the fact that Nvidia hasn't shown much in GPU's lately (not that ATI has shown too much but at least it's a hole lot more) and the other fact being that it needs a way to enter the CPU buisness without being sued or something by Intel (by owning X86-64 Nvidia would nearly force Intel to allow them to use the license for the X86 that AMD has (The last processors Intel manufactured which did not use AMD's x86-64 design were early versions of the desktop Pentium 4 "Prescott", introduced in February 2004, and mobile Intel Core introduced in January 2006.)) But back to DX11 it may be something of a great deal that ATI has managed to strike http://www.youtube.com/watch?v=ghazN5L7Ncw and http://www.youtube.com/watch?v=-eTngR6M37Q . I believe that the first games to use DX11 are going to be released by the four companies that are in the video (gee ain't I a Sherlock) but seriously the release of the AvP from Rebellion (early 2010) and the fact that Phenomic wants to introduce DX11 on Battleforge show that we might even have some games that will have DX11 when the new HD5870 gets released. I hope they get some companies that release highly rated games to work with on implementing DX11 they kind of need it after the very few games that had DX10.1 some are barely known.
Edit: Also check this link about DX11 and the head in the second video. http://www.pcgameshardware.com/aid,686395/AMD-shows-Dir...
!