Sign in with
Sign up | Sign in
Your question

A lot of Nvidia rumors. Looks like ATI will have a DX11 card before N!

Tags:
  • Graphics Cards
  • Nvidia
  • ATI
  • Radeon
  • Graphics
  • Product
Last response: in Graphics & Displays
Share
October 28, 2008 2:41:16 PM

Checking The Rumor Feed http://rumorfeed.blogspot.com/2008/10/ex-nvidia-employee-tells-all-news-on.html as well as some other sites I noticed that only the GT300 is currently confirmed to have DX11. Since the Radeon 5X generation will all have DX11 this could finally end my Nvidia fanboyism. They were first with DX10.1, and now they will be first with DX11. I am still ticked off that my higher priced GTX280 is slower than my friends lower priced 4870X2.

What the heck is Nvidia doing lately?

What do you guys think about their cards as of late? I am pretty disappointed.

More about : lot nvidia rumors ati dx11 card

a b U Graphics card
October 28, 2008 2:56:21 PM

ATI is the right choice in terms of performance and price for Windows.

Nvidia still reigns with Linux due to poor ATI Linux drivers.
October 28, 2008 3:15:44 PM

now, is anyone else surprised that ATI will beat NVIDIA to DX11? because i sure as H3!! wasn't
Related resources
October 28, 2008 3:29:32 PM

I just assumed ATi would make it to DX11 first. DX10.1 was about half the battle.
October 28, 2008 3:30:30 PM

Remind me again what was so great about DX10? And DX10.1? Yes, I have a DX10 capable GPU and I have a DX10 capable OS... but to the best of my knowledge I don't own a single DX10 capable game. The newest game I bought was AoC (what a let-down that was!) and although it was SUPPOSED to support DX10, alas it did not.

So hooray for ATI pushing the hardware envelope while software publishers bring up the rear with the software side of the equation. I guess DX11 will be really awesome if you want to watch a 3DMark08 session loop endlessly... otherwise... enh!
October 28, 2008 3:34:02 PM

So the GT206, GT212, and GT216 are still sticking to DX10 or will they at least go for 10.1 support?
October 28, 2008 3:37:17 PM

At the moment, I think Nvidia's cards are better for F@H, but otherwise, ATI has got the market cornered with price/performance. Nvidia needs to fix a couple of its chip problems and get with it about implementing DX10.1, DX11, or whatever. Otherwise, it will get left behind by ATI, in my opinion.

As for games that use DX10 or better, that's going to take some time. The game developers have been slow on the uptake, and that's disappointing. Still waiting for Starcraft 2 at the moment. Probably be a long wait though. Maybe by this time next year it will be out.
October 28, 2008 3:40:41 PM

just because dx11 will be supported first doesn't mean the cards will be able to run it well...
so in that sense of the word, it doesn't really matter whose coming out with it first, but i do commend ati for pushing the technological envelope from one generation to the next for a few gens now....

hopefully they stick to their marketing plans and keep the cards from low-mid level, and concentrate less on the upper class cards... i think that would bring more incentive to buy pc games =]
October 28, 2008 3:45:35 PM

homerdog said:
So the GT206, GT212, and GT216 are still sticking to DX10 or will they at least go for 10.1 support?



They're sticking with DX10 till GT300.
October 28, 2008 4:03:34 PM

Initial half decent newly released DX10 cards were expensive and the affordable cards like the 8600GTS and 2600XT were way to underpowered for any DX10 gaming use. If we look now, DX10 hasn't been adopted in many games, but at least there's plenty of affordable and capable cards now available.

I don't know for sure about DX11, but I can see this happening again
October 28, 2008 4:04:42 PM

I don’t know about anyone else but I’m still using Direct X 9 and have no plans to upgrade to Vista just to take advantage of DX10 let alone worrying about DX11.
October 28, 2008 4:28:33 PM

There are a ton of game swith DX 10 support.
Assasin's Creed (which had DX10.1, but patched to DX10 due to pressure from Nvidia), Bioshock, Crysis, Far Cry 2 (which actually has DX10.1) and Lord of the Rings Online. There are a lot more, but those are the ones I own.

After seeing these games in DX10, I simply will not go back to 9! Also Far Cry 2 looks a bit better in DX10.1 and runs a bit smoother on my friends machine than it does with DX10 on mine. The only difference between our PC's is that I bought the GTX280 at launch and he was smart enough to wait for the 4870X2.
October 28, 2008 5:06:55 PM

tehlexinator said:

Also Far Cry 2 looks a bit better in DX10.1 and runs a bit smoother on my friends machine than it does with DX10 on mine. The only difference between our PC's is that I bought the GTX280 at launch and he was smart enough to wait for the 4870X2.


Nvidia hardware can use DX 10.1 in Farcry 2, look here at the thread that I posted. It makes sense really considering Far cry 2 was developed on Nvidia Hardware.

http://www.tomshardware.co.uk/forum/256322-15-nvidia-dx...

The reason it's smoother on on your friends machine is down to the 4870X2 being a faster card than your 280.
October 28, 2008 5:19:28 PM

my friend owns a 4870 X2 he has higher frames than 1 of my 280s, but his min framerate is down right terrible, and he has a 1000$ CPU and I have a 250$ CPU...sigh....thats why I hate x2 cards and I got rid of my GX2, the min frame rate issue is just too much of a pain the bum.

I remember in Source with the GX2 (even worse with Quad) frame rates dipped to 20s @ 1280x1024 for no reason.

Meh.
October 28, 2008 5:24:31 PM

tehlexinator said:
After seeing these games in DX10, I simply will not go back to 9!


I would say that is mostly the placebo effect / Crytek's trick to dumb down the settings for DX9 to make DX10 look like it offers something DX9 doesn't.

I am excited about DX11 but I won't be shocked if they find a way to flubb it up like they did with DX10 to where they have a DX11a that is 1/3 of what DX11 is supposed to be then later release a DX11.1 It will be to everyone's benefit NV supports DX11 ASAP. A lot of games go through TWIMTBP....

BTW...how did FC2 end up being DX10.1 and part of TWIMTBP? Do we owe NV's pseudo support for DX10.1 to this? I can't find that article...forgot where it was...

Ah
http://en.expreview.com/2008/10/21/nvidia-can-use-the-d...
a b U Graphics card
a b Î Nvidia
October 28, 2008 5:58:05 PM

tehlexinator said:

What the heck is Nvidia doing lately?

What do you guys think about their cards as of late? I am pretty disappointed.


They're fine for now, but if the G300 (I hate people using GT when it was the G200 not GT200 like we were saying way back in February) doesn't arrive until Nov of 2009, that's a long time out. Might just be around the time of an economic recovery if there's good govt stimuli in the various economies over the next year, but really that's a return to the 18 mth refresh cycle and also a bit late if the competition launches into the summer months. I have a feeling that both will depend alot on the new games, and IMO AMD better get it's act together on this developer involvement thing or else it won't matter what sucks and what doesn't. But like the GF8 advantage over the HD2K, having the capable DX11 hardware out there for developers prior to nV and potentially prior to intel, gives AMD a little bit of influence on setting the standards for DX11 development. It's not huge, but it is a benefit.

For right now though I think nV is treading water, they can't jump to either DX10.1 or DX11 without a major architecture change so there's no point, and really it's all a question of cost of prodution IMO, where their current product with lower yields per wafer hurt their performance per $ cost, and if they have to compete against another company that can bring close to similar performance with a lower cost, then there is pricign pressure. To me that's the biggest concern for them right now, so get a cost reducing refresh out there ASAP. As for their future products, it doesn't look good, but we've had similar negative outlooks in things like the G80, but that turned out to be a triumph for them, so while it doesn't look great, it might not be as bad as people are thinking.

evongugg said:
ATI is the right choice in terms of performance and price for Windows.

Nvidia still reigns with Linux due to poor ATI Linux drivers.


Which is like saying my graphics card is great in 97% of situations, but in the other 3% the competition is great. Not a big deal, and irrelevant to this thread since DX is a Windows focused API anyways, so it's a pretty flimsy strawman for the topic at hand. :heink: 

Linux is also a small niche, and while there's a difference, it's an area where people expect to have to tweak the drivers, and right now ATi at least is improving, and depending on who you talk to their open driver strategy is prefered to nV's more closed system. But really it's like discussing multi-monitor support, it only matters to a small number of people that one would be better than the other, and they will go and seek out those solutions. However being good at it doesn't compensate for the problems in the other areas, because neither Linux users nor multi-monitor or even Apple users are enough of a concerns to be the driving for for GPU design. They'll be considered as gravy, but nowhere near a make/break concern the way other factors are.
a b U Graphics card
a b Î Nvidia
October 28, 2008 7:12:33 PM

speedbird said:
Nvidia hardware can use DX 10.1 in Farcry 2,...


Not really. What it can do is use a work around to take advantage of a feature that is found by default in DX10.1, similar but not quite the same thing.

It's similar to other possible work arounds to things, but it's not actually DX10.1
October 28, 2008 8:56:00 PM

TheGreatGrapeApe said:
Not really. What it can do is use a work around to take advantage of a feature that is found by default in DX10.1, similar but not quite the same thing.

It's similar to other possible work arounds to things, but it's not actually DX10.1


Yeah I know that, but I probably should have stated it was a workaround within DX10, rather than full 10.1 spec in my last post.

Logically ATI were never going to have anything extra over Nvidia, because the game was developed on Nvidia Hardware.
October 28, 2008 9:37:11 PM

I really don't think it will matter, as nV and ATI will probably both have their cards out before DX11 actually comes out.
October 28, 2008 9:57:58 PM

TheGreatGrapeApe said:
(I hate people using GT when it was the G200 not GT200 like we were saying way back in February)

I used to think the same thing, but -everyone- calls it GT200 so I figured what the hell.

As for FarCry 2, it uses a DX10.1 feature (multisampled Z-buffers I think) that just happens to exist on NVIDIA DX10 hardware but is not exposed by the DX10 API. So NVIDIA enabled it via an extension.

www.bit-tech.net/news/2008/10/22/nvidia-gpus-support-dx...
October 28, 2008 10:50:22 PM

Wonder how much these dx11 ready cards from ati will cost seeing their current cards are already a steal.
October 29, 2008 12:16:58 AM

rodney_ws said:
Remind me again what was so great about DX10? And DX10.1? Yes, I have a DX10 capable GPU and I have a DX10 capable OS... but to the best of my knowledge I don't own a single DX10 capable game. The newest game I bought was AoC (what a let-down that was!) and although it was SUPPOSED to support DX10, alas it did not.

So hooray for ATI pushing the hardware envelope while software publishers bring up the rear with the software side of the equation. I guess DX11 will be really awesome if you want to watch a 3DMark08 session loop endlessly... otherwise... enh!


There are games that came out DX9 but have DX10 patches. Yes, I understand your frustration. Games take years to develop and we'll probably only see DX11 games in a couple of years. Until then, patches for Dx10 will get better. As most developers are paid (or bribed with hardware) to participate in Nvidia's TWIMTBP program, you won't see DX10.1 support in most games that arrive as DX10 over the next year.

That doesn't make Nvidia a better choice, or ATI a worthless GPU investment come June 2009. I'm sure optimizations for DX11 have some impact on DX9 gameplay, or patched DX10 gameplay. After all, the newer cards will be a die shrink at the very least.

IMHO, it's Nvidia that's held back the implementation of DX10 in games because they felt that Vista wasn't necessary. Plus, Microsoft screwed up the roll out of Vista. It's great with SP1 and I'm having fun playing the DX10 patched version of LOTRO, but that really doesn't look much different than my wife's DX9 version under XP. The water looks a bit better, but the difference isn't as great as it was between DX8.1 and DX9.

speedbird said:
Initial half decent newly released DX10 cards were expensive and the affordable cards like the 8600GTS and 2600XT were way to underpowered for any DX10 gaming use. If we look now, DX10 hasn't been adopted in many games, but at least there's plenty of affordable and capable cards now available.

I don't know for sure about DX11, but I can see this happening again


Those cards suffered from being marketed towards HTPC and video editing. ATI fixed the problems with the next generation; the 3850, and is doing even better with the 4670 and 4830. Nvidia fixed their marketing niche problems with the 9600 cards. I can't see them making the X2600/8800gt mistake again, as they didn't make the mistake in the last generation. The mainstream cards are capable for gaming at 1280 resolutions; sometimes even with AA and AF enabled.
October 29, 2008 3:13:55 AM

BTW the person that said ati has no linux support should have a look round the web. ati's support for linux is a lot better than nvidia's its all to do with a love affair with AMD and llinux
October 29, 2008 3:29:46 AM

the root of dx10 problems is in incompatibility with XP. developers dont make games based entirely on dx10 because they lose half of their market that still plays on windows xp. so they base it on dx9 and just add a few dx10 effects. If microsoft let it work on XP we'd see a lot more dx10 games.
October 29, 2008 6:20:06 AM

lol i hate it even more when people call it 2x0 GTX instead of GTX 2x0 -.-'

still, will the visual increases be worth the performance hit? i personally thought DX 10 wasn't worth it over DX 9... we will see
a c 130 U Graphics card
a b Î Nvidia
October 29, 2008 7:19:29 AM

makotech222 said:
the root of dx10 problems is in incompatibility with XP. developers dont make games based entirely on dx10 because they lose half of their market that still plays on windows xp. so they base it on dx9 and just add a few dx10 effects. If microsoft let it work on XP we'd see a lot more dx10 games.


Yep that about hits the nail on the head as far as i am concerned, don't think DX10 can just be enabled on XP though, i have heard that there is no reason for it not to work but others have said it cant and have given reasons. Right or not i believe the it cant version. Doesn't matter if its games or apps or hardware, bottom line is as you say the biggest user base is XP and so that's where the companies aim.
Perversely this is exactly what m$ were trying to avoid with the hamfisted heavy handed approach to the new API, im sure they could have initially made it work as an update on XP but they wanted a new revenue stream and unfortunately for them the basket they chose to put their eggs in (DX10) wasn't the killer deal sealer some idiot at M$ thought it would be.
That alone has to tell you that DX10 was meant to be so much more at release, however due to pressure from Nvidia it was dumbed down to suit the cards they had and now vista and DX10 are about where they should have been to start with.

Mactronix
October 29, 2008 10:52:26 AM

yipsl said:
IMHO, it's Nvidia that's held back the implementation of DX10 in games because they felt that Vista wasn't necessary. Plus, Microsoft screwed up the roll out of Vista. It's great with SP1 and I'm having fun playing the DX10 patched version of LOTRO, but that really doesn't look much different than my wife's DX9 version under XP. The water looks a bit better, but the difference isn't as great as it was between DX8.1 and DX9.


The biggest difference between dx9 and dx 10 in lotro is the shadow quality. DX 10 shadows are softedged and more dynamic than their dx 9 counterparts. Compare tree shadows in the shire in the day time and note the differences between dx9 and 10. I like a nice shadow, it should perhaps be noted that the shadows under dx10 in lotro are NOT unachievable under dx 9 in other games, but presumably are either easier achieved or more efficient under dx 10 than dx9...
October 29, 2008 11:37:51 AM

DX10/10.1 actually improves performance if implemented properly. Just look at FarCry2 or Flight Sim. Well maybe not Flight Sim so much since they didn't get to use the geometry shader like they wanted to but definitely FarCry 2.

BTW, since I know I'll hear it, Crysis Warhead in DX9 mode does perform better, but it also misses a key feature that makes the game looks so special - object motion blur.
October 29, 2008 11:40:36 AM

yipsl said:
Plus, Microsoft screwed up the roll out of Vista. It's great with SP1 and I'm having fun playing the DX10 patched version of LOTRO, but that really doesn't look much different than my wife's DX9 version under XP. The water looks a bit better, but the difference isn't as great as it was between DX8.1 and DX9.


That is what I thought about Crysis before I tried the very high quality DX9 trick. I was thinking...yeah the water looks more fluid in DX10 and the light scattering through the trees must be a DX10 feature as well. I bought into it until I found that ramping up the DX9 settings in the files gets you EXACTLY the same effects. I am wondering why they did this and if this happens in other titles in an attempt to sell DX10 as as visually superior.

DX10.1 has shader model 4.1 and all but I don't see where it makes a huge difference. Maybe it isn't being fully utilized yet...I dunno. IMO DX10.1 comes down to just being more efficient. If you think the shadows from the trees in the distance look more realistic and is an amazing improvement over DX9 then fine. Like what you are saying though, and I agree, nothing visually pops out as being mind blowing. I think what is impressive though, as we saw briefly with AC, is that DX10.1 can improve FPS drastically over DX10.
October 29, 2008 12:13:19 PM

SpinachEater said:
That is what I thought about Crysis before I tried the very high quality DX9 trick. I was thinking...yeah the water looks more fluid in DX10 and the light scattering through the trees must be a DX10 feature as well. I bought into it until I found that ramping up the DX9 settings in the files gets you EXACTLY the same effects. I am wondering why they did this and if this happens in other titles in an attempt to sell DX10 as as visually superior.

DX10.1 has shader model 4.1 and all but I don't see where it makes a huge difference. Maybe it isn't being fully utilized yet...I dunno. IMO DX10.1 comes down to just being more efficient. If you think the shadows from the trees in the distance look more realistic and is an amazing improvement over DX9 then fine. Like what you are saying though, and I agree, nothing visually pops out as being mind blowing. I think what is impressive though, as we saw briefly with AC, is that DX10.1 can improve FPS drastically over DX10.


Shadows in lotro werent just distant shadows, standing at the foot of the tree it made a massive difference, instead of a static shadow and a slightly breezey tree you had a shadow that move with the tree the edges of the shadow were far more "natural" looking, it wasnt just trees either virtually every static objects shadow was far better. To me personally shadows are a big deal, they help give a "feel" for a hot sunny day, they really help emphasize the effects of weather etc. To me the shadows alone were worth taking a performance hit for. I would post screen shots, but my subscriptions been down for a while, until the new expansion comes out.

In crysis have you tried benchmarking dx9 very high hack vs dx10 very high? From what I remember I found dx10 very high in vista seemed to perform better than hacked very high under dx9 and xp, (that was with an 8800GTX, E6600 and 2gb of ram) however I didnt record fraps results between the installations, and its been a while since I went to vista, my rigs have seen a lot of upgrades since then. I seem to remember that using the very high hack under xp I did get "god rays" through trees, just not as good as they were under dx10.

I do think though that crysis showed just how great dx9 graphics can be. But from what I remember the very high hack under dx9 seemed to be far worse performance than the same level of effects under dx10.

I think overall we still havent seen a dx10 only game, until we do and until dx10s became defacto standard we wont really know what it can do! just compare dx9 crysis to KOTOR, both under dx9 but at opposite ends of its life span, as far as Im concerned at the moment we are still only seeing dx9 games with dx10 used for whistles and bells. I suspect its more used to make more efficient use of effects than to add efffects that werent previously available.
October 29, 2008 1:37:41 PM

SpinachEater said:
That is what I thought about Crysis before I tried the very high quality DX9 trick. I was thinking...yeah the water looks more fluid in DX10 and the light scattering through the trees must be a DX10 feature as well. I bought into it until I found that ramping up the DX9 settings in the files gets you EXACTLY the same effects. I am wondering why they did this and if this happens in other titles in an attempt to sell DX10 as as visually superior.

No, you do not get all the effects with the DX9 very high tweaks. You don't even get all the effects in Warhead on DX9 enthusiast settings.
SpinachEater said:
DX10.1 has shader model 4.1 and all but I don't see where it makes a huge difference. Maybe it isn't being fully utilized yet...I dunno. IMO DX10.1 comes down to just being more efficient. If you think the shadows from the trees in the distance look more realistic and is an amazing improvement over DX9 then fine. Like what you are saying though, and I agree, nothing visually pops out as being mind blowing. I think what is impressive though, as we saw briefly with AC, is that DX10.1 can improve FPS drastically over DX10.

Right, DX10 (DX10.1 even more so) is all about efficiency. This efficiency advantage can be used to increase performance or add more effects with the same performance. And of course there are some effects that simply cannot be implemented properly under DX9, object based motion blur in Crysis being one example.

I doubt we'll see DX10 truly shine until an engine is built natively on DX10. Which may never happen since most engines need to run on the DX9+ consoles and the next gen consoles will likely be DX11+.

This thread has gone OT but I like this discussion.
October 29, 2008 2:05:35 PM

homerdog said:
I doubt we'll see DX10 truly shine until an engine is built natively on DX10. Which may never happen since most engines need to run on the DX9+ consoles and the next gen consoles will likely be DX11+.

This thread has gone OT but I like this discussion.


That I do agree with I do think theres a big chance that dx10 may be killed off by dx11 before it ever gets a chance to show its full potential.

One of the great things DX10 and vista DID bring to us was the performance leap from the 7 series to 8 series by nvidia. I believe the performance achieved was a direct result of the card being designed for dx10... Its just a shame it was followed by an 18 month lull in progress. However despite the GPU being designed for dx10 and vista, dx9 and xp saw the a huge benefit from the new architecture :D 

Kind of on the original topic I dont see a card supporting a "future" DX version as much of a selling point. I brought the 8800gtx at launch not for its dx10 compatibility but for its performance under dx9. Ive brought a 4870 not because I believe 10.1 is ever going to catch on, but because it provides a good performance boost under dx9 and the few dx10 patched titles out there at a very modest price. I will buy a dx 11 card when a) its dx 9 /10 performance warrants an upgrade over my existing card or b) When theres a title out that benefits significantly in performance or visuals from dx 11. I tend to buy for todays needs, as technology moves to fast and a lot of possible future technologies wither on the vine...
October 29, 2008 3:26:51 PM

dtq said:
Shadows in lotro werent just distant shadows, standing at the foot of the tree it made a massive difference, instead of a static shadow and a slightly breezey tree you had a shadow that move with the tree the edges of the shadow were far more "natural" looking, it wasnt just trees either virtually every static objects shadow was far better. To me personally shadows are a big deal, they help give a "feel" for a hot sunny day, they really help emphasize the effects of weather etc. To me the shadows alone were worth taking a performance hit for. I would post screen shots, but my subscriptions been down for a while, until the new expansion comes out.

In crysis have you tried benchmarking dx9 very high hack vs dx10 very high? From what I remember I found dx10 very high in vista seemed to perform better than hacked very high under dx9 and xp, (that was with an 8800GTX, E6600 and 2gb of ram) however I didnt record fraps results between the installations, and its been a while since I went to vista, my rigs have seen a lot of upgrades since then. I seem to remember that using the very high hack under xp I did get "god rays" through trees, just not as good as they were under dx10.

I do think though that crysis showed just how great dx9 graphics can be. But from what I remember the very high hack under dx9 seemed to be far worse performance than the same level of effects under dx10.


I haven't played lotr so I can't really comment on those graphics. What I was eluting to is maybe those shadows are achievable in DX9 but they are just not implemented. Different strokes for different folks...I don't pay too much attention to shadows. For me, if they are there and don't look like Lego blocks, they are good enough for me and I stop looking at the edges. When I play FC2 tonight though I am going to look at the shadows more closely in DX9 and in DX10.1 I am curious if there is a large variation between the two.

I haven't benchmarked Crysis in Vista and XP to compare exact number but in general I found playing in Vista to be sluggish. I would get that elastic type of mouse movements where in XP it would be crisp and snappy when turning or flipping through guns. It seems like also the FPS take a nosedive more frequently than they do in XP for me. Again...I don't really have numbers to back that up but I prefer to play Crysis in XP with the setting alteration.

I think the other problem (and the main reason DX10 turns into a flame war all of the time [:mousemonkey:2] ) is that judging quality on the level that we are talking about...light rays looking better in DX10 than in DX9...it is first a subjective measurement that comes down to splitting hairs and second, we have different shader models at hand (3.0 and 4.0). What I was talking about is something drastic and eye popping wow like the light rays being in DX10 and not in DX9. By omitting them in the DX9 settings it misleads people into thinking that the light rays are only achievable with DX10.



homerdog said:
No, you do not get all the effects with the DX9 very high tweaks. You don't even get all the effects in Warhead on DX9 enthusiast settings.


Yeah yeah, you don't get all of the effects EXACTLY...one reason being you have different shader models at work (3.0 vs 4.0)...so logically it should be different, which = not EXACTLY, and hopefully better. My bad in saying EXACTLY in all caps :non: 

What I was trying to hit were things like having the water look like one fluid body of liquid vs looking flat with wave animations on top or having light scatter through the tree tops vs not having it at all. In the Crysis Cvargroup files you can alter the water settings to make it behave in a more realistic fashion or a "DX10" fashion. It isn't a DX10 only feature...it is adjustable in DX9 but hidden so it appear as if it is a DX10 only feature. I feel like a parrot repeating myself so I will stop harping on this idea. bawk bawk. Ah, the disclaimer...this is all IMHO. :D 
a b U Graphics card
a b Î Nvidia
October 29, 2008 5:01:39 PM

The argument of whether something brings value to the table is all in the eye of the beer holder.

Some people prefer better graphics and are willing to sacrifice resolution or fps to achieve that, other people prefer higher fps and are willing to sacrifice for that.

However I still haven't seen anyone explain to me how having more tools at a lower cost and better performance is a bad thing.

If you were trading DX11 for 50% more SPUs, fine makes sense to say, don't bother with DX11 or whatever, but if you're essentially taking the space of what would likely be less than 5% or even just empty space, then why not?
October 29, 2008 7:36:21 PM

SpinachEater said:
Yeah yeah, you don't get all of the effects EXACTLY...one reason being you have different shader models at work (3.0 vs 4.0)...so logically it should be different, which = not EXACTLY, and hopefully better. My bad in saying EXACTLY in all caps :non: 

Object based motion blur cannot be properly implemented under DX9 (not enough vertex registers) and IMO that is the single most awesome and revolutionary effect Crysis brings to the table. Just to set the record straight, this is not the typical Halo 3 style motion blur that only works when the camera moves, but a uniform motion blur that acts on every object in the scene all the time. I suppose in concept it's similar to temporal anti-aliasing.
October 30, 2008 6:56:48 AM

Since posting about the trees in lotro Ive got a email from codemasters, Ive got 3 days free play starting today, so I will actually be able to take some screen shots to illustrate the differences :D .

Its pure chance timing that I got the email now, It wasnt related to posting here - they are just trying to get ex subscribers back in time for the expansion.
October 30, 2008 12:05:17 PM

homerdog said:
Object based motion blur cannot be properly implemented under DX9 (not enough vertex registers) and IMO that is the single most awesome and revolutionary effect Crysis brings to the table. Just to set the record straight, this is not the typical Halo 3 style motion blur that only works when the camera moves, but a uniform motion blur that acts on every object in the scene all the time. I suppose in concept it's similar to temporal anti-aliasing.


Exactly. DX9 simply can't do a lot of the stuff that DX10 can, and 10.1 is all about making it easier to do said stuff.

!