Sign-in / Sign-up
Your question

NVidia and DX11

Tags:
  • Graphics Cards
  • Nvidia
  • Graphics
Last response: in Graphics Cards
a b U Graphics card
August 13, 2009 4:45:38 AM

Theres rumors starting to emerge that nVidia currently wont have any high end DX11 solution.
This is all speculation, and we'll know how it all turns out with their release of their cards.
The rumor goes as follows:
Since they wont have a high end DX11 card, theyll only be releasing low mid end solutions at low low price points.
This, in effect, will discourage devs from DX11 paths into new games, as the low end cards wont help much obviously, and even ATIs high end may nt do alot, as is often seen with new DX releases.
These rumors come from the East, but by nVidia people, not ATI people.
I hope these rumors are nothing more than that, just rumors
Im not trying to flame, start wars, or anything of the kind, but nVidia has shown in the past, when their cards arent ready (ala the original DX10) or more recently (the removal of DX10.1 from AC, and the shunning of it) the lack of DX10,1 thats been out for quite some time, I find these rumors as possible. Thoughts?

More about : nvidia dx11

a b U Graphics card
August 13, 2009 4:52:29 AM

Well, looks like I'm going ATI for sure this round.
m
0
l
a b U Graphics card
August 13, 2009 4:55:20 AM

I highly doubt they will do this or one major reason. They missed out on DX 10.1 and while there wasn't a huge amount of games for it (mostly do to the fact DX11 was around the corner) they were still seen as 'behind the times' in some way. If they don't step up they will lose people who are undecided on who to go with because ATI will look more current and future proof.

Also they won't be discourage to many devs simply because the all powerful Microsoft is laying down the law for DX 11 because they need it to boost sells on Windows 7, and they need content for that. Even Nvidia doesn't have the power to sway the pull devs will feel from Windows because they know if they don't come out with content people will think it's a dud and not buy it later on.

ATI is catching up in sells and in card performance, they can't afford to give people another excuse to get ATI cards and DX11 is a big one in the hardcore market as we all know people like us want the latest and greatest in their cases.
m
0
l
Related resources
a b U Graphics card
August 13, 2009 5:01:39 AM

lol @ this thread. for a company who relies mainly on brute force, you'd actually believe nvidia wont do high end?
m
0
l
a b U Graphics card
August 13, 2009 5:07:15 AM

Problem with that is, the all powerful M$ relented originally on DX10, thus the need for DX10.1 to begin with, as nVidias card werent capable. And still arent. The called back some of the original DX10 because of nVidias lack of hardware ability.
And we all saw how M$ touted DX10 and Vista together.
Also, theres never been so slow an uptake onto a new DX model before this happened, as it went hand in hand, as they removed the best performance enhancing ability of DX10 by removing the now DX10.1 one less pass option using 4xAA
m
0
l
a b U Graphics card
August 13, 2009 5:08:46 AM

Its not that they wont do high end, its that theyll do it in their own time, and thats been a problem, as the other things Ive already pointed out are already done
m
0
l
a b U Graphics card
August 13, 2009 5:26:56 AM

JAYDEEJOHN said:
And still arent. The called back some of the original DX10 because of nVidias lack of hardware ability.


yeah but since then ATI has had a jump start. If Nvidia says now Microsoft will simply say "to hell with you these other guys are already on board and making them." They have to much riding on DX11 to not get hardware support from someone, even if it means relaying ATI to deliver a amazing line up.
m
0
l
a b U Graphics card
August 13, 2009 5:31:32 AM

http://bbs.chiphell.com/viewthread.php?tid=51645&extra=...
http://forum.beyond3d.com/showthread.php?p=1320832#post...
http://forum.beyond3d.com/showthread.php?p=1320818#post...
http://www.anandtech.com/showdoc.aspx?i=3320&p=8
http://blogs.msdn.com/ptaylor/archive/2007/03/03/optimi...
Above are thoughts, decisions made, actions taken and other proofs and ideas of how things have been, and possibly may be where theyre heading. I hope not.
But each link shows certain things of which has shown manipulation 1 way or another, which Im sure is feeding this current rumor.
m
0
l
a b U Graphics card
August 13, 2009 5:53:27 AM

In Taylors blog, it was clear that the ATI shaders were better at doing DX10 (as it was then at the time, before they removed the DX10.1 part) than nVidias, so, they ended up taking the lowest path, thus hurting ATI
It was plain to see, without AA, the 2900 kept up pretty well to the G80, once you applied AA however, the perf nosedived, since it didnt have the 1 less pass included.
So, yes, you could say M$ this time is going to tell nVidia its either sink or swim, but what of the game devs?
Look at my link regarding Batmans Arkhum Asylum. There, theyve convinced the devs to not include AA usage at all for ATI cards, let alone the recent elimination of physx using anything other than a nVidia card, and that also includes the aigeia cards as well.

Theres been tons of rumor about the G300 simply not being able to hit the timing of DX11/W7 release, and possibly even the end of the year. How are the game devs going to model their games with mid class only compliant HW from nVidia? Is it a M$ all over again?
How much attention or word has come from nVidia regarding DX11?
I can give the Siggraph papers links, where everything from gpgpu to DX11 and LRB was discussed, with everything pointing towards DX11, even on consoles, but again, where is nVidias push?
http://developer.nvidia.com/object/siggraph-2009.html
Heres nVidias Siggraph papers. Ive looked on that page, and couldnt find DX11 mentioned on it, at all
m
0
l
a b U Graphics card
August 13, 2009 7:31:44 AM

Just wondering, is there no AA at all of ATI cards in batman arkham asylum....I saw in the demo that only Nvidia cards could do it, but will this be a final release decision ??? That would be too low.
m
0
l
a b U Graphics card
August 13, 2009 7:48:37 AM

Im sure ATI will have them put it in, if not, I agree.
But, the point is, why wasnt it there to begin with? Anyone ever complain about ATI sponsored games that nVidia cards cant use full functions? No? Then why here?

In my links, it shows how ATI typically advises game devs, how they suggest DX10 games compliance, then later may suggest DX10.1 abilities as well.
nVidias and Batman Arkhum devs use only the nVidia path, which can be renamed with workaround using the nVidia name even.
To me at least, even if it does get "fixed", theyve already sunk that low, both nVidia and the game devs.
Im sorry, I am coming down hard on nVidia here, but to be totally honest, Im sick of this backwards cr@p, and I actually mean it when I say, Im hoping for a G300 early surprise, so maybe both ATI and nVidia can work in tandem getting DX11 off the ground.
It means the earlier we see it, the better our games will be, and PC gaming needs the boost, cause even tho you may be in business just for the monies, and you may be in heavy competition, taking care of the larger picture should be in everyones focus, including ATI and nVidia
m
0
l
a b U Graphics card
August 13, 2009 8:13:29 AM

And how exactly was the DX10 transformed? Did the original have exactly the same coding? Or was the coding changed because of the latter adoption? If you have links, Id love to see them, maybe Im wrong, but it seems feasible to me.
Also, you have to take timing into account. What HW changes were made by ATI, scrambling with the release of the 2900 which ran hot due to TSMCs 80nm and couldnt reach their projected clocks, and their HW resolve for 1 less pass was changed, and only minor tweaks could be adopted, which in the end, may have eliminated the 1 less pass option as well?
Lots to consider here, especially taking into account all scenarios
m
0
l
a b U Graphics card
August 13, 2009 8:16:57 AM

"So now let’s discuss how our DX10 plan has evolved.



In Aug and Sept, as we lacked DX10 hardware at all (much less production quality hw) we realized we couldn’t simul-ship with Vista in Jan 2007 since we believed we needed at least 6 months with production quality hw. Getting early hw in Oct, and production hw at the G80 launch in early Nov meant at a minimum May 2007for release of DX10 support; given a 6 month schedule and a perfect world.



However, as FSX Launch occurred Oct 17 and we began to get feedback from the community we realized we needed a service pack to address performance issues and a few other issues. So we started a dialog with the community and made a commitment to delivering a service release. The work on SP1 and DX10 is being performed by the same team of people (the Graphics and Terrain team) and thus delivering SP1 has delayed DX10.



Given the state of the NV drivers for the G80 and that ATI hasn’t released their hw yet; it’s hard to see how this is really a bad plan. We really want to see final ATI hw and production quality NV and ATI drivers before we ship our DX10 support. Early tests on ATI hw show their geometry shader unit is much more performant than the GS unit on the NV hw. That could influence our feature plan."
http://blogs.msdn.com/ptaylor/archive/2007/03/03/optimi...
So, according to the DX10 devs at M$, the 2900 did AA better than nVidia, what happened?



PS This is also the belief of our missing fellow at large, TGGA
m
0
l
a b U Graphics card
August 13, 2009 8:25:18 AM

Btw, I would like to point out (although a off-topic) that such games as Batman are incredibly affected by Nvidia and the 'way its meant to be played' slogan. Not only is Nvidia Physix touted, there is literally things missing if you don't turn physx on (aka, don't own an Nvidia card) some things that have nothing to do with physix at all. Such as with physx off, there are no hanging banners in some rooms at all, not even static ones. With Physix on there are banners hanging up, and 'moving realsiticly'

Physx is great and all for those who can use it, but why downgrade the overall game for those that don't own Nvidia ?

This was just one example, there are many others in batman, and other twimtbp games out there.
m
0
l
a b U Graphics card
August 13, 2009 8:31:38 AM

Yea, the steam just doesnt exist as well. Looks like to me at least, they took the steam out of me ever wanting to buy it also heheh
m
0
l
a b U Graphics card
August 13, 2009 8:40:18 AM

What did service pack 1 introduce regarding DX in Vista?
m
0
l
a b U Graphics card
August 13, 2009 9:11:30 AM

Without AA, the 2900 held its own.
And, read what I posted, the same team was working on the SP1, which had to be changed, because HW wasnt ready, so, they took the lessor path, which worked on nVidia cards, which hampered ATI cards, and in the interum, the 2900 was too hardbaked, running behind on a crappy leaky node, and had its original design shifted right underneath them by this change in DX, and adding it later wasnt the same thing, and or, AMD was caught off gaurd too late in the process to make further changes. Thats why I asked for links, as this is the common belief as to what happened, and am suprised you didnt know this.
Look up the latest tests 2900vs G80 with no AA, youll see what I mean
m
0
l
a b U Graphics card
August 13, 2009 9:38:18 AM

Early tests on ATI hw show their geometry shader unit is much more performant than the GS unit on the NV hw. That could influence our feature plan."
What else? The shader model was delayed as to how it was to be done, and later, they couldnt resolve this "more performant" action again, after SP1 arrived. They had it before the change.
m
0
l
a b U Graphics card
August 13, 2009 9:42:06 AM

I could give you a link from TPU discussing this with EastCoastHandle. This very thing, AA and all, if I can find the old link.
Im really surprised at this.
m
0
l
a b U Graphics card
August 13, 2009 10:02:16 AM

Sorry SS, cant find it. Anyways, thats what the folks think, TGGA does as well as I do too.
But, back on topic, if this comes to pass, its a leveraging Im just no longer capable of being ok with, at all.
I hate fanboyism, and yes, Ive always admitted to preferring the red over the green, but Ive bought both, and enjoyed both, but if this does happen, Id hate to become something I hate, course, soon, theres always Intel heheh
m
0
l
a b U Graphics card
August 13, 2009 11:47:44 AM

JAYDEEJOHN said:
Also, theres never been so slow an uptake onto a new DX model before this happened, as it went hand in hand, as they removed the best performance enhancing ability of DX10 by removing the now DX10.1 one less pass option using 4xAA


Actually, DX10 update has been surprisingly fast when you consider both the XP factor, and how long it took to break DX9 in...
m
0
l
a b U Graphics card
August 13, 2009 11:51:54 AM

Annisman said:
Btw, I would like to point out (although a off-topic) that such games as Batman are incredibly affected by Nvidia and the 'way its meant to be played' slogan. Not only is Nvidia Physix touted, there is literally things missing if you don't turn physx on (aka, don't own an Nvidia card) some things that have nothing to do with physix at all. Such as with physx off, there are no hanging banners in some rooms at all, not even static ones. With Physix on there are banners hanging up, and 'moving realsiticly'

Physx is great and all for those who can use it, but why downgrade the overall game for those that don't own Nvidia ?

This was just one example, there are many others in batman, and other twimtbp games out there.


In the Batman case, I haven't done the demo, but I would assume those banners (Even static ones) are effected in some way by the PhysX API. And again, nothing is preventing ATI from adopting PhysX. And once you consider how PhysX has already been ported to all the consoles, only ATI is preventing the unified Physics API from being a reality.

I find it odd: Now PhysX is actually being used properly, and people are complaining? Go figure; that API can't seem to win...
m
0
l
August 13, 2009 12:51:57 PM

this must have something to do with TSMCs leaky process they must be having a hard time getting such a big chip out the door with such a leaky process out the door, well this seems good news for ati and they need this more than nvidia
m
0
l
a b U Graphics card
August 13, 2009 1:56:22 PM

gamerk316 said:
In the Batman case, I haven't done the demo, but I would assume those banners (Even static ones) are effected in some way by the PhysX API. And again, nothing is preventing ATI from adopting PhysX. And once you consider how PhysX has already been ported to all the consoles, only ATI is preventing the unified Physics API from being a reality.

I find it odd: Now PhysX is actually being used properly, and people are complaining? Go figure; that API can't seem to win...




They had PhysX and it was shut down. Then the ability to use a PhysX card along side a ATi card was shut down. ATI already adopted it but it's Nvidia that is not allowing it.
m
0
l
a b U Graphics card
August 13, 2009 4:22:26 PM

gamerk316 said:
In the Batman case, I haven't done the demo, but I would assume those banners (Even static ones) are effected in some way by the PhysX API. And again, nothing is preventing ATI from adopting PhysX. And once you consider how PhysX has already been ported to all the consoles, only ATI is preventing the unified Physics API from being a reality.

I find it odd: Now PhysX is actually being used properly, and people are complaining? Go figure; that API can't seem to win...


But really, we have to get rid of the object completely ? We can't even keep them there to do nothing, they take them completely out of the game ? It's the same for fog and steam in batman, completely gone for ATI. It's a PR move for physics 100%, and it affects everyone without a Nvidia card. I mean, there's no way my 3 1GB 4870's can't process the fog or banners at all ??? Obviously it could handle it, but they have left it out.\

Also, on consoles: The current consoles have what ? 9 - 12 cpu cores I think, Xbox has like 7 or 9, and obviously the cell is pretty beefy. My point is while avg. computer users are still on dual cores, and SOME are on quads, the consoles can actually use a core or two just for physics, because don't tell me they use their puny gfx cards to do the physx.
m
0
l
August 13, 2009 4:46:05 PM

^ the Xbox has 3 cores @ 3.2GHZ, and the PS3 cell @ 3.2GHz has 7 SPE's and 1 PPE.
m
0
l
a b U Graphics card
August 13, 2009 4:57:31 PM

PS3 - Cell Processor (8 cores = 1 disabled + 1 for OS + 6 for developers)
XBOX 360 - IBM Xenon with 3 cores
Wii - dual core? (not easy to find), though it has a separate PPU
m
0
l
a b U Graphics card
August 13, 2009 5:13:51 PM

LOL 7 - 9 cores on the xbox 360? sure.... lol you pay like 400 bucks for 6 core Opterons and they fit a 9 core into a $200 console. :D 

not mocking you or anything but i'm glad some people make me laugh when i'm bored
m
0
l
a b U Graphics card
August 13, 2009 5:31:56 PM

Annisman said:
But really, we have to get rid of the object completely ? We can't even keep them there to do nothing, they take them completely out of the game ? It's the same for fog and steam in batman, completely gone for ATI. It's a PR move for physics 100%, and it affects everyone without a Nvidia card. I mean, there's no way my 3 1GB 4870's can't process the fog or banners at all ??? Obviously it could handle it, but they have left it out.\

Also, on consoles: The current consoles have what ? 9 - 12 cpu cores I think, Xbox has like 7 or 9, and obviously the cell is pretty beefy. My point is while avg. computer users are still on dual cores, and SOME are on quads, the consoles can actually use a core or two just for physics, because don't tell me they use their puny gfx cards to do the physx.


They did what I begged devs to do since PhysX came out: Make a seperate engine for objects that are interacted with by PhysX. By using PhysX as the base engine, you don't have to worry about people who lack PhysX support.

Basically, previous releases had a seperate engine to do effects if PhysX was not present, which limits the time devs have to work on PhysX effects. The devs for Batman went the other route, and made all of those advanced effects reliant entirly on PhysX. No Physx = no PhsyX objects.

How is ATI not supporting PhysX any different then NVIDIA not supporting 10.1? To me, its the same in the end. Choose which feature you want more, and pick the card that best suits you.
m
0
l
a b U Graphics card
August 13, 2009 5:34:09 PM

mindless728 said:
PS3 - Cell Processor (8 cores = 1 disabled + 1 for OS + 6 for developers)
XBOX 360 - IBM Xenon with 3 cores
Wii - dual core? (not easy to find), though it has a separate PPU


To me, it comes down to architecture (bus width, registers, instruction sets, and storage) more then CPU horsepower, and the 360 wins that battle hands down.
m
0
l
a b U Graphics card
August 13, 2009 6:04:13 PM

Nice try gamer, but your "facts" about ATI having to adopt it run out of gas when everyone knows what gfx card resides in a xbox
m
0
l
a b U Graphics card
August 13, 2009 7:21:06 PM

I guess if all the nVidia claims are to be at all taken seriously, theyd have to dump xbox as well?
Or, could it be, we see again that whats good for nVidia, and nVidia alone is what again theyre doing here?
Anyone else tired of this? Want to see early DX11 adoption?
m
0
l
a b U Graphics card
August 13, 2009 7:21:34 PM

???Meaning what exactly? The 360 GPU was developed by ATI. I fail to see what you're trying to say.

I'm mearly saying that based on developer usage, PhysX will be seeing more immediate use then DX10.1, or possibly even DX11 (which will carry the standard 2-3 year period before adoption, assuming XP is gone within that timespan). Its been ported to consoles, aiding in its usage for PC gaming (minimal code changes to fit PC PhysX API). Meanwhile, the new Hovok ATI is working on has yet to even be finished, much less released for use (correct me if I'm wrong; I haven't been able to find a thing about it recently).

Therefore, ATI can either abondon PhysX, which is quickly becoming the standard Physics API, or they can port the API to run on their cards. Even you can't deny that NVIDIA offered to help ATI in the process immediatly after they released the API, which I note would have offset NVIDIA's advantage in that area. If NVIDIA is so evil, why did they offer to help the porting process, knowing they would lose PhysX exclusivity in the process?

NVIDIA chose to ignore 10.1, and ATI chose to ignore PhysX. The difference is NVIDIA is starting to implement 10.1 (even if is only because its a subset of 11), while ATI is flat out ignoring PhysX. If you can justify buying ATI for 10.1 support, you can justify buying NVIDIA for PhysX support; that's all.
m
0
l
a b U Graphics card
August 13, 2009 7:56:30 PM

No, either nVidia can drop the xbox, or you can stop with the standards claims, period.
Its allowed on ATI HW when they so choose, why not on DT? If theyre really trying to become a "standard", why this move? Maybe because it makes them exclusive? And being exclusive has what to do with becoming a standard?
m
0
l
August 13, 2009 8:53:31 PM

yes if you want something to become standard you open it up for DX like ati did with there tessellation, physx was offered to ati over the Internet (through news site's) now if that's how nvidia do business you cant blame ati for not taking them up on there offer
m
0
l
a b U Graphics card
August 13, 2009 9:19:09 PM

First off all, NVidia bought PHYSX... so it's not the fault of ATI.

Second off, if that's true that PHYSX is important in Batman AA, it would mean that the PS3 version will be vastly superior since xbox (ATI gpu) and PS (Nvidia gpu).

I wonder if DX11 will not change all that game. It's maybe the reason why AMD jump so fast on the bandwagon.
m
0
l
a b U Graphics card
August 13, 2009 9:29:50 PM

Having AA is totally seperate from Physx. The game devs, while working with nVidia, only used the nVidia path for using it for their demo, which is unlike ATIs approach when working with devs, where they make sure ot works on all HW first, then go after their particular things such as DX10.1, as shown in my link
None of this applies to the consoles at all, which has ATI parts in them, so its being done by nVidia, and aimed solely at desk top
m
0
l
a b U Graphics card
August 13, 2009 10:12:18 PM

Who says it does?
m
0
l
a b U Graphics card
August 13, 2009 10:21:05 PM

Oh wait, nevermind
m
0
l
August 13, 2009 10:36:21 PM

Quote:
DX is a standard, physx is not and never will be. Saying it is becoming so is a joke.

DirectX is standard? Tell that to intel with their Larrabee
m
0
l
a b U Graphics card
August 13, 2009 10:49:55 PM

Thats a totally different approach, and isnt really an apples to apples, as seen here with nVidia and ATI
LRB is "trying" to persuade devs to use the x86 approach, while also conforming to the traditional DX standard.
Physx is no more a standard than Havok is
m
0
l
a b U Graphics card
August 14, 2009 2:13:08 AM

JAYDEEJOHN said:
Having AA is totally seperate from Physx. The game devs, while working with nVidia, only used the nVidia path for using it for their demo, which is unlike ATIs approach when working with devs, where they make sure ot works on all HW first, then go after their particular things such as DX10.1, as shown in my link
None of this applies to the consoles at all, which has ATI parts in them, so its being done by nVidia, and aimed solely at desk top


ArKam Asylum... not Anti-Aliasing...

PhysX is nothing more than a software integrated physically with Nvidia gpu architecture. My point is that console versions, which are using 2 different GPU are going to show 2 totally different graphic rendering if it was the case.

The Xbox 360 would be heavily handicap by this move... and you know what... it's the contrary. The xbox version is considered having better graphics even with the lack of PhysX support. And yeah, the xbox version is lacking PhysX support, it was stated on a thread on gamespot.

I know ATI cards can run PhysX, but Nvidia owns the rights.

m
0
l
a b U Graphics card
August 14, 2009 2:26:28 AM

Certainly they do, and thats why its not a standard.
"Question:

Quote:
[Configuration]
BasedOn=..\BmGame\Config\DefaultEngine.ini

[SystemSettings]
Fullscreen=True
UseVsync=True
AllowD3D10=False

Will BAA come with D3D10 support?
Is it possible, that there will be D3D10 MSAA for all line in GoW?
There some interesting settings in BaseEngine (...\batman arkham asylum - demo\Engine\Config) like

Code:
[Engine.ISVHacks]
bInitializeShadersOnDemand=False
DisableATITextureFilterOptimizationChecks=True
UseMinimalNVIDIADriverShaderOptimization=True
PumpWindowMessagesWhenRenderThreadStalled=FalseOr

Code:
[SystemSettings]
; NOTE THAT ANY ITEMS IN THIS SECTION WILL AFFECT ALL PLATFORMS!!!

...
AllowD3D10=False
...
bEnableVSMShadows=False

..."
http://forum.beyond3d.com/showthread.php?t=54786
Interesting, no ATI texture filter, and no DX10.....yet
m
0
l
a b U Graphics card
August 14, 2009 2:39:57 AM

OK, just imagine yourself as someone whos made this game, and is finalizing the demo for the general public for advertising the project youve spent tons of money on, and as many hours.
OK, now, for what reason would you NOT include AA in your demo? Oh, but it is in the demo? Oh, but ONLY for nVidia. Oh, its an ID block? Well, we'll backpeddle, get ahold of ATI and say were working on it, cause we got caught.
The question still remains, what made them to make this decision?
m
0
l
a b U Graphics card
August 14, 2009 9:25:18 AM

JAYDEEJOHN said:
OK, just imagine yourself as someone whos made this game, and is finalizing the demo for the general public for advertising the project youve spent tons of money on, and as many hours.
OK, now, for what reason would you NOT include AA in your demo? Oh, but it is in the demo? Oh, but ONLY for nVidia. Oh, its an ID block? Well, we'll backpeddle, get ahold of ATI and say were working on it, cause we got caught.
The question still remains, what made them to make this decision?



It's total BS in my opinion, it's like Ford and Toyota both selling you the same car, but Toyota owns the rights to include a steering wheel with their car, and not Ford.

It's one thing to deny people Physx acceleration or code, which I very much so agree that Nvidia has every right to do. But it is entirely another thing to not let ATI customers enable AA, or have static objects, or fog etc. etc. etc. which have NOTHING to do with Physx. It's cheap, it's dirty, and honestly it makes me wish I had an SLI enabled motherboard, because I would probably jump on the bandwagon. But dammit it's just so dirty of them.
m
0
l
a b U Graphics card
August 14, 2009 9:27:19 AM

Hey, got a great idea here, the next TWIMTBP title from Nvidia should disable resolutions above 800X600 for ATI card users.....
m
0
l
a b U Graphics card
August 14, 2009 11:50:49 AM

Annisman said:
It's total BS in my opinion, it's like Ford and Toyota both selling you the same car, but Toyota owns the rights to include a steering wheel with their car, and not Ford.

It's one thing to deny people Physx acceleration or code, which I very much so agree that Nvidia has every right to do. But it is entirely another thing to not let ATI customers enable AA, or have static objects, or fog etc. etc. etc. which have NOTHING to do with Physx. It's cheap, it's dirty, and honestly it makes me wish I had an SLI enabled motherboard, because I would probably jump on the bandwagon. But dammit it's just so dirty of them.


But the point is, if the interaction with those objects is done with the PhysX libarary, and not the game engine, then its impossible to have then exist without the presence of PhysX. Thats how I've been begging devs to use PhysX for over a year now, as creating objects only for the PhysX API is the best way to demonstrate its power, and reward those capable of running it.

As for AA, its important to note that both NVIDIA and ATI render AA differnetly, so theres a chance there was an issue with how ATI cards were performing AA, and rather then delay the game, decided to release and patch the issue later.

ATI is free to jump on the bandwagon whenever they want. Maybe this will finally be the title that forces the issue...

Also, you don't need a SLI mobo to use 2 GeForce cards, one doing PhysX and one doing the rendering...
m
0
l
August 14, 2009 12:58:39 PM

Meh! Physx doesn't even factor into my decision when purchasing a video card most the games that feature Physx aren't that good (Assassins Creed for example) and as an engine I much prefer the HAVOK model that can be found in games such as the Half Life 2 serial.

As for Nvidia not releasing a high end DX11 card I find that idea preposterous, it's a given that they will have a high end card for DX11 as surrendering their dominance in the discrete video card market makes no sense.
m
0
l
a b U Graphics card
August 14, 2009 1:13:41 PM

JeanLuc said:
Meh! Physx doesn't even factor into my decision when purchasing a video card most the games that feature Physx aren't that good (Assassins Creed for example) and as an engine I much prefer the HAVOK model that can be found in games such as the Half Life 2 serial.

As for Nvidia not releasing a high end DX11 card I find that idea preposterous, it's a given that they will have a high end card for DX11 as surrendering their dominance in the discrete video card market makes no sense.


I think the assumption/rumor here is not that they will simply not make a high end card.. but that thier coming high end cards will not feature dx11 support. That the dx11 support will come a bit (a while?) later. Whether it is because they cannot make one atm, or they don't want to bother remains to be seen.

Though I do agree, I find it unlikely they would give ATI such a huge head start. It wouldn't be good for anyone that plays games.

As for PhysX... it will become an open standard, you know.. maybe when they make it open. Was it not terms of nvidia's origional offer to ati that ati would have to give nvidia all of the plans and specs of their cards in order to be able to use physX?
m
0
l
      • 1 / 2
      • 2
      • Newest