Sign in with
Sign up | Sign in
Your question

When will nVidia get it?

Last response: in Graphics & Displays
Share
a b U Graphics card
September 5, 2009 2:38:06 PM

OK, its been mentioned before, and talked about some, but heres a lil clarification.
Batman AA, which is the newest currently touted TWIMTBP'd game by nVidia, has Physx in it, and theyve just recently locked out anything that isnt nVidia away from using Physx.
Thats exclusivity by exclusion, not just being better.
Now, people have asked, why doesnt ATI just use Physx license and pay the fees etc.
Well, you have to be able to use CUDA first to use Physx, instead of Opencl, so, its even proprietary for just getting to use the license.
Now, wed all like to actually have working, real, make a difference in the game physics in our games, but requiring all this is just too much.
So then we have to ask "why doesnt ATI just do this? Theyve nothing to lose a but license fee, and heres where its simply much more than that.
Batman AA has a code lockout in the game, no not just for Physx, bur the usage of AA, so, if you dont own an nVidia card, you cant use AA.
OK, so, surely it must be that ATI hasnt come to the table for this game, and set the game devs straight as to how to make AA work on their cards you may ask?

Heres the problem. In the code from Batman, youll find this line:

If (System.GPU.IsMadeByNvidia() == false)
GameOptions.AASupport = false;

Now, who do you think wanted this put there? And again, ask yourself, should ATI or Intel in the future have anything to do with nVidias proprietary hoodwinking?
If nVidia ever truly wanted Physx and CUDA to be a universally used set of codes, used thru licensing, it would never do this.
We dont see this being done with Havok, which is owned by Intel, as theyre playing fair.

As for me, Im tired of nVidia and their antics, and I know some people say it shouldnt matter, just get the best card you can, but this is a lawless scenario weve been given by nVidia, and I think its time they pay us back

More about : nvidia

a c 177 U Graphics card
a b Î Nvidia
September 5, 2009 3:39:14 PM

Is n't that kind of thing illegal?
It certainly sounds like it could sire a lawsuit for anti competitive practices here in the EU.
And what about anti trust laws in the US?
September 5, 2009 3:41:40 PM

Well Nvidia lost over $200 million in Q1 of this year and still haven't settled on outstanding law suits on the faulty 8000 graphics chipsets so it's going to stay bad for a while before it gets better.
Related resources
a c 376 U Graphics card
a b Î Nvidia
September 5, 2009 3:52:30 PM

Just because that code is in the game doesn't necessarily mean that without the code AA would work fine on ATI cards. They could simply know that attempting to use AA with an ATI card in the game will cause issues so they have a check for it. Of course that may not be the case and even if it is that they were willing to delay the game to add PhysX but can't bother to make something like AA work with an ATI card is pretty lame.
a b U Graphics card
September 5, 2009 3:52:35 PM

Whats worse is, the XBox uses ATI in it, and the AA works just fine for it
a b U Graphics card
September 5, 2009 3:57:53 PM

Tell that to XBOX owners, sorry, doesnt work like that, because, using a workaround, you can recode AA usage with ATI cards, no problems
a b U Graphics card
September 5, 2009 3:59:41 PM

Theres no excuses here. None. Zip. Nadda.
nVidia is screwing ATI card users in this game, period
a b U Graphics card
September 5, 2009 4:21:21 PM

Also, to those that say ati should be courting dev's that is the wrong way round, dev's should be making sure things work on as much hardware as possible so as not to lose sales. said:
Also, to those that say ati should be courting dev's that is the wrong way round, dev's should be making sure things work on as much hardware as possible so as not to lose sales.


i think they should. sponsorship deals to counter nvidia's marketing (along with "best played with intel blah blah)" is a pretty smart move. and can someone throw a rock @ amd for not doing so?

one thing i'll never get though, is how does the AMD's sponsorship deal with Scuderia Ferrari helps the consumers and developers alike? anyone? apart from filling kimi raikonnens pocket.
a c 271 U Graphics card
a c 171 Î Nvidia
September 5, 2009 6:47:03 PM

JAYDEEJOHN said:
OK, its been mentioned before, and talked about some, but heres a lil clarification.
Batman AA, which is the newest currently touted TWIMTBP'd game by nVidia, has Physx in it, and theyve just recently locked out anything that isnt nVidia away from using Physx.
Thats exclusivity by exclusion, not just being better.
Now, people have asked, why doesnt ATI just use Physx license and pay the fees etc.
Well, you have to be able to use CUDA first to use Physx, instead of Opencl, so, its even proprietary for just getting to use the license.
Now, wed all like to actually have working, real, make a difference in the game physics in our games, but requiring all this is just too much.
So then we have to ask "why doesnt ATI just do this? Theyve nothing to lose a but license fee, and heres where its simply much more than that.
Batman AA has a code lockout in the game, no not just for Physx, bur the usage of AA, so, if you dont own an nVidia card, you cant use AA.
OK, so, surely it must be that ATI hasnt come to the table for this game, and set the game devs straight as to how to make AA work on their cards you may ask?

Heres the problem. In the code from Batman, youll find this line:

If (System.GPU.IsMadeByNvidia() == false)
GameOptions.AASupport = false;

Now, who do you think wanted this put there? And again, ask yourself, should ATI or Intel in the future have anything to do with nVidias proprietary hoodwinking?
If nVidia ever truly wanted Physx and CUDA to be a universally used set of codes, used thru licensing, it would never do this.
We dont see this being done with Havok, which is owned by Intel, as theyre playing fair.

As for me, Im tired of nVidia and their antics, and I know some people say it shouldnt matter, just get the best card you can, but this is a lawless scenario weve been given by nVidia, and I think its time they pay us back

Or that line just stops the option 'Nvidia(TM) Multisampling AA' from being displayed on the settings menu and you will have to do AA through CCC.
a c 105 U Graphics card
a b Î Nvidia
September 5, 2009 7:20:20 PM

If you think physics is being shut down why not mess with the code a little...

If (System.GPU.IsMadeByNvidia() == true)
GameOptions.AASupport = true;

Or sometimes a simple // in front of certain lines will null that part of the code.

I haven't noticed anything so far as far as physics/havoc physics not working on either brand card. I wouldn't go out of my way for the game mentioned above though so I have no way of testing that.

Also, even though somebody mentioned once that it wouldn't work for Vista, you CAN run an ATI card for main graphics and an Nvidia card for physics without problems. ( install physics driver only.... but Vista may find it for you )... I've done it on a Gygabyte board.

I can understand the gripe because if it's true it will effect all players eventually. But the way things are going Nvidia should walk a tight rope because they're heading in a downward spiral. My next machine will be AMD ( 6core ) and ATI graphics exclusive...... sometime in late 2010. That should set me for a few years unless I get bored...... AHEM.
September 5, 2009 9:06:17 PM

swifty_morgan said:
why not mess with the code a little...


I'm pretty sure the game isn't going to be open source.
a b U Graphics card
September 6, 2009 2:50:57 AM

Mousemonkey said:
Or that line just stops the option 'Nvidia(TM) Multisampling AA' from being displayed on the settings menu and you will have to do AA through CCC.

Not even possible. You have tp use a workaround, as Ive said.
Essentually, you have to trick the game into thinking its a nVidia card.
If people find this ok, maybe AMD should just start locki9ng out nVidia cards being used in their cpu coding, or, if they pull this when LRB arrives, which they will if this particular thing doesnt get changed, Intel also should just lock them out, seems fair.
nVidia on VIA anyone?
a b U Graphics card
September 6, 2009 2:59:22 AM

I've got an nvidia card but kinda makes me want to start buying ATI.
a b U Graphics card
September 6, 2009 3:11:57 AM

Think of it this way.
If anyone thinks what nVidia is doing here is ok and legit, wait til Intell arrives, and sees how its done.
Who do you think has more money for exclusive influence?
I dont care if nVidia created Physx,CUDA and made AA happen in this game, theyll lose at this game big time to Intel, if thats how they want to play it.
I bet before Intel arrives with LRB, this kind of crap comes to an end, and good riddance
a b U Graphics card
September 6, 2009 4:50:24 AM

I know it could mean higher prices from less competition, but I hope Intel EATS NVIDIA ALIVE. Let's see here, Intel Dwarfs Nvidia and ATI, for the past 3-4 years they have had by far the best enthusiast CPUs, they are on a win streak right now and cannot be stopped.

Also, remember that Physx and AA are not the only thing getting the shaft here for Ati users, when Physx is 'not available' actual visual parts of the game become missing. Like fog, certain cloth objects and tiles etc. You're telling me that my Core i7 at 4.0Ghz can't and shouldn't be able to do some physic processing on it's own ?? Even if the performance hit is larger.

Nvidia is crippling other people's hardware, does anybody here think that people are going to say "gee, I can't use AA, or Physx, and parts of the visuals are missing, let me for fork over 600$ to Nvidia so I can have that!"

No way, people are going to get pissed.

It's like buying a car from Chevy that gets 30MPG, and Ford has found a clever way to make your car get only 20MPG.

As consumers, when we buy products we expect a full return. When titles in general come out without AA or widescreen support etc. we are NOT happy about it and we let the developers know about it, the Hardware companies are supposed to be the ones helping the software companies makes games that people can run and run well. When I purchase 2 5870's in Sept-October for Crossfire it will literally chew this batman game to pieces, It's just too bad I will be getting 100+ Fps AND HAVE JAGGIES ALL OVER THE PLACE AND PARTS OF THE VISUALS MISSING!!!!

Can you even imagine if a game like Crysis did something like this, or Far Cry 2 ??? Some of us pay thousands of dollars every year on their computers, for the very best gaming experience. When a hardware company deliberately takes that AWAY from us there should be an uproar. Physx is fine, good for Nvidia, good for their users, but disabling AA ??? Really?


this is 2009 and this BS needs to stop.
a b U Graphics card
September 6, 2009 5:01:12 AM

Maybe theyll come up with a NOIDFORNVIDIALIKEASSASSINSCREEDSMSAA patch?
a b U Graphics card
September 6, 2009 5:14:33 AM

Wrong Jaydee. Remember people. ATI and NVIDIA perform AA calculations differently, so its two seperate methods that need to be implemented. Also remember, the Unreal Engine, which Arkham uses, does NOT nativly support AA, so it needs to be added in.

What you are really complaining about is how extra features were added to the engine for NVIDIA, and no extra features were added for ATI. Nothing has been 'disabled', as the Unreal Engine can not nativly do AA to begin with!

Now, if this was an engine that supported AA, you would have a point. But fact is, NVIDIA works closer to devs then ATI does (for whatever reason, thats a whole nother discussion), so its no shock NVIDIA ends up with more features added in.
a b U Graphics card
September 6, 2009 5:18:03 AM

gamerk316 said:
Wrong Jaydee. Remember people. ATI and NVIDIA perform AA calculations differently, so its two seperate methods that need to be implemented. Also remember, the Unreal Engine, which Arkham uses, does NOT nativly support AA, so it needs to be added in.

What you are really complaining about is how extra features were added to the engine for NVIDIA, and no extra features were added for ATI. Nothing has been 'disabled', as the Unreal Engine can not nativly do AA to begin with!

Now, if this was an engine that supported AA, you would have a point. But fact is, NVIDIA works closer to devs then ATI does (for whatever reason, thats a whole nother discussion), so its no shock NVIDIA ends up with more features added in.


Why do you make exscuses for them, why?

It might not be 'disabling' so to speak, but think of it this way.

Your parents give your brother 1000$ dollars, and tell him to go have a blast.

You don't get anything.... ah, they never took money FROM you..... but how does that feel ???
a b U Graphics card
September 6, 2009 5:40:54 AM

gamerk316 said:
Wrong Jaydee. Remember people. ATI and NVIDIA perform AA calculations differently, so its two seperate methods that need to be implemented. Also remember, the Unreal Engine, which Arkham uses, does NOT nativly support AA, so it needs to be added in.

What you are really complaining about is how extra features were added to the engine for NVIDIA, and no extra features were added for ATI. Nothing has been 'disabled', as the Unreal Engine can not nativly do AA to begin with!

Now, if this was an engine that supported AA, you would have a point. But fact is, NVIDIA works closer to devs then ATI does (for whatever reason, thats a whole nother discussion), so its no shock NVIDIA ends up with more features added in.

You know, that may hold water what you say IF
:it didnt work at all on ATI cards WITHOUT the worjaround, but alas, it does
:it didnt work on consoles as well, even those with those "different AA methods", like the xbox, which ummm has what kind of gpu in it?

I think anyone defending this exposes themselves somewhat.
Counter arguments go
:since nVidia hasnt brought anything to with pc interupts, then maybe both Intel and AMD should do ID checks, and just boot nVidia for not adding anything there as well?


It does work on ATI cards, thats the point, and a weak suggestion by you.
But, all you have to do for pc anyways, since it works on ATI parts on console is, workaround the ID check
a b U Graphics card
September 6, 2009 5:47:09 AM

Remember the title of this thread?
nVidia just doesnt get it.
They come off as the bad guy here, plain and simple.
What wouldbt happened if they promoted doing this for all users, much like supposedly physx is supposedly for as well?
No, they make claims, then theyre greedy.
If theyd just done this for the benefit of gaming in general, great.
Doesnt this go against the nVidia conferences to PROMOTE the usage of gpus?
Hasnt everything theyve dont whether it be no DX10.1, the Assassins creed debacle, physx, now this go exactly against that promotion of gpu usage?
Fail for nVidia here
a b U Graphics card
September 6, 2009 5:50:41 AM

bioshock did AA, on both brands, under dx10.
a b U Graphics card
September 6, 2009 6:01:01 AM

To me, a leader doesnt have to do petty things such as this, and if nVidia is supposed to be the gfx leader, no wonder its in the state its in. This gets old real fast, and this isnt like getting caught with a perf failure.
No, its a deliberate attempt for exclusivity by the supposed leader in gfx.
A leader doesnt seek theirs first, then everything else later, because a true leader will just get theirs, regardless.
Havnt we learned that more cooperation is a much better scenario inthese economic times, and what the "me first" mentality has shown what can happen? I dont want to go too far off topic here, but it does apply here.
PC gaming needs every boost it can get, not this crap
a b U Graphics card
September 6, 2009 6:06:56 AM

And here I thought Piracy and DRM is what's keeping PC gaming down..... now Nvidia want's to take a crack at it too i guess.
a b U Graphics card
September 6, 2009 4:22:30 PM

JAYDEEJOHN said:
You know, that may hold water what you say IF
:it didnt work at all on ATI cards WITHOUT the worjaround, but alas, it does
:it didnt work on consoles as well, even those with those "different AA methods", like the xbox, which ummm has what kind of gpu in it?

I think anyone defending this exposes themselves somewhat.
Counter arguments go
:since nVidia hasnt brought anything to with pc interupts, then maybe both Intel and AMD should do ID checks, and just boot nVidia for not adding anything there as well?


It does work on ATI cards, thats the point, and a weak suggestion by you.
But, all you have to do for pc anyways, since it works on ATI parts on console is, workaround the ID check

Here's an idea, has anyone emailed the makers of Arkham Asylum why AA is not possible on their ATi hardware?

I don't like to bash other people/corps without hearing what they have to say about it first, maybe they have a valid reason (lazy QA/no budget for QA/haven't tested throughly with ATi configs/etc) but at least give them the chance to explain themselves. After they reply and if the reply is BS then you have every right to bash them.


* I'd bet you'd be singing a different tune if you were the developer who put in that line and knows that that there is a possibility with AA going wrong with ATi, and then read such comments here.
a b U Graphics card
September 6, 2009 5:21:26 PM

amnotanoobie said:
Here's an idea, has anyone emailed the makers of Arkham Asylum why AA is not possible on their ATi hardware?

I don't like to bash other people/corps without hearing what they have to say about it first, maybe they have a valid reason (lazy QA/no budget for QA/haven't tested throughly with ATi configs/etc) but at least give them the chance to explain themselves. After they reply and if the reply is BS then you have every right to bash them.


* I'd bet you'd be singing a different tune if you were the developer who put in that line and knows that that there is a possibility with AA going wrong with ATi, and then read such comments here.

And what if we get a response like before on Assassins Creed? :
"The performance gains seen by players who are currently playing AC with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly."
http://www.anandtech.com/showdoc.aspx?i=3320&p=6
Thats exactly what DX10.1 does, is remove that pass, thus the perf gains.
What makes me, you, or anyone else think that the response would be different?
Theres still alot of people who simply dont believe the AC thing went down as it did, so whats a statement going to prove?
Im of the notion, some people will believe what they want to believe, if theyre so inclined, regardless of the facts.
Just like the DX10 debacle with nVidia and M$, some people dont understand that DX10.1 was actually part of the original DX10, but was delayed.
Some people still think weve never landed on the moon also, so no, these "proofs" have done lil in the past to actually prove anything, as I said, because some are either too greedy, too thick or too much a blinded fan
a b U Graphics card
September 6, 2009 6:17:06 PM

Jaydee is the proverbial voice of absolute truth, reason, and common sense in the Graphics and Displays section of Tom's Hardware.
a c 271 U Graphics card
a c 171 Î Nvidia
September 6, 2009 6:35:54 PM

Have any ATi owners confirmed the lack of AA in BAA yet?
a b U Graphics card
September 6, 2009 7:58:20 PM

regarding the AC scandal. i've found a pretty simple workaround. dont install the 1.02 patch, its that simple.
September 6, 2009 9:25:23 PM

very bad play nvidia i was always a fan of nvidia now they are just playing unfair. i always backed up nvidia back in the past but now it seems like i was on the wrong board... nvidia has disappointed me by doing this. i want all the manufacturer in the market play fair and compete so it is good for the ultimate customers like me. Yes it is true Nvidia work close to devs and thats why they get bit more support in games than ATI thats acceptable. But after AC now BATMAN that code??? thats not acceptable anyways.... i prefer Nvidia gets somewhat more fps with a similar card than ati as they pay money to devs its simple but this kind of thing is not... atleast they know now they have lost one loyal customer who bought 14 nvidia cards and made many people buy nvidia cards. its a right time to say goodbye
a b U Graphics card
September 7, 2009 1:22:58 AM

Quote:
You know, that may hold water what you say IF
:it didnt work at all on ATI cards WITHOUT the worjaround, but alas, it does
:it didnt work on consoles as well, even those with those "different AA methods", like the xbox, which ummm has what kind of gpu in it?


Making another mistake JD. As Mass Effect, GoW, AA3, all proved, you can force AA in Unreal games (Supersampling?), but to various degrees of success with various performance hits. AA is simply not officially supported in DX9 mode for the Unreal Engine, and requires extra work to put in. Yes, you can force it, but this method is not supported for either vendor.

You're argument that features have been removed is hopelessly flawed this time JayDee. The game was developed with NVIDIA in mind (PhysX), and just like PhysX, AA was a feature that had to be added in. And unfortunatlly, ATI and NVIDIA render AA differently, so the AA method that was added is, like PhysX, a feature only for NVIDIA users.



a b U Graphics card
September 7, 2009 4:42:08 AM

So, what about DX10 then? You can argue all you want, these so called problems arent being seen on ATI cards.
Youre sounding like the devs for AC here
"The performance gains seen by players who are currently playing AC with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly."
September 7, 2009 3:05:35 PM

This is just too fishy. Having the line to exclude ati cards specifically would proof they did some testing at least and found out it doesn't work on ati cards. But they exclude every non-nvidia card. That's too fishy.

It's like windows refusing to install other browser than IE, because IE is safe. Safe, how why are you kidding?

This is wrong. People buy a graphics card from 3 companies. It's not a lot. They follow basic rules. Not adding AA's and removing an option to get better perf at a loss of nothing is just what makes Nvidia the bad guy.

They didn't want their old 8800 GTX card to die, so they never supported 10.1. Now they can't be on time with dx11, they have a lot of problems and they try to stay above water with these kind of cheap tricks. I'm sorry, but this is why all the companies before Nvidia and Ati left.

Consumers aren't happy with this kind of BS. They don't want to pay overpriced for a brand. They want good stuff that works. If they want a license, fine. But this is just well jay said it all.
September 7, 2009 3:43:17 PM

gamerk316 said:
Quote:
You know, that may hold water what you say IF
:it didnt work at all on ATI cards WITHOUT the worjaround, but alas, it does
:it didnt work on consoles as well, even those with those "different AA methods", like the xbox, which ummm has what kind of gpu in it?


Making another mistake JD. As Mass Effect, GoW, AA3, all proved, you can force AA in Unreal games (Supersampling?), but to various degrees of success with various performance hits. AA is simply not officially supported in DX9 mode for the Unreal Engine, and requires extra work to put in. Yes, you can force it, but this method is not supported for either vendor.

You're argument that features have been removed is hopelessly flawed this time JayDee. The game was developed with NVIDIA in mind (PhysX), and just like PhysX, AA was a feature that had to be added in. And unfortunatlly, ATI and NVIDIA render AA differently, so the AA method that was added is, like PhysX, a feature only for NVIDIA users.


Even if that is correct, your telling me that is alright?!

Ok how would you feel if there came a alternative fuel that was cheap, environmentally sound, and added a lot of horsepower to your engine, but those benefits only worked in GM vehicles, not Ford. There would be a huge uproar over that and there should be for this too.

A new ATI slogan, TWTGWMBPBNFU or "The way the game was meant to be played before nVida f&%ked it up"
a b U Graphics card
September 8, 2009 1:35:01 AM

^^ I never said it was alright. But the fact remains, AA in the Unreal Engine needs to be specifically added in, and in this case, the devs of Arkahm worked closer with NVIDIA, so NVIDIA card owners get the benifit. Nothing has been "removed", AA just never get officially added for ATI users to begin with.

As I've said before, it comes down to time and money, and in this case, considering how much closer companies work with NVIDIA, its no shock NVIDIA cards get extra benifits from the devs. Weather thats right or wrong is an entirly different debate.
September 8, 2009 2:23:04 AM

DirectX 9 supports AA directly... where did you get the idea it didn't?

there is a struct you pass in C++ to create a D3D device:

struct D3DPRESENT_PARAMETERS

and two of these items are (as well as back buffers etc):
D3DMULTISAMPLE_TYPE MultiSampleType;
DWORD MultiSampleQuality;

Which we should all recognise as the items for AA. You dont need to do different coding for ATi vs nVidia in DX9+, in fact its very little coding at all...

If the Unreal engine doesnt nativley support this then its an oversight... however i suspect you are quite off base and that it actually DOES and its more like the developers generally dont add an option to set it in their video options screen. (as all interfaces are done by the end-dev obviously)
September 8, 2009 2:26:18 AM

Oh and just to be clear:

You set the D3DMULTISAMPLE_TYPE flag to D3DMULTISAMPLE_NONMASKABLE
and then set MultiSampleQuality to your desired AA quality (checking the device support so you dont crash the app of course! heh)

It's 0 based as most stuff is in coding... so

MultiSampleQuality = 3; // set to 4x AA

Thats really all there is to it... and its DX9 C++ code... managed directx in C# can do the exact same thing with just as little code.
September 8, 2009 3:32:17 AM

I even went so far as to look up the code for the different types of extended AA just incase THATS what you were on about (transparency AA for nvidia and adaptive AA for ATi)

EDIT: Remember ALL cards support the standard AA type anyway and this is just to use their repective extended versions instead

Here is a rough paste of code from nVidia's source:

Using nVida Transparency AA
------------------------------------

g_bATOCSupport = (pd3d->CheckDeviceFormat(D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL, D3DFMT_X8R8G8B8, 0, D3DRTYPE_SURFACE, (D3DFORMAT)MAKEFOURCC('A', 'T', 'O', 'C')) == S_OK);
// here we can add another line to check for ATi adaptive AA
// Of course defaulting to standard AA with one more line...

....

D3DFORMAT AAFmt = D3DFMT_UNKNOWN;
....

// if statement for AA type... 2 lines of code each here
AAFmt = (D3DFORMAT)MAKEFOURCC('A', 'T', 'O', 'C');
fAlphaScale = (float)(g_HUD.GetSlider(IDC_ALPHASCALE)->GetValue() / 1000.0);
// else 2 more
// else default 2

....

g_pRenderer->RenderAlphaTested(AAFmt, g_HUD.GetCheckBox(IDC_ALPHATEST)->GetChecked(), g_HUD.GetCheckBox(IDC_ALPHABLENDING)->GetChecked(), fAlphaScale, myTime, bCloseUp);


So about 6 odd lines of extra code to support *all* AA types Standard, nVidia, ATi... you really think this is a time/cost issue? Most code we devs have that we re-use and especially 3D engines we buy already support all this; so you would have to delete code generally to stop ati AA working (no-one in their right mind recodes everything from scratch each time on such huge projects)

(code directly from a nVidia source code example from their site: http://developer.nvidia.com/object/transparency_aa.html)


And the last issue is DX9 and AA+HDR, how the card supported HDR setting does not work with AA enabled... however this is a problem for coders to solve and effects ALL cards. You'll notice Half-Life 2 solved this through some coding (custom pixel shader) which actually allows cards without HDR support to display HDR scenes... nice :)  this would likly be the reason Unreal Engine 3.0 does not display AA (on BOTH nVidia and ATI) as it turns off AA when you enable HDR.
EDIT: just to be clear as i didn't mention it specifcally: this issue only effected older video cards, new ones handle it fine.

So in conclusion....

There is no reason at all for nVidia to be running AA and ATi not. You do not code the AA routines yourself you simply switch them on.

So this is terribly fishy and sounds like a payout from nVidia.

We might see it working in a patch with some excuse soon perhaps however claiming it was a bug or some such.
a b U Graphics card
September 8, 2009 7:42:49 AM

Yea, using AA on ATI on this game may be harmful to your health j/k

If the devs omitted this for ATI only, thisll be a first, or, if its nVidia influence, its certainly not a first
a b U Graphics card
September 8, 2009 1:43:01 PM

Quote:
And the last issue is DX9 and AA+HDR, how the card supported HDR setting does not work with AA enabled... however this is a problem for coders to solve and effects ALL cards. You'll notice Half-Life 2 solved this through some coding (custom pixel shader) which actually allows cards without HDR support to display HDR scenes... nice this would likly be the reason Unreal Engine 3.0 does not display AA (on BOTH nVidia and ATI) as it turns off AA when you enable HDR.
EDIT: just to be clear as i didn't mention it specifcally: this issue only effected older video cards, new ones handle it fine.


But you ran into the issue that needs to be examined: "this issue only effected older video cards, new ones handle it fine"

Here's the problem: You now have to manually input which cards support HDR + AA in DX9 mode, and which ones do not, which is NOT how devs want to code (imagine the hell if a driver update fixes the issue...). Hence why this feature was likely removed from the Unreal Engine entirely, as HDR + AA in DX9 mode is not supported nativly on all capable cards.

You see, if you add AA as an option for the engine, you need to disable AA if HDR is enabled, or create a custom shader to deal with the problem. The Unreal guys took another route, and removed AA support entirly. Now, what I assume happened here is that the devs for Arkahm created a custum pixel shader that allows for both HDR and AA for all NVIDIA DX9 cards. Now, what happens if this customized shader has problems on ATI hardware?

You see the issue? ITs all conjecture. But the fact remains: AA is not native to the Unreal3 Engine, so nothing has been removed for ATI users. And considering how NVIDIA and devs have long been close, its no shock NVIDIA users get some extra benifit in a large majority of games.
September 9, 2009 9:21:26 AM

gamerk316 said:
Quote:
And the last issue is DX9 and AA+HDR, how the card supported HDR setting does not work with AA enabled... however this is a problem for coders to solve and effects ALL cards. You'll notice Half-Life 2 solved this through some coding (custom pixel shader) which actually allows cards without HDR support to display HDR scenes... nice this would likly be the reason Unreal Engine 3.0 does not display AA (on BOTH nVidia and ATI) as it turns off AA when you enable HDR.
EDIT: just to be clear as i didn't mention it specifcally: this issue only effected older video cards, new ones handle it fine.


But you ran into the issue that needs to be examined: "this issue only effected older video cards, new ones handle it fine"

Here's the problem: You now have to manually input which cards support HDR + AA in DX9 mode, and which ones do not, which is NOT how devs want to code (imagine the hell if a driver update fixes the issue...). Hence why this feature was likely removed from the Unreal Engine entirely, as HDR + AA in DX9 mode is not supported nativly on all capable cards.

You see, if you add AA as an option for the engine, you need to disable AA if HDR is enabled, or create a custom shader to deal with the problem. The Unreal guys took another route, and removed AA support entirly. Now, what I assume happened here is that the devs for Arkahm created a custum pixel shader that allows for both HDR and AA for all NVIDIA DX9 cards. Now, what happens if this customized shader has problems on ATI hardware?

You see the issue? ITs all conjecture. But the fact remains: AA is not native to the Unreal3 Engine, so nothing has been removed for ATI users. And considering how NVIDIA and devs have long been close, its no shock NVIDIA users get some extra benifit in a large majority of games.


You assume WAY too much :p  where in the world do you get all this?

<snip>

Writing a custom shader works in both ATi and nVidia so claming there is an issue if they did so isn't going to cut it.

The math is the same, there is no want/need to code two completely seperate shaders for each... in fact you DONT code completely different 3D engines/shaders/maps/models etc for each manufactuers cards! ... that would be HELL *cry* the whole point of DX is to manage it all as a single API... thats the point of it! it boggles the mind to think of having to code shaders to each manufacturer... my poor brain.

The dev doesn't have to manage anything, thats the POINT of an engine. If you had to do all this yourself you wouldn't buy one would you ;) 

Secondly,

"You see, if you add AA as an option for the engine, you need to disable AA if HDR is enabled, or create a custom shader to deal with the problem"

No you do not, do you do much coding? doesn't show :)  you do this:
You enable HDR
You query support of HDR+AA, if yes enable AA
else... do nothing OR flag set for no AA if needed OR use a custom shader (that works in BOTH without needing to write two completely different ones)

Coding hassle? no... dev code? maybe one custom shader... thats it if they really want to care, most dont :)  i wouldn't (and dont) bother with it.


EDIT: i found an interview finally :)  that stated DX9 Unreal Engine 3.0 doesn't support it but DX10 Unreal Engine 3.0 does support it.

So it seems more and more likly that its either:

a) A shader was implemented that was simply not appied to ATi people (no reason for this)
b) There is a DX10 exe of Batman AA and the ATi people simply weren't able to run it? (this happens sometimes) but i dont own the game to tell. (eg Hellgate london has 2 versions and AOC does too, your support decides on the exe to load)
c) Something else we havn't thought of?

You've still yet to say anything that comes close to convincing me this is anything but the dev/nvidia swapping some cash about (unless its b lol)
September 9, 2009 11:55:16 AM

hehe well there is no malice, bad intent or bad feelings here... we are just discussing and sharing ideas... so its all good from where i stand :) 

And im happy to be proven wrong, and its happened b 4! none us are perfect and none of us know it all... least of all me.... but i'll make him work for it ;) 
a b U Graphics card
September 9, 2009 12:30:00 PM

Quote:

The math is the same, there is no want/need to code two completely seperate shaders for each... in fact you DONT code completely different 3D engines/shaders/maps/models etc for each manufactuers cards! ... that would be HELL *cry* the whole point of DX is to manage it all as a single API... thats the point of it! it boggles the mind to think of having to code shaders to each manufacturer... my poor brain.


But we're not talking about DX, we're talking about two seperate AA methods that are not nativly supported by the engine in question. Sure, the DX standard is fully supported by both vendors cards, but if the engine itself doesn't have 'x' feature, that support goes to waste.

Quote:

The dev doesn't have to manage anything, thats the POINT of an engine. If you had to do all this yourself you wouldn't buy one would you ;) 


Devs have to manage quite a bit actually. Engines provide a base set of abilities, but also many limitations that need to be worked around. Its up to the devs to make the most of whatever the engine provides.

Quote:

Secondly,

"You see, if you add AA as an option for the engine, you need to disable AA if HDR is enabled, or create a custom shader to deal with the problem"

No you do not, do you do much coding? doesn't show :)  you do this:
You enable HDR
You query support of HDR+AA, if yes enable AA
else... do nothing OR flag set for no AA if needed OR use a custom shader (that works in BOTH without needing to write two completely different ones)



Except the engine itself lacks AA support in DX9 mode. Its not a switch the devs can simply turn on; its simply not there, period. Yes, it makes the most sense to simply un-grey out AA if HDR is off in DX9 mode (or the card supports both at once), but AA itself is not present by default for the DX9 mode of the Unreal Engine. Its a limitation of the engine itself, and not an implementation issue.


Quote:

EDIT: i found an interview finally :)  that stated DX9 Unreal Engine 3.0 doesn't support it but DX10 Unreal Engine 3.0 does support it.

So it seems more and more likly that its either:

a) A shader was implemented that was simply not appied to ATi people (no reason for this)
b) There is a DX10 exe of Batman AA and the ATi people simply weren't able to run it? (this happens sometimes) but i dont own the game to tell. (eg Hellgate london has 2 versions and AOC does too, your support decides on the exe to load)
c) Something else we havn't thought of?


Probably something we didn't think of in all honesty. Theres no seperate EXE for DX10, so its not B.


My point is a simple one: The Unreal Engine lacks native AA support for DX9 mode, period. Other Unreal games (GoW, Mass Effect, etc.) lack AA as well, remember? Theres no magic switch that suddenly makes it avaliable for use, so it has to be added by developers.

As both NVIDIA and ATI use different methods to achieve the same effect, two different techniques would have to be implemented to support each one (one method would not work for the other). In this case, due in part to working closely with the devs, NVIDIA's AA got implemented, and ATI's did not (explaining lack of ATI AA in DX9 mode).

I have less of an answer for lack of ATI AA for DX10 mode, which theoretically supports AA in the Unreal Engine. Maybe the mismatch of DX9 + DX10 causes a compatability problem with AA in the engine? (Eg: DX9 Shaders and DX10 HDR)
a b U Graphics card
September 9, 2009 1:10:50 PM

Then it wouldnt work either way in DX10, but yet it does, with flags, and it does with workarounds on ATI.
September 9, 2009 2:16:02 PM

"But we're not talking about DX"

Yes we are, its the interface to the driver you HAVE to talk to (i've made basic D3D apps obviously):
{ATi or nVidia Driver} <-> {DirectX} <-> {Unreal Engine 3.0} <-> {Dev App}

"Devs have to manage quite a bit actually. Engines provide a base set of abilities, but also many limitations that need to be worked around. Its up to the devs to make the most of whatever the engine provides. "

Devs only manage the code of running the game unless there is something specific to make that isn't supported yes, unreal engine 3.0 provides almost everything you need as is listed here:
http://www.unrealtechnology.com/features.php?ref=techno...
and for rendering here:
http://www.unrealtechnology.com/features.php?ref=render...

So i'm not sure this comment gives any information of any kind? You just said what i did just then... it does stuff and what it doesn't you have to do? Same as with every piece of software on the planet? this in no way proves or helps any side of the discussion.

"Except the engine itself lacks AA support in DX9 mode. Its not a switch the devs can simply turn on"

This is where you are directly incorrect. If you have access to the D3D device (which every engine i know of exposes it for the NEEDED ability to check support of specific flags and other extended options) then you simply run the AA flag switch on the UE3 exposes D3D device... once again a few lines of code. I've pasted them in previous posts.

The fact UE3 doesn't let you use AA in DX9 with HDR is not some huge mass of code that was left out and unimplemented. My first few posts prove the code to switch on AA is rediculously trivial for ALL types of AA, i think we can agree on that? i got the code direct from nvidia even! what else could i possibly show you to prove it other than the actual code to do it? in C++ even. we should be able to see thats a given.

"As both NVIDIA and ATI use different methods to achieve the same effect"

Not techincally true, MSAA is support as a single standard on both. They both have standard MSAA support and each has an extended AA set.

"As both NVIDIA and ATI use different methods to achieve the same effect, two different techniques would have to be implemented to support each one (one method would not work for the other"

This leads me to believe you havn't read the code i pasted at all. I proved as a 'fact' as you put it... that ALL types of AA (Standard and nVidia/ATi extended) are turned on by a switch... thats how its done... it IS a switch you just flick on ;)  the drivers themselves and the cards do the hard work....not your code.


The orignal issue was nVidia working with AA and ATi not working with AA... this means its been switched on or a shader was implemented.

and for 1) If you can switch it on for one, you can switch it on for the other...this is a 'fact'

and 2) If you write a custom HDR shader so AA can be left on and switch it on... you can do it with ATi without writing 2 full custom versions... this is also a 'fact'

Maybe thats where you have it wrong? You do not write a shader to perform AA you write a shader for HDR, the engine then sets the D3D Device to flag/init as just AA with no HDR support and executes a custom shader for the HDR to add it back in. (again if the engine says no i wont let you set that with my standardly exposes methods... you set them yourself with the D3DDevice)

Even if you wrote a shader for AA and left the HDR instead you would have to write the routines yourself however you wanted and it would be compatible with BOTH ATi / nVidia and it would not be any of the 3 listed... it would be your OWN special AA routine. BatMan AA: AA heh.. you could make similar to MSAA... or similar to one of the other extended methods... or it could be a new really nice effect not available anywhere :) 
EDIT: just fyi the reason this is not the preffered option is because drivers and hardware are very well tuned to how standard/extended AA is performed and rewriting these is going to perform.... less than perfectly? :)  to put it mildly.

So i re-ask you... :)  what exactly do you think is going on here? As a professional developer (which just means i do it for a living heh) i seriously find this fishy as all hell.

No doubt they will throw up some excuse to keep everyone happy... maybe with luck there will be a patch? Likly they just wont care and leave it i suspect tho :) 
September 9, 2009 2:23:59 PM

JAYDEEJOHN said:
Then it wouldnt work either way in DX10, but yet it does, with flags, and it does with workarounds on ATI.


Right!, and the fact you can do simple workarounds and it works for ATi shows you its a trivial thing to fix... these workarounds dont require you to recode UE3 do they ;) , the code i pasted and explained as best i could also shows at a raw C++ code level that its a trival thing to fix without any engine of any kind ( might need to be a coder to know what i'm talking about exactly perhaps but still i kept it simple as i could! :)  )

And even with an engine that doesn't support it eg UE3 you can enable flags anyway! so there is no reason that i can see unless the devs are thinking a few lines of code is "too much work and money" ??

heh is this what development companies have come to? :) 

EDIT: i didn't mean to type 'you' there... ops sorry heh
a c 271 U Graphics card
a c 171 Î Nvidia
September 9, 2009 2:35:06 PM

Again I ask, have any ATi users tried this game and if so does AA work or not? Until then this is just an unsubstantiated load of cobblers.
September 9, 2009 3:48:44 PM



Nice thx for the link :) 

It does bring up a nice point however... driver profiles for games. Perhaps ATi might indeed release a profile to force it working for this game; as the driver is the most core point of the process. Maybe they will hear the calls of the users and release a profile for it rather than people having to go nuts to get it looking how they want.

Heres to hoping!

Now Dragon Age... thats a game that better be looking and performing nice... or i'm gonna be a sad panda.
a b U Graphics card
September 9, 2009 4:44:55 PM

Mousemonkey said:
Again I ask, have any ATi users tried this game and if so does AA work or not? Until then this is just an unsubstantiated load of cobblers.


You could probably force AA on all textures through drivers, but there are situations where this method leads to various display issues. Even then, the best you could probably force is Multisampling, instead of Transparency/(I forgot what ATI uses...) antialising, which is slightly slower for the same quality.

Personally, I wish they'd just use the HQ2x/HQ4x filters. Free for use and, frankly, better results then standard AA.
a c 271 U Graphics card
a c 171 Î Nvidia
September 9, 2009 4:56:07 PM

Looks like ATi should have got their cheque book out and made a contribution to the dev's beer fund, but they didn't so sod 'em.
!