When will nVidia get it?

OK, its been mentioned before, and talked about some, but heres a lil clarification.
Batman AA, which is the newest currently touted TWIMTBP'd game by nVidia, has Physx in it, and theyve just recently locked out anything that isnt nVidia away from using Physx.
Thats exclusivity by exclusion, not just being better.
Now, people have asked, why doesnt ATI just use Physx license and pay the fees etc.
Well, you have to be able to use CUDA first to use Physx, instead of Opencl, so, its even proprietary for just getting to use the license.
Now, wed all like to actually have working, real, make a difference in the game physics in our games, but requiring all this is just too much.
So then we have to ask "why doesnt ATI just do this? Theyve nothing to lose a but license fee, and heres where its simply much more than that.
Batman AA has a code lockout in the game, no not just for Physx, bur the usage of AA, so, if you dont own an nVidia card, you cant use AA.
OK, so, surely it must be that ATI hasnt come to the table for this game, and set the game devs straight as to how to make AA work on their cards you may ask?

Heres the problem. In the code from Batman, youll find this line:

If (System.GPU.IsMadeByNvidia() == false)
GameOptions.AASupport = false;

Now, who do you think wanted this put there? And again, ask yourself, should ATI or Intel in the future have anything to do with nVidias proprietary hoodwinking?
If nVidia ever truly wanted Physx and CUDA to be a universally used set of codes, used thru licensing, it would never do this.
We dont see this being done with Havok, which is owned by Intel, as theyre playing fair.

As for me, Im tired of nVidia and their antics, and I know some people say it shouldnt matter, just get the best card you can, but this is a lawless scenario weve been given by nVidia, and I think its time they pay us back
 

JeanLuc

Distinguished
Oct 21, 2002
979
0
18,990
Well Nvidia lost over $200 million in Q1 of this year and still haven't settled on outstanding law suits on the faulty 8000 graphics chipsets so it's going to stay bad for a while before it gets better.
 
Just because that code is in the game doesn't necessarily mean that without the code AA would work fine on ATI cards. They could simply know that attempting to use AA with an ATI card in the game will cause issues so they have a check for it. Of course that may not be the case and even if it is that they were willing to delay the game to add PhysX but can't bother to make something like AA work with an ATI card is pretty lame.
 

wh3resmycar

Distinguished


i think they should. sponsorship deals to counter nvidia's marketing (along with "best played with intel blah blah)" is a pretty smart move. and can someone throw a rock @ amd for not doing so?

one thing i'll never get though, is how does the AMD's sponsorship deal with Scuderia Ferrari helps the consumers and developers alike? anyone? apart from filling kimi raikonnens pocket.
 

Or that line just stops the option 'Nvidia(TM) Multisampling AA' from being displayed on the settings menu and you will have to do AA through CCC.
 
If you think physics is being shut down why not mess with the code a little...

If (System.GPU.IsMadeByNvidia() == true)
GameOptions.AASupport = true;

Or sometimes a simple // in front of certain lines will null that part of the code.

I haven't noticed anything so far as far as physics/havoc physics not working on either brand card. I wouldn't go out of my way for the game mentioned above though so I have no way of testing that.

Also, even though somebody mentioned once that it wouldn't work for Vista, you CAN run an ATI card for main graphics and an Nvidia card for physics without problems. ( install physics driver only.... but Vista may find it for you )... I've done it on a Gygabyte board.

I can understand the gripe because if it's true it will effect all players eventually. But the way things are going Nvidia should walk a tight rope because they're heading in a downward spiral. My next machine will be AMD ( 6core ) and ATI graphics exclusive...... sometime in late 2010. That should set me for a few years unless I get bored...... AHEM.
 

Not even possible. You have tp use a workaround, as Ive said.
Essentually, you have to trick the game into thinking its a nVidia card.
If people find this ok, maybe AMD should just start locki9ng out nVidia cards being used in their cpu coding, or, if they pull this when LRB arrives, which they will if this particular thing doesnt get changed, Intel also should just lock them out, seems fair.
nVidia on VIA anyone?
 
Think of it this way.
If anyone thinks what nVidia is doing here is ok and legit, wait til Intell arrives, and sees how its done.
Who do you think has more money for exclusive influence?
I dont care if nVidia created Physx,CUDA and made AA happen in this game, theyll lose at this game big time to Intel, if thats how they want to play it.
I bet before Intel arrives with LRB, this kind of crap comes to an end, and good riddance
 

Annisman

Distinguished
May 5, 2007
1,751
0
19,810
I know it could mean higher prices from less competition, but I hope Intel EATS NVIDIA ALIVE. Let's see here, Intel Dwarfs Nvidia and ATI, for the past 3-4 years they have had by far the best enthusiast CPUs, they are on a win streak right now and cannot be stopped.

Also, remember that Physx and AA are not the only thing getting the shaft here for Ati users, when Physx is 'not available' actual visual parts of the game become missing. Like fog, certain cloth objects and tiles etc. You're telling me that my Core i7 at 4.0Ghz can't and shouldn't be able to do some physic processing on it's own ?? Even if the performance hit is larger.

Nvidia is crippling other people's hardware, does anybody here think that people are going to say "gee, I can't use AA, or Physx, and parts of the visuals are missing, let me for fork over 600$ to Nvidia so I can have that!"

No way, people are going to get pissed.

It's like buying a car from Chevy that gets 30MPG, and Ford has found a clever way to make your car get only 20MPG.

As consumers, when we buy products we expect a full return. When titles in general come out without AA or widescreen support etc. we are NOT happy about it and we let the developers know about it, the Hardware companies are supposed to be the ones helping the software companies makes games that people can run and run well. When I purchase 2 5870's in Sept-October for Crossfire it will literally chew this batman game to pieces, It's just too bad I will be getting 100+ Fps AND HAVE JAGGIES ALL OVER THE PLACE AND PARTS OF THE VISUALS MISSING!!!!

Can you even imagine if a game like Crysis did something like this, or Far Cry 2 ??? Some of us pay thousands of dollars every year on their computers, for the very best gaming experience. When a hardware company deliberately takes that AWAY from us there should be an uproar. Physx is fine, good for Nvidia, good for their users, but disabling AA ??? Really?


this is 2009 and this BS needs to stop.
 
Wrong Jaydee. Remember people. ATI and NVIDIA perform AA calculations differently, so its two seperate methods that need to be implemented. Also remember, the Unreal Engine, which Arkham uses, does NOT nativly support AA, so it needs to be added in.

What you are really complaining about is how extra features were added to the engine for NVIDIA, and no extra features were added for ATI. Nothing has been 'disabled', as the Unreal Engine can not nativly do AA to begin with!

Now, if this was an engine that supported AA, you would have a point. But fact is, NVIDIA works closer to devs then ATI does (for whatever reason, thats a whole nother discussion), so its no shock NVIDIA ends up with more features added in.
 

Annisman

Distinguished
May 5, 2007
1,751
0
19,810


Why do you make exscuses for them, why?

It might not be 'disabling' so to speak, but think of it this way.

Your parents give your brother 1000$ dollars, and tell him to go have a blast.

You don't get anything.... ah, they never took money FROM you..... but how does that feel ???
 

You know, that may hold water what you say IF
:it didnt work at all on ATI cards WITHOUT the worjaround, but alas, it does
:it didnt work on consoles as well, even those with those "different AA methods", like the xbox, which ummm has what kind of gpu in it?

I think anyone defending this exposes themselves somewhat.
Counter arguments go
:since nVidia hasnt brought anything to with pc interupts, then maybe both Intel and AMD should do ID checks, and just boot nVidia for not adding anything there as well?


It does work on ATI cards, thats the point, and a weak suggestion by you.
But, all you have to do for pc anyways, since it works on ATI parts on console is, workaround the ID check
 
Remember the title of this thread?
nVidia just doesnt get it.
They come off as the bad guy here, plain and simple.
What wouldbt happened if they promoted doing this for all users, much like supposedly physx is supposedly for as well?
No, they make claims, then theyre greedy.
If theyd just done this for the benefit of gaming in general, great.
Doesnt this go against the nVidia conferences to PROMOTE the usage of gpus?
Hasnt everything theyve dont whether it be no DX10.1, the Assassins creed debacle, physx, now this go exactly against that promotion of gpu usage?
Fail for nVidia here
 
To me, a leader doesnt have to do petty things such as this, and if nVidia is supposed to be the gfx leader, no wonder its in the state its in. This gets old real fast, and this isnt like getting caught with a perf failure.
No, its a deliberate attempt for exclusivity by the supposed leader in gfx.
A leader doesnt seek theirs first, then everything else later, because a true leader will just get theirs, regardless.
Havnt we learned that more cooperation is a much better scenario inthese economic times, and what the "me first" mentality has shown what can happen? I dont want to go too far off topic here, but it does apply here.
PC gaming needs every boost it can get, not this crap
 

amnotanoobie

Distinguished
Aug 27, 2006
1,493
0
19,360

Here's an idea, has anyone emailed the makers of Arkham Asylum why AA is not possible on their ATi hardware?

I don't like to bash other people/corps without hearing what they have to say about it first, maybe they have a valid reason (lazy QA/no budget for QA/haven't tested throughly with ATi configs/etc) but at least give them the chance to explain themselves. After they reply and if the reply is BS then you have every right to bash them.


* I'd bet you'd be singing a different tune if you were the developer who put in that line and knows that that there is a possibility with AA going wrong with ATi, and then read such comments here.
 

And what if we get a response like before on Assassins Creed? :
"The performance gains seen by players who are currently playing AC with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly."
http://www.anandtech.com/showdoc.aspx?i=3320&p=6
Thats exactly what DX10.1 does, is remove that pass, thus the perf gains.
What makes me, you, or anyone else think that the response would be different?
Theres still alot of people who simply dont believe the AC thing went down as it did, so whats a statement going to prove?
Im of the notion, some people will believe what they want to believe, if theyre so inclined, regardless of the facts.
Just like the DX10 debacle with nVidia and M$, some people dont understand that DX10.1 was actually part of the original DX10, but was delayed.
Some people still think weve never landed on the moon also, so no, these "proofs" have done lil in the past to actually prove anything, as I said, because some are either too greedy, too thick or too much a blinded fan