Sign in with
Sign up | Sign in
Your question
Closed

Another TWIMTBP LowBlow

Last response: in Graphics & Displays
Share
October 31, 2009 5:03:23 AM

Wow ok this is really, REALLY gon too far.

http://www.pcgameshardware.com/aid,698462/Borderlands-t...

Most nvidiots will now come and say that ATI should just have a TWIMTBP thing also. But no excuse me this is FAR over "compatibility optimizing".
This is REALLY a F****** lowblow.

Cmon people, In what world can a 8800GT/9800GT beat a HD4850?? But not just beat it, it actually rapes it. 13 FPS difference between these two... and it even beat a HD4870?? Seriously youv gotta be sh*tting me. A GTX260 > HD5870?????? Cmon, obvious trap is obvious.

We arent talking about optimizing a game or whatsoever anymore. The engine is Unreal engine 3 (Version 2, or modified). Cmon.
This only prooves that nvidia has nothing to back them up (Yes nvidiots Im talking about Failmi).
The only ace they have left is to steal games, making them unplayable on ATI cards.
This is by far why I will NEVER buy nvidia.

We arent talking about more powerfull cards anymore.
Nvidia is now becoming a serious b!tch, these are EXTREMLY low blows (These as in -> Batman AA).

Like at least they couldv made the thing less obvious, but cmon a fkin GTX260 owning a HD5870? this only shows that nvidia is REALLY stabbing ATI with lowblows.

More about : twimtbp lowblow

a b U Graphics card
October 31, 2009 5:33:32 AM

That's some serious whoopass.
a c 144 U Graphics card
October 31, 2009 6:15:01 AM

lol what back stabbing? what did ATI agreed with Nvidia and then get back stab because of it? btw this is what people called as "business world" :lol: 
October 31, 2009 6:47:05 AM

What is a joke? o.o
Sorry I just forgot the 3 LOL
pure typo.
And yeah I should change the Back stabbing...sorry guys but english aint my native language ya know XD
October 31, 2009 8:19:34 AM

I hope AMD wont resort to TWIMTB Payed tactics to screw opposition, its not healthy for the customers.
October 31, 2009 8:22:57 AM

I smell sarcasm 8D
October 31, 2009 9:49:30 AM

LOL just realized it has the same Antialiasing problem as Batman AA.
So THIS is nvidia's REAL answer to HD5800? XD
October 31, 2009 1:37:00 PM

Maybe boycotting these games might help or hacking the games to enable the AA for ATI video cards. :D 

NVIDIA thinks that they can stop people from buying Radeon HD 5000 series by doing these anti-competitive practices. Somebody really need to sue NVIDIA for this illegal cowardly anti-competitive practice. It is like illegal corruption; you pay to game developers and tell them not to enable AA option for ATI. :ouch: 
October 31, 2009 2:12:49 PM

This is what you called.. BIASED benchmark for the Dummies and alike.

Dummies and alike?.. Yes.. Nvidia can only show how good their video cards to the people who aren't smart enough to notice nVidia's
stupidity when it comes to these matters (im not generalizing though) that nVidia can't do nothing but to suffer and look on
ATI's success on their 5000 series video cards WHICH by the time they release their GT300 cards IMO, they will have a hard time
regaining their huge losses in terms of market shares and profits most of all.
And at the same time sticking to their proprietary technologies like cuda and physx which soon will be obliterated as new games and
applications now widely supports non-proprietary technologies that rivals Nvidia's offerings and finding it more beneficial not just only
for the game and other developers but and to the greater sake of the consumers.


-- Im not saying this to favor ATi or anything but, only time will tell if those GT300 video cards would bring nVidia's glory back but i doubt. IMO,
by the time nVidia releases their very high end video card that rivals ATi's 5870, ATi will release their X2 5000 series video cards that
will beat their GT 300 high end ones and if nVidia would release their dual GPU cards that rivals ATi's dual gpu high end ones, ATi will slam
their price down to the point where nVidia cant compete because of the high costs of manufacturing those GT 300 video cards
and the high cost of developing, revision and improving their proprietary technologies.
a c 130 U Graphics card
October 31, 2009 2:40:53 PM

This is...amazing.

I smell desperation among fanbois!
a b U Graphics card
October 31, 2009 2:45:22 PM

bboynatural said:
LOL just realized it has the same Antialiasing problem as Batman AA.


Well suppose-eh-bleeh, AA doesn't work for nVidia in this title either, so not exactly the same, but it does look fishy. :heink: 

Quote:
So THIS is nvidia's REAL answer to HD5800? XD


Well... it's cheaper to pay to cripple the game for others than to build a competitive product. :whistle: 
October 31, 2009 2:56:38 PM

How close to the truth is this response from the linked page by the OP regarding some of the coding for TWIMTBP games, crysis is used as an example. Its on the comments sections regarding the article and is posted by a guest so I can't give personal credit to the poster.


"asked AMD to help them with enabling AA on their cards "

Probably bullshit from them, it is more like "give us free high-end ATI hardwares or large sum of money, and we will enable AA for your cards"

Since they AMD did not help them: "Okay fine, we will disable some performance and functions when an ATI is detected, HA!!!"

Just a joke, but I have gone through (or uncovered) some games that have settings that do not give the best performance to ATI hardwares. Crysis, for example, has XML settings that if "corrected", gives more performance to ATI cards. It can be found in the NGOHQ website somewhere.

Anyone heard of this? Ape, you're probably one of the more informed people here is there any truth to that or other games such as Batman ect?
a b U Graphics card
October 31, 2009 3:34:12 PM

Here's the issue he's talking about which does seem to be underhanded, but what they are doing is making nV's AA their lower-quality Quincux AA instead of the equivalent full MSAA which is a floptimizaion that makes nVidia players think they are getting the same AA level as ATi cards, but they are actually getting less, until they enable the 16Xq or hgiher option;

http://www.ngohq.com/skds-corner/14519-ati-struggle-in-...

You can achieve similar IQ between cards, but you just need to know what you're really clicking on.

Optimizations are a tricky situation, because of how different the architecture is, the underlying development can be geared towards one method or another, the most well known being for Doom3 where the development was geared toward the nV architecture, even so far as to have and NV30 path that simply replaced the ARB2 path, errr... was folded into the ARB2 path, and thus the performance was greatly different. But if you tweaked how the cards handled Z, you got a huge boost on the ATi cards which were held up by a preferential method. For a same generation counterpoint you would look at HL2 and their choice of int16 & FP24 (instead of FP16 / FP32 which would've had to be run in FP24 on ATi hardware), now this didn't crippled nV cards, but because they handled FP24 as slow as their FP32, and didn't get the speed doubling of the usual FP16, they were noticeably slower, or else had to drop IQ down to int 16, or else internally dumb down precision to FP16 or what they finally did which was mix FP32 and int16. In both cases it was the best choice for their preferred hardware and wasn't meant to hurt the opposition, but that's what it ended up doing, and no amount of renaming would fix that.
However this is more an issue of game/hardware detection, and while most stories are about disabling optimizations or floptimizations (like FartCry.exe), this is a case where the software maker detects a competitor's card and disables features available to the competition, which is hard to defend if it works on the competition's cards. EIDOS claims it's 'untested', but it doesn't seem to hard to test what essentially most gamers have been doing. It also raises questions of whether it is a compliant game if it intentionally disables features for other IHVs while still claiming full DX compliance.

It's different if ATi, nV or S3 do it as they should do things specific to their hardware, but games should be built to standards for all without optimizations and restrictions built into them.
October 31, 2009 3:41:06 PM

And you KNOW this do you? And youre not jumping to conclusions?

Nvidia OFFER the devs engineers to help OPTIMIZE the games for their cards and drivers...ATI do NOT.

Its like building an engine for a car and expecting it to run perfect right away....which it wont, it needs tweaks & mods etc...then the car would run perfect, without those tweaks it would run like a dog.

This is the same for games....The game engine has had the tweaks and mods to run on certain hardware...which Nvidia have done. ATI didnt...what results would you expect????
October 31, 2009 3:57:41 PM

Looks to me like Nvidia offer devs lot's of $ to sabotage it's completion also.
AA is one of the most basic of features that would run fine on ati cards also without the needs for any developer 'optimizations'!

You really are a naive Nvidiot!
a b U Graphics card
October 31, 2009 4:06:36 PM

smoggy12345 said:
And you KNOW this do you? And youre not jumping to conclusions?


Who are you talking to and about what? 'Cause you're about to put your foot in it.

Quote:
Nvidia OFFER the devs engineers to help OPTIMIZE the games for their cards and drivers...ATI do NOT.


Spoken like an ignorant F'in Fanboi. ATi do, their program is much less effective (which is evident by it commonly being thought that there isn't a similar program to TWIMTBP) but they do offer help and hardware for game development. Use Google to edjumakate yourself and get in the game you knob! [:thegreatgrapeape:6]
a b U Graphics card
October 31, 2009 4:31:55 PM

I have been playing Borderlands for a considerable amount of hours with DirectX 10 enabled in the .ini file (along with other high quality settings not available in-game) on my HD 4890 and I'll be damned if I can ever see any frame stutter at 1920x1080. Maybe it was because I also disabled the TWIMTBP screen? :D 
October 31, 2009 7:16:36 PM

smoggy12345 said:
And you KNOW this do you? And youre not jumping to conclusions?

Nvidia OFFER the devs engineers to help OPTIMIZE the games for their cards and drivers...ATI do NOT.


Yes but take a look at this specific code from Batman Arkham Asylum for instance:

IF device ID=ATI then
{
AA=disable
}



What does this have to do with optimizing the games for NVIDIA cards and drivers??? :o 

This evidence clearly proves anti-competitive practice by NVIDIA's TWIMTBP Program. This code is not necessary at all and it is just like Assassin's Creed's controversial patch that removed DirectX 10.1 to give disadvantage to ATI. It is silly to even believe that removing DirectX 10.1 would optimize the game better. Actually, it is the opposite, DX 10.1 is what boost the extra speed for Assassin's Creed game. So why removing DX 10.1 from Assassin's Creed game and why disabling AA option if the user is using ATI video card???? :ouch: 

I don't mind if NVIDIA pays the game developers to optimize the games but what they did was different and also as anti-competitive practice. :o 

Can we now conclude that NVIDIA was doing anti-competitive practice against ATI??? :heink: 
a c 271 U Graphics card
October 31, 2009 8:05:40 PM

Surely the easiest solution would be for ATi to put some money on the table to have their 'Get In The Game' logo splashed across our screens a lot more often then they can have those games optimised for their cards whilst at the same time locking out any opposition.
a c 271 U Graphics card
October 31, 2009 8:25:29 PM

Quote:
I think maybe hiring someone to stick a gun against the game dev's heads and tell not to greedy, lazy arseholes would be cheaper.

True, just a lot less legal.
October 31, 2009 8:36:55 PM

Techno-boy said:

Can we now conclude that NVIDIA was doing anti-competitive practice against ATI??? :heink: 


IMO. Yes, in anyway you look at it, it seems that some game developers favors Nvidia for extra money earning reasons in return of
game performance favors which is not a good thing to do. The only reason i think why nvidia is doing these type of thing
is for the sake of maintaining their profits and market shares by showing how good their video cards when we already know
that they are giving some game developers money to make their product more superior as compared to ATi but actually
the opposite.

Quote:
AC was claimed to have artifacts in it that they couldn't be arsed fixing so decided to remove it completely, although some people never saw them.


True Dude, nvidia has its own dirty way of manipulating some results and has their own clean get away reasons to back it up.
Clever, but not so clever, coz right now what they got? ATi's success on its 5000 series video cards slammed on nvidia's face big time
and as time goes by nvidia can only do is announce some blah-blah-blah bullshit and publishing superior features of their GT 300 video cards
that no one knows if its true or just a plain bullshit and paying some online tech sites to do some biased benchies on how inferior their
video cards are and even if those features they say is true, those features can also be done and achieve by ATi's non proprietary technologies.

-- Perhaps nVidia doesn't know the saying "Bullshit will get you to the top, but it wont keep you long there"...
a b U Graphics card
November 1, 2009 12:40:55 AM

smoggy12345 said:
Nvidia OFFER the devs engineers to help OPTIMIZE the games for their cards and drivers...ATI do NOT.

Optimisations can only go so far. You can not bring an older generation well ahead of a clearly superior newer one just with a few game tweaks for your own hardware. Would you still be saying this is merely a good optimisation if they threw in a 7900GS and it beat the HD5850?

November 1, 2009 1:32:59 AM

Seriously, that smoggy guy killed it.
Didnt even feel like answering this time.

Were not talking about optimization here sir, were talking about sabotage.
Get over it, your company is the most anti competitive one in the wh0le planet.
November 1, 2009 8:43:33 PM

hey what no one has anything to say?
thats the biggest lowblow in the history of mankind people O.O
a c 271 U Graphics card
November 1, 2009 9:12:42 PM

It's called business, and if I were running Nv I would be a lot more draconian I can assure you.
a b U Graphics card
November 1, 2009 9:57:47 PM

So are the AMD/ATI fanboys going to boycott the 3 or 4 new great games a year ? Is it surprising the 9800 is competitive when , ATI's new great unbelievable 5700 series cards are truly sideways upgrades that offer no extra performance, except for vaporware DX11 , which does not exist in any form. When someone chimes in with power savings and lower heat, i'm going to lol milk out of my nose. Yes over the course of a year, this card will save you .57 cents. I'M GOING TO UPGRADE ~~!!!!
November 1, 2009 10:22:34 PM

notty22 said:
So are the AMD/ATI fanboys going to boycott the 3 or 4 new great games a year ? Is it surprising the 9800 is competitive when , ATI's new great unbelievable 5700 series cards are truly sideways upgrades that offer no extra performance, except for vaporware DX11 , which does not exist in any form. When someone chimes in with power savings and lower heat, i'm going to lol milk out of my nose. Yes over the course of a year, this card will save you .57 cents. I'M GOING TO UPGRADE ~~!!!!

[:mousemonkey:5]

There there (*pats on the back*), things will be ok (maybe) with your beloved Nvidia ;) 
a b U Graphics card
November 1, 2009 10:34:44 PM

notty22 said:
So are the AMD/ATI fanboys going to boycott the 3 or 4 new great games a year ? Is it surprising the 9800 is competitive when , ATI's new great unbelievable 5700 series cards are truly sideways upgrades that offer no extra performance, except for vaporware DX11 , which does not exist in any form. When someone chimes in with power savings and lower heat, i'm going to lol milk out of my nose. Yes over the course of a year, this card will save you .57 cents. I'M GOING TO UPGRADE ~~!!!!

Your point might have been minutely valid if the 5700 series was being discussed, however since we are talking about the 5800 series your post is meaningless.
a b U Graphics card
November 1, 2009 10:40:21 PM

Wake up , I believe ATI and NVDIA as a whole were being discussed and the OP's opening salvo spoke of a nvdia 9800, a 95 dollar card vs what ? Your discussing a 5800 series card that starts at 289.00? Its pretty sad when this board so obviously has AMD fanboy MOD. lol
so keep your digs to yourself
November 1, 2009 10:42:33 PM

notty22 said:
So are the AMD/ATI fanboys going to boycott the 3 or 4 new great games a year ? Is it surprising the 9800 is competitive when , ATI's new great unbelievable 5700 series cards are truly sideways upgrades that offer no extra performance, except for vaporware DX11 , which does not exist in any form. When someone chimes in with power savings and lower heat, i'm going to lol milk out of my nose. Yes over the course of a year, this card will save you .57 cents. I'M GOING TO UPGRADE ~~!!!!


euh... I don't think you understand..
This is IN NO WAY a proof of nvidia's overall technology.
This only PROVES 100% that nvidia is PAYING gaming companies to screw ATI cards.
It only proves that nvidia has no REAL weapon against the awesome new 5800 serie
Cuz if they had, why would they waste even more money trying to make old cards look awesome over ati's latest tech?
And this is not permanent buddy. It happened quite a few times when nvidia was pissed over ATI and had nothing against them. It is generally taken care off after 1 or 2 driver update (1 or 2 months) and the REAL benchies will show how nvidia is nothing but a b!tch company.

And talking about the HD5770. Are you fkin kidding me? You come b!tch at a card that has 100% scales in DualGPU (against the HD4870 its replacing who had ~70%), give Eyeinfinity support, DX 11 AND Ati Stream?
Whats nvidia most known marketing technique. Isnt it renaming their cards? Why the fk would I buy the 9800, when its nothing but a 8800. With NO "vaporware" upgrade. This is what I hate about nvidiots. So much ego just because they have "the best performances" (also known as the best sabotage) for 200$ more.
Just to add, the HD5770 give AT NATIVE DRIVER the same performance of the HD4870 WHO IS ON ITS LATEST DRIVER. So keep it low a bit, the Driver updates of ATI are 1000x times better then nvidia's. 9.8 vs 195 driver versions.
AND the price of the HD5770 will drop as soon as HD4870 are EOL, wich means pretty soon. So for you, being incompetitive and rebranding the same fkin card is ok, but trying to save some money, just until your old brand is EOL WHILE giving some REAL upgrades (once again, tested on native driver) is OMFG THE WORST BLASPHEMY IN THE HISTORY OF MANKIND.

You make me sick.
November 1, 2009 10:47:37 PM

notty22 said:
Wake up , I believe ATI and NVDIA as a whole were being discussed and the OP's opening salvo spoke of a nvdia 9800, a 95 dollar card vs what ? Your discussing a 5800 series card that starts at 289.00? Its pretty sad when this board so obviously has AMD fanboy MOD. lol
so keep your digs to yourself


The mods your talking about are always against me whenever I open a thread protecting AMD, so shut it right now.
The fact that this time they are taking Ati sides only proves that nvidia really was a big b!tch this time.
Face it, your the only one saying nvidia is right. Just looking at your avatar, I can see what type of idiot you are.
a b U Graphics card
November 1, 2009 10:49:23 PM

The gaming industry is a losing money hand over foot right now. No one has the money to do complex engineering on gamer cards to cater to the ultimate percentile pc enthusiast. Its just business. Thats why consoles are being sold by the millions and games are being developed for them and ported to the pc. How many games a year, that are truly niche to stress hardware? 2, 3 ? When did Crysis come out ? There is no company conspiracy.
http://www.pcworld.com/article/174017/the_sorry_state_o...

If i had the dough I would buy a 5850 right now myself.
a b U Graphics card
November 1, 2009 10:51:24 PM

notty22 said:
Its pretty sad when this board so obviously has AMD fanboy MOD. lol
so keep your digs to yourself

Yes, that is why I have a GTX275. Oh whoops, I didn't mean to shut you down a second time, honest.
a b U Graphics card
November 1, 2009 10:55:37 PM

notty22 said:
The gaming industry is a losing money hand over foot right now. No one has the money to do complex engineering on gamer cards to cater to the ultimate percentile pc enthusiast. Its just business. Thats why consoles are being sold by the millions and games are being developed for them and ported to the pc. How many games a year, that are truly niche to stress hardware? 2, 3 ? When did Crysis come out ? There is no company conspiracy.
http://www.pcworld.com/article/174017/the_sorry_state_o...

If i had the dough I would buy a 5850 right now myself.


What is with people thinking games are made for consoles and THEN ported to the PC? They're made for the PC and THEN ported to the CONSOLES. Just because they're released for the consoles earlier does not mean they're made for them. I hate to keep using this but Borderlands is a prime example, it came out for Xbox 360 a week earlier even though it was ported. It costs $60 USD on the Xbox and about $40 (lower if you get a four-pack on Steam) on the PC. Can you guess why they do this?
a c 271 U Graphics card
November 1, 2009 10:55:48 PM

bboynatural said:
The mods your talking about are always against me whenever I open a thread protecting AMD, so shut it right now.
The fact that this time they are taking Ati sides only proves that nvidia really was a big b!tch this time.
Face it, your the only one saying nvidia is right. Just looking at your avatar, I can see what type of idiot you are.

I take issue with you attacking other members which is a offence and will get you and if you don't like it then read the before you post your rant.
a b U Graphics card
November 1, 2009 10:58:06 PM

I think he edited that last part in, I don't think it was there before I posted.
November 1, 2009 10:58:44 PM

oh here we go again.
Yes and when all the 3 other members are insulting me and saying nonsens, you accept it?
Wtf was the point of this reply anyway? I said a blasphemy again?
a b U Graphics card
November 1, 2009 11:22:55 PM

Not even going to bother with the rest of the thread.

[:thegreatgrapeape:7]
!