Sign in with
Sign up | Sign in
Your question

Interview with AMD DevRel

Last response: in Graphics & Displays
Share
October 16, 2010 12:33:39 PM

http://www.kitguru.net/components/graphic-cards/ironlaw...

After reading that interview, i think neither company really cares about gamers much to be honest, but he did point out some valid facts about Nvidia trying to be "too" competitive perhaps. Im sure a Nvidia DevRel could point out just as many cases of ATi being counter productive and hurting gamers. Although to be honest, i cant really think of any. That is likely because in 2011-2012 most games will be made with radeon in mind, and therefore we will have to see how they treat that priveledge, as most of this was calling out Nvidia abusing their lead.

More about : interview amd devrel

October 16, 2010 1:01:55 PM

Interesting, he has a point but I don't think that's illegal. Also Adobe disabled all the ATI cards and all the cheaper Nvidia ones (with Nvidia's accord) in their CS engines, forcing the users to spend a lot on Nvidia's workstation cards, Adobe must go down for that.

There's plenty of locked software out there, to not be able to use that application on your hardware is one thing but messing with your hardware (HDCP) is much more outrageous.

m
0
l
October 16, 2010 1:28:02 PM

“This highlights perfectly the fundamental difference in thinking between AMD and nVidia. We would never do that”, said Huddy. “nVidia needs to learn that you should always put the gamer’s experience ahead of your own ego. Issues like the deliberate and unnecessary reduction in image quality seen in the Batman Arkham Asylum situation, shows that nVidia is willing to single out half the market and nobble their experience. That’s just not right. You should never harm PC gamers just to make yourself look good”.

Well, if this is their approach, and we know that this is how it turned out, ala nVidia, and we also know for the most part, ATI has been everyone is welcome policy, as is AMDs cpu approach, and mostly Intel as well, that leaves nVidia by itself in this regard
m
0
l
!