More dirty tricks from Nvidia?

http://hardocp.com/news/2010/10/20/benchmark_wars

Quote:
It has come to our attention that you may have received an early build of a benchmark based on the upcoming Ubisoft title H.A.W.X. 2. I'm sure you are fully aware that the timing of this benchmark is not coincidental and is an attempt by our competitor to negatively influence your reviews of the AMD Radeon™ HD 6800 series products.


Quote:
AMD has demonstrated to Ubisoft tessellation performance improvements that benefit all GPUs, but the developer has chosen not to implement them in the preview benchmark.


Quote:
Is Ubisoft in NVIDIA's pocket and pushing technology while ignoring the company with the largest DX11 market share?
68 answers Last reply
More about more dirty tricks nvidia
  1. Often happened from a very long time ago... :)
  2. Seems similar to the Dirt 2 benchmark that was released last year in which SLi didn't work but no one complained about that, perhaps because it was an ATi sponsored title? Either way if it's an early build then just like the Final Fantasy benchmark why be surprised if it's not working properly?
  3. Mousemonkey said:
    Seems similar to the Dirt 2 benchmark that was released last year in which SLi didn't work but no one complained about that, perhaps because it was an ATi sponsored title? Either way if it's an early build then just like the Final Fantasy benchmark why be surprised if it's not working properly?


    Crossfire didn't work with the Dirt 2 benchmark either.
  4. eyefinity said:
    Crossfire didn't work with the Dirt 2 benchmark either.

    So this is just another broken benchmark, what's the big deal?
  5. Mousemonkey said:
    So this is just another broken benchmark, what's the big deal?



    I think it's a big deal if AMD offers to help Ubisoft and they refuse, then Nvidia tries to pressure review sites into using hawx 2 - a benchmark not the full game mind you.

    It will be interesting to see which sites use it, that way we'll know for sure which of those are in Nvidias pockets too.
  6. It's no secret that Ubisoft and Nvidia have always worked closely together, remember Ghost Recon 2 and the PhysX level?
  7. I remember assassins creed mysteriously losing its dx 10.1. That was ubisoft right?

    The sad thing is, Nvidia would rather throw millions at bribing games companies instead of making better graphics cards. This stuff is just going to get worse over the coming years.
  8. eyefinity said:
    I remember assassins creed mysteriously losing its dx 10.1. That was ubisoft right?

    The sad thing is, Nvidia would rather throw millions at bribing games companies instead of making better graphics cards. This stuff is just going to get worse over the coming years.

    According to an article published at the time "no money changed hands" but if you have evidence to the contrary then please post it.

    http://techreport.com/discussions.x/14707
  9. Mousemonkey said:
    According to an article published at the time "no money changed hands" but if you have evidence to the contrary then please post it.

    http://techreport.com/discussions.x/14707


    Somebody commented on that part.

    Quote:
    No money changed hands? Only the fools will be convinced. Ever heard of value-in-kind sponsorships? No cash may be transferred, but those kind of sponsors reap benefits in the 6 figure ballpark at least!


    Nvidia can pull this crap all they want, just be sure that any website using this hawx 2 demo benchmark is also in Nvidia's pockets.
  10. This is what corporate business is all about, trying to up the other company no matter the cost so as to sell more products. ATI and NVidia have been doing this for quite some time and I don't expect it to stop.
  11. eyefinity said:
    The sad thing is, Nvidia would rather throw millions at bribing games companies instead of making better graphics cards.

    Nvidia makes the single best graphics card on the planet.
  12. There was another mini controversy with HawX I, it featured dx10.1 , which only AMD had. So its another fantasy that Nvidia is the bad guy here, AMD needs to work on their Product or just make a reviewer guide, lol
  13. 17seconds said:
    Nvidia mades the single best graphics card on the planet.


    Haha. Good one. :lol:
  14. notty22 said:
    There was another mini controversy with HawX I, it featured dx10.1 , which only AMD had. So its another fantasy that Nvidia is the bad guy here, AMD needs to work on their Product or just make a reviewer guide, lol


    AMD has the better technology, and they know ubisoft is bent as hell so they keep making their games better for them, just to see how low they and Nvidia both stoop.
  15. AMD make games now?
  16. @ eyefinity, this is just business. Tactics....

    Nvidia are trying to get more custom buy making their products look superior to AMD's, I'd be suprised if they didn't try these types of tactic's

    Underhanded? IMO yes. Good business sense?, Yes.
  17. I found this interesting link from last week.

    http://www.kitguru.net/components/graphic-cards/ironlaw/nvidia-offers-the-constant-smell-of-burning-bridges-says-amd/

    Quote:
    “I’m confused as to why you’d want to upset people. Historically, with nVidia, there’s the constant smell of burning bridges. Over time, they seem to damage relationships everywhere and, in the long term, if hampers them”


    Quote:
    How many of these top studios does AMD work with ? “All of them. My team is scattered across the globe and, between us, we’re fluent in more than 20 human languages. Any developer that will impact the market, regardless of where they are based or how large a company they are, will get help from AMD”.


    Quote:
    “We’re speaking with every development studio in the work that’s likely to create a piece of software that makes it into the charts. All of them are telling us the same thing. nVidia is pushing a single message and that’s tessellation“


    Quote:
    “Tessellation is about enriching detail, and that’s a good thing, but nVidia is pushing to get as much tessellation as possible into everything”.


    Quote:
    "You should never harm PC gamers just to make yourself look good”.


    Wise words.
  18. eyefinity said:
    I found this interesting link from last week.

    Quote:
    “We’re speaking with every development studio in the work that’s likely to create a piece of software that makes it into the charts. All of them are telling us the same thing. nVidia is pushing a single message and that’s tessellation“


    Quote:
    “Tessellation is about enriching detail, and that’s a good thing, but nVidia is pushing to get as much tessellation as possible into everything”.


    Quote:
    "You should never harm PC gamers just to make yourself look good”.


    Wise words.

    So improved graphics is a bad thing?
    "Tessellation will allow much higher quality rendering and animation at fairly low GPU compute cost. The generic rule here is the more tessellation, the slower the GPU gets, yet since there's now dedicated core logic for it on the GPU, it's fast and can boost your detail massively, thus giving an impression of sharpness and much finer quality."
    http://www.guru3d.com/article/geforce-gtx-470-480-review/10
  19. Im sure i read something recently about ATI/AMD messing with something in the drivers that Nvidia drivers didn't support.
    It was a lowering of some setting or other i cant remember exactly. Anyone know what it was ?
    Anyway they hacked the Nvidia drivers and it seems that in some cases they are getting an advantage by doing what ever it is they are doing. Damn wish i could find it.
    Maybe this is Nvidia trying to even things up a bit ?

    Mactronix :)
  20. 17seconds said:
    So improved graphics is a bad thing?
    "Tessellation will allow much higher quality rendering and animation at fairly low GPU compute cost. The generic rule here is the more tessellation, the slower the GPU gets, yet since there's now dedicated core logic for it on the GPU, it's fast and can boost your detail massively, thus giving an impression of sharpness and much finer quality."
    http://www.guru3d.com/article/geforce-gtx-470-480-review/10



    Quote:
    “Overall, the polygon [Triangle - Ed] size should be around 8-10 pixels in a good gaming environment”, said Huddy. “You also have to allow for the fact that everyone’s hardware works in quads. Both nVidia and AMD use a 2×2 grid of pixels, which are always processed as a group. To be intelligent, a triangle needs to be more than 4 pixels big for tessellation to make sense”.

    Interesting enough, but why are we being told this? “With artificial tests like Stone Giant, which was paid for by nVidia, tessellation can be done down to the single pixel level. Even though that pixel can’t be broken away from the 3 other pixels in its quad. Doing additional processing for each pixel in a group of 4 and then throwing 75% of that work away is just sad”.


    You see what Nvidia is doing? Fermi, for want of a better word "overtesselates". It is capable of doing more tesselation than is needed. You cannot tell the difference between 4 pixel tesselation and single pixel tesselation.

    What this means is, Nvidia is deliberately slowing down their own cards because it slows down AMD cards more. Why? Because AMD cards don't tesselate at such tiny levels because there is no need.

    This is why we see Fermi "winning" at extreme tesselation even though there is practically no visable difference between medium and extreme. It slows down AMD cards a lot, and Fermi a bit less, but still more than is needed.
  21. mactronix said:
    Im sure i read something recently about ATI/AMD messing with something in the drivers that Nvidia drivers didn't support.
    It was a lowering of some setting or other i cant remember exactly. Anyone know what it was ?
    Anyway they hacked the Nvidia drivers and it seems that in some cases they are getting an advantage by doing what ever it is they are doing. Damn wish i could find it.
    Maybe this is Nvidia trying to even things up a bit ?

    Mactronix :)


    You're talking about AMD's so called "cheat" according to Nvidia.

    Read this.

    http://www.atomicmpc.com.au/Feature/232215,ati-cheating-benchmarks-and-degrading-game-quality-says-nvidia.aspx/1

    Quote:
    Given that in their own documents, NVIDIA indicates that the R11G11B10 format "offers the same dynamic range as FP16 but at half the storage" it would appear to us that our competitor shares the conviction that R11G11B10 is an acceptable alternative.



    Quote:
    Atomic) Why is it considered incorrect to test with Catalyst AI enabled at the launch of the GTS450, considering it has been included in AMD drivers for over five years?


    I wonder why. Dirty tricks is all Nvidia can do these days.
  22. When are you going to jump off AMD's manhood and look at how the world really is. Competition breeds this kind of stuff, from BOTH SIDES. I'm pretty sure ATI spewed that their DX11 cards were so great because of tesselation, but now that nVidia has upped the bar and can handle it better, it is deemed "unnecessary"? Get over yourself...
  23. 96Firebird said:
    When are you going to jump off AMD's manhood and look at how the world really is. Competition breeds this kind of stuff, from BOTH SIDES. I'm pretty sure ATI spewed that their DX11 cards were so great because of tesselation, but now that nVidia has upped the bar and can handle it better, it is deemed "unnecessary"? Get over yourself...

    Yeah, I would have to say all this hype over the 6800 series is really all about AMD trying to catch up to Nvidia in tesselation performance, in the popularity of the GTX460, and in reclaiming the top performance crown from the GTX480. I think we will just keep going back and forth with each new release.
  24. Yep thats the one.
    Basically Nvidia sent a pack out with its cards pointing out that these games would run better on ATI hardware due to the FP16 Demotion. They made a big thing out of nothing and now the boot is on the other foot. This happens all the time nothing new to see here people move along.
    Personally i think both teams should send out technical demo's of what the cards can do on fully Optimised code.
    Im not trying to defend either party here just saying its the same old tit for tat we have been seeing for years and instead of bitching about it they should each take a set of games. The same games hopefully, and fully optimise a system to get the most out of them.
    They should run highest quality tests as well as best performance tests. Optimise/Floptimise to your hearts content and let people decide what they want.

    Mactronix :)
  25. AMD/Nvidia, in the end, does it really matter? Figure out how much $ you want to spend on a video card, pick the card(s) that will give you the best value/performance for the games you play (based on what your budget is), buy that card. It really is that simple...

    If you will only buy Nvidia or AMD then you are only lessening your choices.
  26. eyefinity said:
    I found this interesting link from last week.

    http://www.kitguru.net/components/graphic-cards/ironlaw/nvidia-offers-the-constant-smell-of-burning-bridges-says-amd/

    Quote:
    “I’m confused as to why you’d want to upset people. Historically, with nVidia, there’s the constant smell of burning bridges. Over time, they seem to damage relationships everywhere and, in the long term, if hampers them”


    Quote:
    How many of these top studios does AMD work with ? “All of them. My team is scattered across the globe and, between us, we’re fluent in more than 20 human languages. Any developer that will impact the market, regardless of where they are based or how large a company they are, will get help from AMD”.


    Quote:
    “We’re speaking with every development studio in the work that’s likely to create a piece of software that makes it into the charts. All of them are telling us the same thing. nVidia is pushing a single message and that’s tessellation“


    Quote:
    “Tessellation is about enriching detail, and that’s a good thing, but nVidia is pushing to get as much tessellation as possible into everything”.


    Quote:
    "You should never harm PC gamers just to make yourself look good”.


    Wise words.


    R. Huddy has been on the anti-Nvidia trip ever since his employment with them ended, I for one don't give a rats what he says as it must be BS. Does he not say elsewhere in that interview that all games developed until 2012 we be developed on ATi hardware?
    Quote:
    “With one exception, every game released through to 2012 will have been developed on AMD’s Radeon”
    And if that's the case then H.A,W.K 2 must be one of those games so why the noise about "dirty tricks"?
  27. Mousemonkey said:
    R. Huddy has been on the anti-Nvidia trip ever since his employment with them ended, I for one don't give a rats what he says as it must be BS. Does he not say elsewhere in that interview that all games developed until 2012 we be developed on ATi hardware?
    Quote:
    “With one exception, every game released through to 2012 will have been developed on AMD’s Radeon”
    And if that's the case then H.A,W.K 2 must be one of those games so why the noise about "dirty tricks"?


    Maybe hawx 2 was the game? :bounce:

    Games can also be developed on AMD hardware and still be subject to Nvidia's dirty tricks btw, and most people who aren't single minded would be able to figure that out for themselves.

    Developed doesn't mean uniquely developed, or proprietarily developed. That's Nvidia's expertise. And apple's too. AMD believes in open development and open standards.
  28. A game developer releases a new working model benchmark of a sequel game.
    There are two major discrete video card makers.
    They both send emails.
    One is to alert there is a new dx11 game benchmark.
    The other to insist AMD's new cards not be tested.
    Who sounds more dirty ?

    AMD is going to release a driver that will 'fix' performance in this benchmark.
    Do you know how they will do that ?
    Most likely through subtle image loss.
    Adaptive tesselation, low level implementation for 'far away objects'
  29. eyefinity said:
    Maybe hawx 2 was the game? :bounce:

    Games can also be developed on AMD hardware and still be subject to Nvidia's dirty tricks btw, and most people who aren't single minded would be able to figure that out for themselves.

    Developed doesn't mean uniquely developed, or proprietarily developed. That's Nvidia's expertise. And apple's too. AMD believes in open development and open standards.


    I think you'll find it was Crysis 2, the rest is just the ramblings of a conspiracy theorist who would believe that everyone is out to get AMD.
  30. Mousemonkey said:
    I think you'll find it was Crysis 2, the rest is just the ramblings of a conspiracy theorist who would believe that everyone is out to get AMD.


    I thought Nvidia didn't buy games?
  31. notty22 said:
    A game developer releases a new working model benchmark of a sequel game.
    There are two major discrete video card makers.
    They both send emails.
    One is to alert there is a new dx11 game benchmark.
    The other to insist AMD's new cards not be tested.
    Who sounds more dirty ?

    AMD is going to release a driver that will 'fix' performance in this benchmark.
    Do you know how they will do that ?
    Most likely through subtle image loss.
    Adaptive tesselation, low level implementation for 'far away objects'
    http://img192.imageshack.us/img192/2006/170805bw2q2b1fwstjgbgt.jpg


    http://www.rage3d.com/board/showpost.php?p=1336398700&postcount=1376

    Quote:
    The issue here is HAWX 2 is using a very high level of tessellation, creating very small poly's - 6 pixels, in fact (per NV email). This is below the optimal efficiency threshold for GPU's that process four quads of pixel groupings - i.e. NVIDIA and AMD current designs. You know quad quads better as 16px/clock, in specifications.

    So when you tessellate down below 16 pixels, you create a scenario where the resterizer ('pixel processor') has to reprocess pixels because it's got a new poly in it, despite the fact it's already been processed. This stalls the rasterizer pipeline and causes problems.

    This would be fine and down to 'architecture implementation of specification' if it werent for the fact that you can't actually tell the difference as tessellation factors go up, especially if you have a HD display.


    Proof from the experts that Nvidia is shafting YOU, because it helps to shaft AMD more.

    You lose, AMD buyers lose. Everybody loses except lying, cheating Nvidia. THE WAY IT'S MEANT TO BE PLAYED
  32. eyefinity said:
    I thought Nvidia didn't buy games?

    They bought marketing rights, just like AMD have bought for every other game developed until 2012 according to Richard Huuddy.
    Quote:
    What Nvidia will get for its buck is marketing rights
  33. Mousemonkey said:
    They bought marketing rights, just like AMD have bought for every other game developed until 2012 according to Richard Huuddy.


    Where did he say that?
  34. eyefinity said:
    Where did he say that?

    Quote:
    “With one exception, every game released through to 2012 will have been developed on AMD’s Radeon”
  35. Mousemonkey said:
    Quote:
    “With one exception, every game released through to 2012 will have been developed on AMD’s Radeon”


    I don't see where it says AMD bought marketing rights for every other game until 2012.

    You realise that most software companies use AMD hardware because Fermi was 9 months late, right? :lol: Or maybe you think they should have kept making games on their gtx 285's instead of the much faster 5870?
  36. eyefinity said:
    I don't see where it says AMD bought marketing rights for every other game until 2012.

    You realise that most software companies use AMD hardware because Fermi was 9 months late, right? :lol: Or maybe you think they should have kept making games on their gtx 285's instead of the much faster 5870?

    Then you must be blind to the AMD logo appearing on the splash screens of MOH and BB2 not to mention Dirt2 and AvP? Or did you think marketing means something else?
  37. Mousemonkey said:
    Then you must be blind to the AMD logo appearing on the splash screens of MOH and BB2 not to mention Dirt2 and AvP? Or did you think marketing means something else?


    Shall we both count up the number of games that have AMD and Nvidia logos? Or would you just like to abandon that laughable theory before we both waste a lot of time getting to the obvious conclusion that Nvidia "spends" a lot more on that side of things?
  38. eyefinity said:
    Shall we both count up the number of games that have AMD and Nvidia logos? Or would you just like to abandon that laughable theory before we both waste a lot of time getting to the obvious conclusion that Nvidia "spends" a lot more on that side of things?

    R. Huddy is the one saying that all game with the exception of one have the backing of AMD, something that you don't want to agree with, why the sudden call for a truce? Do you feel you are losing your argument? :whistle:
  39. Mousemonkey said:
    R. Huddy is the one saying that all game with the exception of one have the backing of AMD, something that you don't want to agree with, why the sudden call for a truce? Do you feel you are losing your argument? :whistle:


    Truce? What a warped sense of humour you have.

    Just to humour you I'll let you count up the games that have AMD or ATI logo's taking up loading time, then I'll do the same with Nvidia ones (until i've "beaten" your score)

    You still don't get the difference between Nvidia bribery and AMD hardware do you? Software developers use superior AMD hardware, because it's...better. They aren't going to say no to Nvidia bribe money just because of that.

    I think all of us realise this is what Huddy meant. :whistle:
  40. eyefinity said:
    Truce? What a warped sense of humour you have.

    Just to humour you I'll let you count up the games that have AMD or ATI logo's taking up loading time, then I'll do the same with Nvidia ones (until i've "beaten" your score)

    You still don't get the difference between Nvidia bribery and AMD hardware do you? Software developers use superior AMD hardware, because it's...better. They aren't going to say no to Nvidia bribe money just because of that.

    I think all of us realise this is what Huddy meant. :whistle:

    [:mousemonkey:5]
  41. FLAME ON! :lol: Anyway, its good business by Nvidia, bad morals, but come one, what company has good morals. The way i read it, Huddy said "You should never harm PC gamers, unless you wont get caught or it makes you more money". Its how the world of business works. Nvidia is behind on hardware, so they use the money they have to buy out software. Stem the win AMD has achieved. Granted, it does seem to me the new way of doing things is better, 16 polygon and such, so if Nvidia is making a real effort to block it so that things are inefficient and their raw tesselating power pushes it thru, then id be a bit pissed. Same if AMD did something to block Nvidia from games or kill off Physx. Im all for good business, as long as it doesnt get in the way of good progress.
  42. ares1214 said:
    FLAME ON! :lol: Anyway, its good business by Nvidia, bad morals, but come one, what company has good morals. The way i read it, Huddy said "You should never harm PC gamers, unless you wont get caught or it makes you more money". Its how the world of business works. Nvidia is behind on hardware, so they use the money they have to buy out software. Stem the win AMD has achieved. Granted, it does seem to me the new way of doing things is better, 16 polygon and such, so if Nvidia is making a real effort to block it so that things are inefficient and their raw tesselating power pushes it thru, then id be a bit pissed. Same if AMD did something to block Nvidia from games or kill off Physx. Im all for good business, as long as it doesnt get in the way of good progress.

    What are you going on about ?
    Nvidia is NOT BLOCKING ANYTHING, facepalm, its AMD who is behind in Hardware on this issue. AMD does not like the way tesselation was implemented in the benchmark.
    Nvidia did not make it, they are just tinkled pink that it exposes AMD weaknesses.
    Its up to you or I or the game tester to decide its relevance.
  43. I think some folk are just getting upset that their favoured company haven't been putting any money into the gaming development until now and that by crying foul it will somehow make up for it! :lol:
  44. notty22 said:
    What are you going on about ?
    Nvidia is NOT BLOCKING ANYTHING, facepalm, its AMD who is behind in Hardware on this issue. AMD does not like the way tesselation was implemented in the benchmark.
    Nvidia did not make it, they are just tinkled pink that it exposes AMD weaknesses.
    Its up to you or I or the game tester to decide its relevance.


    Well arent we testy today :pfff: Now you show me, how many games support DX11? Tesselation? Is the tesselation efficient? Does it seem like AMD might be able to improve on it? Did you see the "if's" i put in there?

    Quote:
    FLAME ON! :lol: Anyway, its good business by Nvidia, bad morals, but come one, what company has good morals. The way i read it, Huddy said "You should never harm PC gamers, unless you wont get caught or it makes you more money". Its how the world of business works. Nvidia is behind on hardware, so they use the money they have to buy out software. Stem the win AMD has achieved. Granted, it does seem to me the new way of doing things is better, 16 polygon and such, so if Nvidia is making a real effort to block it so that things are inefficient and their raw tesselating power pushes it thru, then id be a bit pissed. Same if AMD did something to block Nvidia from games or kill off Physx. Im all for good business, as long as it doesnt get in the way of good progress.


    Im not saying Nvidia is, im saying if they do eventually.
  45. Mousemonkey said:
    I think some folk are just getting upset that their favoured company haven't been putting any money into the gaming development until now and that by crying foul it will somehow make up for it! :lol:


    That shouldnt be directed at me... :non: In any event, i dont have a problem with Nvidia pouring a lot of money into SW to make up for a pretty bad year in HW, even if they sabotage AMD a bit, but what i dont want to see is something thats far more efficient being wasted in the event Nvidia tries to block it to keep the advantage with their cards, even though everything is losing.
  46. Well arent we testy today :pfff: Now you show me, how many games support DX11? Tesselation? Is the tesselation efficient? Does it seem like AMD might be able to improve on it? Did you see the "if's" i put in there?

    [quoteFLAME ON! :lol: Anyway, its good business by Nvidia, bad morals, but come one, what company has good morals. The way i read it, Huddy said "You should never harm PC gamers, unless you wont get caught or it makes you more money". Its how the world of business works. Nvidia is behind on hardware, so they use the money they have to buy out software. Stem the win AMD has achieved. Granted, it does seem to me the new way of doing things is better, 16 polygon and such, so if Nvidia is making a real effort to block it so that things are inefficient and their raw tesselating power pushes it thru, then id be a bit pissed. Same if AMD did something to block Nvidia from games or kill off Physx. Im all for good business, as long as it doesnt get in the way of good progress.]

    Im not saying Nvidia is, im saying if they do eventually.
    The only reason AMD is behind in the tessellation game is because they chose to go with a dedicated chip rather than hold on to see what the software actually needed, it was a crap shoot and they lost because their chip was not powerful enough to do what was asked of it. Nvidia waited until the software demanded something and went for an approach that so far seems to be working, who's fault is that?
  47. ares1214 said:
    That shouldnt be directed at me... :non: In any event, i dont have a problem with Nvidia pouring a lot of money into SW to make up for a pretty bad year in HW, even if they sabotage AMD a bit, but what i dont want to see is something thats far more efficient being wasted in the event Nvidia tries to block it to keep the advantage with their cards, even though everything is losing.

    If I meant to direct that at you then I would have quoted you but as I didn't it's just your paranoia getting the better of you mate.
  48. Mousemonkey said:
    The only reason AMD is behind in the tessellation game is because they chose to go with a dedicated chip rather than hold on to see what the software actually needed, it was a crap shoot and they lost because their chip was not powerful enough to do what was asked of it. Nvidia waited until the software demanded something and went for an approach that so far seems to be working, who's fault is that?


    Im not blaming anybody, however even though tesselation was more or less an AMD brainchild, they released it first, and like you said, had no idea what to do. Nvidia had 6 months to make changes. That no ones "fault" just a perk of being later i suppose. We also have to consider Nvidia is jumping the gun on many things. 2% of britons PLAN on buying a 3D TV soon, PLAN, not even own. And they are playing that market up. Very few games support DX11, or DX11 well. Tesselation isnt widely used nor fully developed IMO. Sure, they get to be better in those, but to what effect? If you ask me, even if AMD still didnt have 3D i wouldnt consider it a con. The fact they are coming up with these things later is just a result of either waiting until the markets matured, which they still really havent, or just releasing earlier. Now AMD knows whats expected, and know how to fix things with tesselation efficiency. Great. I dont care if Nvidia pays to code games for them, even if they make them less playable on AMD cards, but if they get in the way of serious progress just to hold everybody back, then id start to care.
  49. ares1214 said:
    Im not blaming anybody, however even though tesselation was more or less an AMD brainchild, they released it first, and like you said, had no idea what to do. Nvidia had 6 months to make changes. That no ones "fault" just a perk of being later i suppose. We also have to consider Nvidia is jumping the gun on many things. 2% of britons PLAN on buying a 3D TV soon, PLAN, not even own. And they are playing that market up. Very few games support DX11, or DX11 well. Tesselation isnt widely used nor fully developed IMO. Sure, they get to be better in those, but to what effect? If you ask me, even if AMD still didnt have 3D i wouldnt consider it a con. The fact they are coming up with these things later is just a result of either waiting until the markets matured, which they still really havent, or just releasing earlier. Now AMD knows whats expected, and know how to fix things with tesselation efficiency. Great. I dont care if Nvidia pays to code games for them, even if they make them less playable on AMD cards, but if they get in the way of serious progress just to hold everybody back, then id start to care.

    You keep with this "six months late" thing like they were changing things in that time, they weren't, as once the design is laid down it takes eighteen months to two years to go from design to working silicon, and that six months that you like to keep on about was lost due to non working silicon not a redesign, what part of that do you not want to understand? :heink:
Ask a new question

Read More

Graphics Cards Benchmark Nvidia Graphics