Sign in with
Sign up | Sign in
Your question

More dx10 benches from Anandtech

Last response: in Graphics & Displays
Share
July 5, 2007 4:51:01 PM

Real World DirectX 10 Performance: It Ain't Pretty

Good read and some interesting points being made. They pull no punches pointing out that the current mid range cards will hold back game development and new features.

More about : dx10 benches anandtech

July 5, 2007 5:43:12 PM

from reading the article, we wont see more detailed games (DX10) unless the graphics cards improve. i guess it will take a couple of years to see true DX10 games. for now its all about DX9's.
July 5, 2007 6:14:20 PM

Hopefully the 9800GTX is able to make the Direct X10 versions of each game a lot more playable. :?
Related resources
July 5, 2007 8:17:26 PM

9800GTX , what a dreamer haha.
July 5, 2007 8:56:31 PM

strangestranger wrote;

''A)reply to the right person

B)please explain why you said that, nvidia will have to come out with something, what are they going to call their next card or refresh?''

please explain part A again, huh, im confused now.

i said that because i felt like it, i was bored, making a joke. maybe the numbers 9800 sounded too much and too exciting. anyways did that sound so painful to say ''what a dreamer''. it probably is true but you dont have to take it too far by saying explain yourself ok?

well ask nvidia what theyll call the card, they have thier forums.

not that you sent 4000+ posts means you can dig in with me, ok pal.

PS. i can tell you im joking and bored, and please never take me serious.
July 5, 2007 9:35:05 PM

Overall I think it's nice to see DX10 developments that actually make games look better. I also like to see things that challenge current high end graphic cards and put ATI/Nvidia benchmarks on the same page(physically and figuratively). As far as DX10 being a "worse" performer than DX9 I wonder how much of it has to do with Vista. Since I can see DX9 games being worse in Vista than XP I'd also like to account for some of the DX10 performance loss to Vista as well. Although It's not yet possible to run DX10 on XP it does make the fight for DX10 on XP even more crucial.
July 5, 2007 10:30:09 PM

These benchmarks are a start.
I still think those the games tested are "patched" DX10.
If it works on the Xbox 360, then it must be patched.
Maybe I don't know what I'm talking about, but so far I see very little difference between the DX9 and DX10 versions.
From what little I know, Crysis and 'DX10 only' games might better show what DX10 can do.
I also expect the 2900XT to bench much faster on the next-gen DX10 games. But by then the 2900XT will be obsolete anyway.. :?
July 5, 2007 10:41:19 PM

Quote:
Hopefully the 9800GTX is able to make the Direct X10 versions of each game a lot more playable. :?

The "current" DX10 games more playable?
I don't think they even use hardware accerated physics. PPUs arn't even useable yet. Hardware and software has a LONG way to go.
So, even the 9800GTX-XXX will choak when games start to require PPUs and other specialzed hardware.
But yes, I expect the next-gen Nv cards to better handle the environment I described.
July 5, 2007 10:45:13 PM

You should re-read the article they are by-passing Nvidia's hardware solution. AMD never had a solution for this in other words doesn't support it. I would be upset about this and any gamer should. That like having AC in a car that when driving on Florida roads is disabled.

Both Lost Planet and Call of Juarez are bad examples unfortunately we only have three titles at the moment to really test against. Perhaps in the fall we will have Crysis and some others. For now it's clear that you need to best hardware to even attempt high res DX10.

For me Vista 64 offer enough advantages of XP to deal with a 10-15% performance hit. What I'm not happy with is that Ultimate and Business have difference in performance with Ultimate being noticeably slower at everything.
a b U Graphics card
July 5, 2007 11:02:02 PM

Just curious what happened to the COH AA results for the low-end cards?

It amazes me that Anad always does that selectively omitting results from testing. :roll:

Even if it's 2fps or 0.5 fps, that's still some info that'd be nice to have.
a b U Graphics card
July 5, 2007 11:17:30 PM

Quote:
You should re-read the article they are by-passing Nvidia's hardware solution. AMD never had a solution for this in other words doesn't support it. I would be upset about this and any gamer should. That like having AC in a car that when driving on Florida roads is disabled.


You should re-read the article, or better yet read the actual issue from Techland. Techland didn't 'bypass' nVidia's hardware, they did THEIR implementation of AA under DX guidelines, just like MS add FP16HDR+AA by doing their own method to help nVlast generation. Techland is doing MSAA through the shader as supported in DX10 and REQUIRED in DX10.1 (I love how Anad missed the part about it being supported in D3D, as if Techland made up their own method on their own). So nvidia's complaint has no merit for something that is standard to DirectX. Sure they'd love to use their own method (and can force it if they want to hard enough), but their complaint is more that someone chose to do it a different way than they expected which would be like ATi telling Id they should've changed their z sampling and shadow techniques because they don't compliment their cards.

And your comparison to AC is not the same since the output isn't the same. If you goal was cooling the car, then the coolest car would be achieved by doing method Y only which both can do but one does better than the other, versus another option that doesn't achieve the same effect but is available to one vehicle and to it is more efficient.

nVidia's complaint would be like arguing that they should be allowed to use partial-precision again like the FX era simply because they can do it faster.

As long as it's supported in DX then either IHV can ask nicely for their benefits to be put in, but if they're complaining about what amounts to a level playing field where both companies have to do what's asked of them under the same DX10 guidelines and implementation, then their complaint is pretty hollow.
a b U Graphics card
July 5, 2007 11:30:02 PM

TY, thought thats what I got the first time also, saved me from rereading 8O So is their timeline inline with what you know, say taking in the info of the gigaflop from nVidia (if true), at that advancement? 1+ years?
a b U Graphics card
July 5, 2007 11:47:33 PM

I'm not sure what you mean.

Are you talking about the timeframe for the Teraflop G9x or was it something else.

Sorry trying to read at work.
July 6, 2007 12:52:51 AM

I 'm talking about this "Move forward to the release of the of the Call of Juarez benchmark we currently have for testing, and now we have a more interesting situation on our hands. Techland decided to implement something they call "HDR Correct" antialiasing. This feature is designed to properly blend polygon edges in cases with very high contrast due to HDR lighting. Using a straight average or even a "gamma corrected" blend of MSAA samples can result in artifacts in extreme cases when paired with HDR.

The real caveat here is that doing HDR correct AA requires custom MSAA resolve. AMD hardware must always necessarily perform AA resolves in the shader hardware (as the R/RV6xx line lack dedicated MSAA resolve hardware in their render backends), so this isn't a big deal for them. NVIDIA's MSAA hardware, on the other hand, is bypassed. The ability of DX10 to allow individual MSAA samples to be read back is used to perform the custom AA resolve work. This incurs quite a large performance hit for what NVIDIA says is little to no image quality gain. Unfortunately, we are unable to compare the two methods ourselves, as we don't have the version of the benchmark that actually ran using NVIDIA's MSAA hardware."

I'm not sure what I might be misinterpeting here. As taking Techland explanations I think it would be safe to say that has as much merit as Nvidea argument in they are both biased.

As for my analogy being flawed I disagree as well the best solution is the one that keep me the coolest with equipment in my cars particularly when it the far more popular model. Doesn't seem that this method would adversely impact the other model. The goal for software developers is optimise their code to work best with the hardware available. The game itself is mediocre at best so in the big picture this isn't a big deal. The better developers will make the most of the environment they have to work with regardless of what the reference states is possible. That what optimisation is all about. I deal with this with the developers in my organization all the time. In the real world being idealist isn't as practical as being a realist.
a b U Graphics card
July 6, 2007 1:34:41 AM

Quote:
I 'm talking about this...


Oh I know exactly which section you're talking about, we've discussed this issue before when it was first brought up, here's a better version of Techland's reply;
http://www.theinquirer.net/default.aspx?article=40401

Be sure to read the middle part with the e-mail where Techland mentions that BOTH IHV's requested shader based AA support in DX10.

And like I said I think you need to absorb what's being said by both sides (and what's being omitted by Anand), and then look at M$' DX10 standards (option D3D10.0 support, mandatory D3D10.1 support) check on MSDN or check for the Google WinHEC or XNA presentations. Then tell me how Techland/COJ using a method suppoted by M$/D3D is wrong, and nVidia has any more reason to complain than ATi would about Doom3 because Id didn't use more standards methods of Z-culling and shadow mapping that put the R300/R420 performance in a better light. nVidia can say 'this benchmark doesn't show our product's GENERAL performance in DX10', but Techland's choice to use an accepted and DX support method for AA to achieve an already acknowledge different effect doesn't show them 'bypassing' nV hardware to favour anyone, just using a method that doesn't exploit nV's extra hardware. That would be like ATi/AMD complaining that Techland or someone didn't employ the option to use tesselation so therefore they're biased against ATi/AMD.

Quote:
I'm not sure what I might be misinterpeting here. As taking Techland explanations I think it would be safe to say that has as much merit as Nvidea argument in they are both biased.


However it's Techland's game, not nVidia's. So for nVidia to tell someone how to make their game would be like saying Techland has the right to tell nVidia they need to add tesselation to their hardware. It's not nV's place to comment on it other than to say that the game isn't indicative of their performance elsewhere. Instead nV attacked Techland as if they were trying specifically to damage nV.

Quote:
As for my analogy being flawed I disagree as well the best solution is the one that keep me the coolest with equipment in my cars particularly when it the far more popular model. Doesn't seem that this method would adversely impact the other model.


But it's not the COOLEST. Both have the option to give you the coolest (which would be HDR-correct AA), thus both can give you AC, just one can give you a less penalty cooling but only give you a portion of the temperature. That's the difference. You pretend the AC is only factory on one, while shader based AA support is the true factory option required (equivalent to DX support), while the convertible or whatever else would be the option nV might have, which may get cooler in some situations but sucks in the rain. that's closer to the options we're talking about.

Quote:
The goal for software developers is optimise their code to work best with the hardware available.


No, the goal of the devs is to make the best game/experience possible for their gamers with the hardware and standards set out before them.

Quote:
The game itself is mediocre at best so in the big picture this isn't a big deal.


That's a BS cop-out on your part that's irrelevant. I don't like the game (or hate it), and I don't like alot of games, but I don't say, well they suck anyways, so they shouldn't have any merit. I think AOE games suck-ass, it's just a personal feeling I don't like that style of game, but I wouldn't say that M$ using software based HDR+AA in AOE3 to achieve the effect on a GF7 and X1K should be discounted because I think the game sucks.

Quote:
The better developers will make the most of the environment they have to work with regardless of what the reference states is possible.


Sometimes if they have the resources and the ends justify the means. If they don't like the output, then it's a hack job inorder to specifically optimize for one platform over another. BOTH can do shader based AA, so why would they need to optimize for one over the other. Once again this comes back to a similar argument as to why they didn't use tesselation based displacement mapping instead of parralax offset occlusion mapping, since it's a much better option if the resources are there (which they are exclusively to the HD2K series). It makes no sense for that argument, and it makes no sense for them to have to add a lesser implementation of hardware based AA jut because one company wants it.

Quote:
That what optimisation is all about. I deal with this with the developers in my organization all the time. In the real world being idealist isn't as practical as being a realist.


Being a realist doens't involve making a single path lesser path for one IHV specifically because they don't like their benchmark results. This isn't like the HL2 or SM3.0 issues where nV or ATi/AMD can't run the standard path, there t's necessary to even make the game playable, here we're talking about eP3nis issue over a freakin benchmark, not unplayable.

What nV is essentially asking for is a floptimization whose only benefit is a benchmark result at the cost of IQ, not an equal playing field. So as someone who works with developers, you tell me how you justify those man hours. Remember nV can still add those floptimizations on their own if they desire it so much, the only thing is they know that that path puts the spotlight on them, whereas Techland acquiescing to nV's request means they can say that the floptimization is part of the game. Which is essentially the strategy of ATi's Get in the Game and nV's The Way It's Meant to Be Played.

The only thing I agree with nV in their original criticism would be the shadow quality issue, but that's a personal preference.
July 6, 2007 11:32:10 AM

The funny thing here is after all this discussion as to whether or not Techland purposely tried to gimp the Nvidia hardware they still do relatively well in the benches.

ATI obviously had some big tie in with these guys. Months ago before r600's release they posted the screenshot from the game and then it was their big demo on r600's release. I'd be a fool to think all this could happen without an effort made to have the demo reflect well on ATI's new hardware.

I also realize that the title has a TWIMTBP stamp also but this whole program has really lost any special meaning over the years. I'm guessing that at this point the TWIMTBP stamp is tagged on in exchange for money rather any real co-development.

All that said Nvidia should just suck it up. Their hardware runs the same demo as ATI.

The two questions I'd have are:
-Will Techland optimize the game for Nvidia's hardware AA when they actually have the full game patched to dx10?
-If no to the above can Nvidia force it through drivers?

Even though the game gets bad reviews I'd play it just for the scenery if it looks like the flythrough demo.
a b U Graphics card
July 6, 2007 12:05:16 PM

Quote:
-Will Techland optimize the game for Nvidia's hardware AA when they actually have the full game patched to dx10?
I truly hope not. Not that Im against nVidia, more Im FOR tech advancement. Im more let down about the DX10 dilemma than the 2900 launch. Here we sit with an O$ from M$ thatll do DX10, and the midrange cards are holding it all back? Should we allow or be argueing whether nVidia or ATI should be dragging their feet towards DX10? Like you said, man up, let the tech grow
July 6, 2007 2:55:17 PM

Don't you guys wonder if it's not just the hardware but it's actually more of Vista being the performance culperate?
July 6, 2007 3:44:44 PM

No. Vista has more overhead but the impact is not really that great. My 6 month old computer on Vista out performs my 2yr old computer on xp. I'm guessing some future OS will bring my current computer to it's knees but it won't matter because my future computer will have more power. This is the nature of things. Consider cars, average daily drivers for people probably weigh about 500 more lbs than 20 years ago but we just up the horsepower.

Aside from that rant here is what Vista did to my performance on the same system/settings.
3dmark 06 xp-11827 http://service.futuremark.com/compare?3dm06=1888828
3dmark 06 vista-11519 http://service.futuremark.com/compare?3dm06=2034070

I'm not letting Microsoft off the hook though. Dx10 is their baby and it is being implemented terribly. Even Microsoft ended up pushing back dx10 for FSX. They now spin it that they never had a deadline but last year it was being widely reported that the dx10 patch would be released at the same time as Vista. It is obvious that the industry (MS included), is having trouble making this transition and Microsoft needs to assume the heaviest share of that fault.
Both ATI and Nvidia have delivered hardware. We can argue that the hardware is not up to dx10 but what is expected? ATI delivered a card with 700 million transistors before the manufacturing process was really ready. Do we need 1400 million transistors, teraflop processing and 1 gig mem bandwidth to make dx10 work?
July 6, 2007 4:15:57 PM

I think one very important point is being minimized in this discussion...that being the economics of DX10. If it is too cumbersome or requires only very expensive GPUs to work properly; it will be slow going waiting for properly DX10 only coded games, etc. I'm am quite sure developers like to be paid and sales drive their compensation. The enthusiast community just isn't big enough to support DX10 by itself. The buyers of prebuilt with on-board graphics or low performance GPUs are where the big money is. I expect any successful DX10 game will have to be playable in some form on what ever is being sold with its level graphics or very close to it.
July 6, 2007 4:19:20 PM

The are first generation dx10 cards. Don't you guy remember what happen to the first gen of dx9 cards, they could barely handle anything. And the 9800pro was the best thing out for them. I am happy with my x1900xt and XP Pro.
July 6, 2007 4:19:39 PM

These are first generation dx10 cards. Don't you guy remember what happen to the first gen of dx9 cards, they could barely handle anything. And the 9800pro was the best thing out for them. I am happy with my x1900xt and XP Pro.
a b U Graphics card
July 6, 2007 4:41:30 PM

Quote:
The funny thing here is after all this discussion as to whether or not Techland purposely tried to gimp the Nvidia hardware they still do relatively well in the benches.


Which asks the question, why are they complaining so much?

Quote:
ATI obviously had some big tie in with these guys.


Yet the game was also part of the TWIMTBP program with nV. So the company is tied into both, and if anything more so to nV.

Quote:
Months ago before r600's release they posted the screenshot from the game and then it was their big demo on r600's release. I'd be a fool to think all this could happen without an effort made to have the demo reflect well on ATI's new hardware.


And if ATi/AMD is so close to Techland, why at the HD2900XT's launch was nVidia able to provide people with an updated demo to benchmark that AMD hadn't even seen yet?

Quote:
I also realize that the title has a TWIMTBP stamp also but this whole program has really lost any special meaning over the years. I'm guessing that at this point the TWIMTBP stamp is tagged on in exchange for money rather any real co-development.


So your saying nVidia exchanges money and no co-development, while nVidia says the exact opposite that no money exchanges hands just support. And based on the replacement demo nVidia had access to to update their drivers which AMD hadn't seen, Id' say your guess likely doesn't mirror that relationship. My guess is that it's more like the DX9 situation, where nVidia was closely involved and then they didn't like the dirction things were going and now cry foul.

Quote:
The two questions I'd have are:
-Will Techland optimize the game for Nvidia's hardware AA when they actually have the full game patched to dx10?
-If no to the above can Nvidia force it through drivers?


I think they (Techland) would've been more inclined to do so if nV asked nicely instead of criticizing their use of shader based AA. Who knows, they may launch with it once it's official. The other option is nVidia allow you to force it in drivers, similar to other situations like HDR in Oblivion, which is likely the way to go if there's little/no benefit to Techland in doing this for just 1 IHV.

Quote:
Even though the game gets bad reviews I'd play it just for the scenery if it looks like the flythrough demo.


Yeah I'm not interested in the game but there's alot of people who are. RobSli was one of those people, he used to rave about the game when it first came out.

So I know it's got a strong following, but I'm more interested in the technical aspects than actually playing it (happy with my Oblivion and UT2K4 existence 8) )
July 6, 2007 4:55:09 PM

I played the Direct X9 version of the game and was unimpressed as not only did it perform pretty poorly on my 8800GTX, there wasn't anything being rendered that I felt like I hadn't seen before. The coolest parts of the Direct X10 version of the game are in the benchmark. Grape, do you think Crysis will use a shader based AA approach as well?
July 6, 2007 6:19:17 PM

Quote:
i had and do still have a 9700pro. it is dx9 and handled early stuff pretty damn well. of course if you looked at the mid range 9 series and how well they did, it might give you an idea why perople shouldn't expect miracles from the 8xxxx and 2xxxx seires mid range.

Were you an first gen nvidia user back then or something :lol: 

I was hoping the 2900XT was the next 9700Pro. Oh well.
a b U Graphics card
July 6, 2007 6:34:37 PM

Quote:
I played the Direct X9 version of the game and was unimpressed as not only did it perform pretty poorly on my 8800GTX, there wasn't anything being rendered that I felt like I hadn't seen before.


Yeah that was my impression, and between spending the very VERY limited time I have playing COJ or finishinng some Oblivions quests, Oblivion wins every time. I have boxes of unopened games Like SeriousSam2 because the lack of them being able to pull me away from Oblivion. Someday I may change, but I think that will be only once they release the replacement for the thing that takes my FPS needs away UT2K4 replaced by UT3 and Crysis. Right now it's 80+% Oblivions and ~20% UT2K4, but I suspect once those titles come out it'll flip to 30% or less Oblivion for 2-3 months, and then abck again to more of a 50% and rising situation.

Quote:
The coolest parts of the Direct X10 version of the game are in the benchmark.


Yeah some of the stuff is neat and it's interesting to see some of the split-screen demos.

Quote:
Grape, do you think Crysis will use a shader based AA approach as well?


Dunno, but I would guess that yes it will at least be an option. Remember we're talking about in-game, and some games don't even have in game AA. My guess is that Crysis will give you the options for both, but it's so early it's hard to tell. The problem with the shader based approach is when the scene gets awesomely complex, and there's already a heavy shader burden. Just based on what we know about Crysis and their geometry loads and the amount of effects they could throw in. I wouldn't be surprised if they go both routes. But I also wouldn't be surprised that even with hardware AA, most people won't be using AA versus playing at higher resolutions.
July 6, 2007 7:47:52 PM

Makes you wonder if waiting for a few more months before purchasing a new video card is the way to go at this time.Naw!!!!I like upgrading.It beats the heck out of smoking.See,I traded one expensive habit for a more expensive habit.Only with this habit,I have something to show for it.And I do like computers.Anyways,I might wait for a bit just to see what happens.Later.

Dahak

AMD X2 5600+ @ 2.8ghz(stock)
M2N32-SLI DELUXE MB
2 GIGS DDR2 800 RAM
THERMALTAKE 850WATT PSU
7950 GX2 560/1450
ACER 22IN. LCD
SMILIDON RAIDMAX GAMING CASE
80GIG/250gig SATA2 HD's
XP MCE
3DMARK05 14644
a b U Graphics card
July 6, 2007 8:16:52 PM

Quote:
I think one very important point is being minimized in this discussion...that being the economics of DX10. If it is too cumbersome or requires only very expensive GPUs to work properly; it will be slow going waiting for properly DX10 only coded games, etc. I'm am quite sure developers like to be paid and sales drive their compensation. The enthusiast community just isn't big enough to support DX10 by itself. The buyers of prebuilt with on-board graphics or low performance GPUs are where the big money is. I expect any successful DX10 game will have to be playable in some form on what ever is being sold with its level graphics or very close to it.


Yeah! No one wants to discuss the reality that throwing more at the GPU, and expecting more realism at high resolutions, is turning out to be a pipe dream for now. Not to mention that the thrust is also at doing a lot more at the pixel level than the vertex level (pixel shader versus vertex shader), and now we have the geometry shader.

We need a $1000+ SLI setup to get playable frame rates with the new features enabled. Meanwhile, the Wii is selling like hot cakes, which in my no where near humble enough opinion, means very few people want to pay big bucks just for better graphics, and better graphics do not make a game more fun to play.
July 6, 2007 9:07:37 PM

I'm not saying Vista has everything to do with it, but you'd have to admit that DX9 games on XP have higher fps than they do on Vista. So you could assume they Vista has part to play with some of the suckage. More like squeezing the wound, but not being the wound itself.

Next gen cards should pretty much rock though as long as the heat output, and energy consumption isn't much higher than the way things are right now...hopefully they aren't any bigger than the ones right now too.
July 7, 2007 5:10:29 AM

Ape you need to keep things in context, the game sucking is in no way a cop out. It's simply pointing out a fact and that we need more games from more game developers to paint a more accurate picture of the state of the hardware. Crysis which will likely be optimized will be a better gauge.

As for it being part of the standard for DX10.1 that doesn't matter we are working with DX10 at the moment and just barely at that. Which is interesting considering that CoJ has clipping issue which one think would be more important than a few issues with high contrast shadows...

Also the second gen cards will likely be more about improving performance through brute force. We aren't likely to see real innovation until gen three or four. So it's incumbent to get the most out of the hardware that available today or the near future. Direct X 10 or 10.1 define the parameters. AMD and Nvidia have created the hardware which further refines what is possible. We are served best when the game developer optimise for what is actually available. I agree with you they shouldn't be pandering to either side. However I personally would either butter the knife on both sides or lean towards the market and/or performance leader. In this case that's Nvidea at the moment. Now they may lose the performance lead in couple of months but they own the market share and will likely have the performance crown by the end of the year.

I'm a fair weather fan in the hardware race because I'm looking out for me. I have my sports teams that I support through thick and thin, Go Buc's!

We the consumers benefit when the developer cooperate with hardware vendor that they are at the mercy of. As a student of the industry I know that you understand how much effort it takes to develop new hardware architecture. With that in mind they should be trying to serve us.
a b U Graphics card
July 7, 2007 9:55:14 PM

Quote:
Ape you need to keep things in context, the game sucking is in no way a cop out. It's simply pointing out a fact and that we need more games from more game developers to paint a more accurate picture of the state of the hardware. Crysis which will likely be optimized will be a better gauge.


I agree with that, but your using it in defence of nVidia's complaint didn't jive.

Quote:
As for it being part of the standard for DX10.1 that doesn't matter we are working with DX10 at the moment and just barely at that.


It's a requirement of D3D10.1, but still a supported feature of D3D10.0, so really it's in the standard, and nVidia complaining about it would be ATi saying 'no-fair' about SM3.0 features. They can comment on it's usefulness, but to say that Techland is doing something wrong like they and their followers say, is just wrong. As for issues, there's still bugs in HL2 and D3 and the original FartCry, so I'm not surprised that COJ like every other game, has a bug or three. But remember a bug is different than not adding a feature.

Quote:
Also the second gen cards will likely be more about improving performance through brute force. We aren't likely to see real innovation until gen three or four.


Actually it depends on what a Gen2 card is, the G90 was always seen as a refresh, but the G100 and R700 would be Gen2. So I suspect the G90 and R670/680 will be just addition of units here and there (nV on the front end, AMD on the back end) but also some re-working, with nV fixing some internal efficiency issues and ATi improving on say ratios and likely the scheduler IMO. So we may see some nice improvements, but it's still early to tell.

Quote:
So it's incumbent to get the most out of the hardware that available today or the near future.


Which really seems to be done best when there's true replacement of techniques to do things, and that will come with later games. Right now there's still alot of areas that could be improved, and simply cranking up the load is only the lame side of the equation for these early cards, especially the mid-range could benefit more from improving efficiency and lightening the CPU laod on some systems and making them much more playable.

Quote:
Direct X 10 or 10.1 define the parameters. AMD and Nvidia have created the hardware which further refines what is possible. We are served best when the game developer optimise for what is actually available.


AMD and nV don't refine what's possible, they LIMIT what's possible. M$ defines and refines what's possible. The only time the IHVs refine what's possible is when they go BEYOND the specs in the playbook (both optional and required). It might not always be practical, but think about what we're talking about here; M$ has standardized a different method of AA in their API, now it's nV that wants to limit that implementation. It's not refining it and saying 'this programmable method is better' than that one, they are saying stick with the old method which may be flawed, but we can do it faster.

Quote:
I agree with you they shouldn't be pandering to either side. However I personally would either butter the knife on both sides or lean towards the market and/or performance leader.


Which is a step backwards. That argument for the FX generation would've meant DX8.1 forever until nV finally got the GF6 out. Why would that kind of thinking be good, instead of pushing forward? There are options in the games, and like I said nV can force their own AA like is required for AA+HDR in Oblivion, so explain to me why going backwards makes sense. It's limits the options, not improves them.

Quote:
In this case that's Nvidea at the moment. Now they may lose the performance lead in couple of months but they own the market share and will likely have the performance crown by the end of the year.


Butthat's irrelevant to Techland and to the discussion of features. What you want is for the game to be playable on the widest variety of platforms, whcih it is. Adding a hardware AA solution just becaue nV wants it is irrelevant to the argument sinc ebasically the market that will be the biggest is the midrange, and they won't be using AA at all because they're going to be drowning in these titles, and will be in DX9 mode where they can use their hardware and shader based solutions. So really if you wanna argue the weight of the market, then DX10 should be so stripped down to make the mid/low end crowd feel good about themselves because they can play a 'DX10' title. It's the same argument you're making of playing down to the lowest common denominator, which if techland did, would be really low because of how much the GF8600 and HD2600 suck compred to their bigger brothers.
It's funny because this is an OPTIONAL feature AA, and really not about 'playable' and more about benchmarking BS. If nV and AMD spent half as much effort developing their drivers as they do talking about the benchmarks, both companies might have far better performance instead of far better PR guys.

Quote:
We the consumers benefit when the developer cooperate with hardware vendor that they are at the mercy of. As a student of the industry I know that you understand how much effort it takes to develop new hardware architecture. With that in mind they should be trying to serve us.


I think they are, in the sense that those who talk about AA and such are those who care about IQ, and if one method is more 'IQ correct' in their eyes, why would they go another route simply to increase benchmark scores. If people don't want the IQ, and just want big resolutions and high AA and high framerate, then they have the DX9 option to drop down to. The DX10 option should be the penultimate with the MAX IQ, that may be unpayable at max settings on current hardware, but also gives you the ability to increase settings as the systems/technology grows.
a b U Graphics card
July 7, 2007 10:05:34 PM

To sum up a bit and split a longer post...

I don't think there's a defense for complaing about things that are already part of Direct3D regardless of whether someone wants another method, if it's in there and the developer has a logical reason that there's no evidence against, then there's no grounds for a complaint against the developer. If this were such an issue then nV should come out with slides showing Techland's statements are incorrect and their method doesn't achieve what they say it does. Until then, I'll side with the developer and not an IHV, because neither nV nor ATi/AMD have a spotless history of respecting the developer's vision or game player's needs in games. Past non-optional floptimizations show that to the IHV it's more about the framerate than the quality or experience, and to think that nV or ATi/AMD has our interests in mind more than their own IMO is naive. Tehcland has one desire, it's customers in THIS game on as many platforms as possible (including intel and the others). nV and ATi have their customers, but would destroy this game if it meant significantly more sales by changing their drivers for Crysis or UT3 and break things in COJ.

I'm sure ATi would argue Quack was to benefit their users, and to some extent it did (things that would improve later drivers) same with their AF non-optional tweaks that finally later turned into optional Cat AI, and I'm sure nV would argue their Floptimizations (liek in FartCry) and shader replacement was for the benefit of their users who otherwise would've been slower than the competition and unplayable. But it doesn't mean those options are better than doing things the way the devs want, unless it achieves the same goal, and Techland in this case says it doesn't.

It's different if it looks like someone went out of their way to favour someone, but I don't see that here. If anything it looks like Techland picked the method both companies support, and based on their reply to nV, that BOTH companies asked for.

In a year's time Techland will still be supporting this game, will AMD or nV care anymore, not likely unless it stays a benchmark that sells cards. That to me says more about who I trust in this or any similar fight.
September 3, 2007 8:28:32 PM

Is it me or that article was so biased...I have a slower cpu, but get much higher frames than the 2900xt mentioned by anand...seems odd...doesn it ;) ...I guess they like Nvidia more ;) 
September 3, 2007 9:20:14 PM

The article date is July 5th, 2007, so new drivers and fixes might be one of the reasons why you see so much difference.
!