Sign in with
Sign up | Sign in
Your question

far cry patch 1.2 performance major boost for nV

Last response: in Graphics & Displays
Share
July 2, 2004 5:19:39 AM

well the wait is over for sm 3.0 tests

<A HREF="http://www.anandtech.com/video/showdoc.html?i=2102&p=1" target="_new">http://www.anandtech.com/video/showdoc.html?i=2102&p=1&...;/A>

all the cards are tested this time, ATi does fairly well but not good enough.


SM 3.0 only has benefits of 3-10% in most far cry levels. Which isn't bad but dirvers alone shows that that gt matches the xt pe.

Scenes that use more then 1 light there is a huge performance increase nv news reported something like a 30% increase in those scenes. Which is reasonable with amount of passes reduced.

Now just have to see how the other games respond to dx9c and the new drivers.
a b U Graphics card
July 2, 2004 6:27:57 AM

Your title and your content don't match. Title Patch 1.2, your content SM3.0, and your conclusion different than Anand's.

<font color=blue>"Both of our custom benchmarks show ATI cards leading without anisotropic filtering and antialiasing enabled, with NVIDIA taking over when the options are enabled. We didn't see much improvement from the new SM3.0 path in our benchmarks either."</font color=blue>

Of course nV's handpicked benchmark showed what they are selling. Hmm, and like I'd trust nV to decide which benchmarks to run. Sure cause they'd never optimize for a specific benchmark/path would they.

There are some nice looking improvments, but nothing spectacular, and nVNews quoting numbers is about as trustworthy as Rage3D numbers.

I'll wait for a deeper look with their own tests from people like [H] or Digit-Life, and After the FartCry fiasco, I'll wait until someone disects every layer.

Looks like a nice addition (hey doesn't cause any apparent drawbacks according to this review), but not the be all and end all as it was sold. The review says it best with the following two statements;

<font color=blue>"Even some of the benchmarks with which NVIDIA supplied us showed that the new rendering path in FarCry isn't a magic bullet that increases performance across the board through the entire game."</font color=blue>

and

<font color=blue>"It remains to be seen whether or not SM3.0 offer a significant reduction in complexity for developers attempting to implement this advanced functionality in their engines, as that will be where the battle surrounding SM3.0 will be won or lost. "</font color=blue>

That last one of course will be the biggest item, and really the games that AREN'T TWIMTBP games will show what unbiased people will do, and nV may entice deevelopers in their stable enough that regardless of effort SM3.0 will likely find it's way into any major TWIMTBP games.

Still need to see more than just FarCry to see it as being a must have feature.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
July 2, 2004 6:47:05 AM

I don't really care about the SM3 that much. I'm just pumped that the GT is layin' the smackdown on the x800XT and pro in AA/AF enabled tests. It rocks!

"This means that you can play over a network, just not with each other."-PC GAMER review
Related resources
Can't find your answer ? Ask !
July 2, 2004 1:24:29 PM

TheGreatGrapeApe:
Is there or is there not improvement from SM3.0 over SM2.0, and is this achived without "optimizations"(read: cheating?)
All that basicly matters for me :) 
That and the overall preformance, where I have to say that the eye-to-eye intial prefomance, has tilted more over in favour of Nvidia, not only via SM3.0 but also the new SLi :) 
No mistake, both cards rocks, but at the moment, should I upgrade, I'd go Nvidia...

Terracide - Brand is nothing, preformance everything!

Don't pretend - BE!
July 2, 2004 1:57:40 PM

This is the feature of what is to come with Doom 3 and Half life 2, both games are even more so shader intensive then Far Cry. Finally nV put thier 2 billion dollar's a year profit to good use.
July 2, 2004 4:38:57 PM

TheGreatGrapeApe

I dub thee SourGrapes

Balls, said the Queen if I had them I would be king!
July 2, 2004 5:10:27 PM

<A HREF="http://www.techreport.com/etc/2004q3/farcry/index.x?pg=..." target="_new">Techreport's Review</A>
Quote:
Version 1.2 of Far Cry will apparently come with four built-in demos for benchmarking. Those demos take place on the four levels mentioned in the NVIDIA presentation. Rather than use those pre-recorded demos, however, we elected to record five of our own—one on each of the four levels NVIDIA mentioned, and one on the "Control" level. The demos are downloadable via a link below.

I would believe this review more than Anandtech's simply because they did not use the demo's Nvidia handed them but still benchmarked their own demos on those same levels.
July 2, 2004 5:25:20 PM

keep looking at that tech report test, depends on the level shows the difference in how much faster the nV cards are. So the bottom line is over all performance the nV cards are faster by alot. Not in one or two specific tests but over all average.

The ultra kept up with the xt pe and the gt keeps up with xt, end results the same as Anandtech's review. Over all ATi and nV's cards are very close but nV's cards are 100 bucks less for the same performance. And most of the nV's board partners are overclocked so where does that leave ATi?

<A HREF="http://www.xbitlabs.com/articles/video/display/farcry30..." target="_new">http://www.xbitlabs.com/articles/video/display/farcry30...;/A>

Benches with min frame rates too nV leads on almost all counts. Unfortunately these don't have benches with aa and af. But Anandtech and Techreport shows immprovement in those departments.
a b U Graphics card
July 2, 2004 7:11:04 PM

The main thing is looking at is under open benchmarks. If it only shows performance increases in small select portions of the game, then it's limited in it's impact on performance. If the grass looks better, and the lighting effects look better at no penalty to perfromance or at an increase in performance then you have some series improvements. Right now the test base is limited, but the indications are positive, just not detailed, and really compared to other similar results on patch 1.1 and SM2.0, there is little change (considering that the benifits without AA/AF aren't as dramatic as the changes with AA/AF, then it's interesting to see exactly what is going on there). I'll wait 'til there's someone looking at this other than an site that seems to get alot of nV 'advanced looks'.

I'll wait for [H], Digit, Extremetech, and even Lars to take a look. As DX9.0c isn't even available yet to the general public, and neither is the patch, I'm in no rush to judge or to applaud.

Don't get me wrong, it looks promising, but so have many other performance improvements in the past.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
a b U Graphics card
July 2, 2004 7:57:33 PM

Quote:
This is the feature of what is to come with Doom 3 and Half life 2, both games are even more so shader intensive then Far Cry.

Well as D]|[ is OGL and not DX, then it should be performing near max now. From the 2.0 conference, no major additions from nV to think that waiting for 2.0 over 1.5 will make a difference, except for ATI which did have two new additions, one of which should mirror nV's shadow defining extension. HL2 (or a similar game) will be the true test for me as it is not part of the TWIMTBP program, and the FartCry type floptimizations shouldn't be found there. No doub there is improvement, but the question is why those improvements are greatest in AA/AF, that's a little more interesting, considering the increases are close to 50-80% in some case. Now that's worht checking into. I'll wait for [H], Extremetech, B3D, and Digit to do their typical in depth reviews before deciding. IT does look good on the surface, but I've seen far to many floptizations of this type before to simply trust it at face value.

Quote:
Finally nV put thier 2 billion dollar's a year profit to good use.

$2 billion a year? In what currency? Not US dollars, that's for sure. They didn't even break $2 billion in revenue last year, let alone profit. Last year Earnings was under $100 million, and net income was $74 million, and the leading 12 months has been $76million sofar. Unless you were using some other currency like Yen, or don't see the difference between revenue and profit.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
a b U Graphics card
July 2, 2004 7:59:33 PM

I dub the Mr Bobbit, really more appropriate.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
a b U Graphics card
July 2, 2004 8:58:38 PM

Quote:
keep looking at that tech report test,

As I do it doesn't look the same as the Anand review.

Quote:
nV's cards are 100 bucks less for the same performance.

The GT may be cheaper, but the Ultra is not cheaper than the X800XT, nor XTPE. The Gigabyte GA-R80X256V (an XT-PE) is available on PriceWatch for $511 ($499+ $12 S/H) versus the cheapest GF6800U for $540 ($540+0) which is far harder to find. So your statement is false, if anything the XT-PE is cheaper, and thus wins that criteria.
The true card to watch of course is really the GF6800GT, which is a good deal, although still rather rare.

Still waiting for [H], Digit, B3D, and ExtremeTech, Xbit does a good job, but they don't bother to look beyond the benchies unfortunately.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
a b U Graphics card
July 2, 2004 10:37:43 PM

You're on a roll. :smile:

Most surprising to me is NV winning AA/AF now. Especially when the FX cards crawled with those settings.

ABIT IS7, P4 2.6C, 512MB Corsair TwinX PC3200LL, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
a b U Graphics card
July 2, 2004 10:52:10 PM

Yeah their implementation is much better.

One MAJOR advantage of the GF6800 over the FX line is the use of rotational grid for AA, and lower AF quality (similar to ATI's which was theoretically lower but in real life most people prefered ATI's R3xx method over the FX's). Now that nV has adopted those two methods they are far better than their old FX counterparts. While SuperSampling AA can (theoretically) offer better AA, really it comes at a huge performance price). As to why suddenly there is upwards of a 50-80% increase over the previous methods using those same techniques in FarCry with patch 1.1/SM2.0, that would be what I like to see, especially when without AA/AF there is single digit, or very low double-digit increases in most places. That's what I'm interested in seeing explained.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
July 3, 2004 12:14:25 AM

well the gt matchs the xt pe in most of the benchs so how can you compair the ultra to the xt pe in price vs. performance?

Whats there to look beyond the benches? They already went over the IQ which both cards are now very similiar, everyone knows this, Anandtech, Techreport, Xbit all said this, as did Guru 3d, and Beyond 3D. If ya still have doubts and don't trust 5 independent benchmarkers might as well buy the cards and test for yourself.

Now the other questions that remain, is there anything else in the nV cores that will increase performance *hint*.

Oh yes now AF works on shaders in sm 3.0 so guess what the iq is actually better on nV cards :) . I wouldn't be surprised if they do the tests in the coming weeks and find this out too! If ya want to say thats not true grape be my guest cause this is a given.

And can drivers pull out more preformance for nV, just have to see about this *hint*

Well if ya don't see the similiarities in the performance boosts then you should really start looking into the render pipelines and how they work again. Because although the numbers aren't as high as Anandtech's the ratio's of the performance increases are the same. And the x-bits aswell. Also you can't deny the fact that now the ultra is now faster, and with a small over clock like like BFG does the gt and ultras are even faster.

You keep talking about 1 frame rates difference all you want thats not going to change the fact that I was right from 3 weeks back, and you were talking about speculation and benchmarks, and now they are out, trying to deny it all you want but this is the truth. You wanted proof you got it. I don't have to say anymore. Now you ramble on as if its not true even though benchs are out. So why kid yourself? should say you are a fanboy (jk :) ) or at least a nV skeptic, yeah the fx line left a bad taste in everyones mouth but things are about to change. And there are alot of things I know about the gf 6 line that hasn't been activated yet *hint*.

When I state something this solidly I've already done the tests. And now the truth comes out. Don't be so skeptical next time. What good does it do for me if people buys nV or ATi, I could care less, we get the same deals from both sides. Better for us if both are there we get better deals.

I said this a while ago this is the tip of the iceberg.

my engine tests show a 30% or more performance difference compairing x800 to the gf 6 line when heavy shaders are in use. Its not that we optimized for the gf 6 line its just that they are much better with shaders now, its not just SM 3.0 either. Everything with 2.0 is faster even single pass shaders. Remember I'm still making my engine backward compatible so I can't do without 2.o or even 1.1 support, we are using them as fall back just in case a card doesn't have 3.0 or is not fast enough to do 2.0.

And another one when the ultra is faster without sm 3.0.

<A HREF="http://www.ixbt-labs.com/articles2/gffx/nv40-3-p2.html" target="_new">http://www.ixbt-labs.com/articles2/gffx/nv40-3-p2.html&...;/A> Keep in mind there is not 1.2 patch for Far Cry either.

And a custome test from Hexus

<A HREF="http://www.hexus.net/content/reviews/review.php?dXJsX3J..." target="_new">http://www.hexus.net/content/reviews/review.php?dXJsX3J...;/A>
a b U Graphics card
July 3, 2004 2:35:22 AM

What is up with Patch 1.2 and the ATI cards. Looking at your last link, the X800's lose alot of performance with the new patch. The best scores of all at 1600x1200 4X/8X are the X800XTpe with 1.1. But with the new patch, NV almost catches up, and ATI drops back. What gives?


ABIT IS7, P4 2.6C, 512MB Corsair TwinX PC3200LL, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
a b U Graphics card
July 3, 2004 2:57:29 AM

Quote:
well the gt matchs the xt pe in most of the benchs

This is the Quake 3 argument. Yes at 1024x768 the GT is very capable of matching the XT, however crank it up to 16x12 and suddenly it lags behind. Just like a GF4ti can outperform an FX5900 or R9800 in low res tests.

Quote:
so how can you compair the ultra to the xt pe in price vs. performance?

Because considering the small difference in performance in real terms outside of AA/AF they are close enough. If the AA/AF turns out to be global (which it doesn't match in many of the current benchies), then thats something else to consider, but it is but one game sofar. However I need not compare as you made the statement about <i>"Over all ATi and nV's cards are very close but nV's cards are 100 bucks less for the same performance."</i> So either you're talking about the GT and the ULTRA, or misleading people about the performance of the GF6800 if it is to be the second in the "nV's Cards" category. So which is it? I have no argument that the GT is a great deal, but don't try and pretend that I chose the cards to compare.

Quote:
Whats there to look beyond the benches? They already went over the IQ which both cards are now very similiar, everyone knows this, Anandtech, Techreport, Xbit all said this, as did Guru 3d, and Beyond 3D.

Well what was there to look at after the initial GF6800 and X800 reviews, I guess NO ONE found any irregularities? Right? How did EVERYONE miss them if they checked as hard as you say people do. These initial Benchies have minimal attention to detail. They look for Bugs, like the one for the X800 in the Anand review, and the one for the GF6800 in the Xbit review. They don't grind the IQ for days yet, if ever even that. The reviewers I listed take the time, if not now, later to look at the tests/IQ in detail. Anand basically admitted they didn't do all the testing they wanted since they used the nV based benchies and picked some of their own that didn't potentially stress the advantages of the patch and access to SM3.0.

Quote:
If ya still have doubts and don't trust 5 independent benchmarkers might as well buy the cards and test for yourself.

Well, I'll leave the testing to people I KNOW can do a better job than I, and also are paid to do it. They also have better tools at their disposal, so I trust their final words. I'm sure that everyone should've trusted those initial FX reviews too, eh? The ones that said they were tops. I'll wait. I admit there's significant improvement, but outside of the AA/AF, it's not a phenomenal as you predicted. And BTW, where's that performance improvement for the FX series you spoke of. Everything so far shows the opposite.

Quote:
And can drivers pull out more preformance for nV, just have to see about this *hint*

Sure they can, we've seen it before, but can they do it without adding other issues, now that's the question. Can ATI pull out faster speed with their drivers, sure they can, but at what cost. I'll wait for TRUE IQ tests not just a random sampling of screenies taken while someone rushes to meet their release deadline. And as Xbit hints, there's obviously more headroom in the ATI's too, their final remark is quite telling;<font color=blue><i>For an unknown reason the RADEON X800-series graphics products’ performance slightly dropped in FarCry version 1.2 compared to the version 1.1. The reasons for the drop are not clear, but may indicate that ATI also has some speed headroom with its latest family of graphics processors and the final judgement is yet to be made…</i></font color=blue> And that very last line is exactly where I stand on it.

Quote:
Also you can't deny the fact that now the ultra is now faster, and with a small over clock like like BFG does the gt and ultras are even faster.

Yes they are faster for the most part, in one game. But even then the XT does have it's victories, which isn't the global thrashing you predicted.

The funny thing is that the standard results aren't that far off from those that Digit-Life got with the <A HREF="http://www.digit-life.com/articles2/gffx/nv40-3-p3.html..." target="_new">1.1patch and SM2.0</A>, so I wouldn't say it's THAT impressive. The AA/AF are the impressive ones, but don't try and convince me that we shouldn't question those scores considering both companies recent activities in this area, and THAT TOO is a GIVEN.

Quote:
You keep talking about 1 frame rates difference all you want thats not going to change the fact that I was right from 3 weeks back,

Global 30% then? Just by drivers alone then? Increase in FX then? Nah, haven't seen that yet. Sure a little bit of this and a little bit of that has brought them up, but it's still so far just in one game. Show me some other stellar improvement like you promised.

Quote:
and you were talking about speculation and benchmarks, and now they are out, trying to deny it all you want but this is the truth.

The truth doesn't match your promises except in certain areas, this global increase you spoke of never materialized.

Quote:
yeah the fx line left a bad taste in everyones mouth but things are about to change.

No one is saying they are the FX line, nor that things aren't different, however the bill of goods that nV and yourself have tried to sell still hasn't materialized. It's got better performance, but nothing so much as to make people say, gee, damn, NOW that's efficient. Bring me a non-TWIMTBP game that actually optimizes for more than the less than 1% of cards out there, and then you'd have a more convincing argument. So far, it's doing well in a game that it should so well in since they've optimized specifically for it. Show me equal improvments in a game like TombRaider AOD where they currently struggle, and then you'd have something. This isn't some kind of ground breaking earth shattering performance difference like the HL2 initial numbers, this is something that is the few frames difference, and not very different from those previous Digit-Life benchark results. The AA/AF is impressive, and if it does hold true, then that's something, but this being some demo of a pure raw SM3.0 advnatage is very unimpressive. I expect D]|[ to provide a much larger gap than this. So really, for all the lead up and hoopla, not really that impressive. Sure there's benifits, but nowhere near what was advertised.

Quote:
Don't be so skeptical next time.

Yeah that'll be the day. Without so much as a 3Dmark (still having trouble posting them?), what did you have to offer? And still the end result wasn't as good as what you said. If it were, then these results <A HREF="http://www.techreport.com/etc/2004q3/farcry/index.x?pg=..." target="_new">TechReport1</A> and <A HREF="http://www.techreport.com/etc/2004q3/farcry/index.x?pg=..." target="_new">TechReport2</A> wouldn't be so close, despite what has become an obvious drop in performance for the ATIs due to the patch.

Quote:
Remember I'm still making my engine backward compatible so I can't do without 2.o or even 1.1 support, we are using them as fall back just in case a card doesn't have 3.0 or is not fast enough to do 2.0.

Which is exactly what Crytek didn't do in this case. This entire patch was meant for basically 1% of card holders. Any FX users got hosed simply to highlight this card. Considering the number of people who own GF6800s it seems to be quite the slap in the face to old users. As for improvements, I wonder how many FX users will decide not to add the patch simply so they can save their performance numbers at still reasonable levels. You wonder why I question and doubt, it's simply because CryTek has been the willing PR participant from the start with their trumpeting PS3.0 support in Patch 1.1, which you said wasn't there, so either it's a lie or simply a rushed feature that wouldn't work. Either way, that kind of 'effort' makes me a little sceptical to say the least. IT's surprising too that the patch usually hurts the X800s performance, whereas prior to the patch things were rather fine. Even in the Hexus results the difference between nV post patch/SM3 and ATI pre patch runs at closer to less than 10% Surprising how without the patch the X800XT at high res. + AA/AF did better than the GF6800U with the updates, yet 'patch' the X800 and suddenly it's struggling. That seems to be the case alot of the time.

Quote:
And another one when the ultra is faster without sm 3.0.

Yes see above, but linked to the proper page, like I said, as an improvement over THAT, it's not impressive. So how much is SM3.0 and how much was driver version 61.34?

The main thing is that the GF6800s do very well, but does the improvement come anywhere near the promotion? I don't think so, and most of the reviewers so far tend to agree.

The future may offer far better tests, and like I said before, likely games that are built AROUND SM3.0 will show bigger differences IMO. So far, the differences are somewhat limited, even if they are something that looks good in PR print ads.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
a b U Graphics card
July 3, 2004 3:01:10 AM

CryTek's not 'In The Game' so they are showing you 'The Way It's Meant To Be Played', even on ATI cards.

ATI gets a patch they don't <b>need</b> which slows down performance and <A HREF="http://images.anandtech.com/reviews/video/nvidia/farcry..." target="_new">F's up the IQ</A>. Yeah I wouldn't plan on dowloading that patch if I owned FarCry.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
July 3, 2004 5:06:41 AM

I wish we would also have some benchmarks with FRAPS rather than demos. Include all the ingame AI and physics. ;) 

Anandtech updated their review with correct numbers.
It was indeed AA not being applied to the 6800 because they were enabled through the control panel.

The X800XT PE (SM2.0) does not loose once to the 6800U (SM3.0) with 4xAA/8xAF enabled. It shows a pretty good lead in some tests.

The X800XT PE and 6800U are basicly tied w/no AA/AF in the first 2 tests.
SM3.0 shows good improvement for the 6800U and it wins in most of the Nvidia demos when the single light path comes in handy (with no AA/AF).

<P ID="edit"><FONT SIZE=-1><EM>Edited by piccoro on 07/03/04 01:08 AM.</EM></FONT></P>
July 3, 2004 5:10:02 AM

Ah 3 weeks ago the drivers were different or did you forget?

All your arguments were based on old drivers. 2 days ago you were beeping on IQ while showing old pictures of Far Cry on pre 1.1 patch with 61.11/61.12 drivers.

Actually performance changed for ATi cards in Far Cry so its probably due to a bug not something Crytek did. They too are partners with ATi and nV. Anandtech pointed out there were bugs at the textures level for ATi cards with patch 1.2. And if you remember I stated bugs at the texture level cause pixel overdraw. Thus slowing down performance. And if AF is used this performance drop becomes pronounced.

There was an overall 30% increase with Dx9c Sm 3.0 and the new drivers which I stated 2 and half weeks back. Another thing you seemed to have fogotten :) .

Remember the facts. I remember them very well. I know what you said, I don't foget what I said. You might want to go back and look at those threads hehe.

This is what I said.

The drivers gave a 15-20% increase and dx9c and sm 3.0 gave 10-15%. So guess what that all came true :) .

There were steller improvements in tombraider, and painkiller, not to mention all other dx9 gamse like Ut 2004 where the gf 6 now leads too

the gf 6 line beats ati in pain killer now also.

It also beats them in the Half Life 2 beta

Where was Gabe Newell's 40% ATi lead?
a b U Graphics card
July 3, 2004 8:03:12 AM

Yes let's remember your statements.

Let's see my personal favourite;

<font color=green>"test it the fx line with the new drivers they are faster much faster with dx9c too. And the shader quality still not as good as ATi, but better. Also you can now turn of AF optimizations on the fx line aswell with .40 and up. "</font color=green>

Hmmm, that didn't seem to happen now did it? If anything tests so far have provided negative feedback for the FX line. Perhaps they need to fix the run-time compiler again.

Now on to the original 30%, it wasn't the combination of DX9.0C and other items it was the drivers ALONE that you stated caused the increase. The exact words;

<font color=green>"Well those were the old driver tests, the new drivers with the 30% boost it will take care of the pro and xt :) "</font color=green>

Not DX9.0C + a Patch or any other variable. Simply the above statement. Which still hasn't come to pass on it's own even with the 61.72 drivers.

It's funny that overall even the difference between the old and the new cannot be fully established since for the most part everyone is using different benmarking runs, and often using different setups. You did notice I'm sure that most of the reviewers have increased their CPU power since their previous reviews. You weren't fooled again by this into simply subtracting one score from the other again?

And not to forget this one of course;

<font color=green>"This is a bit different 61.11 already boosted the performance 20% on lower reses, higher reses 30%. (far cry as an example)"</font color=green>

Hmm, so this initial statement of 30% from the 61.11 alone not the 61.45s + SM3.0 + Patch 1.2.

So in effect a ~70% increase eh, since the boost later started after the 61.34 drivers? OR were you just throwing everything together because you didn't know what caused what?

Quote:
There was an overall 30% increase with Dx9c Sm 3.0 and the new drivers which I stated 2 and half weeks back. Another thing you seemed to have fogotten :) .

No I remember it well, you DIDN'T state A+B+C, simply that A alone did it, and that B would do it too, and hold on to you socks for C!

All things together sure 30% is reasonable, it's not that great an improvement. But you were selling each part as the improvement, and whenit didn't come to pass you simply said it was going to happen later. And once when you thought it had come to pass you didn't notice the CPU differences, and once again said, maybe next time. And interesting here's another error on your part

Quote:
The drivers gave a 15-20% increase and dx9c and sm 3.0 gave 10-15%.

Which gives us a model of 27% to 38%, which doesn't add up to your statements, and overall sometimes doesn't even add up to the reality of the increase.

Quote:
This is what I said.

No, not quite.

Quote:
I don't foget what I said

But you do <b>forget</b>. You forget that your statements never started with the holy trinity of improvement, simply that one aspect, the drivers, brought all the performance, and that SM3.0 and later the Patch would then do more. 30% from a whole whack of drivers, and an nV-centric patch, and an update to DX (which still hasn't truely arrived yet), well sure that's far more believable than the Magic drivers you were selling.

Quote:
You might want to go back and look at those threads hehe.

I did, hehe, and it doesn't bear out your statements, any more now, then your supposed proof of links to faster CPUs did then.

Now on to something out of context...

Quote:
Where was Gabe Newell's 40% ATi lead?

It's still there with the same cards that had that problem, the FX, and so far it doesn't look like any new magic fixes are going to help them either. So far it's shown nothing but the opposite.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
a b U Graphics card
July 3, 2004 8:24:44 AM

I just thought we'd revisit another one of your statements;

<font color=green>I wouldn't be surprised if they do the tests in the coming weeks and find this out too! If ya want to say thats not true grape be my guest cause this is a given.</font color=green>

Once again just accept it, because it's 'a given', is that it, simply because our 'betters' tell us so. Well let's see what Piccoro was mentioning above;

<font color=blue>UPDATE: It has recently come to our attention that our 4xAA/8xAF benchmark numbers for NVIDIA 6800 series cards were incorrect when this article was first published. The control panel was used to set the antialiasing level, which doesn't work with FarCry unless set specifically in the FarCry profile (which was not done here). We appologize for the error, and have updated our graphs and analysis accordingly.</font color=blue>

I guess if they HAD looked at the IQ they may find a difference between NO AA and Some? No?!? A card running AA versus a card that isn't, sounds to me like a serious floptimization.

This just PROVES that they didn't really look in depth since any screen comparison would have showed the difference. A rush to get the review to market? Hmm, wonder why a person might want to wait and SEE?

So did you wish to revise your original statement, or leave the brilliant "When I state something this solidly I've already done the tests.", which means either you didn't properly enable AA yourself or you didn't bother to question results that deviated from your own because they fit your purpose better.

As I stated before about the TechReport's review, "As I do it doesn't look the same as the Anand review.

And even with 5 good review sites out it appears that healthy skepticism is a good thing. Perhaps you'd like to rethink how you go about validating your reviews/statements.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
a b U Graphics card
July 3, 2004 8:27:04 AM

Thank you, Sir!

Had you not pointed that out I likely never would have revisted that review, and a pretty valuable piece of information, which obviously it was wise to question, would have gone unnoticed and unchallenged.

I have always relied on the kindness of strangers. :lol: 


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
a b U Graphics card
July 3, 2004 1:21:26 PM

Yeah, LOL thx for pointing that out. That totally explains the AA/AF improvements. :lol:  Oh well, the 6800Ultra still beats the X800 Pro anyway, and the GT beats the..., uh, the..., uh, the... Well it beats the 6800 anyway. :wink:


ABIT IS7, P4 2.6C, 512MB Corsair TwinX PC3200LL, Radeon 9800 Pro, Santa Cruz, TruePower 430watt<P ID="edit"><FONT SIZE=-1><EM>Edited by pauldh on 07/03/04 09:22 AM.</EM></FONT></P>
a b U Graphics card
July 3, 2004 1:25:57 PM

Grape, I predict a 60% improvement in AA/AF on the X800's. Just get the new beta drivers and then disable AA/AF and you get a huge fps increase. Just you wait and see. hehe


ABIT IS7, P4 2.6C, 512MB Corsair TwinX PC3200LL, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
July 3, 2004 1:48:34 PM

Take alook at all the reviews and add up the numbers the 6800 is still in the lead :) , Ati does better where shaders aren't as in intense (which is pointed out) AKA outdoor levels? nV does better in door levels much better with aa and af too. So add the numbers up and 6800 is now faster.

Which when you reported that there was something starange in Anandtech's number and someone showed techreport, I said this same thing. Add the numbers up from xbit, techreport, hexus, any of them (remember hexus was a custom bench one the not nV uses or Crytek uses)

So how can you still argue when everything you said was bs from 3 weeks back.


Also The fx line the banding has been taken care of just checked.
<P ID="edit"><FONT SIZE=-1><EM>Edited by entium on 07/03/04 04:41 PM.</EM></FONT></P>
July 3, 2004 1:57:44 PM

Well the hell are ya still talking about the fx line, which i've stated quite a few times thier quality wasn't good. Theses drivers are probably not going to effect that. I wasn't even referring to the fx line. So your talking bs again. which I thought we cleared up?

Show me where I said the fx line got a global incease from the new drivers.


From 61.11 to 61.34 some of the benchmarks are more then 40%. Again take alook at some of the old benchmarks and compair them to the newer benchmark numbers.

new bnechmark - old benchmark
_______________________________

Old benchmark

JK
I wasn't throughing crap together man, look at how the older drivers where wasn't the ultra getting in the 20-30 at aa and af at 4x 8x at 1600x1200 with older drivers? What is a 30 % boost at that time 30-40 now? So where is your math?

And if you look at it this way it was let say 25 fps now its doing 55 fps, So guess what there was more then 100% improvement. not the 75% where ever ya got that from, on if there was an initial 30% improvement and then a 45% improvment, you can't just add the percentages up and say its 75% math 101 the base number changed you have to do precentages as a whole.

Seems like you don't know how precentages work.

Example, lets say frame rates are 30 and it goes up to 40

40-30=a change of 10

10/30 *100= 33%

So now where we at agian? 50-55 I think the nV 6800 ultra is pulling depending on the level

So lets say it started at 35 shall we with the 61.11 (confirmed this too at firingsquid just to make sure)
<A HREF="http://firingsquad.com/hardware/ati_radeon_x800/page25...." target="_new">http://firingsquad.com/hardware/ati_radeon_x800/page25....;/A>

drivers so now where are we 55-35=20

20/35*100= roughly 58% increase

Were my numbers wrong?

Really should check your math before ya speak, I already did.

I don't like spelling things out. Simply arthmatic you should do it. Its not hard. Might take ya a couple minutes or so.

If the cpu isn't a limitation, which it isn't in far cry when you have high res and aa an af. then the test results should be very similiar in any case aslong as the graphics card and drivers sets are the same.


Here is another custom demo

<A HREF="http://firingsquad.com/hardware/far_cry_sm30/page5.asp" target="_new">http://firingsquad.com/hardware/far_cry_sm30/page5.asp&...;/A>

unfortunately the x800 xt is not in this batch but you can see at 1600x1200 with 4x/8x aa/af the gt is at 42, the ultra is at 46, and the ultra extreme is at 50, where ati pro is at 38 and the xt pe is at 51.

so well xt is in fairly a bit slower then the xt pe so its safe to say its around 46.

Now lets go back and recap different games.

ATi losses all open gl games, ATi looses in Half life 2 beta demo tests, Far cry now each card sets share leads. Pain killer is close on both cards, Ut 2004 is all nV, well doesn't leave much for ATi to say they have the performance crown does it?

Here is a nice game recap for ya, <A HREF="http://www.digit-life.com/articles2/gffx/nv40-3-p4.html..." target="_new">http://www.digit-life.com/articles2/gffx/nv40-3-p4.html...;/A>

and also Guru 3d

<A HREF="http://guru3d.com/article/Videocards/135/" target="_new">http://guru3d.com/article/Videocards/135/&lt;/A>
a b U Graphics card
July 3, 2004 10:32:34 PM

The conclusion to your digilife review states that the Ultra loses to the XT:

"Total:

Leadtek GeForce 6800 Ultra vs. ATI RADEON X800 XT: a loss
Leadtek GeForce 6800 vs. ATI RADEON 9800 XT: a victory
Leadtek GeForce 6800 vs. NVIDIA GeForce FX 5950 Ultra: a victory"

And your Guru3d linked article has an update done which includes the X800XT and a faster test system. They now state this:
"The faster test-system proved that performance wise things will shift in advantage for the x800 XT series over the Ultra."

Seems there is no way to claim a real winner even with all the new drivers and patches. NV has caught up. The GT looks very good. But NV isn't faster. When stressed the most the XT is a hair in the lead still.

<A HREF="http://www.guru3d.com/article/content/136/9/" target="_new">http://www.guru3d.com/article/content/136/9/&lt;/A>




ABIT IS7, P4 2.6C, 1GB Corsair XMS 4000 Pro Series, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
July 3, 2004 10:58:04 PM

Well there were no aa and af's on these also, the only game that was a clear victory for the xt was splinter cell, and thats a dx8 game. All games now with aa and af the cpu isn't a bottleneck so I would imagine that you wouldn't be playing one of these games with these cards without those. Plus not dx9c and sm 3.0 drivers
a b U Graphics card
July 3, 2004 11:01:22 PM

The GT looks like a nice card, especially when it drops to the under $300 range some day. Should be a very popular card indeed.


ABIT IS7, P4 2.6C, 1GB Corsair XMS 4000 Pro Series, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
July 3, 2004 11:18:32 PM

yeah it does, err wasn't there a online store that had a 30% discount on it? Think it was for PNY cards
a b U Graphics card
July 3, 2004 11:26:18 PM

To me I like to see [H]'s max playable settings, and 1600x1200 both with and without AA/AF. Why bother with 1024x768 on these monster cards unless like [H] tests, that is max playable. Otherwise the king of the 1600x1200 4X/8X is what I'd call the fastest card overall. (Although on a 19" CRT I prefer to play at 1280x1024 myself, so i do look at those tests too.)

Going by this, UT2004 1600x1200 4X/8X the XT beats the ultra, Wolfenstein goes to the Ultra, Farcry to the XT, Splinter Cell to the XT, 3dmark03 and Aquamark to the XT. Stressed the most at 1600x1200 and with the AA/AF charts, it is the XT who won 5 out of 6 tests. They are both almost identical, but the XT did edge out the Ultra in all but Wolfenstein. And Wolfenstein is ancient. So the XT only gets 142FPS at 1600x1200 4X/8X. No big deal. NV is higher, but this game doesn't make the cards break a sweat. I still say no clear winner. Just not sure how you come to the conclusion the Ultra is faster than the XT. NV looks very good, but so does ATI. I'd personally right now say the 6800GT and the XT Platinum are the two best new options. But like others, I will wait to fully decide. More sites doing tests on retail cards and scrutinizing IQ for cheats. Sad, but that is what it has come to to see if there will be a clear winner. Either way, unlike last round, to me it seems that neither company lost this round.


ABIT IS7, P4 2.6C, 1GB Corsair XMS 4000 Pro Series, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
a b U Graphics card
July 3, 2004 11:30:00 PM

That is a killer card for 30% off msrp. I would have ordered one if I saw that card for $280-$300. If GT's sell for that cheap, what will the plain 6800 sell for?


ABIT IS7, P4 2.6C, 1GB Corsair XMS 4000 Pro Series, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
July 3, 2004 11:37:34 PM

this is how see it, the 61.45 dirvers are faster then the 61.34 dirvers (a few bug fixes), and that later tests by t-break showed that I think don't remember which benchmarker tested this. I didn't see much changes after the 61.45 dirver set.

Hopefully nV fixes thier woes with aa and af, cause thats really hurting them.


I'll see if I can find that pny link for ya after dinner :) 
a b U Graphics card
July 4, 2004 12:13:25 AM

Ah, COmpusa had the 30% off all PNY cards and of course none of the 6800's are available online or in a store near me. I guess I missed that one last Sunday. I'd have bought a GT for $280 + tax for sure.


ABIT IS7, P4 2.6C, 1GB Corsair XMS 4000 Pro Series, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
a b U Graphics card
July 4, 2004 9:55:14 AM

Quote:
So add the numbers up and 6800 is now faster.

I can see that the 6800 is basically within the margin of error of the X800. I said they're close, you say there's something special to differentiate the speeds. You talk about complex shaders, yet the difference is still minimal.

Quote:
Which when you reported that there was something starange in Anandtech's number and someone showed techreport, I said this same thing.

Right, you said the numbers aren't as high but the ratio is the same. Sure that's doubt. The ratio/% increase is still what we're talking about, and it's not there.

Quote:
So how can you still argue when everything you said was bs from 3 weeks back.

Because you said drivers would do it alone, and you were wrong, then you said wait next drivers would do it and you're wrong. Finally after about 6 updates plus CryTek's NV40 only boosts (at the cost of other players) you finally get the increase you said would come from the drivers alone. Talk about BS, man you spread it pretty thick. And you wonder why people question your numbers, yet we can't question numbers that look fishy, which you defend and ask me why I doubt them, or more accurately, <i>"Don't be so skeptical next time"</i>, which shows you were wrong again, because once again reality didn't match the intial hype.

Quote:
Well the hell are ya still talking about the fx line,



Cause you said that the drivers and DX9.0c would help it too, and you were wrong about that and FarCry too. It's just an example of you throwing alot out there and hoping something would stick.

Quote:
which i've stated quite a few times thier quality wasn't good. Theses drivers are probably not going to effect that. I wasn't even referring to the fx line.

The quotes just a few posts back, if you want I can re-highlight it for you, but since your memory is SOOoooo good I probably don't have to, right?

You talk about the 40% difference, which is in direct reference to the last generation. You keep asking why I bring it up, simple, because you keep mentioning it.

Quote:
And if you look at it this way it was let say 25 fps now its doing 55 fps

Let's say you cobble together a test that is the same that shows your improvement. You talk about the Anand and Firing Squad benchies, and while they do use the same basic setup (except for Anand has a faster drive) the benchmarks are different. I know you have a problem with consistancy but you should be able to see that different becnhies give different results at least. You think one result from one benchmark can be compared to a totally different benchmark? Seems like you don't know about something called internal validity.

Quote:
If the cpu isn't a limitation, which it isn't in far cry when you have high res and aa an af. then the test results should be very similiar in any case aslong as the graphics card and drivers sets are the same.

As long as the benchmarks are the same, which they aren't. Seriously you need to resort to that kind of cheating to try and validate your statements, that's pretty cheesy. Don't worry about your arithmetic, I'm sure you had your calculator out and churning numbers to look for something to support your statements, too bad your methods are shot so your math does add up to squat.

Quote:
Here is another custom demo

It's obvious you just don't know what 'CUSTOM' means. As in not the same as others.

Quote:
unfortunately the x800 xt is not in this batch

Why unfortunate? Unlike the GF6800 Ultra Extreme Supa-Doopa, the X800XTplatinum IS available, heck I even showed you how it costs less than the regular ULTRA, so what happened to your $100 difference mattering, etc? Now you want to compare to the lesser XT, just because it suits your purposes?

Quote:
ATi losses all open gl games, ATi looses in Half life 2 beta demo tests

unless AA/AF is on, then it's a toss up.


Quote:
Pain killer is close on both cards,

Except when you turn on AA/AF, sure they're close, but anything you can call a win anywhere else is definitely a win there.

Quote:
Ut 2004 is all nV,

Yet, UT2K3, is all ATI in the Digit-Life review you posted, and no UT2K4. I KNOW from experience that the UT2K4 numbers are close, however the review you posted the UT2K3 numbers aren't close.

Quote:
well doesn't leave much for ATi to say they have the performance crown does it?

Except for the previous page where they outperfrom in TombRaider and Unreal-II. In reality it doesn't leave much for EITHER to claim the crown, especially if the Digit-Life review is with just an XT and not an XT Platinum. Really would've helped to se the speeds of the XT, and it's overclock, not just the Ultra's overclock, especially considering the price advanatage.

I have always said the GT is a great performer, but when it comes to the TOP cards, the two are tied, and since you wanted to mention money, then for top slot the XT has to win based on price AND availability.

Currently the the two best buys to me remain the GF6800GT and the X600XT (if you don't include the R9800pro), and the top slots are both wastes of money with the XT being the better deal between the two e-penis money grabbers. Like paul points out the differences are so close it only takes a system upgrade to shift things around.

However back to the subject at hand. You said the drivers alone would offer global improvement, and you were wrong. You offered this up as proof of a MAJOR coup, and you were wrong, and you said I should trust the reviewers or else do the tests myself, and once again you, like Anand's testing methods, were wrong. You talk about how, <i>"ATi does fairly well but not good enough."</i>, when in fact they do pretty well for a game/test/patch/site geared to benifit and showcase nV. If this is what you have as evidence of the MUST have advantage you keep jawing about, this is not it. So far this shows simply that for the foreseable future, there's not much out there to define the two cards. However I DO expect that to change with new games, but even then who knows, if the differences are this mild, then it will have less impact than we thought. I still stand behind the choice of the GT, but just like the FX5900XT, that too could change if the price ranges shift around. You mistakenly (no surprise there) mention nV's 2billion a year profit that they can spend, well neither company wants to burn money, but out of the two, ATI has more profit margin to burn with a previous 12 month profit of $165mil (~90mil diff), on a money losing marketing fight, which wouldn't benifit either company IMO. So shifting the non-platinum XT and pro down price levels once there are actually GF6800s out there to compete against would make things more difficult to call. However until then there is one clear choice IMO, as long as you can find it.

So far you've been more wrong than right, and it's not impressive when the only thing you do get right and try to focus on is your basic math skillz.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
a b U Graphics card
July 4, 2004 9:57:49 AM

The plain jane GF6800 may simply wind up as a mainly OEM thing short term, where just the moniker GF6800 will garner buyers. The GT can sell at $250-350 and leave the sub $250 market to the GF6800 in a month or two (right now ~$300 for a GF6800 is too much IMO, when a GT is only a few bones more). The main issue will be what to do with the old FXs and even the PCX5700 series. Once again the mid-range cards may suffer because of the low price of the low-upper-end card.

The GT will probably remain the best buy for the near term, and I find it strange that someone feels they have to attack the X800 series to try and prop up a card that right now beats them all dollar for dollar IMO. Sure that may change later, but for now, the GT is the FX5900XT of the last generation, and if nV can move enough to market, they should sell very well.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
July 4, 2004 2:30:29 PM

Grape you want bs, evertime you said that my numbers were wrong well they are right now. And you can deny it as mucha as you want. 58%, actaully just pulled out the calculator 57.1....., I aproximated. So you tell me I said 40% from drivers all the way around? Break down the drivers and dx9c so what is the break down? Did you forget the 100 percent improvements in Tomb raider and Pain Killer?

And about 35% increase in Ut 2004? So why do you argue when the number are hitting you smack in the face and everything you say has no meaning or context that gets even close to what the benchmarkers have seen in different dirver sets?

Btw I never said that the drivers and dx9c will help the fx line. The drivers did help the fx line a little only with image quality, it removed the banding. But there was really no major speed gain. So again I think I was unsure if the banding was removed or not thats what I said, and it was removed.

Dude you don't have far cry, you can make your own custom demos. Its not hard, just hit a button and walk around. Thats what those two other sites did.

So Now you tell me, I showed ya the benchmarks, you still deny it. So what the hell is wrong with you?

I think you are too thick skulled. By the way don't you use precentaging when you use filters in Photoshop for editing really should take a course of algebra 1 or 2 just to freshen up.


Also do you think AA and AF is a driver issue? There is a very good chance it is, because its the drivers that regulate aa and af not the cards. So you think nV won't improve those in the next few driver sets? I'm having a conference call with nV Tuesday, and this is what we will be talking about.
a b U Graphics card
July 4, 2004 8:13:39 PM

Quote:
Grape you want bs,

Nope, and that's exactly why I point out your BS in a hope it would end, but obviously it won't, so why bother?

Quote:
evertime you said that my numbers were wrong well they are right now.

SPrinkling here and there and as so many sites have shown the improvements are mostly from PAtch 1.2 not the drivers, unlike you said. So after some driver improvement, now you want to claim that your global 30% was refering to drivers + DX9.0c + Patch 1.2 and plus SM3.0, which it wasn't you state one thing, and still haven't backed it up. And you were talking about everything getting a boost, not just one or two things, and that all these boosts from the drivers would destroy the x800 series. Neither of those things have happened. Things are better but they are still close, and it's not a sweep like you predicted.

While you are familiar with how to make a custom demo, you aren't familiar with the idea that one custom demo cannot be related to a DIFFERENT custom demo, and that's all you've done sofar is to post unrelated benchmarks, ones even with different setps, you did that last time you were trying to point out driver improvement (at least then you realized your folly of different CPUs), so nothings changed there. Oooh thanks for the tip about custom benchies btw, never would have guessed, good thing we have you around to explain it.

Quote:
Btw I never said that the drivers and dx9c will help the fx line.

Sure you did, here's your words again. Seems like you said EXACTLY that, are you lying or did you FORGET like you said you don't?

<font color=green>"test it the fx line with the new drivers they are faster much faster with dx9c too. And the shader quality still not as good as ATi, but better. Also you can now turn of AF optimizations on the fx line aswell with .40 and up. "</font color=green>

I see the words faster, and MUCH FASTER, and DX9C. So you want to avoid that you said it because it proves that you'll say anything in the hopes that it may come true.

Quote:
So again I think I was unsure if the banding was removed or not thats what I said, and it was removed.

Nope that's what you said above, not about banding about speed. You can deny it all you want, but that's exactly what you said.

Quote:
Also do you think AA and AF is a driver issue? There is a very good chance it is,...

BLAH Blah blah, whatever dude. Once again, tomorrow, tomorrow, they'll do it tomorrow. Whatever dude, and if they don't you'll simply deny that you ever said they would.

BTW, what's a percentage man?

You so smart, please 'splain to everyone exactly how to figure it out. I'm sure it would be nice to know what the percentage of your right to wrong answers is, cause right now my calculators don't have scientific notation that goes to minus triple digits. So as x aproaches zero it equals entium, is that is?


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
July 4, 2004 9:34:46 PM

pointless rambling, I proved what I've seen.

and now just to make ya look stupid I did the math for ya too.

If ya want to see more numbers I already showed ya how the precentage things works. 57.1% increase with drivers and dx9c.

I said what 40%?

Thiers was higher then mine. Now if ya can say I lied or miss represented numbers I suggest you shut up cause I didn't, the benchmarkers proved that. The grape ape in the cartoon was thick skulled just like you I see the reason you picked the name.

CPU's bs man, if its cpu limited then yes it will be a factor. I don't think you know [-peep-] about computers just because of that fact. When aa and af are no cpu is no longer the limit.

All ya do is read reviews, take a step in my shoes and see what I see, I already see what you see but I'm telling you its wrong! And stop distrupting posts with your lunitic badgering that I lied even after I showed you it was right. What I've been saying has been coming true. You really think I didn't know about the reg hack for ATi, why did I post it because I wanted some else to find it. I guess ya just don't look deep enough for your information. Thats the problem. You will always be looking in from the outside. Its not hard to really step into this and see it from the inside. Thats why beyond 3d is better for an overall graphics forum, alot more people there understand the differences just like I do, and see the differerence in a judgemental way. You don't, and never will becuase of you're thick skulled, or should you be just blind?

I can point out so many facts to you, which I already have but it won't change the fact that you have your own paradigms.

So stay here be happy, and learn nothing else. I don't think you have the capacity to learn anything else because you are too inflexiable. And stop saying that I lied or mislead someone. Because I will personal tell you that you are more missleading at times. Which I started correcting you on already.

Global improvements for the drivers, and dx9c
SM 3.0 will only be effective if the shader was written properly. But dx9c does have a minor benifit with performance just because its optimized more. It actually helped ATi's cards too in some tests.

its is a sweep before aa and af are add right?

Just wait and see, I'm sure that will be taken care of soon. Its not like aa and af are hardware limited. Thats all done in drivers. Something that is very tangible.

Do you see why I don't like you? Because you think you are too good for everyone, when you really don't know anything (the why factor). Well tell ya what I don't think you even know who you really are because you can't even remember the own things you say. You try to prove a point by misrepresenting me and you are the one thats actually talking crap. Even after I showed you the numbers you are talking crap. My numbers are prefectly been shown. So if you can't accept that then just be quite and sit there because you have no right to tell me that my numbers were missleading because many benchmarks have shown that they were right. Only when AA and AF are on does the performance drop, that will soon be fixed. And please question me again, because you will be shown off again as being wrong.

Btw the first time I posted the performance gains was in a thread that was deleted because of your hostile posting so I suggest you shut up because you are irritating me right now by you saying the numbers I stated were wrong. The thread pertained to the GT out performaing the the pro and closing in on the xt.

Maybe you should learn from Beyond 3d guys whats really going on? I think you don't have the capacity to do, prove me wrong. Show me why the 58% difference in performance in far cry, the 100% performance incraese in Pain killer, the 100% performance increase in Tomb raider, the 35% performance increase in Unreal tournament 2004, are so different from my overall 40%? In general thats all dx9 games. Not talking about dx8 shader model in dx8 is too different to get any performance increase.

If you are so good at why the benchmarks are why they are tell me why this is so? I want you to really understand this because otherwise you will keep talking nonsense without really knowing what you are talking about.
July 4, 2004 10:25:46 PM

*Sticks the patented Anal-Alert™ sticker on this thread*

I hope you guys have an ergonomic keyboard...

<font color=blue>The day <font color=green>Microsoft</font color=green> will make something that doesn't suck is the day they'll start making vacuum cleaners.</font color=blue>
July 4, 2004 11:45:25 PM

Anyhow back to the topic when did I say the patch was going to be released for Far Cry? Is it going to be released? Or was that too a figment of my imagination like the the 40% boost on all dx9 games from drivers 61.11 to 61.34 and above?

And if you remember correctly grape, you said that it wasn't a reality the dates I gave, so you were wrong there too. Even though I specifically stated I knew when they were coming out.
a b U Graphics card
July 5, 2004 12:15:16 AM

You really are in love with NV this round aren't you. :smile: I still like the GT, but I can't join in and claim a sweep like you because if you buy one of these cards, you want to use AA/AF and High resolutons. And right now ATI is beating NV at those settings. I hope you are right and they do fix it in their drivers. But they better be smart and actually fix it, instead of butchering IQ like last round. To me the X800XTpe is still the power champ right now because when stressing the GPU the most, it is the winner. I don't argue that NV might change this, and if they do, good for them. But bragging about $500 cards that are so scarce, and drivers that aren't available to the public, and disreguarding that they are behind with AA and AF, just doesn't = a big victory in my book. Not yet anyway.




ABIT IS7, P4 2.6C, 1GB Corsair XMS 4000 Pro Series, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
July 5, 2004 12:50:02 AM

I here ya, well the gf 6 line has best of both worlds, hopefully they take care of aa and af issues, I don't like making nV specific optimizations to get the performance equal to ATi, and I know thats what this next conference call will be about its a royal pain in the ass.

Well Grape seams to distrust my information which I get first hand, and I tell ya guys as much as I can, not that i'm misleading anyone, if I was, these benchmarks wouldn't have came out the way they did, and he still thinks i'm lying even with the proof. Its like ya have to blow up a stick of dynamite in grapes head to get him to understand anything.

The whole discussion boils down to is what Grape really doubts just about even when the numbers are there. Its so ridiculous that he tries to pick on something he doesn't even have any idea about or doesn't even want to seem to do the math or research to figure out. I know he won't be able to answer why the performance boosts are higher then what I orginally stated.

And this is why I got a 40% increase and the benchmarkers got more. I didn't do stardard benchmarks, I just play the game and a program in the back picks up the fps numbers every second in Far Cry, and UT 2004, not fly bys or pre made demos. I specifically choose Far Cry and UT because one uses heavy shaders and one doesn't and that would have covered the gammet of dx9 games.

In tombraider and Pain killer this is what I have heard, never played the games myself, but the slow down in the game was the tomb raider's engine and pain killer engine's didn't recognize the gf 6 lines as dx 9 cards they were recognized as dx 7 cards. This was also a problem with Far Cry before and thats why shaders were off and there were texture abnormalities. So its a good assumption that it was not just a single game which had this bug.

AA and AF is wierd, I don't understand this, because the calculation amounts should be relatively painless when it comes to non shader games on the gf 6 line.

But from what I've been seeing with my engine and hopefully don't need to write code around this problem is that when aa and af is used ATi's cards use a better optimization technique which both are done at the same time. But with nV's card's first the polygons are filtered with aa and then the textures are done with af. this is a guess so don't hold me on these. But thats what seems to be happening.

I don't think this is a hardware issue because its more of an optimization technique which I can do in software with my engine.
a b U Graphics card
July 5, 2004 5:51:22 AM

Quote:
pointless rambling, I proved what I've seen.

You didn't prove anything you said. You said the increase would be drivers alone, never happened. You said that DX9.0c would improve the FX line, never happened.

You now want to revise and deny what you said and shift the focus with some juvenille math demonstration.

Quote:
Now if ya can say I lied or miss represented numbers I suggest you shut up cause I didn't,

Prove otherwise, I've provided enough evidence of what you said and none of it has happened yet. You pretend it was about drivers + dx9.0c + patch + SM3.0, but that's not what you said, and even your FX comments didn't come to pass.

Quote:
I don't think you know [-peep-] about computers just because of that fact.

Oh this coming from the sam person who had to have this fact pointed out to him by me when I was helping someone with their UT2k$ problem. So thanks for always pointing out what I've already demonstrated. And I now alot about computers, my Speak and Spell does just fine.

Quote:
All ya do is read reviews

I do more than that,and actually I provide proof, unlike you. The nice thing about getting additional info from reviews is that it means I have evidence (even 3rd party) to back-up my statements, you on the other hand have the promise of something tomorrow, or later, or whenever the figures add up, that is if they EVER do. Then you'll rework your statements to match the final results.

Quote:
Thats why beyond 3d is better for an overall graphics forum, alot more people there understand the differences just like I do, and see the differerence in a judgemental way.

Actually B3D would be better because more people would call you on your BS and FUD. Make these same statements at B3D and you'd get shot down every time.

Quote:
I can point out so many facts to you, which I already have but it won't change the fact that you have your own paradigms.

You haven't pointed out one fact. In reality this whole thread is a testament to you pointing to something and it turning out to be a shadow of the truth. Guess your experience didn't help you notice the error in Anand's review before the rest of us. Yes you're special that way. As for different paradigms yeah that's right, I'll stick with truth and evidence over your predictions, fud, and fallback tactics.

The thing is you still don't provide one shred of evidence to back-up your statements. I've highlighted your statements for you, how about simply prove them. Remember your statements didn't involved the holy trinity, just the drivers for the GF6800 series, and just the drivers and DX9.0C for the FX line. So prove the statements, and not with two different benchmarks with different setups.

Quote:
Do you see why I don't like you? Because you think you are too good for everyone,

Yeah that's it, sure. Persecution complex. You're the one lording over everyone how you're so special and we should all bow to your wisdom. You don't like me because unlike everyone else you know, I don't roll over and let you act like the messiah. Whatever dude, I don't care if you like me or not. You know why many don't like you? Simply because you are you, and you won't ever change that.

Quote:
)Well tell ya what I don't think you even know who you really are because you can't even remember the own things you say.

Sure this coming from the person who avoids his own quoted words. I also find it funny that you think remembering one's own words is the key to knowing oneself. So I guess you are truely lost according to your own guidelines. PErsonally I think it's more complex than that, but I'm sure that's all it is to you.

Quote:
You try to prove a point by misrepresenting me and you are the one thats actually talking crap.

Direct quotes. I let you hang yourself, it's far easier that way, I don't have any extra work that way.

Quote:
So if you can't accept that then just be quite and sit there because you have no right to tell me that my numbers were missleading because many benchmarks have shown that they were right.

You sit there, keep <b>quiet</b>, and simply post some proof of your statements.

Quote:
And please question me again, because you will be shown off again as being wrong.

Sure empty threat, you haven't shown much of anything sofar.

Quote:
Btw the first time I posted the performance gains was in a thread that was deleted because of your hostile posting

Actually it wasn't my postings that would have had it removed. If anything it was the complaints about YOUR postings.

Quote:
so I suggest you shut up because you are irritating me right now by you saying the numbers I stated were wrong.

I suggest you prove any of the statements you made that I've quoted here. Either put up or shut up. Seriously you whine and whine, who cares if the thread was deleted, do you cut and past so much that you can't provide proof again because you've lost your source?

Quote:
Maybe you should learn from Beyond 3d guys whats really going on?

I know more than you think, and the other thing is I know far more B3D guys than you ever will, so thanks for the tip, but like so much you do post, it's outdated, either that or pure fiction.

Quote:
I think you don't have the capacity to do, prove me wrong.

I have proved you wrong, in fact I have done it many times, what you haven't done yet is prove yourself right other than AFTER the fact with revised statements that don't matcnh what was said.

Quote:
If you are so good at why the benchmarks are why they are tell me why this is so?

You'd actually have to write that in english first.

Quote:
I want you to really understand this because otherwise you will keep talking nonsense without really knowing what you are talking about.

Once again it would need to be in english first.

Do what you do when you did what you did?

First work on the english skillz, then work on the ability to post valid benchmarks. Don't try to hide behind your desire to show of your minimal math skills, instead show me a valid published benchmark that spans the driver launches and actually backs up your point. Don't let the raw numbers confuse you, try and stick with the basics, like internal or even external validity. Keep going on tangents like B3D, knowing who you are, life the universe and everything, just as long as you keep avoiding the task at hand.

Since you have nothing else to back up your statements, there's nothing more to be said, other than the simple fact that you can't back up your original statements and try to redirect when confronted with your own words.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
July 5, 2004 6:56:52 AM

Look grape, you are totally ignorant, apparently you are skirting my question, why was my performance increase 40% on all dx 9 games and the reviewers got 40% or more on most of the other game benchmarks?

And the fx line, I was never talking about them, how many times must we go over that again? You must be a fruit loop or it goes in one ear and goes out the other. (selective memory perhaps again?)

Why is it that everytime there is a discrepency between us, you ignore that question that I pose and I am left to answer it myself (is it because you are unable or unwilling to put a foot in your mouth?).

Not withstanding let me list a few questions I posed and you have failed to answer .....

Example 1) Architectural differences and 32 bit performance between the two graphics cards from nV and ATi.

Example 2) Driver optimizations and magic drivers and why the performance increase due to different types of optimizations.

Example 3) Differences between 24 bit and 32 bit precision

Example 4) Benchmark differences with my benchmarks being 40% and other benchmarkers with over 40% (up to 2 times) or around 40% depending on the game?

___________________________________________________________

You know,

I really must ask,

Do you have the x800 pro, xt?.......

Do you also have the gt 6800?.......

what the hell do you have?.

How can YOU dicredit my results if YOU can't do the comparision YOURSELF?.

Quote:
The nice thing about getting additional info from reviews is that it means I have evidence (even 3rd party) to back-up my statements, you on the other hand have the promise of something tomorrow, or later, or whenever the figures add up, that is if they EVER do. Then you'll rework your statements to match the final results.
]http://The nice thing about getting additional info from reviews is that it means I have evidence (even 3rd party) to back-up my statements, you on the other hand have the promise of something tomorrow, or later, or whenever the figures add up, that is if they EVER do. Then you'll rework your statements to match the final results.

Right spewing out other peoples work and not doing your own tests.

I look forward to a sensiable response to my orginal questions although perhaps I maybe wasting my time as its obvious you can't even understand my questions let alone answer them.


"skillz" are you black, why are trying to be someone you are not? seems to me thats where the root of your issues are, how about a wiger?
a b U Graphics card
July 5, 2004 12:45:20 PM

Quote:
Look grape, you are totally ignorant, apparently you are skirting my question

Right, now it's me, not you. Of course, it must be someone else. You didn't even show 40% performance gains, you showed to unrelated links, and compared them to no baseline.

Quote:
And the fx line, I was never talking about them

Sure you were, this is your quote, yet again; <font color=green>"test it the fx line with the new drivers they are faster much faster with dx9c too. And the shader quality still not as good as ATi, but better."</font color=green>

and since you want to avoid your fx comments, just back up the GF6800 statements;

<font color=green>"This is a bit different 61.11 already boosted the performance 20% on lower reses, higher reses 30%. (far cry as an example)"</font color=green>

also

<font color=green>"Well those were the old driver tests, the new drivers with the 30% boost it will take care of the pro and xt :) "</font color=green> And that was AFTER the 61.11 claim, and NOT the holy trinity you keep refering to, just the drivers alone.

Stop trying to skirt the question, and provide proof of those statements which YOU made. Prove that and I will answer your stupid offshoot questions meant to distract people from your statements. Seriously FP32 vs 24, oohh that'll be tough, well no tougher than knowing the difference between 2billion a year revenue and profit (especially for a '<i>CEO</i>' like yourself).

As for the magic drivers, that's exactly what we are talking about here. You still can't support your magic driver statement and now you ammend your statement tom include every option under the sun.

Quote:
How can YOU dicredit my results if YOU can't do the comparision YOURSELF?.

Simple, all I have to do is ask you for proof of your original statements, and it stops you cold. You say you have the GF6800 and the X800, yet you're unable to post 3Dmarks, is that because you're so computer savy? Even the newest nOOb can figure out how to do that within the first few days of being here. It's pretty simply to discredit you, all I have to do is ask for proof, and you are left with redirects and avoidance.

Quote:
Right spewing out other peoples work and not doing your own tests.

Actually I have posted my own tests, so far you haven't provided one shred of evidence that you've tested anything, you yourself rely on other sites and you make errors in cutting and pasting eve (oops wrong CPU, ooops didn't notice the AA/AF figures were wrong, oops didn't notice the benchmarks are completely different and therby not comparable), when this whole time you say you have both cards, post the results. I've posted mine numerours times and therby anyone here has been able to see what I've said I've tested because I've posted proof, you've posted nothing but other people's work. So follow your own guidelines, and prove yourself instead of screwing up other people's reviews to try and justify your mistakes or lies.

Quote:
I look forward to a sensiable response to my orginal questions although perhaps I maybe wasting my time as its obvious you can't even understand my questions let alone answer them.

Whatever, you provide me with the proof and your answers are all ready to go. And unlike your statements, everyone else knows that I have the answers to your questions, heck we've discussed it about 20 times since the FX came out to defend the throne against the R300. So bring out your proof, and then we'll rehash the past.

You know, I really must ask: Do you have any proof of this ficticious company or these cards? It's one thing to talk a big game, but as you have no proof, really, why should anyone believe you, especially with your poor track record so far. Just had to ask.

I'll leave you with this, bring me the PROOF of your three statements, and I will give you the answers, not after weeks like you've stalled and avoided, as soon as I've seen the proof. It's that simple, now do it!


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
July 5, 2004 3:19:25 PM

Grape tell ya what come down to Binghamton NY, I also own about 50% of the bars in this town too.

Please have a drink me!

Because you still haven't answered my question, and there was a 40% increase all round and more for some dx9 games. If that doesn't get through your greasy head, then your hopeless.

Show me the link with the fx line, I did a search for that quote and it only pulls your statements, twisting the facts agian or just plain lieing?

Ok 30% increase with 30% increase it ends up at 65% or so, the benchmarkers got 57.1%?

Was there a big difference between my numbers and theres for Far Cry?

Also the new drivers without dx9c will actually slow down the gf 6 line. Which I also stated which you seem to have forgotten? selective memory

Do you know how a c-corp splits profits at the end of the year? I didn't think so.

Do you know there are 10 different type of corporation structures in the US?

Did you know the tax benifits of not showing profits for c vs other types corporation?

Do you know what type of corporations nV is?

hmm I didn't think so, again talking about something you know nothing of.

I suggust you stay with the topic.


Please answer my questions. Because you, all you can do is avoid them, and that shows your true nature.
a b U Graphics card
July 5, 2004 9:27:54 PM

Quote:
Grape tell ya what come down to Binghamton NY, I also own about 50% of the bars in this town too.

Right, now I've gotta fly there, cause there's nothing you could provide us here?

Quote:
Please have a drink me!

I assume that's an offer to have a drink, you buy the plane the tickets and I'll buy the drinks.

Quote:
Show me the link with the fx line.

<A HREF="http://forumz.tomshardware.com/hardware/modules.php?nam..." target="_new">I thought you said it was I who had trouble remembering my own words !</A>

Quote:
Ok 30% increase with 30% increase it ends up at 65% or so

Ok, now you mention my math skills, +30% x +30 (130%x130%) equal 69% EXACTLY, so if anything it ends up 70% or so. So 57.1 percent while close to 60% is substantially further than 70%, and those increase aren't global, no previous mention of other conditions.

Quote:
Also the new drivers without dx9c will actually slow down the gf 6 line.



That may be so, but was not mentioned at the time of the initial driver issue; and it's not the biggest impact to performance. The biggest impact to performance is the Patch itself, which was not part of the original statement.

Quote:
Do you know how a c-corp splits profits at the end of the year? I didn't think so.

I know far more about business than you think, and the term c-corp is not a global one it's specific to the US and it's accounting practices. From what I've been able to gather a C-corp is a for-profit company with broad legal/liability protections for it's shareholders and actually they can split profit in more than just dividends, but I assume you're talking about as a non-employee, so that would be the principle method, of course it's major drawback is the higher tax rate (double taxation). An S-corp has far more restriction (including no non-resident aliens which would preclude me and many people here, no financial institutions, and no class b or non-voting shares) but it's primary has the benift of no double taxation. Then of course you have limited liability, and not for profit. As for the fifth, who cares, it's not relevant to your statemnt, once again a pointless sidetrack.
However there is still no magic structure that can turn a $2billion/year revenue into $2billion/year profit unless there are no associated costs, period. Unless you're talking about floptimized US accounting practices, and then that's one thing that nV and ATI shared before corporate governance standards, yet none of their reports even mention even half that figure.

Quote:
hmm I didn't think so, again talking about something you know nothing of.

You try and show me how you make $2billion in profit from $1.8billion in revenue, unless you SELL nVidia lock stock and two smoking barrels, then you'd see that kind of money.

You keep trying to redirect the issue, but the reality is this was another area you made a HUGE mistake. Use all the accounting examples you want, but your numbers once again don't add up.

Quote:
I suggust you stay with the topic.

I did, simple, provide PROOF. I was simply pointing out another thig that you'd be unable to prove, $2billion in profit per annum.

Quote:
Please answer my questions. Because you, all you can do is avoid them, and that shows your true nature.


Seriously, now if that isn't the pot calling the kettle black. Just prove the selected items, and then I'll answer your questions, no problem. Heck I'll even answer your questions tonight after my evening is over. It's not complicated, but I don't have time while at work, just noticed this when I got back from a meeting. You just work on the proof, I have no problem discussing FP32/24/16 and FX12 on the ATI/nV architecture, comparing conversion versus # of pipes and the need for FP16/32 versus FP24; been there, done that.

You just supply the PROOF I asked for though, or else you'll be sitting there with nothing to show for yourself, as I expected.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: 
!