The Crap I started at Futuremark.

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
I started a thread in the Futuremark forums a few days ago that got closed, there was alot of fighting involved(go figure) It was about the cheating in 3DMark, I had posted the findings of Unwinder(Rivatuner dude) and it wasnt taken too well. I just found another neat little thread going over there that I wanted to share with ya :smile:
<A HREF="http://discuss.futuremark.com/forum/showflat.pl?Cat=&Board=techdisplayadapters&Number=2396896&page=0&view=collapsed&sb=5&o=0&fpart=1" target="_new">http://discuss.futuremark.com/forum/showflat.pl?Cat=&Board=techdisplayadapters&Number=2396896&page=0&view=collapsed&sb=5&o=0&fpart=1</A>

3DMark 03 = 4,140
<A HREF="http://service.futuremark.com/compare?2k3=897633" target="_new">http://service.futuremark.com/compare?2k3=897633</A>
<font color=red>AthlonXP 2100+/Radeon 9500Pro</font color=red>
<font color=red>Folding for Beyond 3D</font color=red>
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
Quote from Unwinder
All the results posted by GeneticWeapon (Ti4600 w/ and w/o NVAntiDetector script and R9500PRO w/ and w/o ATIAntiDetector script) are real. It looks like GeneticWeapon found our 3DMark detection discussion on www.iXBT.com forum (http://forum.ixbt.com/0010/044212-30.html). If you need technical details about detection techniques in both Detonator and Catalyst - you can find everything there (discussion is in Russian).

Summarizing:
I really RE'd both Detonator and Catalyst and found application detections mechanisms in each of the drivers. I really created the scripts to prevent the drivers from detecting D3D applications (the scripts block pixel/vertex shader and command line/window text checksum calculation in the Detonator and texture pattern/pixel shader code detections in the Catalyst).
Blocking application detection code caused dramatic performance drop in 3DMark2001/Nature on both NV (74->42 FPS) and ATI (66->42 FPS) boards. IQ in this test changed on both of the systems. Do I have to continue?
NVAntiDetector also caused significant performance drop in other D3D benchmarks (i.e. UT2003), 3DMark2003 score on NV35 dropped even more then with 330 patch (it's info from my tester and I cannot confirm it because I don't have NV35 sample).
Review containing details and benchmarks is preparing for publishing on Digit-Life now.

Conclusion:
My trust to NVIDIA and ATI PR is almost equal to 0 now. Both of them seem to use the same benchmark 'optimization' techniques, but NVIDIA promotes it as 'application specific optimizations', ATI simply tried to appear innocent, but both are fooling us for a long time. 3DM2001/Nature was de-facto in estimating PS performance, but both IHVs show distorted benchmark results by altering rendering code. And it’s very sad.
Cheating in UT2003 too?....Hmmmmm
Ati and Nvidia suck suck suck suck.


3DMark 03 = 4,140
<A HREF="http://service.futuremark.com/compare?2k3=897633" target="_new">http://service.futuremark.com/compare?2k3=897633</A>
<font color=red>AthlonXP 2100+/Radeon 9500Pro</font color=red>
<font color=red>Folding for Beyond 3D</font color=red>
 

vacs

Distinguished
Aug 30, 2002
239
0
18,680
yeah, It's pretty sad what nvidia and ATI have hidden the last few years. I bet that all those "optimizations" have been going one for many more years that we are currently aware of.

If only those optimizations didn't change IQ, all would be fine. But as it seems now this isn't the case. Shame on nvidia for praising their cheats as legal and shame on ATI for playing the innocent.

If the Matrox Parhelia wasn't a so bad performer, I would immediately buy one.

Hopefully both companies will stop this BS with their next generation cards (nv40 and R420). It would be about time!
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
If the Matrox Parhelia wasn't a so bad performer, I would immediately buy one.
I would too my friend......and I'm being totally serious.



3DMark 03 = 4,140
<A HREF="http://service.futuremark.com/compare?2k3=897633" target="_new">http://service.futuremark.com/compare?2k3=897633</A>
<font color=red>AthlonXP 2100+/Radeon 9500Pro</font color=red>
<font color=red>Folding for Beyond 3D</font color=red>
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
The thread is getting funny, browse through it.
You guy's havent seen flamewars & severe hatred like this place can deliver.

3DMark 03 = 4,140
<A HREF="http://service.futuremark.com/compare?2k3=897633" target="_new">http://service.futuremark.com/compare?2k3=897633</A>
<font color=red>AthlonXP 2100+/Radeon 9500Pro</font color=red>
<font color=red>Folding for Beyond 3D</font color=red>
 

jaythaman

Distinguished
Oct 13, 2002
1,613
0
19,780
I see you still havent been banned there. What the hell are you doing dammnit ;)

<font color=blue>I feel the need...The need for weed! :tongue: </font color=blue><font color=red><b><i>Jay Kay</font color=red></b></i>
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
Lol....I found out tonight that NV News banned me for vulgarity. Can you imagine that?
I told someone there I was going to pee in their butt.
Their forums are HEAVILY moderated, unlike this fuucking place.

3DMark 03 = 4,140
<A HREF="http://service.futuremark.com/compare?2k3=897633" target="_new">http://service.futuremark.com/compare?2k3=897633</A>
<font color=red>AthlonXP 2100+/Radeon 9500Pro</font color=red>
<font color=red>Folding for Beyond 3D</font color=red>
 

CaveInfiltrator

Distinguished
May 2, 2003
29
0
18,530
so true. i hate it when a forum gets so moderated that you cant even talk because you MIGHT say something. its a place to talk! so, you should be able to talk! now, i dont mind bots that auto pick out words, but....whatever.

and if ATI and Nvidia have been cheating on 03, why not on 2K1? why is it so hard to believe? makes sense...LOGIC!

and matrox makes pretty good cards dont they? but i thought they tended to make cards more for the video editing industry than for gamers...(no research just what i think) and hey! i bet that matrox card is only poor performing due to LACK of "optimizations?"!! *j.k j.k j.k* ;o)

peace.......

OSI, ISO, ITU-T, IETF, IEEE, WEP, 802.1X IPSec, ISAKMP, ATM, FR, 802.3, X.25, 802.11, 802.5, MPLS, RIP, OSPF...
Welcome to the acronym industry.
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
seriously, it's time to switch sides, who wants to go the heaven of Matrox!!! LONG LIVE SIS AND MATROX!!!! AND INTEL EXTREME GRAPHICS 2!!! join me Ati and nVidia fanboys, it's no longer safe in those side!!!

Proud Owner the Block Heater
120% MATROX/INTEL EXTREME GRAPHICS Fanboy
 

LtBlue14

Distinguished
Sep 18, 2002
900
0
18,980
i'm surprised this wasn't uncovered sooner!!

and yeah i love these unmoderated forums =)

<A HREF="http://www.planettribes.com/allyourbase/ayb2.swf" target="_new">411 UR 84$3 R 8310N6 2 U$</A>
 

eden

Champion
Hilarious how many have now changed to neutral siding lol!

So it was confirmed UT2003 had image quality dropping with these opts?

--
If I could see the Matrix, I'd tell you I am only seeing 0s inside your head! :tongue:
 

spud

Distinguished
Feb 17, 2001
3,406
0
20,780
Sorry guys but I have to say something this has been bothering me for a bit here.

I have been keeping up on this recent ATI and nVidia "cheating" issue's, and I would like to share my views on the whole ordeal.

Starting off with the Futuremark "cheating" that had occurred more so with the FX's then the R3xx's. This so-called ": cheating" doesn’t really seem so much cheating; let me tell you why I see things this way.

First off, lets look at the benchmarks themselves. They are designed to show the "base power" of the GPU's if I am not mistaken non-optimized base strength. Thus to show you the performance you would expect from a given manufacturers card (generalizing since there are more than two companies that make video semiconductors.)

With that in mind I come back to a statement I had said a while back about the retraction of Futuremarks accusations that nVidia was cheating. The silicon vs. silicon or engineering vs. engineering one.

So what’s the big deal right we allow special optimizations for CPU's if not Intel wouldn’t be selling CPU's, why not GPU's. Now some folks pointed out that the cards were checking for software load up. Then running "special" premade executions paths to allow the card to lower the workload so the performance would be higher.

Is that such a bad thing??? Does it affect the output in any negative way??? Does it cause any unwanted results such as instability or corruption??? From what I have seen, a read it doesn’t other than piss ATI guys off that they aren’t top dog no more. Which is typical AMD guys still can’t understand that Intel has bested the mighty XP. They will be back on top or damn well close when the Athlon 64 is out.

But that’s neither here nor there this is about GPU specific optimizations to increase performance on a specific GPU. We allow it for CPU's but not for GPU's sorry to say sounds like a double standard. Folks will argue that the drivers looked for specific scenes in the bench's and changed the execution paths, for predetermined never altered image path. I'm sorry but I see that as fair game IMO if the software developer can’t anticipate this then what are we supposed to do. It’s like the guys that had those insane crazy high scores on 3dmark01; they turned off all the textures and ran with wire frames to boost their scores. This should have been a warning bell to Futuremark to better protect the software executions and make is protected and secure. Allowing the drivers to detect and thus "enhance" performance fell down on them.

It’s bad that the card companies did the "enhancements" but it’s still not their fault entirely. Futuremark should have watched for this when they were still in beta not in release 330. It should have been caught and dealt with then.

This is why I frown upon synthetic benchmarking since all it’s showing is that you can run a controller strip on your computer, nothing else. So that what baffles me about why so many people are being huffy puffy. Christ no one has even stated exactly what these "cheats" really were, other than they were cheats used to "enhance" performance. Just frustrating to read so many folks bad mouthing nVidia over this when ATI did it too but no one says anything. Just a double standard debate is what this really comes down to. So the companies took shortcuts so bloody what all that matters is you end experience is better and that’s all that matters to me.

Now I have a huge deal with this recent stuff about nVidia "cheating" in games. This is utterly stupid and anyone that’s following this needs to take a serious look at what’s being said and shown. These so-called "cheat" detection snap ins people are using to detect "cheats" in the drivers of the ATI and nVidia cards. To me this looks like they aren’t disabling what they say they are. It looks to be they are disabling advanced calls and execution paths that are real performance enhancements. Folks will say on but its detecting games and running different execution paths, well ya every game is different; different textures, poly counts fill rates, and shader volumes to name the obvious. These optimizations are there to boost and bolster the engines features to the maximum on the cards why anyone would call this cheating is beyond me. Just like saying the P4 shouldn’t be allowed SSE, and SSE2 extensions because its not base processor power. Which we all know means jack when comparing new and advanced architectures such as the Pentium4 or Current Opteron (well not soo much). Its all about the engineering that counts one company finds a better way to deal with texture compression another finds a way to boost poly counts they are fair game regardless. If not its rock vs. rock and that’s pretty well pointless since each is designed differently.

Well I think that about does it for my rant just wanted to share my views on the current GPU issue.

-Jeremy

:evil: <A HREF="http://service.futuremark.com/compare?2k1=5341387" target="_new">Busting Sh@t Up!!!</A> :evil:
:evil: <A HREF="http://service.futuremark.com/compare?pcm=1060900" target="_new">Busting More Sh@t Up!!!</A> :evil:
 
Spud simply put, optimizations don't show the raw power, and when benchmarking you want an idea of the BASE from which you can build your optimizations. Sure optimize for games all you want the problem is that the game coder get payed to cheat one company or the other out of the full benifit their cards can bring.
Perhaps (now this is an example don't get all owrked up about how bad the card is) a card like the Parahellia was a a better base performer, but the reason the GF4 or 8500/9500s were handing them their hats was because the games that people were playing had special IQ droping code that just didn't run on the Matrox card? That would mean that you could have a much better card for next years ganme in the MAtrox, but you've been duped into buying the less powerful cards because they cranked the benchmarks. Most people here (and elsewhere) use the benchmark as a tribal call, a penis size show; but that's the complete oposite of what it's best at, and that's giving you an idea of whether it HAS the potential to be a good card and worth your money. Most people use the bench for two things 1 stability check for Overclocking (good use), the other is to brag and to see the demos, and then trash other people because they are lowly 'X' users; however it would be most helpful ONLY without optimization so you could see what the card has. Then maybe more people would buy the card. Cheating and selling an inferior product is like fudgin the MPG rating of a car, yeah most people NEVER see the same ratings the cars get, but they are all fairly tested on the same scale, therefore you can make an un-biased jusgedment. That's all I ask for here, and that's what I think MOST people (who aren't fanbois of one camp or the other) want.
Optimize the crap out of games (hopefully for most card makers little extras) to make them play better on more/all cards. But leave the tools alone.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <font color=green>RED</font color=green> <font color=red>GREEN</font color=red> :tongue: GA to SK
 
Speaking of options, it appears that the P10 will never come out. Creative's recent release of the 9800Pro (saw a review earlier and the news today in Tom's News), means that there is likely not going to be a competing product for gamers from Creative's 3Dlabs department. I thinks that's a VERY bad thing. Without new blood, and with all his crap, it's not lookin g good for the next generation of cards.



- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <font color=green>RED</font color=green> <font color=red>GREEN</font color=red> :tongue: GA to SK
 

JimmyDean

Distinguished
Mar 17, 2003
326
0
18,780
LOL. I want me one of those Matrox Paraphanelias! They sound so cool! And so expensive too, they must rox0r

<b><font color=red>Remember kids, if you see a downed power line, suck on the end, candy comes out!</font color=red></b>
 

spud

Distinguished
Feb 17, 2001
3,406
0
20,780
But the raw power isnt what you are paying for. You are paying for is how they move 0's and 1's or in this case textures, pollies, and fill rates. If product a has higher fill rates then product b but product b has higher pollies, and you want the product with high pollies but that only occurs when you have optimizeations on. Product b loses 60% because of no optimizeations so product a surpases it by 7%.

How does that make sence if these optimizeations allow for higher performance accross the board then why the hell not use em???

If not why buy a P4 if you care about raw base FPU that 16bit double pumped FPU sure stands up well against the 3 32bit FPU's on the Athlon. Thats the way the manufacturer intended the product works if now you arent running it the way they meant and that doesnt show the true performance of anything. Just shows who follows the MS DirectX base core features design specs.

The Nvidia card will be fastest with "r_renderer nv30", while the ATI will be a tiny bit faster in the "r_renderer R200" mode instead of the "r_renderer ARB2" mode that it defaults to (which gives some minor quality improvements). The "gfxinfo" command will dump relevant information about the functioning renderer modes and optimizations. At some point, after we have documented all of the options and provided multiple datasets, Doom is going to be an excellent benchmarking tool, but for now you can still make some rough assessments with it."
Carmack himself says that Nvidia's FX line will be faster in DoomIII, this man has huge relevance because hes down in the source code seeing the ticks its takes for code execution and how many texture passes he can do with each card hardware.

Whatever you guys wanna call it these "cheats" or "enhancements" or "optimizeations" are giveing us better frames per seconds better special effects and higher quality images. Which is fine by me because thats what I am going to pay for.

-Jeremy

:evil: <A HREF="http://service.futuremark.com/compare?2k1=5341387" target="_new">Busting Sh@t Up!!!</A> :evil:
:evil: <A HREF="http://service.futuremark.com/compare?pcm=1060900" target="_new">Busting More Sh@t Up!!!</A> :evil:
 

phial

Splendid
Oct 29, 2002
6,757
0
25,780
How can you ignore the awesome power of S3's DELTACHROME!!!????

LOL oh yeah baby hahahha

that made me laugh


and GW saying "pee in their butts" hahhaha that made me laugh too

-------

<A HREF="http://www.quake3world.com/ubb/Forum1/HTML/001355.html" target="_new">*I hate thug gangstas*</A>
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
:smile: I try not to say that anymore...

3DMark 03 = 4,140
<A HREF="http://service.futuremark.com/compare?2k3=897633" target="_new">http://service.futuremark.com/compare?2k3=897633</A>
<font color=red>AthlonXP 2100+/Radeon 9500Pro</font color=red>
<font color=red>Folding for Beyond 3D</font color=red>
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
Grape, thank you for that excellent, and almost equally long post, directed at spud. I just woke up(it's 5pm) and can hardly see straight.
This forum stuff is killing me.(I'm an active member of at least 8) I think.

3DMark 03 = 4,140
<A HREF="http://service.futuremark.com/compare?2k3=897633" target="_new">http://service.futuremark.com/compare?2k3=897633</A>
<font color=red>AthlonXP 2100+/Radeon 9500Pro</font color=red>
<font color=red>Folding for Beyond 3D</font color=red>
 

eden

Champion
Spud, I'll repeat what I had said before about this:
-3dMark, is designed for one thing only: Show how far a card can be stressed in the event of absolute crap optimization. This is important for several reasons, namely the presence of monkey programmers, that did the graphics for games like Commanche 4 and Splinter Cell. I hope they got fired, because they made games that a 4GHZ Pentium 4 can't even pump 80FPS from, with the best of graphics cards.
That is 3dMark's best use, to show you how much your card can withstand. Can I believe that one day, if I played a DX 8.1 that looked like Battle of Proxycon, but was badly programmed, that I can get about 5FPS like I do with my Ti200 in that test?
YES! Because as nVidia explained, the test uses very a taxing and useless amount of light sources, among other things. It is a credible test for such reasons, BECAUSE IT SHOWS YOU WHAT THE WORST CAN BE.
And as long as graphics programming for games like Commanche 4 and Splinter Cell pumps itself into our market, we'll need such programs.
I strongly believe that no optimizations are needed in this benchmark. Because, as I said, it is there to test the rawest of rawest of performance. And yes it is possible, when you have bad programmers.
-The reason, as I and many have said several times, that we have gripes about these optimizations or cheats, is that some of them LOWER your image quality (while you claim they give higher). And these are CHEAT-class codes.
Now you did tell me in our conversation today that it's not always noticeable. I agreed, however I will say it again: with clear water effect degradation in 3dMark03's Nature test on some geForce FXs (COULD be driver bugs of course), and some other things, it can be noticeable, and I would not live with a 500$ card not even properly displaying my textures.
Carmack himself says that Nvidia's FX line will be faster in DoomIII, this man has huge relevance because hes down in the source code seeing the ticks its takes for code execution and how many texture passes he can do with each card hardware.
I strongly believe in Carmack and his credibility. But here the issue is nVidia's drivers. If they are making cheats to make Carmack think they work so well, then it's a whole new story. Judging image quality in Doom III between cards should become even more easy due to the sharpness and advancements of detail.

In response to your SSE 2 on Pentium 4, the issue is, these optimizations are not visual effects but speed. That means a card could have an optimization integrated to load faster or render faster (as the Radeon 9700 and the geForce FX 5800 have been capable of, live rendering). But when the optimization is visual, it HAS to stay consistent. If the Pentium 4 rendered the scenes rather than the GPU, and SSE2 was included but actually Intel used a trick to lower FP SIMD precision to 64-bit than 82-bit (or whatever SSE2 was using), then can we still say it's a fair trick? Not if the visuals really end up worse (yay for software rendering!).

Spud I hope you can see our POV. I believe what I said to be fairly representative of a good portion of the disappointed people here. We do not want image quality drops, and we do look at 3dMark as the test that sees what cards can do if left on a desert island with no tools but their own body. (or circuits, lol)

--
If I could see the Matrix, I'd tell you I am only seeing 0s inside your head! :tongue:
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
I'm still battling it out over at Futuremark, you guys would be proud of me(holding my cool)
<A HREF="http://discuss.futuremark.com/forum/showflat.pl?Cat=&Board=techdisplayadapters&Number=2396896&page=0&view=collapsed&sb=5&o=0&fpart=1" target="_new">http://discuss.futuremark.com/forum/showflat.pl?Cat=&Board=techdisplayadapters&Number=2396896&page=0&view=collapsed&sb=5&o=0&fpart=1</A>

3DMark 03 = 4,140
<A HREF="http://service.futuremark.com/compare?2k3=897633" target="_new">http://service.futuremark.com/compare?2k3=897633</A>
<font color=red>AthlonXP 2100+/Radeon 9500Pro</font color=red>
<font color=red>Folding for Beyond 3D</font color=red>
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
ya.... keep your cool there, if u can't hold it anymore bring him here and Phial will take care of him hehehehehhehe................ (gang RAPE!!!)

Proud Owner the Block Heater
120% MATROX/INTEL EXTREME GRAPHICS Fanboy