Futuremark useless, 3DMark waste of time

WHQL doesn't mean didly squat. It's so ridiculous, Nvidia, questionable drivers? Ya, whatever. Since when does WHQL imply anything. ATI has been releasing WHQL drivers for years. Is that supposed to mean they were bug free? Hell, half their certified drivers have fundamental problems. Just take a peak at the last benchmark between the Radeon and the GeforceFX, a couple of benchmarks couldn't even run because of ATI driver issues. In fact, when was the last time every game / benchmark ran perfectly on an ATI card. Can anyone say, never. Futuremark's just a total waste of time these days -- a bunch of political groveling Microsoft drones. I feel for Nvidia and ATI having to stomach so much stupidity in the benchmark world.

-- Chaos is the better order.
39 answers Last reply
More about futuremark useless 3dmark waste time
  1. WHQL means something since none of Nvidia's beta drivers are worth a damn. They're increasing their 3dmark score without increasing performance.

    I have had bad luck with their beta drivers. I stick to the WHQL's

    <font color=red>GOD</font color=red> <font color=blue>BLESS</font color=blue> <font color=red>AMERICA</font color=red>
  2. But yeah....3dmark isn't worth much.

    <font color=red>GOD</font color=red> <font color=blue>BLESS</font color=blue> <font color=red>AMERICA</font color=red>
  3. Sounds like someone is bitter.

    Just because you don't understand how or why a benchmark works, doesn't make it useless, just <i>NOT RIGHT FOR YOU</i>.

    The fact that we wouldn't have known about the PS1.4 vs 1.1-1.3 (and 16bit vs 32) issues for a while without 3DMK03 proves that it has some 'overall' worth to everyone.
    Sure it has it's limitations and flaws, but then again show me one benchmark that DOESN'T.
    If you don't like 3DMK03 fine don't use it, but don't whine about it and then go off on a rant about drivers, when Futuremark has shown, and you pretty much re-confirm, that drivers don't make a better card, they just make the card work differently, in Nvidia's case faster at the price of quality SIMPLY to score on benchmarks. And harping about WHQL is just plain ignorant. You don't respect the ceritification fine, but tartarhus approved means nothing to people in here. Some people wait for the newest drivers with great anticipation, however for those looking for stability, WHQL means, this is as stable as it gets, regardless of whether there still remains artifacts/bugs.

    Personally the more benchmarks the better. Perhaps I'll only understand half of them, and use less than 1/10 of those, however the more the merrier for me, no one forces you to run a benchmark, that's your own doing.

    - You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <font color=red>RED</font color=red> <font color=green>GREEN</font color=green> :tongue: GA to SK
  4. I agree 100% with Grape...if you dont like it, dont use it. I wish there were a hundred more 3D Mark's out there...

    (Crashman) "That pic..Are you sane?)...<A HREF="http://www.lochel.com/THGC/html/Genetic_Weapon.html" target="_new">http://www.lochel.com/THGC/html/Genetic_Weapon.html</A>
  5. And also....all of my games run perfect on my ATI card.....I wish I could say "what a stupid post", but that wouldnt be very nice.

    (Crashman) "That pic..Are you sane?)...<A HREF="http://www.lochel.com/THGC/html/Genetic_Weapon.html" target="_new">http://www.lochel.com/THGC/html/Genetic_Weapon.html</A>
  6. I agree. Benchmarks aren't just about getting a "great score". They're a tool used to troubleshoot, assess a cards performance, and test a cards hardware features. It looks like 3dmark caught Nvidia with it's pants down this time.

    The aim of military training is not just to prepare men for battle, but to make them long for it. <A HREF="http://forums.btvillarin.com/index.php?act=ST&f=41&t=327&s" target="_new"><b>MY SYSTEM</b></A>
  7. hm.. is this post worth a comment? actually none

    except possibly one: no problems with any game or benchmark or anything on any ati card except the documented ones on the ati page.
    if there are problems, its the rest of the hw, not the ati card.

    WHQL certified drivers are a standard, like iso standards, like any other standards. stuff following a standard makes stuff work according to this standard.
    in the case of WHQL this means:
    full featureset
    no cheating
    quality proofen (image quality, precicion checks, etc..)

    ati drivers are stable, provide full featureset of dx9 for example, don't cheat anywhere, and is proofen to fit the quality needs for dx9.

    nvidia drivers don't. they are not stable, full of cheats and hacks, don't show up the quality they should, and are still slow as hell in uncertain situations.

    if you are not stupid, you get that. else, please, buy a gfFX 5800 and try to use your lovely hack-drivers and feel cool while you heaten up your room. but never, _NEVER_ cry because a game does have bugs. _NEVER_ cry that it does not run fast, or look as good as on the ati card from your friend.

    yes, ati cards are not 1337 hacker cards. they are stable, reliable, high quality and high performance cards.

    i know enough developers who stand behind this. including john carmack, coding doom3 currently. he always stood behind nvidia over the last time, but it definitely starts to break.
    even he has to say ati drivers get very good (and that even before the real gfFX fun started), and ati makes very very much to get them perfect.

    about 3dmark03. its a great benchmark, shows very good how nvidia tried to cheat and lie over years to the gamers. thats why all nvidia cards simply such in those tests. they lack very useful and actually in standards defined features.

    even carmack sais the "pixelshading" of gf3/gf4 is crap. he has to work with it, and due the bruteforce power of those cards, he can partially hide that they are crap.

    3dmark03 shows definitely how much more advanced ati cards technologically where, even on the days of the radeon8500..

    nvidia sucks. they lye like any american money making thing.. (and its even from texas, too, not? :D)

    "take a look around" - limp bizkit

  8. Dave...your post's are always so good!

    (Crashman) "That pic..Are you sane?)...<A HREF="http://www.lochel.com/THGC/html/Genetic_Weapon.html" target="_new">http://www.lochel.com/THGC/html/Genetic_Weapon.html</A>
  9. "3DMark waste of time"
    Actually i find the 3DMarks (both 2001 and 03) to be very informative, i use them both to test my o/c 9700's stability. With these utilities i can get maximum performance and when i go over the edge i see instantly in the certain benchmarks whats going wrong and lower the speed to compensate till i find a perfect balance :)
  10. I personally love anytime a person uses "ya, whatever" in an arguement as if it somehow adds to their case. Talk about bringing out the big guns.


    Ray Charles is my co-pilot
  11. 3DMark03 (and 01?) sucks because it doesn't show me how my card is going to perform in games; it only shows brute force. if nvidia makes ut2k3 run better via drivers, so be it, at least it's <i>running better</i>, that's all i'm concerned about
    i don't use it because it doesn't show game performance

    and i don't want hundreds of benchmarking programs floating around, because you never know which ones to believe, you can't keep track of them all or know what their strengths or weaknesses are. bad suggestion
    and to carmack saying nvidia's ps1.3/4 abilities suck and he had to work around them, that's bad and i don't respect nvidia for designing crappy pixel shading hardware

    <A HREF="http://www.tweaktown.com/document.php?dType=guide&dId=120&dPage=1" target="_new">WinXP tweak guide</A>
    <A HREF="http://www.tweaktown.com/document.php?dType=guide&dId=145&dPage=1" target="_new">WinXP tweak guide 2</A><P ID="edit"><FONT SIZE=-1><EM>Edited by LtBlue14 on 03/26/03 03:27 PM.</EM></FONT></P>
  12. the 3DMark programs are 1337, go to hell all that diss them.

    Wake up idiots, 3dmark2003 is not final, there will be patches!

    FX goes down on 9800.

    Now enough with the fighting, I hope none of you become world leaders, because we will always be in war if you are.

    "Picture yourself on a boat on a river, with tangerine trees, and marmalade skies. Somebody calls you, you answer quite slowly, a girl with kaleidescope eyes. Celophane flowers of yellow and green, towering over your he-ead, look for the girl with the sun in her eyes and shes gone... LUCY IN THE SKY, WITH DIAMONDS. LUCY IN THE SKY WITH DIAMONDS. (repeat) AAAAAHHHHHH! Follow her down to a bridge by a fountain, where rocking horse people, eat marshmallow pies. Everyone smiles as you drift past the flowers, that grow so incredibly high. Newspaper taxis appear on the shore, waiting to take you awa-ay, climb in the back with your head in the clouds and youre gone...
    Picture yourself on a train in a station, with plasticine pausers, with looking glass ties. Suddenly someone appears by the turnstile, the girl with kaleidescope eyes.
    CHORUS x2 and end."

    lets all listen to the beatles and get happy ok?

    Long live ATI.
  13. Willamette....you dont have any right to be calling ANYBODY idiots. Post after post of your's that I read, I have come to the absolute conclusion, that you are childish, incoherant, irresponsible, and all around brain dead....I would hate to see the working condition of your computer. Please shut your mouth.

    (Crashman) "That pic..Are you sane?)...<A HREF="http://www.lochel.com/THGC/html/Genetic_Weapon.html" target="_new">http://www.lochel.com/THGC/html/Genetic_Weapon.html</A>
  14. um... ok.
    i love you too:)
    you must not have gotten your daily dose of 60/70's hippie music, oh well.
    i have been a LITTLE unnecesarily critical of some people in the forum (n00bs), but only a couple times.
    oh and, my computer works perfectly fine, so i dont know why you mentioned that.

    Long live ATI.
  15. you guys arent making sense

    3dmark03 is used to *compare* hardware. the performance is going to be less in it than in games because 3dmark03 is made to be valid for a while to come

    so that hardware released 6 months from now will be able to be used with it.

    3dmark03 is a great program. just because you guys dont get your 12k doesnt mean it isnt. get real and grow up ffs

    AND 3dmark2000 and 2001 ran PERFECTLY on my radeon 7200 and 8500 with certified, and most beta drivers. i only changed to tweak performance.so yeah you dont know what you are talking about man
  16. and i laugh at the people who were dum enuff to go with Nvidia and are gettin half the scores in 3dmark03 compared to people with 9500pro (where 9500pros are cheaper than TI4400 LOL! . 9700pros sell for 65 bucks less than TI4600s in the local shop .. another LOL)
  17. You're missing the point. 3Dmrk03 doesn't show you exactly how it will work in one specific game because games are optimized for as many cards as possible to make sure they can sell as much as possible, 3D03 has ALWAYS been forward looking, and an attempt to stress cards ina way that future games COULD stress cards. It's a good tool to see the majority of features of new DX releases, which not all games use. Heck not all of DX7 or 8's features are in current games. The thing is to know how it works, and then know how to use it.
    I also 'respectfully' disagree with you about the number of benchmarks. The more the better, and then the quality ones will float to the top and then be the gold standard. With only 2 or 3 benchmarks out there how do we know they aren't optimized for just one product or programing code? As long as we can figure out or get info on how the benchmarks work then WE choose whether to use them or not. Just because one person doesn't have a use for a benchmark doesn't make it useless, just of not of any good to them.
    Anywhoo, I find it funny that people criticize a benchmark. It may/does have flaws, however what doesn't? Just as long as you know the most about them you can discount that particular issue. It's like languages, I wish I knew more, even if I never used Friisian in real life, knoing it is cool, just like Spanish, another 'dead' language. :tongue:

    - You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <font color=red>RED</font color=red> <font color=green>GREEN</font color=green> :tongue: GA to SK
  18. Many of you miss the point entirely. First off WHQL doesn't mean didly squat. Anyone who thinks it does should really look deeper into what WHQL certification entails. Secondly, people claiming that Nvidia "cheats" on benchmarks should really look into how DX8 and DX9 work. I don't mean by downloading it, or by reading the hype, but develop a grasp for how games are going to run in the future. When you understand how DX9 works, then you'll realize how utterly uninformative 3Dmark truly is. If you want a real benchmark, run the bloody game your interested in -- that's a real benchmark that has meaning.

    I don't understand everyone's adhearance to a benchmark that doesn't predict game speed. What the hell is it for? It's like telling me how fast my car would be in a parallel universe. Whoopidee doo. Who cares. Games are the only benchmark that count. No one sits for 4 hours playing 3Dmark.

    -- Chaos is the better order.<P ID="edit"><FONT SIZE=-1><EM>Edited by tartarhus on 03/27/03 01:45 AM.</EM></FONT></P>
  19. You obviously miss EVERYONE's response entirely.
    If you're worried about a specific game then run IT'S benchmark, however they aren't predictors of how you card will perform in other games either. Serious Sam or Jedi Knight performance doesn't tell you how well your card will do in QuakeIII, Morrowind, Aquanox, and a few ohters, they all use different parts of the DX7,8,9 & OpenGL tool boxes to achieve the effects they are trying to produce. It appears you also don't understand the concept of optimization either. What 3Dmark might tell you is that indeed your card IS good enough to play X or Y on a hardware level, however the driver/game optimizations are holding it back, perhaps leading some people to look for optimized drivers for this or that game or application.
    Why don't you share your EXACT problems with where 3Dmk03 and DX9 diverge, and then also show us another Dx9 benchmarking utility out there, let alone game/gamemark.
    If you're going to argue that it doesn't fully use all of DX9's features, don't bother, it's a start, and it's maybe only 20% DX9, but that's better than 0% which is all other comers. Aquanox's new bench isn't fully released so we don't even have THAT yet, except in beta form.

    As for playing 3Dmk03 for 4 hours, you once again miss the boat. It's not mean't to be run for 4 hours a day, it's a benchmark! Do you run PcMark, Sysmark, or Spec's Viewperf for hours on end back to back to back? If you do you really need to BUY a game, and try THAT.
    Perhaps you don't know how to use 3Dmk03, or perhaps you're simply a fan of Nvidia trying to bad-mouth Futuremark to defend your fanboy allegiances. In either case you probably should have simply downloaded a ripped copy of Daredevil or the Doom]I[ alpha. In the future, learn more about a bench before downloading it, that will likely save your time, and keep us from hearing more whinning.
    Also, don't go to the hardware store and buy a torx driver and then complain because you can't use it on your Phillips head screws. I can see it now, 'MAN Torx Suck! They don't tighten/loosen Phillips-head screws! Stupid Torx! Damn those guys at Fuller/Stanley!'

    Now where's my Robertson screwdriver, got go paint the living room.

    - You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <font color=red>RED</font color=red> <font color=green>GREEN</font color=green> :tongue: GA to SK
  20. That's the whole god damn point. 3DMark doesn't show you driver optimizations for games. It's not a good predictor of game speed, and in the future will be even less informative as optimizations become the norm. So, then what the hell is it supposed to show me? The fact that Futuremark is playing buddy-buddy with Microsoft, a company that wields politics and not open standards shows you what they are all about.

    The fact that ATI and Nvidia must waste their time trying to appease these idiots apparently keeps you happy at least. Then you can go about toating how high your benchmarks are, when they don't mean a damn thing.

    -- Chaos is the better order.
  21. That post touched my heart, deep deep down where my heart resides.

    I agree with what dave said completely and would just like to add a couple of things.
    First off, why does Nvidia keep adding 'special' features to their cards that don't follow the standards that eveyone else uses. The example of this that comes to mind is their OpenGL and Direct3D. They created their own subversions of these so that they can make drivers that excel in specific situations, however, hardly anyone has ever optimized the software they write to use Nvidia's standards. If Nvidia would design things to run the same way as the rest of the market a lot of people would be happy, and we would see that Nvidia's cards only preform well because they use 'alternative' methods to get the performance.

    <b>Just because I like AMD or Intel more at a time because of one product compared to another, does not make me a fan boy, it makes me a person who is able to make a descision for myself.</b>
  22. tartarus, please learn to use brain (download UseBrain(tm) ver 2.31.46, it has a lot of bugfixes:D)

    no, really. i know enough developer that can definitely support my arguments. you don't know [-peep-].

    "take a look around" - limp bizkit

  23. oh, and i know of enough "inofficial" dx9 benches, and the gf cards simply suck everywhere.

    and working for them _IS_ crap. definitely. after the first WOAH GF3 HAS PIXELSHADERS after short time there where just tons of questions "how to do this, how to do that" and in the end, world had to realize, actually, those "pixelshaders" are just extended multitexturing, with some predefined effects like for the bumpmap water and that. not programable, not _NEW_ (matrox and ati had those more or less in their cards for _YEARS_!!!)..

    about the inoffical dx9 benches?

    every programmer out there can code up some bench for you. just download the dx9 sdk, learn to program, come back in some years when you learned it, and then code up some simple dx9 pixelshader. and voilà, the gfFX is slower.

    once you've gotten there, you know the differences between ps 1.3 and ps1.4, too, and realize how sucky gf3 and gf4 are actually..

    anyways. i was nice to you. i'm not nice anymore. you showed up to not read, to not listen, to not think.

    and WHQL is great. i'm happy to have a card with performance, 100% stability (my pc runs over months, with gaming, dvd watching, inet, programming, etc everything i want. no need to shut down at any time except for some reboot for some installation possibly), and quality. image quality, that is.

    never want to move back. WHQL guarantees this. if you don't know it, learn it. WHQL is not much, there could be more standards. i agree. but it definitely means more than nvidia can provide.

    "take a look around" - limp bizkit

  24. happy:D

    one thing to add. nvidia can't provide full compliant hw, and because they have such a wide marked share, quite some useful features always have to .. get unused .. because it simply is not in the hw of nvidia. clip planes are the most prominent example. projective texturing on gf2 another one (that took _LOOOOOOOOOONG_ till we found out they cheated us there.. years after the release of the gf2 we found out they actually don't have a real projection matrix in their hw, only 4x3 matrix, not 4x4. they forgot the projection part. if you accidentally used it (as you normally did if you wanted to project), you kicked out the full hw t&l part. that sucked. and nobody knew.. everyone was wondering why did their code got so slow.. till nvidia stated.. ups.. we .... "forgot" ..... that...)

    "take a look around" - limp bizkit

  25. Here is the point. You have no way of benchmarking for games that have not come out yet. 3DMark tests a cards ability to use the different standards. In the case of DX9, it tests their ability to utilize those features. I think it is a valuable tool for doing so. You cant rate a card's performance capabilities for future games by using existing games because existing games to not use those new features. IF a chip maker optimizes a driver for a benchmark then they are cheating. They are deliberatly trying to use a benchmark to lie about their cards performance. Futuremark, in an effort to perserve the integrity of the results, is trying to keep chipmakers from doing this. WHQL drivers are generally not optimized for the benchmark so they provide a better indication of performance. The number that 3Dmark comes up with is just a number that is used for comparison purposes. The number in and of itself doesnt mean a whole lot.

    So here is the question, how else do you compare performance, tartarhus? You seem to know better than a good amount of the industry, right? I am assuming that you have some sort of grand idea on how to compare performance. Obviously we cant go by clock speed or memory size and using existing games is not an indication of how the card will perform with future games. I dont think you have reasonable answer to that question anymore than you have a reasonable basis for your complaint.


    Ray Charles is my co-pilot
  26. That is the thing, we are all talking to a wall. Dave has tried to launch the hard truth and the guy selfishly ignores it.

    Methinks we shouldn't bother and go help someone looking for a new card.

    This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
  27. i have a question about the inefficiencies in the benchmark, i'll quote some complaints from nVidia:

    they show sample code, then say
    The portion of this algorithm labeled "Skin Object in Vertex Shader" is doing the exact same skinning calculation over and over for each object. In a scene with five lights, for example, each object gets re-skinned 11 times. This inefficiency is further amplified by the bloated algorithm that is used for stencil extrusion calculation. Rather than using the Doom method, 3DMark03 uses an approach that adds six times the number of vertices required for the extrusion. In our five light example, this is the equivalent of skinning each object 36 times! No game would ever do this.

    This approach creates such a serious bottleneck in the vertex portion of the graphics pipeline that the remainder of the graphics engine (texturing, pixel programs, raster operations, etc.) never gets an opportunity to stretch its legs.

    how true is this? why do drivers fix it? (with regard to the supposed over-swamped graphics pipeline)

    <A HREF="http://www.tweaktown.com/document.php?dType=guide&dId=120&dPage=1" target="_new">WinXP tweak guide</A>
    <A HREF="http://www.tweaktown.com/document.php?dType=guide&dId=145&dPage=1" target="_new">WinXP tweak guide 2</A>
  28. well nvidia might have done it again, they released new drivers that got the card back up to rank in 3dmark03, and although it is a benchmarking program, it can give us an idea on what games will be like in the future. Given that game devlopers use these DX9 standards, although most developers won't use all of DX9, they will take chunks that can be added to make it pretty. So technically, yes and no, it does/n't show the future of gaming. At least that's what i see from my point of view...

    "What kind of idiot are you?"
    "I don't know, what kinds are there?"
  29. I'm still waiting for them to release WHQL certified drivers no matter what. It's been 5 months and numerous questionable releases. Many couldn't even get the 40.72 drivers to work and had to go back to the 3x.xx WHQL drivers.

    <font color=red>GOD</font color=red> <font color=blue>BLESS</font color=blue> <font color=red>AMERICA</font color=red>
  30. Is the FX actually out on the market, officially?
    I have yet to find any store in Canada.

    This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
  31. The drivers that boosted the score are questionable. There are talks of low Zbuffer and FP precision, low image quality. It is rather questionable in how come the score quadruples in PS2.0 code suddenly. No drivers including the Detonator XP can do such just with optimization code.

    Additionally, some individual DX9 shader tests out of 3dMark03 revealed extremly horrid DX9 performance. Dunno if they are to be believed, they were on HardOCP and someone had posted them here.

    This post is brought to you by Eden, on a Via Eden, in the garden of Eden. :smile:
  32. 41.13 drivers seems better than 40.72. It comes with nForce driver package.

    Submit your opinion <A HREF="http://forumz.tomshardware.com/community/modules.php?name=Forums&file=viewtopic&p=28537#28537" target="_new"> Should Tom Fire Omid? </A>
  33. In regards to Defective,
    Yes, your right. Trying to benchmark the future performance of a video card presents an enormous number of challenges, especially when DX9 is factored in. And I understand what futuremark is trying to do. It's just that their software lacks the sophistication employed by modern gaming companies like ID software. When ID came out and said that ATI's latest Radeon really rocked, the world dropped everything and listened. That's because everyone knows that ID software really knows what the hell they are talking about, probably more than anyone else in the industry.

    The way games are written and optimized has gotten to the point that general purpose 3Dmarks simply fail to capture what's going on. If you really want to know, you're gonna have to run benchmarks for individual games. Most popular 3D games are written by the bad asses of the programing world, the graphics experts, gurus. When you benchmark their games, you can clearly see how well a card shines.

    What really pissed me off was that after Futermark made a mistake, instead of fixing it, they went blaming the world for it. The slap in the face was the WHQL crap, which was just meant to keep business suits interested in the quality of their benchmark. Their moves seem to be purely political these days.

    -- Chaos is the better order.<P ID="edit"><FONT SIZE=-1><EM>Edited by tartarhus on 03/27/03 11:51 PM.</EM></FONT></P>
  34. I don't go showing my benchmarks, their triple digits in 2001 and not running in 03 right now are not things I like to brag about. So that's not my focus.
    What does it show me? It shows me how my card stacks up against other cards, and gives me an idea of how much Nvidia's 'how it's suppozed ta be played' program screws over my card to benifit Nvidia cards. With that incestuous compacts (which are in games like 1942, UT2003, STALKER, and a few others [including the upcoming Tron 2.0]).
    What this bench shows me is what many people have always thought, Nvidia is stacking the deck for their inferior technology by having game developers optimize their games at the expense of other cards; Optimize for ALL.
    I want an unbiased benchmark. Ati and Nvidia may spend time optimizing for the Benchmark, but it appears that only Nvidia so far has improved their score at the cost of all else (quality and stability in other games).
    I don't want a predictor of specific games, I want an unbiased unoptimized test of the the technology that game engines are based on. Perhaps with an unbiased test, we will see more people giving the proper props to cards like the 8500 series, and then more gamers will realize that Nvidia's way isn't the only way, and that they will optimize for ALL cards, instead of just the way NVIDIA or for that matter ATI wants you to play it. IF we want competition in the Graphics market we can't have that inhibited or halted by behind the scene optimizations that favour just one party.
    Don't think of 3Dmk03 as simply a predictor of ALL games, but gives you an idea of how SOME game/applications COULD perform, and that they could do better based on if there is preferential optimization.
    Should a benchmark favou on product over another, no of course not; althought the results may favour one over the other.
    As for complaining about DX9 and MS, I am no fan of MS, however they are the standard by which most games base their engines on, we may not like it, but until Linux and others become the GAME OS of choice, there is no choice.

    P.S. People listen to ID / Carmack because THEY will base their games on HIS engine, so of course they listen, they have to.

    P.P.S. what is this mistake of Futuremark you speak of, I haven't seen ANY major ones, only little things that will be tweaked in later versions. You speak in generalities, how about some concrete arguments.

    - You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <font color=red>RED</font color=red> <font color=green>GREEN</font color=green> :tongue: GA to SK<P ID="edit"><FONT SIZE=-1><EM>Edited by TheGreatGrapeApe on 03/27/03 10:51 PM.</EM></FONT></P>
  35. That's not Futuremarks problem. The multiple passes problem is an Nvidia problem. Ati's cards don't have that problem, heck even the Matrox cards don't do it that way (not that that helps their performance/scores much)
    Nvidia's complaints were even adressed here in Tom's and shown that they are simply poor Nvidia implementation of DX standards, and methods.

    If you want an article that refutes it here is a link to the specific page on Beyond3D that adresses this issue;

    <A HREF="http://www.beyond3d.com/articles/3dmark03/post/index.php?p=2" target="_new">http://www.beyond3d.com/articles/3dmark03/post/index.php?p=2</A>

    Once again Nvidia is complaining that Futuremark doesn't favour their cards at the expense of all other cards.

    The Sad thing is that Nvidia's argument doesn't affect the FX, but their older boards, and show them to be poor designs when compared to others. Now they aren't complaining about the FX or even really the Gf4, it's trying to defend why the
    Gf3 line does so poorly when up against cards like the 8500.
    Once again Nvidia trying to get people to optimize for their cards, to cover their shortcomings.
    The FX is the best card sout there now, but that's the QUADRO FX, and they don't seem to have problems with synthetic benchmarks when the QFX kicks the FireGL. So their complaints about 3dMk03 ring hollow to me.

    - You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <font color=red>RED</font color=red> <font color=green>GREEN</font color=green> :tongue: GA to SK
  36. I read an interview, I can't remember where; Where a rep. of Nvidia said that they were expecting a release of the 50 series drivers in a week or so. Now he was being coy but that was the hint. So you may see anothter series of drivers, likely not to get certification first, but liekly an FX level driver a start of the 'official' FX level set.

    - You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <font color=red>RED</font color=red> <font color=green>GREEN</font color=green> :tongue: GA to SK
  37. I take comfort in knowing that there are so many blank slates out there to keep marketing guys like me employed. Apparently, I'm due for a raise. But should I be frustrated that it's so hard to erase someone else's work, or encouraged by how soundly one's work stays put? It's a diliemma, that's for sure.

    -- Chaos is the better order.<P ID="edit"><FONT SIZE=-1><EM>Edited by tartarhus on 03/28/03 02:11 AM.</EM></FONT></P>
  38. For all of the FX bashing I've done, now that things have settled, I have a teeny bit of admiration towards Nvidia for managing to squeeze all that performance out of a 128bit memory bus...even if they used DDR2 to acomplish it...

    (Crashman) "That pic..Are you sane?)...<A HREF="http://www.lochel.com/THGC/html/Genetic_Weapon.html" target="_new">http://www.lochel.com/THGC/html/Genetic_Weapon.html</A>
  39. i do agree with the questionable drivers, as a matter of fact the discussion is going on in futuremark's fourm, where they posted pic's and their IS a difference between the drivers. It appears their still doing the FP16, although i don't know if it will be a big deal, it looks like Doom ]|[ will be using FP16 anyway (from the chat) Either way i hope MS pimp slaps nVidia down and doesn't let the bully push around the giant -_-

    "What kind of idiot are you?"
    "I don't know, what kinds are there?"
Ask a new question

Read More

Graphics Cards Drivers ATI Graphics Product