different graphic cards in 3d apps and test like pro cdrs don´t do any difference. as long as u have a geforce (mx, sdr, whatever) that is. i am getting 3.4 average in geometry2.max benchmark test on my athlon 1.2 ghz. tried my mates elsa gloria 2 in same machine (was thinking of buying one myself) - got 3.5 - 3.6 @ it´s best..... real speed booster huh? i think not.

u have got to realise that pushing 200 000 polygons around isn´t really what them geforce cards are made for they are made for GAMING and pushing lot´s of textures around, and with the geforce 3, cooler light effects and stuff like that.

desperately tried tomshardware and sharkyextreme to do a test with geforce vs other alternatives/amdvs vs intel/nt vs w2k and 3ds max performance, but no luck.

what will increase fps display with complex geometry in 3ds max is as fast a cpu as possible. that´s THE only thing that´s going to make a difference. my fps with geometry2.max increased from 2.3 with a duron 700 to 3.4 with an athlon 1.2 ghz.

my advice for performance hungry budget max artist is get a abit k7ta raid mobo with athlon 1.2 ghz, geforce whatever model now, upgrade to 1.5 ghz whenever that is possible, keep running on SDRAM (3ds max and 3d graphics like we do it isn´t at all sensitive to memory bandwidth we need fpu power).

unfortuanetely most 3d artists think upgrading their geforce sdr (for instance) to a geforce 2 ultra will increase 3ds max fps display performance...
they are in for a surprise i´m afraid... if they aren´t into a lot of gaming that is..... hey games will rock on the more expensive geforce...

greetings from stockholm
pardon my sad english

patrik forsberg
21 answers Last reply
More about truth bout graphic cards
  1. check out the quadro 2 for 3d boosts in design

    where did all my money go, ahh... [-peep-], forgot that i lived in london
  2. quadro are hardly better than the geforces they came from. I just want to see what the radeon will put out in max.
  3. THAT i agree to (quadro´s not much better then geforce) i have personally tested an elsa gloria 2 and compared it to an old geforce sdr card NOT MUCH DIFFERENCE u would be shocked, especially considering the price difference.

    @ work we use geforce, fire gl´s and a wildcat.
    the latter 2 are REALLY good. comes highly recomended from me. but any geforce is a good budget choice.
    focus on getting a fast cpu instead. an athlon 1.2 will do.
  4. I am upgrading right now, have p3 500, 256mb pc100, tnt2. Want to replace with tbird 900, 256mb pc133, geforce 2 unless the radeon can perform close to geforce 2 in max. I'll try to run the cpu at 150mhz fsb but if I can't reach that I'll run at 133mhz fsb.

    Unfortunately, the tnt2 has got to be replaced so the cpu speed has to go down.
  5. yeah u should replace the tnt card with a geforce one. an mx card is a good choice. as is the athlon 900. wouldn´t matter if u upgraded to geforce 2 ultra (fps display wise), and the performance gain between athlon 900/1200 isn´t THAT big. the truth is: neither the athlon 900 OR 1200 can cope with a really heavy scene like geometry2.max benchmark scene....

    u really have to work smart. use low-poly substittute models (max 4´s new multi-res modifier is perfect) while animating etc etc.

    good luck
  6. So in 3dsMax FPU is the biggest factor right? So then a Radeon or a GeForce will perform about the same then right?
  7. this doesn't make sense to me. show me some numbers proving this. i believe that the quadro2 doesn't outperform a geforce2 gts because they are manufactured using a very similar if not exactly the same parts and the same assembly process. read up on modifying your geforce2 into a quadro2 for more info on this (<A HREF="" target="_new"></A>). the only difference being a "switch" turned on in the quadro2. the quadro2 was not designed to be (much) faster than a normal geforce2 but to have more features available. also geometry2.max is only one benchmark you've shown this to be true in for a particular case that TOM has already pointed out and for only one model of geforce. if you could show that all these varieties of cards scored similarly on other benchmarks like lighting and textures i might start believing you, but then speed isn't always what pro cards are about (but sometimes higher quality for the same fps). if you have any benchmark results showing that the other varieties of geforce cards please share them. right now i don't believe you.

    also if 3ds is so fpu intensive and non bandwidth sensitive how come the pIV (strength of processor = high bandwidth :: weakness = FPU) wiped the table clean with the figegl2 in the geometry benchmark? i agree that 3ds is VERY fpu sensitive, but doesn't this show that it is also bandwidth sensitive at least in certain circumstances?

    <A HREF="" target="_new"></A>

    also according to the fpu theory why do the quadro, firegl and wildcat perform so differently and what makes the firegl peform so much better in geomtry than the geforce and quadro? is it just an illusion that the wildcat cleans up in all the benchmarks and has the fastest fps in all the benchmarks. also according to this theory why even have a 3d graphics card for modeling 3d.

    in response to upping your processor speed; you increased it by over 70% but only saw a performance gain of 47% (in the benchmark you were looking at) and that's with going from a duron to an athlon ( not athlon -> athlon) a 700MHz duron goes for abou $45 while a 1.2GHz athlon is $222. a 4.9:1 athlon to duron price ratio. geforce2 64MB ~$240 while firegl2 ~$1000 a ration of 4.2:1. the performce gain according to the link above, for say the 1.2GHz athlon, is 225% (compared to 47% for the increased cpu. to get the same performance increase you would need a 3.6GHz processor). so you get a huge increase in performance for comparatively less price difference. graphics cards DO make a difference. you just need to know which ones do what you need them to.

    just realize i kind of screwed up and used geometry1 results for the last part, but the point remains the same since TOMS geometry1 results for the geforce2 and quadro were even more dismal than soziopat's for geo2.

    <P ID="edit"><FONT SIZE=-1><EM>Edited by DeSilentio on 03/05/01 05:10 AM.</EM></FONT></P>
  8. hmmm a lot to answer to have i....

    first of all quality isn´t your main issue when u work on a scene with 200 000 polygons or more, like i do almost every day, i promise u that. i prefer quality to go into the renderings. just kinda happy rotating the view after 10-15 seconds work (high-end huh?). yeah sure u try to work with low poly substitutions etc but it´s inevitable u gonna have to play with the complete scene from time to time.
    this here is the single issue 3d artists freak out on.
    see, textures and lights etc any geforce handles really well. thank god for the geforce. any model.

    secondly; my original thread here were actually posted as an opinion concerning some people wanting to benchmark different geforce cards against each other.
    acording to my experience i don´t notice ANY difference working on a geforce sdr or a quadro card in real life.
    @ least not where it matters. like with heavy scenes with tons of poly´s.
    that is my main point and experience i want to share with the 3d community. don´t spend your hard earned cash wrong!
    and that is why everyone @ work decided against the quadro card. not much difference (or could it be they´re gaming after i´ve left...hmmm will have to look into that).

    i haven´t got all that much time to do pro serious benchmarks @ work wich is a pure production environment and we are always supposed to have finished our work yesterday. that´s why i´ve tried tomshardware/sharky and others to do some serious benchmarks with different geforce and other alternative video cards, and different os´s, and platforms (amd, intel) in 3d apps like 3ds max a bit more seriously. no luck.

    the reason i´m refering to geometry2.max is that it´ll bring your pc down to it´s knees, and ALMOST represent some geometry 3d artists like myself have to cope with every day.
    it´s one of my favorites. forget synthetic benchmarks and teoretical opinions without real world experience. just run this here benchmark and see where your system will get u.

    the overclocked pentium 4 with fire gl2 scores well.
    better then fire gl2 with athlon.
    could it be they managed to optimise them drivers better for the pentium 4´s could it? especially since them athlons smoke any intel while rendering? and the athlon beat the pentium 4 in geometry1 with a geforce 2 gts?

    cause if it´s the case that bandwidth matters when it comes to viewport display fps how come ddr systems vs sdram systems don´t do better in 3d? or that the geforce ultra doesen´t smokes a crappy mx card in 3ds max geometry2.max or pro cdrs tests? read tom´s latest test on geforce 2 scaling and the pro cdrs test and lightscape test wich represents what us sad 3d artist is going insane by;
    high polygon count:

    actually let me quote dear mr tom on this one concerning pro cdrs test if u are into this more teoretical benchmark stuff;

    "It looks like ProCDRS is extremely performance intensive. Using the GeForce 2 Ultra will not enable much more performance than a GeForce 2 MX is able to provide. Just a faster processor will speed up ProCDRS. "

    quite frankly my main point is this:
    don´t go buy a geforce 2 ultra thinking it´ll speed up your heavy scenes in 3ds max. don´t expect the quadro to help u out either.
    there are high-end cards available if u can pay.
    otherwise, a crappy geforce mx will do just as good in 3ds max. u´ll need to adopt your workflow to use low poly proxy´s etc always ANYWAY. doubt the geforce 3 will do that much better either with heavy scenes (though i sure hope so). any geforce is good for textures, lights, 25 or 50 fps won´t matter here that much.

    if u believe me or not is not my main issue here.
    just want share my experience with other 3d artists.
    come on; we need 15fps in geometry2.max benchmark scene @ least, not 3.4-5 or so jesus christ i should have forced u into dragging my 870 000 polygon scene around them viewports on my pc here in stockholm for a few hours. lucky for u u aren´t closer to me... u would also freak out....
    go insane.... cursing nvidia.... amd.... intel.... and everyone else.

    take my advice or don´t.
    believe me or not.

    now i have to go drag that scene around some...
    good luck with your systems anyway
    and happy animating!
  9. i really recomend u buy a geforce over the radeon if u want to run 3ds max. any nodel (secon hand sdr/ddr) newer mx etc will do fine.
    the geforce is better with textures and lights, and poly pushing the the radeon (right now that is). spend your money on a fast cpu instead u won´t regret u took my advice on this.

    i´m not saying graphic cards DON´T make a difference they do. what i´m saying is wich geforce card don´t matter that much. as long as u get ONE of them. in 3ds max that is.

    of course high end cards are available wich will smoke the geforce, quadro and even fire gl2/3.
    of course u can game faster and in higher resolutions with a geforce 2 ultra.
    if u got the cash - hey go spend them all.
    if you are on a budget - get a geforce mx or similar.
    they´ll do just as fine in max as an geforce 2 gts will.
    textures and light animation is fast. 40 or 50 fps don´t matter here.
    when it comes to heavy scenes: both the mx and ultra won´t cope any more. that´s a promise for u...

    the point i´m trying to make only concerns different geforce cards and/or faster cpu IN 3DS MAX. NOT games.
    or other synthetical benchmarks or other fictive tests.

    happy animating and good luck if u decide to go down the max route. the world will be your oyster.

    this here wonderful threads have made me realise that
    they seem to be gaming @ work, after i´ve left that is, will have to get into that..... mayby some network malfunctioning after 1800 hours will sort it out...
  10. Thanks soziopat, I agree in Windows 2000 a nvidia chipset for professional work may have less headaches then what ATI is dishing out at present. When it comes to textures and lights, high polygon screens the Radeon really starts to shine because it is not wasting as much time texturizing the unseen polygons from your view point. In this case as the complexity goes up the Radeon performs better and better over the GF2. As for straight untexturize polygon rendering the GF2 would clearly be the winner in all cases as far as I see which would be the most inportant in modelling. I use TrueSpace5 which really is not in the same league as 3dsMax but in my case the Radeon performs superbly. I really enjoy your professional experience and comments, hope to keep reading your fine posts.
  11. strange, very strange...

    "akuna mutata" braza... :wink:
  12. Thanks soziopat! I've been waiting for someone that has experience with max and various cards. Since I am buying my system to run the best in max as my money will allow I know your suggestions are dead on. When the geforce 3 is out the geforce 2 mx will go down in price so I'll be able to get an even better cpu to start out with before I over clock it. I curse the day I bought my P3 500 a month before amd released the athlon. All well, lesson learned. You've been a major help. I actually expected newer cards to fair way better in geo2 than my tnt2 but I guess it is just too much for quite sometime yet. The most important thing is that the vid card can handle the lights and medium sized poly's. I'm still a novice with animation, need to sell my legs and one arm before I can afford to go to animation school. Even going in debt 10 grand isn't enough. Having the geforce 2 will also run games good, which really isn't a good thing in my case since animation is more of a hobby. Hard to stay focused when you have no one to teach or push you.
  13. some tests i stumbled upon while early-morning-drinking-coffe-surfing-some. testing graphic cards viewport display with maya. results should be comparable in any 3d app.

    unfortuanetely the tests doesen´t seem to incorporate radeon. i think you will find that the faster the cpu the better performance, than say same graphic card slower cpu.

    not a lot of tests out there for people like us, so i thought i should share them with u guys.

    happy animating.
  14. i am self-taught. i feel i need to give u some advice when it comes to landing a job in the 3d world. i know how tough it is.

    uninstall all games (for a while @ least). don´t be afraid of slow pc´s. you will always have a too slow a pc...
    first of all mess around with your 3d app some. do some basic modelling tutorial, texturing, animation just to get a grasp on stuff, tools, workflow. then decide upon a project animation and set out a week by week schedule and finish your work plan every week. by sunday latest. if you are too lazy to finish before sunday: force yourself into staying in front of the pc until you do finish. as pure punishment.

    this will make u get your work done. and this way u will focus on say modelling a or a few weeks, and gain in-depth knowledge within a certain area one by one.

    yeah it´s a hard road to travel, self-education.
    but this is ultimately how it´s going to end anyway; even if u would attend classes or whatever the truth is: u will have to learn yourself anyway.

    today with internet one can get good feedback and a lot of help from the 3d community. there are forums u can post images of work in progress and get feedback from all over the world (!). questions can be asked in tons of forums.
    the 3d community are very helpful. there are tutorials everywhere.

    what´s most important; hang in there and stay with it. most people can´t and won´t learn enough to spit out models and animations, make sure u will.

    good luck.
    u know u can do it.
    now go do.
  15. I've been doing it for a little more than a year now and the games have made it so I don't know all my stuff right now. My worst fear is modeling. This is hard for me to do. I know the basics about texturing but I've mostly stuck with character animation. I love to just animate anything a person would do but the last large project that I had a schedule for was last year in grade 12. Back then I didn't know anything about modeling so it took about 9/10 of the total time I spent on it in modeling. I spent 3 months and about 8 hours a day animating the final for the class. Having to go to school for 6 hrs cut into the time I spent on it but every minute when I got home was working on that damn thing. I was kinda disappointed with it even though it got me an A because I could see where I crammed to get things done and where it could have been so much better. I know now that I want to do this and will be doing it a hell of a lot more now too. Could you please post some links where there are some good discussion boards? The only place I went was the discreet forum when I couldn't figure something out.

    One last thing I need to do is get away from those plugins! I spend way too much time fooling with those so I don't home my skills in the areas that count.

    Thx for all your help, you helped me spend my small amount of money wisely and you've inspired me! As long as I have enough discipline and the right help when I need it I won't have to spend thousands of dollars for school, although everyone seems to hire from that school so it is a good in for getting a job.
  16. (community message boards) there is a good in depth tutorial about subdivision modeling (w meshsmooth) u should print out under quick links, message forum - here u can even post own work and get feedback... requires u can upload your images onto a server somewhere though, and add a link to the image read more @ their site (massive resources with community, tutorials free stuff)

    just to get u started some.

    hey: focus on animation will ya! nothing wrong with that! tons of job openings for animators-only, even in small town stockholm there is! but sure, your portfolio should contain an organic modeled object or so. the rest of them object´s u could just rip off 3d cafe or someplace else.

    one guy who landed a job as animator @ my work animated really simle stuff like boxes, toothbrushes etc nothing fancy, but hey: they were so funny i still laugh just thinking about them! character animation isn´t about 200 000 poly objects. it´s about the ability to bring the simplest thing to life!

    good luck!
    this discussion should probably continue somewhere else now... like in a 3d forum... sorry bout this mr tom sir...
  17. Thx for the links, one thing I'd like to ask you is whether or not to go for duron since the fpu is the same. Will the extra cache of the tbird be need in max at all?

    The reason I ask this is because I have to replace my 256mb pc100 ram with pc133 and I want to go more than 256mb. Only problem is that I'm on a small budget so if I go duron that will save me lots of money.
  18. the duron works just as well as the athlon. @ same clock speed. in max. only the duron isn´t available in 1.2-1.3 ghz yet, but other than that....
    the duron 850/geforce mx is probably the most value you´ll ever get for your money!
  19. discovered another spot on the web u can upload own images and get some... critique!
    easier upload just upload from HD like a regular attachment

    i know i will. post work in progress that is. see ya there?
  20. cool, I posted my work there too, id is m_kelder of course :) thx again, and again, you've been a great help!
  21. here are the settings for optimal open gl performance running geforce for 3ds max:

    under nt right click desktop, properties, geforce, additional properties, open gl settings:

    "enable buffer region extension" ON
    "allow the dual planes extension to use local video memory" ON
    "use fast linear-mipmap-linear filtering" ON
    the rest of the settings in that window should be OFF

    "default color depth for textures" should be: "use desktop color depth"

    "buffer flipping mode" should be "auto-select"
    "vertical sync" should be "always off"

    finally "use up to 5mb (or whatever) of system memory for textures in pci mode" should be changed to "0" (zero).

    run geforce in 16-bit mode.
    for gaming purposes u might wanna turn fast linear-mipmap-linear OFF to get better quality textures but personally i have grown to kinda like them noisy gritty dirty textures...

    ahhh. love dirty noisy 3d.
Ask a new question

Read More

Graphics Cards Geforce Graphics