3ds max graphics

I have a strange scenario that I would like explained:

At school my class is using 3ds max 7 to learn 3d modeling and animation. At home I do the same thing and work on some projects of my own. What's strange is that my gaming comp. renders, displays more, and works faster than the graphics computer at my school.
Here are the basic specs:
Home: e6600, 7600gt, 2GB RAM
School: P4 3Ghz, quadro fx 1000, 1 GB RAM

I know it isn't the ram bc mine was faster a few months ago b4 i upgraded. Is the 7600GT better suited for 3d modeling than the quadro? Hmmm...it really isn't a big deal, just got me thinking.
8 answers Last reply
More about graphics
  1. Ok from what I have read around here is what I think the difference is. From what I understand when doing the actual rendering it is more cpu dependent that gpu dependent and since you have a "faster" cpu at home that would speed up rendering times, I belive. As far as the gpu I believe that only has a big effect when doing the actual modeling and designing i.e. pre render where you are moving large models that takes a fair bit of gpu power to display on screen even though it tends to be wireframe models. That is my take on it I could be totally off base and I am sure someone will correct me later. I used to do alot of 3d graphics on my current rig about 8 years ago with 3dStudio Max.
  2. Yeah I had the CPU part figured out but that still doesn't explain why the actual workstation modeling/animating before rendering was still a lot smoother on my rig. Maybe the Gaming/modeling cards overlap at a certain point and that is where the 7600 GT stands?
  3. I would guess it has to do with the extra gig of ram. And I believe that the workstation class and consumer class vid cards are pretty much the same except the workstation cards are set up to work better in openGL than in DX. they usually get more ram as well but I think they are pretty similar to a standard gaming card, I would assume a good gaming vid card would preform just as good in openGL as a workstation card but I have little to no knowledge of the actual differences between the 2, I think there was a topic started sometime ago talking about the differences between a workstation card and a consumer gaming card you might try searching and see if you can locate it it could provide more insight than my lowly self. LOL
  4. Exactly.

    The quadro isn't used during rendering, so only the processor matters. and the E6600 is way more powerful than the P4 (especially for 3dsmax, since it can fully use dual core CPUs). My school also has P4 3.0GHz, and Mental Ray renderings are 2.5x - 3x slower than on my E6300.

    for the comparison between the FX1000 and your 7600GT, I don't really know which one should work best... but the FX1000 is en "entry level" quadro. Anyway, are you sure the MaxTreme driver is correctly installed on your school's computer ? 3dsmax can fully use quadros only if it has this driver, and if it is set to OpenGL mode (a quadro is almost useless in DirectX mode... at least until 3dsmax9)
  5. I'd say being a school computer that is networked wiff how many students would slow things down a bit, I can say wiff my network when I'm using 3dmax 7 and buddy is downloading from my puter decoding is somewhat slower.
  6. Well I did a little searching on the web and found out more info about the difference between workstation and gaming vid cards. Basically the gpu is the same on both cards but with workstation cards the rest of the vid card set up is geared towards workstation graphics i.e. usually clocked slower for stability and some have features built into the card itself that works together with the graphics program to improve certain functions like programable shaders and other optimizations. I would think the a good gaming video card would preform fairly well in those programs but the workstation card will still work better. The biggest difference between your school pc and the one at home is your system at home gpu excluded is considerably faster with the c2d chip and more ram so it would make up for the difference in graphics cards. As long as you have a good set of openGL drivers (which is a big part of what you pay for with a workstation card that and stability) for use with a consumer graphics card it should preform well in the 3d apps. But then again the gpu only really counts pre render unless you are trying to do real-time rendering. The workstation gpu will make the creation and modeling faster but your home rig is faster (excluding gpu) overall to make up for the difference.
  7. (EDIT: Didn't see the part describing the card; the FX1000 is a very weak model, based on the FX5800/NV30. It's not much of a competitor with the GF7600GT, which has more memory bandwidth, more vertex power, more texture power, all over more powerful).

    You need to find out what Quadro it is, it could be an old Quadro DLL or Quadro4 series or an FX500, neither of which have impressive strength to accelerate the pre-rendering graphics. It'd be like putting the X1600 up against a FireGL8800 (R8500 based), no match for raw power.

    Also some versions of 3DSMax just favour a strong vertex /geometry card, and depending on your settings those will also favour one setup over another.

    Depending on the model, that Quadro at school could be a weak FX based Quadro, against the geometry and texture power of the GF7600 it'd be unable to keep up in many ways. I'd be surprised if it were a high end QuadroFX 4xxx or 5xxx series card in those machines at school being beaten, but anything in the 500-1500 range would be easily understandable, and even the Quadroo 3000 series would likely struggle against a GF7600GT.

    Some apps are affected alot by specialized drivers, but 3DSMax is a little less so (makes a difference within the same architecture but not as much across generations) there is some boost, especially under the OGL version of 3DsMax, but it wouldn't be enough to boost the truely low end over a solid gaming card in 3DsMAX.

    Anyways, you might want to check which model of Quadro it is at school, I have a feeling it's likely something with a very unimpressive core, and likely low memory bandwidth too by the sounds of it.
  8. Ahh Grape Ape now this is the kind of person we need posting in here. Bow to the wise one. 8) :lol: I envy your breadth of knowledge always a good read when Grape Ape jumps in.
Ask a new question

Read More

Nvidia Graphics