Solved

Quadro K4000 + GeForce GTX 780, How to pair them ?

I am looking to build a workstation for 3ds Max modeling, animation, and rendering.

The Quadro K4000 {http://www.nvidia.com/object/quadro-desktop-gpus.html} is to give me an edge while modeling in the view port, which is OpenGL, and so is Vray RT (Real Time)

But for the final images rendering, The cards with the more CUDA cores seem to produce much faster rendering times according to this article:
www.tomshardware.com/reviews/best-workstation-graphics-card,3493-18.html
Especially point 18. Iray Renderer + 3dsMax Results

I saw at LinusTechTips where he grouped a Quadro K4000 with a Geforce GTX 780{http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-780}

I always thought that you couldn't do that, but he doesn't look like an amature, can someone please help clarify if this setup would work, and how ? what would be used for what?
would the quadro handle the OpenGL tasks while the GTX handle the rendering?

I realize I might have not explained myself well, If anyone needs clarifications, please ask.

Here's the Link to the video link where he builds the system:
http://www.youtube.com/watch?v=PhkJLF3oyI8&feature=player_detailpage

It's a lengthy video, so you can just jump to [40:00] where he starts installing the GPUs, or to [04:03] where he shows the components of the build.

Looking forward for your input.
I would really appreciate your help.
22 answers Last reply Best Answer
More about quadro k4000 geforce gtx 780 pair
  1. kicoverz said:
    I am looking to build a workstation for 3ds Max modeling, animation, and rendering.

    The Quadro K4000 {http://www.nvidia.com/object/quadro-desktop-gpus.html} is to give me an edge while modeling in the view port, which is OpenGL, and so is Vray RT (Real Time)

    But for the final images rendering, The cards with the more CUDA cores seem to produce much faster rendering times according to this article:
    www.tomshardware.com/reviews/best-workstation-graphics-card,3493-18.html
    Especially point 18. Iray Renderer + 3dsMax Results

    I saw at LinusTechTips where he grouped a Quadro K4000 with a Geforce GTX 780{http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-780}

    I always thought that you couldn't do that, but he doesn't look like an amature, can someone please help clarify if this setup would work, and how ? what would be used for what?
    would the quadro handle the OpenGL tasks while the GTX handle the rendering?

    I realize I might have not explained myself well, If anyone needs clarifications, please ask.

    Here's the Link to the video link where he builds the system:
    http://www.youtube.com/watch?v=PhkJLF3oyI8&feature=player_detailpage

    It's a lengthy video, so you can just jump to [40:00] where he starts installing the GPUs, or to [04:03] where he shows the components of the build.

    Looking forward for your input.
    I would really appreciate your help.


    Depending what cpu you have as anything above 3770k/4770k needs a dedicated card and quadros usually heat up real bad so seperating them would improve your 780 as quadros designed for it (your choice) but if 4770k then your onboard apu will suffice giving you options to decide what gpu to dedicate in 3d max or what your dedicated card will be for view point.

    If 3930k you would already have dedicated card and maybe can offload rendering to cpu.
    you cannot sli them but you can assign what gpu you want to offload your rendering.

    only workstation cards support ecc and xeons, gtx and radeon don't nor intel consumer brands.

    You would need 2 quadros to sli but you would suffer severe raytracing issues with out ecc and cuda cores.
  2. @DrBackwater

    Thank you for your feedback, you make very good points.

    I actually haven't decided on the CPU yet, I am lingering between a 4930K (to benefit from the OC potential at the expense of ECC RAM) OR dual XEON E5-6230{http://ark.intel.com/products/64593/}, at the expense of not having the ability to OverClock the CPU, however, I would gain 12 extra threads to help speed up the rendering times , even though they are only 2.3Mhz to 2.8(with Turbo 1.0) [Of course ,they cost more, almost twice the 4930K]

    BUT

    If I was going to go with the Xeon , the Quadro K4000 doesn't support ECC , so, I was going to choose a single Quadro K5000 GPU instead of having two, but ofcourse, the Xeon Option is much more expensive.

    So briefly and in regards to this thread,

    The CPU is 4930k, and the MB is Asus P9x79 WS-E

    You said "you cannot sli them but you can assign what gpu you want to offload your rendering."
    I believe this is my question, how do I do that ?

    and

    "You would need 2 quadros to sli but you would suffer severe raytracing issues with out ecc and cuda cores."
    Are you saying that If I use a 4930k , I wouldn't be able to utilize the Cuda cores feature of the GPU?

    I want the quadro to handle the OpenGl tasks, where the GTX to handle the CUDA cores tasks with it's massive 2304 Cores!
    The question is, How do I do that, and would it work the way I am planning?

    "Depending what cpu you have as anything above 3770k/4770k needs a dedicated card and quadros heat real bad so seperating them would improve (your choice) but if 4770k then your onboard apu will suffice giving you options in 3d max what your dedicated card will be for view point."

    So, with a 4930K, The Quadro would be the one hooked to the monitor, (main dedicated GPU); but I didn't understand the second part of your statement, "giving you options in 3d max what your dedicated card will be for view point" , could you please clarify on that, what do you mean , what options!?

    Again, thank you for your help :)
  3. kicoverz said:
    @DrBackwater

    Thank you for your feedback, you make very good points.

    I actually haven't decided on the CPU yet, I am lingering between a 4930K (to benefit from the OC potential at the expense of ECC RAM) OR dual XEON E5-6230{http://ark.intel.com/products/64593/}, at the expense of not having the ability to OverClock the CPU, however, I would gain 12 extra threads to help speed up the rendering times , even though they are only 2.3Mhz to 2.8(with Turbo 1.0) [Of course ,they cost more, almost twice the 4930K]

    BUT

    If I was going to go with the Xeon , the Quadro K4000 doesn't support ECC , so, I was going to choose a single Quadro K5000 GPU instead of having two, but ofcourse, the Xeon Option is much more expensive.

    So briefly and in regards to this thread,

    The CPU is 4930k, and the MB is Asus P9x79 WS-E

    You said "you cannot sli them but you can assign what gpu you want to offload your rendering."
    I believe this is my question, how do I do that ?

    and

    "You would need 2 quadros to sli but you would suffer severe raytracing issues with out ecc and cuda cores."
    Are you saying that If I use a 4930k , I wouldn't be able to utilize the Cuda cores feature of the GPU?

    I want the quadro to handle the OpenGl tasks, where the GTX to handle the CUDA cores tasks with it's massive 2304 Cores!
    The question is, How do I do that, and would it work the way I am planning?

    "Depending what cpu you have as anything above (3770k/4770k) needs a dedicated card; again quadros do heat up real bad so seperating 780 and the k4000 would help but heats your big issue there. (your choice)

    So, with a 4930K, The Quadro card would be the one hooked to the monitor, (main dedicated GPU) but I didn't understand the second part of your statement, "giving you options in 3d max what your dedicated card will be for view point" , could you please clarify on that, what do you mean , what options!?

    Again, thank you for your help :)


    you could build fx8350 a core amd cpu with socket am3+ cheaper too and has apu=gpu built into cpu. you can use k4000 and view point all your designs

    THEN

    build a 1150 intel single cpu xeon with socket 1150 that will include the 780gtx and let that do all renderfarming. its coefficent vise versa but you'll have 2 pc's running then (one with k4000, and the other with gtx780). you can go for the 4930k path. nice cpu tho but not worth it. same goes for the 5930k soon to release.


    3d max will either tell you( in opions department) what gpu you have for rendering. as secondary solution its not to say you cant use k4000 for rendering if in dedicated mode its just slow and the ecc isn't something you would care about unless verticies and vectors are something that needs to be calculated to the extreme, results in the slow and steady approach.

    All quadro cards are ecc certified even amd fire pros (but they cannot use iray all amd cards)


    duel cpu are seriously expensive; motherboards too are 400$ minimum, cheapest dual boards 1366 but good luck building it. parts are hard to find
  4. Dr.Back you have been very helpful ,I really appreciate your feedback.
    I would have posted you as Best Answer from the First post, but, I would benefit from your expertise just a little bit further :)

    I have never used AMD or gone that way, I have no experience with them , your suggested solution does sound a lot cheaper, but I am not sure about the overall performance, the FX8350 falls in number 59 compared to the i7-4930k which ranks at 8 according to this popular benchmark website

    {http://www.cpubenchmark.net/high_end_cpus.html}

    The i7-4930K even beats the Xeon processors (not surprisingly) ; but as you mentioned, Xeon support ECC and are more stable, they are for the workstation oriented rather than the consumer gaming/workstation.

    You do sound like a Professional and I would use your expertise a little bit more if you don't mind.

    My budget is around 3500~4000$

    My final CORE build inclination is leaning toward the following:

    MB: Asus P9X79 WS E

    CPU: i7-4930k

    RAM: 32GB (For starters) up to 64GB (In the near future)

    GPU: K4000 Quadro(For OpenGL computation) + 1 GeForce GTX780 Ti (For CUDA) +(Additional GTX in the future to SLI it with the 1st one)

    An SSD for Cashing and a WD 1TB for storage (both will be upgraded to RAID 0, (SSD) , and RAID 1 (For HD's) in the future when I can afford to upgrade them)

    This is a Free Lancers workstation system, so it will be a money maker and I do not intend to use it for gaming , nor am I interested in looks, All I need is smooth workflow during modeling (the Quadro) and fast rendering to finish up (the GTX)

    The primary software is 3dsMax (mudbox, Zbrush) and the Renderer is VRay

    to sum it up to a single question!

    Can I configure my GPU's the way I described ? Quadro + GTX? and How? Do I simply put them in the slots and that's , will one cancel the other ?!

    here's a link to how the Vray engine works in case details are needed
    http://help.chaosgroup.com/vray/help/rt100/render_gpu.htm

    If I understood the link correctly, I believe I can actually choose which GPU I want to use for a specific task (RT or Rendering)
  5. Best answer
    6
    kicoverz said:
    Dr.Back you have been very helpful ,I really appreciate your feedback.
    I would have posted you as Best Answer from the First post, but, I would benefit from your expertise just a little bit further :)

    I have never used AMD or gone that way, I have no experience with them , your suggested solution does sound a lot cheaper, but I am not sure about the overall performance, the FX8350 falls in number 59 compared to the i7-4930k which ranks at 8 according to this popular benchmark website

    {http://www.cpubenchmark.net/high_end_cpus.html}

    The i7-4930K even beats the Xeon processors (not surprisingly) ; but as you mentioned, Xeon support ECC and are more stable, they are for the workstation oriented rather than the consumer gaming/workstation.

    You do sound like a Professional and I would use your expertise a little bit more if you don't mind.

    My budget is around 3500~4000$

    My final CORE build inclination is leaning toward the following:

    MB: Asus P9X79 WS E

    CPU: i7-4930k

    RAM: 32GB (For starters) up to 64GB (In the near future)

    GPU: K4000 Quadro(For OpenGL computation) + 1 GeForce GTX780 Ti (For CUDA) +(Additional GTX in the future to SLI it with the 1st one)

    An SSD for Cashing and a WD 1TB for storage (both will be upgraded to RAID 0, (SSD) , and RAID 1 (For HD's) in the future when I can afford to upgrade them)

    This is a Free Lancers workstation system, so it will be a money maker and I do not intend to use it for gaming , nor am I interested in looks, All I need is smooth workflow during modeling (the Quadro) and fast rendering to finish up (the GTX)

    The primary software is 3dsMax (mudbox, Zbrush) and the Renderer is VRay

    to sum it up to a single question!

    Can I configure my GPU's the way I described ? Quadro + GTX? and How? Do I simply put them in the slots and that's , will one cancel the other ?!

    here's a link to how the Vray engine works in case details are needed
    http://help.chaosgroup.com/vray/help/rt100/render_gpu.htm

    If I understood the link correctly, I believe I can actually choose which GPU I want to use for a specific task (RT or Rendering)



    I cannot tell you how to spend your money but advise you on what paths you can take, (and yes we all love left over money)

    Now as i've mentioned before, intel xeon cpu's are expensive for a reason, they tailor towards the business sector and charge a large fee, then so towards the consumer intl cpu sector.

    2 Intel xeon cpu's that are extremely fast are the: 1366 socket platform, and the 2011 socket platform.

    http://processors.findthebest.com/d/p/LGA-.-1366

    These xeon Clock speeds ( 1366 has out dated architecture) are fast, with the added costs that are astronomical,especially when you branch towards the intel 2011 socket xeon are ridiculously expensive.

    http://ark.intel.com/products/75279/


    Now is it a good investment, maybe depending if your business can afford it; and times extremely critical then yes.

    But it would simply blows your budget entirely.

    Now you asked is the fx8350 any good compared to the 4930k, it's incomparable as amd does not have anything higher then 8 cores; but neither does intel.

    3930k has 6cores/2 soldered off (total of 8 cores) but 6 physical cores/6 logical cores resulting in theory 12 core cpu (hyperthreading).

    Amd has a genuine a 8 core cpu, but the core clock speeds are much slower, as intel improved on that overall advantage clock speeds per core.

    Intel deliberately soldered of the 2 extra cores housed on 3930k, because they thought it's beneficial to focus on business side of things.

    But comparing the cost of amd( f8350) to intel (3930k) pointless, but price point insanely cheaper, as the fx8350 are multitaskers (yes it might take a extra 10 minutes on rendering or longer, but are you actually going to sit at your desk while it renders) some scenes can take along time to render (excluding pixar studios they have magical offloading technology)

    But the price of 200$ amd cpu pretty good.

    If your not using the 3930k/4930k for nothing else rather THEN gaming and searching the web then your wasting your money and time, as it serves the same purpose as the fx8350 while cheaper for a reason.

    Xeons are purpose built machines, some extreme (meaning pc elite enthusiast with the highest god complex will utilize this approach) Lower class xeons below that line simply function towards sever side, hosting, calculations , maybe modified bit miners, rendering, research labs, geneticical devisions.

    Even amd has their own xeon branded cpu's (opteron) they also can be cheaper, but in duel cpu they too become astronomical.
    Yes
    Having any premium motherboard will suffice, as it doesn't necessarily have to be workstation motherboard, just good build quality will do.

    How sure are you that your project are going to exceed 32gig of memory? (yes dual xeons support 128gig ram)

    But to answer your question while excluding the history lesson:

    Intel xeons i would recommend? if your serious in future endevours, as your already serious and have path paved in your head (hold that xeon thought for later when your confident in your skills)

    The 4930k will suffice

    When rendering, 3dmax will either use the cpu, or gpu given the choice you have, and if you've selected the quadro instead you'll have ecc enabled for correction, and added 3 gig on board including the shared memory from your ram (k=improved architecture)4000.

    And again; if you use the 780, it can have instability issues where the render might not come out as good because it's clocked to fast.( while quadros clock speeds lower.)
    Its risky on long rendering in regards to the gtx cards (thats where error correcting memory comes in) and with slow clockspeeds.
    You could look at a tutorial, on how to turn your 780 into a quadro, the only downfall is you will not have ecc on that card but the clock speeds will then be much slower with the added driver support.

    780 IS TITAN, THE TITAN IS A 6000 quadro, nvidea seperated the market years ago..

    Remember:all nvidea cards are built on open gl language as amd cards are open cl.

    Quadros= open graphical language (pros)> iray, cuda technology supports all autodesk products (cons)> its only on nvidea cards
    Firepro=open computer language (pros)>open, supports 80 percent of autodesk products while 20-35 percent cheaper quadro (cons)>no iray limited 3d max extentions that fall under nvideas cuda lingo.
    Cpu=open cl and gl
  6. Sorry to dig this thread up again, but I'm having the same problem and I'm so glad I came across this thread
    My current GPU is a GTX 780, it handles 3ds Max modeling, animation, and rendering pretty well, but lately I got some projects which demands heavy scenes and lots particles that I could barely navigate viewport.
    So I was thinking about getting a k4000 which is within my budget. But I dont know whether they can be in the same PC or not, I did research and there are lots of answers both yes and no. Some say to install both GeForce and Quadro drivers its fine, others say they will conflict, even nvidia is against it.
    This is my current spec:
    i7 4770
    asus z87-A
    16gbs ram
    GTX 780
    650w PSU

    I really need professional opinion on what I should do.
    Really appreciate ur time guys, thanks very much.
  7. BigZero said:
    Sorry to dig this thread up again, but I'm having the same problem and I'm so glad I came across this thread
    My current GPU is a GTX 780, it handles 3ds Max modeling, animation, and rendering pretty well, but lately I got some projects which demands heavy scenes and lots particles that I could barely navigate viewport.
    So I was thinking about getting a k4000 which is within my budget. But I dont know whether they can be in the same PC or not, I did research and there are lots of answers both yes and no. Some say to install both GeForce and Quadro drivers its fine, others say they will conflict, even nvidia is against it.
    This is my current spec:
    i7 4770
    asus z87-A
    16gbs ram
    GTX 780
    650w PSU

    I really need professional opinion on what I should do.
    Really appreciate ur time guys, thanks very much.



    Yes they will work.
    They're graphics cards just some do things differently then others.

    My little fx4800 fermi quadros beast and runs bioshock infitite fast, but runs hotter than the sun.

    So yes, it will run physics, viewpoints, but wouldn't recommended rendering with it as it's slow; dead slow.

    (Quadros for viewpoints, and gtx for rendering, that is if ecc isn't important if so then you'll need a k5000 series card that can handle reasonable rendering while with correction and viewpoint, given if your budget can afford it )

    Error correction code: mainly applies to architectual design and critial situations that need extreme attention.
  8. HERES YOUR FINAL ANSWER ON THE QUADRO+GTX QUESTION:

    DON'T DO IT.

    DO NOT trust any of these other posters on here, as they have not tried it and therefor DO NOT know definitively what they are talking about. I on the other hand, went ahead and built one of these machines, deciding to take the risk, fully wanting a payoff in MAX and VRAY. The result was NOT PRETTY. Trust me when I say you do not want to try and run two completely different graphics cards in one singe rig. Worst idea i've ever had, and to be perfectly candid, Linus needs to get slapped across the face for having ever made that video. It's possible that his techies with 10+ years experience tinkering have some back alley way of hacking their machine to make it work for them, but I would advise you, if this isn't you don't even think about going there, and I'm about to show you why.

    In November, the day after the Linus video came out, I had been planning on a self-build, and wanted it for use with 3ds Max, Vray, Mental Ray, Revit, Autocad, Maya, Premiere Pro, After Effects, Photoshop, Sketchup, and a host of others I am quite versed at being in the architecture field. So I decided to go with the linus build and here is my parts list more or less:

    Asus P9 x79 e WS, i7 4930k, noctua NH-D14 cpu heatsink (don't buy), quadro k4000, gtx 770 windforce, ax850 psu, an SSD for apps, and 2 1 tb drives in raid 1 for storage and 32gb RAM (cg, not ecc).

    3 things: 1.) P9 x79 e WS is overpriced garbage. read the reviews on newegg and take them to heart, because they are all true. Go with a sabertooth board they are much more reliable. My P9 x79 e WS board had all sorts of errors and I ultimately had to RMA it because my friend who works as IT for a major university and has his masters in CS helped me determine that some fan speed errors where the result of bad fan headers. Not a well made board at all, those like linus who say so are full of it, just read the newegg reviews to find the truth. One final thought with this board and this build in general, go all workstation or all/mostly consumer. drop this wishy washy garbage of oh well it does both... no it doesnt, youre just too cheap to do ws and dont need it anyway, so go consumer grade. 2) everything seemed to work initially, but one of the gpus had to be uninstalled in short order do to massive driver conflicts. THESE CARDS USE DIFFERENT DRIVERS!!! and there is no way around that. You WILL have nightmares trying to solve that issue when really all you need is a machine that works for 3ds max and vray. dont waste your time. 3.) The quadros use dual floating point precision. its all the card you will need forget the idea that you need more than that. If you are in MAX and vray it will be more than enough card for you. My latest design is 3 city blocks detailed out to the fourth floor and it still zooms around no prob. Vray isn't going to use CUDA cores like you think, on my rig, it uses mostly CPU and the GPU is mainly keeping my polys up while modeling. Its just the way the program works. The only place I am running into a need for more GPU is in video editing. So I am on here trying to find out if I can SLI two quadros, but if not, no big deal cause I don't do that much editing.

    So, those were the issues I ran into, here was my solution:

    Sabertooth x79 mobo, i7 4930k, quadro k4000, 32gigs RAM (cg, not ecc), h100i, upgraded my storage to 4 WD RE4 1 tb drives running in raid 10.

    I haven't looked back. To whomever said quadro cards run hot, you are foolish! My card has NEVER gone above 22 degrees C. Even while running OCCT for 24 hours strait. It is the coolest card out there. This machine I have is a beast and wisper quiet. It uses the 4930k to crush renders at speeds that would blow your mind. and my modeling is getting rediculous and I am being even sloppy about my technique and my rig hasn't even broken a sweat yet. Like I said, the only time I had issues was in a little video editing session. and it wasnt even that serious. I even play some games (I love assassins creed, its my stress relief).

    So take my advice or don't, but just know you are better off sticking with either consumer grade or WS grade in one machine. THIS IS FROM EXPERIENCE, so don't pay attention to words of "experts" who havent acually tried this and arent running max and vray. BTW, don't waste your money on WS grade parts unless your plan is to run your machine LITERALLY 24/7 - 365. Otherwise don't bother. Hope my words of experience were helpful.
  9. rockthedesigner said:
    HERES YOUR FINAL ANSWER ON THE QUADRO+GTX QUESTION:

    DON'T DO IT.

    DO NOT trust any of these other posters on here, as they have not tried it and therefor DO NOT know definitively what they are talking about. I on the other hand, went ahead and built one of these machines, deciding to take the risk, fully wanting a payoff in MAX and VRAY. The result was NOT PRETTY. Trust me when I say you do not want to try and run two completely different graphics cards in one singe rig. Worst idea i've ever had, and to be perfectly candid, Linus needs to get slapped across the face for having ever made that video. It's possible that his techies with 10+ years experience tinkering have some back alley way of hacking their machine to make it work for them, but I would advise you, if this isn't you don't even think about going there, and I'm about to show you why.

    In November, the day after the Linus video came out, I had been planning on a self-build, and wanted it for use with 3ds Max, Vray, Mental Ray, Revit, Autocad, Maya, Premiere Pro, After Effects, Photoshop, Sketchup, and a host of others I am quite versed at being in the architecture field. So I decided to go with the linus build and here is my parts list more or less:

    Asus P9 x79 e WS, i7 4930k, noctua NH-D14 cpu heatsink (don't buy), quadro k4000, gtx 770 windforce, ax850 psu, an SSD for apps, and 2 1 tb drives in raid 1 for storage and 32gb RAM (cg, not ecc).

    3 things: 1.) P9 x79 e WS is overpriced garbage. read the reviews on newegg and take them to heart, because they are all true. Go with a sabertooth board they are much more reliable. My P9 x79 e WS board had all sorts of errors and I ultimately had to RMA it because my friend who works as IT for a major university and has his masters in CS helped me determine that some fan speed errors where the result of bad fan headers. Not a well made board at all, those like linus who say so are full of it, just read the newegg reviews to find the truth. One final thought with this board and this build in general, go all workstation or all/mostly consumer. drop this wishy washy garbage of oh well it does both... no it doesnt, youre just too cheap to do ws and dont need it anyway, so go consumer grade. 2) everything seemed to work initially, but one of the gpus had to be uninstalled in short order do to massive driver conflicts. THESE CARDS USE DIFFERENT DRIVERS!!! and there is no way around that. You WILL have nightmares trying to solve that issue when really all you need is a machine that works for 3ds max and vray. dont waste your time. 3.) The quadros use dual floating point precision. its all the card you will need forget the idea that you need more than that. If you are in MAX and vray it will be more than enough card for you. My latest design is 3 city blocks detailed out to the fourth floor and it still zooms around no prob. Vray isn't going to use CUDA cores like you think, on my rig, it uses mostly CPU and the GPU is mainly keeping my polys up while modeling. Its just the way the program works. The only place I am running into a need for more GPU is in video editing. So I am on here trying to find out if I can SLI two quadros, but if not, no big deal cause I don't do that much editing.

    So, those were the issues I ran into, here was my solution:

    Sabertooth x79 mobo, i7 4930k, quadro k4000, 32gigs RAM (cg, not ecc), h100i, upgraded my storage to 4 WD RE4 1 tb drives running in raid 10.

    I haven't looked back. To whomever said quadro cards run hot, you are foolish! My card has NEVER gone above 22 degrees C. Even while running OCCT for 24 hours strait. It is the coolest card out there. This machine I have is a beast and wisper quiet. It uses the 4930k to crush renders at speeds that would blow your mind. and my modeling is getting rediculous and I am being even sloppy about my technique and my rig hasn't even broken a sweat yet. Like I said, the only time I had issues was in a little video editing session. and it wasnt even that serious. I even play some games (I love assassins creed, its my stress relief).

    So take my advice or don't, but just know you are better off sticking with either consumer grade or WS grade in one machine. THIS IS FROM EXPERIENCE, so don't pay attention to words of "experts" who havent acually tried this and arent running max and vray. BTW, don't waste your money on WS grade parts unless your plan is to run your machine LITERALLY 24/7 - 365. Otherwise don't bother. Hope my words of experience were helpful.

    I have one in my system a we speak, no problems. fx4800 and 770 gtx, should up grade the gtx tho, but overall no problems.
    Autodesk have confirmed that even rivet or maya will not work on gaming cards, they're not certified cards.
    Quadros use doable floating points to calculate (math) everything in space, while the more vertices that make up one polygon will eventually result in larger scene.


    You can pair 2 quadros based on the model, but overall your better of with a k5000. or fire pro card. as for someone who wishes to add a gtx, or radeon, fire pro isn't simply to make some monstrosity but to have a balance.

    You stated your intentions as a architect, there for a quadro is necessary, given every calculation, even from v ray, (irradiance map lighting) how accurate the viewport may eventually be would result in work station card as recommended.

    Most games these days are made up from 2d and 3d techniques; but in engineering the complexity is not that quite easy. (you could get a way with it in architectual design) the complexity that engineers need when creating a bridge with every rivet, towards the complexity under lighting isn't possible with normal graphics cards, and i think linus knows that all to well.

    People come here to ask if they can use a gtx or quadro in one system, yes they can. is it advised as a architect or a engineer, no.

    As a games designer, or developer of some form yes.


    In regards to cuda and cpu

    Vray Advanced uses brute force to literary calculate ray by ray and bounce per bounce of light for the whole GI solution. These very small "problems", are a waste for the long, complicated compute threads of modern CPUs: the CPU is "done" with it very fast, but it has to wait for the next problem to queue up, and you end up calculating 8 or 12 or 24 (depending on the threads your CPU(s) have) out of hundreds of thousands or millions of bounces...the massive parallelism built into the 100s or 1000s of simple compute units (aka CUDA cores, shaders, etc) in GPUs is very efficient in calculating these exact problems, thus you get to calculate one bounce per shader achieving decent rendering speeds.


    GTX 780Ti > K6000 > GTX Titan > GTX 780 >> GTX 770/680 > K5000 > GTX 670 > GTX 760 >> GTX 660 >> K4000.
    A K4000 is slow for GPU rendering, a K2000 is nearly useless in GPU rendering (too slow). You can use them, sure, but compute power $ is horrible. I am mentioning nVidia cards only, simply because the current VRay RT GPU (2.x) is horribly optimized for AMD cards (despite the latter being probably much better in compute than nVidia).
  10. DrBackwater said:
    rockthedesigner said:
    HERES YOUR FINAL ANSWER ON THE QUADRO+GTX QUESTION:

    DON'T DO IT.

    DO NOT trust any of these other posters on here, as they have not tried it and therefor DO NOT know definitively what they are talking about. I on the other hand, went ahead and built one of these machines, deciding to take the risk, fully wanting a payoff in MAX and VRAY. The result was NOT PRETTY. Trust me when I say you do not want to try and run two completely different graphics cards in one singe rig. Worst idea i've ever had, and to be perfectly candid, Linus needs to get slapped across the face for having ever made that video. It's possible that his techies with 10+ years experience tinkering have some back alley way of hacking their machine to make it work for them, but I would advise you, if this isn't you don't even think about going there, and I'm about to show you why.

    In November, the day after the Linus video came out, I had been planning on a self-build, and wanted it for use with 3ds Max, Vray, Mental Ray, Revit, Autocad, Maya, Premiere Pro, After Effects, Photoshop, Sketchup, and a host of others I am quite versed at being in the architecture field. So I decided to go with the linus build and here is my parts list more or less:

    Asus P9 x79 e WS, i7 4930k, noctua NH-D14 cpu heatsink (don't buy), quadro k4000, gtx 770 windforce, ax850 psu, an SSD for apps, and 2 1 tb drives in raid 1 for storage and 32gb RAM (cg, not ecc).

    3 things: 1.) P9 x79 e WS is overpriced garbage. read the reviews on newegg and take them to heart, because they are all true. Go with a sabertooth board they are much more reliable. My P9 x79 e WS board had all sorts of errors and I ultimately had to RMA it because my friend who works as IT for a major university and has his masters in CS helped me determine that some fan speed errors where the result of bad fan headers. Not a well made board at all, those like linus who say so are full of it, just read the newegg reviews to find the truth. One final thought with this board and this build in general, go all workstation or all/mostly consumer. drop this wishy washy garbage of oh well it does both... no it doesnt, youre just too cheap to do ws and dont need it anyway, so go consumer grade. 2) everything seemed to work initially, but one of the gpus had to be uninstalled in short order do to massive driver conflicts. THESE CARDS USE DIFFERENT DRIVERS!!! and there is no way around that. You WILL have nightmares trying to solve that issue when really all you need is a machine that works for 3ds max and vray. dont waste your time. 3.) The quadros use dual floating point precision. its all the card you will need forget the idea that you need more than that. If you are in MAX and vray it will be more than enough card for you. My latest design is 3 city blocks detailed out to the fourth floor and it still zooms around no prob. Vray isn't going to use CUDA cores like you think, on my rig, it uses mostly CPU and the GPU is mainly keeping my polys up while modeling. Its just the way the program works. The only place I am running into a need for more GPU is in video editing. So I am on here trying to find out if I can SLI two quadros, but if not, no big deal cause I don't do that much editing.

    So, those were the issues I ran into, here was my solution:

    Sabertooth x79 mobo, i7 4930k, quadro k4000, 32gigs RAM (cg, not ecc), h100i, upgraded my storage to 4 WD RE4 1 tb drives running in raid 10.

    I haven't looked back. To whomever said quadro cards run hot, you are foolish! My card has NEVER gone above 22 degrees C. Even while running OCCT for 24 hours strait. It is the coolest card out there. This machine I have is a beast and wisper quiet. It uses the 4930k to crush renders at speeds that would blow your mind. and my modeling is getting rediculous and I am being even sloppy about my technique and my rig hasn't even broken a sweat yet. Like I said, the only time I had issues was in a little video editing session. and it wasnt even that serious. I even play some games (I love assassins creed, its my stress relief).

    So take my advice or don't, but just know you are better off sticking with either consumer grade or WS grade in one machine. THIS IS FROM EXPERIENCE, so don't pay attention to words of "experts" who havent acually tried this and arent running max and vray. BTW, don't waste your money on WS grade parts unless your plan is to run your machine LITERALLY 24/7 - 365. Otherwise don't bother. Hope my words of experience were helpful.

    I have one in my system a we speak, no problems. fx4800 and 770 gtx, should up grade the gtx tho, but overall no problems.
    Autodesk have confirmed that even rivet or maya will not work on gaming cards, they're not certified cards.
    Quadros use doable floating points to calculate (math) everything in space, while the more vertices that make up one polygon will eventually result in larger scene.


    You can pair 2 quadros based on the model, but overall your better of with a k5000. or fire pro card. as for someone who wishes to add a gtx, or radeon, fire pro isn't simply to make some monstrosity but to have a balance.

    You stated your intentions as a architect, there for a quadro is necessary, given every calculation, even from v ray, (irradiance map lighting) how accurate the viewport may eventually be would result in work station card as recommended.

    Most games these days are made up from 2d and 3d techniques; but in engineering the complexity is not that quite easy. (you could get a way with it in architectual design) the complexity that engineers need when creating a bridge with every rivet, towards the complexity under lighting isn't possible with normal graphics cards, and i think linus knows that all to well.

    People come here to ask if they can use a gtx or quadro in one system, yes they can. is it advised as a architect or a engineer, no.

    As a games designer, or developer of some form yes.


    In regards to cuda and cpu

    Vray Advanced uses brute force to literary calculate ray by ray and bounce per bounce of light for the whole GI solution. These very small "problems", are a waste for the long, complicated compute threads of modern CPUs: the CPU is "done" with it very fast, but it has to wait for the next problem to queue up, and you end up calculating 8 or 12 or 24 (depending on the threads your CPU(s) have) out of hundreds of thousands or millions of bounces...the massive parallelism built into the 100s or 1000s of simple compute units (aka CUDA cores, shaders, etc) in GPUs is very efficient in calculating these exact problems, thus you get to calculate one bounce per shader achieving decent rendering speeds.


    GTX 780Ti > K6000 > GTX Titan > GTX 780 >> GTX 770/680 > K5000 > GTX 670 > GTX 760 >> GTX 660 >> K4000.
    A K4000 is slow for GPU rendering, a K2000 is nearly useless in GPU rendering (too slow). You can use them, sure, but compute power $ is horrible. I am mentioning nVidia cards only, simply because the current VRay RT GPU (2.x) is horribly optimized for AMD cards (despite the latter being probably much better in compute than nVidia).



    DrBackwater:

    That's all well and good that you have figured out how to run two cards in your system, but you still didn't explain how you deal with the issue of conflicting drivers. Yes we all know how awesome having lots of CUDA cores is in theory, but the whole point of getting a professional grade card such as a K4000 has a lot to do with the drivers developed for those cards, as well as the dual floating point capabilities, which I have already found to be invaluable. Render times are less of an issue if I can't produce an accurate model in a timely fashion. As I stated before, when I tried a build with separate cards just as linus showed, the problem was that the drivers conflicted for the different cards and ultimately caused my system to become unstable. I think that is the main issue that kicoverz needs to get a solution to before he tries something that most hardware and software companies expressly warn against doing.
  11. rockthedesigner said:
    DrBackwater said:
    rockthedesigner said:
    HERES YOUR FINAL ANSWER ON THE QUADRO+GTX QUESTION:

    DON'T DO IT.

    DO NOT trust any of these other posters on here, as they have not tried it and therefor DO NOT know definitively what they are talking about. I on the other hand, went ahead and built one of these machines, deciding to take the risk, fully wanting a payoff in MAX and VRAY. The result was NOT PRETTY. Trust me when I say you do not want to try and run two completely different graphics cards in one singe rig. Worst idea i've ever had, and to be perfectly candid, Linus needs to get slapped across the face for having ever made that video. It's possible that his techies with 10+ years experience tinkering have some back alley way of hacking their machine to make it work for them, but I would advise you, if this isn't you don't even think about going there, and I'm about to show you why.

    In November, the day after the Linus video came out, I had been planning on a self-build, and wanted it for use with 3ds Max, Vray, Mental Ray, Revit, Autocad, Maya, Premiere Pro, After Effects, Photoshop, Sketchup, and a host of others I am quite versed at being in the architecture field. So I decided to go with the linus build and here is my parts list more or less:

    Asus P9 x79 e WS, i7 4930k, noctua NH-D14 cpu heatsink (don't buy), quadro k4000, gtx 770 windforce, ax850 psu, an SSD for apps, and 2 1 tb drives in raid 1 for storage and 32gb RAM (cg, not ecc).

    3 things: 1.) P9 x79 e WS is overpriced garbage. read the reviews on newegg and take them to heart, because they are all true. Go with a sabertooth board they are much more reliable. My P9 x79 e WS board had all sorts of errors and I ultimately had to RMA it because my friend who works as IT for a major university and has his masters in CS helped me determine that some fan speed errors where the result of bad fan headers. Not a well made board at all, those like linus who say so are full of it, just read the newegg reviews to find the truth. One final thought with this board and this build in general, go all workstation or all/mostly consumer. drop this wishy washy garbage of oh well it does both... no it doesnt, youre just too cheap to do ws and dont need it anyway, so go consumer grade. 2) everything seemed to work initially, but one of the gpus had to be uninstalled in short order do to massive driver conflicts. THESE CARDS USE DIFFERENT DRIVERS!!! and there is no way around that. You WILL have nightmares trying to solve that issue when really all you need is a machine that works for 3ds max and vray. dont waste your time. 3.) The quadros use dual floating point precision. its all the card you will need forget the idea that you need more than that. If you are in MAX and vray it will be more than enough card for you. My latest design is 3 city blocks detailed out to the fourth floor and it still zooms around no prob. Vray isn't going to use CUDA cores like you think, on my rig, it uses mostly CPU and the GPU is mainly keeping my polys up while modeling. Its just the way the program works. The only place I am running into a need for more GPU is in video editing. So I am on here trying to find out if I can SLI two quadros, but if not, no big deal cause I don't do that much editing.

    So, those were the issues I ran into, here was my solution:

    Sabertooth x79 mobo, i7 4930k, quadro k4000, 32gigs RAM (cg, not ecc), h100i, upgraded my storage to 4 WD RE4 1 tb drives running in raid 10.

    I haven't looked back. To whomever said quadro cards run hot, you are foolish! My card has NEVER gone above 22 degrees C. Even while running OCCT for 24 hours strait. It is the coolest card out there. This machine I have is a beast and wisper quiet. It uses the 4930k to crush renders at speeds that would blow your mind. and my modeling is getting rediculous and I am being even sloppy about my technique and my rig hasn't even broken a sweat yet. Like I said, the only time I had issues was in a little video editing session. and it wasnt even that serious. I even play some games (I love assassins creed, its my stress relief).

    So take my advice or don't, but just know you are better off sticking with either consumer grade or WS grade in one machine. THIS IS FROM EXPERIENCE, so don't pay attention to words of "experts" who havent acually tried this and arent running max and vray. BTW, don't waste your money on WS grade parts unless your plan is to run your machine LITERALLY 24/7 - 365. Otherwise don't bother. Hope my words of experience were helpful.

    I have one in my system a we speak, no problems. fx4800 and 770 gtx, should up grade the gtx tho, but overall no problems.
    Autodesk have confirmed that even rivet or maya will not work on gaming cards, they're not certified cards.
    Quadros use doable floating points to calculate (math) everything in space, while the more vertices that make up one polygon will eventually result in larger scene.


    You can pair 2 quadros based on the model, but overall your better of with a k5000. or fire pro card. as for someone who wishes to add a gtx, or radeon, fire pro isn't simply to make some monstrosity but to have a balance.

    You stated your intentions as a architect, there for a quadro is necessary, given every calculation, even from v ray, (irradiance map lighting) how accurate the viewport may eventually be would result in work station card as recommended.

    Most games these days are made up from 2d and 3d techniques; but in engineering the complexity is not that quite easy. (you could get a way with it in architectual design) the complexity that engineers need when creating a bridge with every rivet, towards the complexity under lighting isn't possible with normal graphics cards, and i think linus knows that all to well.

    People come here to ask if they can use a gtx or quadro in one system, yes they can. is it advised as a architect or a engineer, no.

    As a games designer, or developer of some form yes.


    In regards to cuda and cpu

    Vray Advanced uses brute force to literary calculate ray by ray and bounce per bounce of light for the whole GI solution. These very small "problems", are a waste for the long, complicated compute threads of modern CPUs: the CPU is "done" with it very fast, but it has to wait for the next problem to queue up, and you end up calculating 8 or 12 or 24 (depending on the threads your CPU(s) have) out of hundreds of thousands or millions of bounces...the massive parallelism built into the 100s or 1000s of simple compute units (aka CUDA cores, shaders, etc) in GPUs is very efficient in calculating these exact problems, thus you get to calculate one bounce per shader achieving decent rendering speeds.


    GTX 780Ti > K6000 > GTX Titan > GTX 780 >> GTX 770/680 > K5000 > GTX 670 > GTX 760 >> GTX 660 >> K4000.
    A K4000 is slow for GPU rendering, a K2000 is nearly useless in GPU rendering (too slow). You can use them, sure, but compute power $ is horrible. I am mentioning nVidia cards only, simply because the current VRay RT GPU (2.x) is horribly optimized for AMD cards (despite the latter being probably much better in compute than nVidia).



    DrBackwater:

    That's all well and good that you have figured out how to run two cards in your system, but you still didn't explain how you deal with the issue of conflicting drivers. Yes we all know how awesome having lots of CUDA cores is in theory, but the whole point of getting a professional grade card such as a K4000 has a lot to do with the drivers developed for those cards, as well as the dual floating point capabilities, which I have already found to be invaluable. Render times are less of an issue if I can't produce an accurate model in a timely fashion. As I stated before, when I tried a build with separate cards just as linus showed, the problem was that the drivers conflicted for the different cards and ultimately caused my system to become unstable. I think that is the main issue that kicoverz needs to get a solution to before he tries something that most hardware and software companies expressly warn against doing.


    You just stated what I previously said before, lol

    Quadros have exclusive driver support, that gtx do not offer. every person who buys them already knows this, :P or should by now. but with out those drivers these quadro cards are essentially gtx cards with ecc enabled that's all.

    You must switch the gtx card off, (disable) not when running parallel, (no doubt drivers will conflict) or drivers will conflict. as I said before: assign a task for the gtx later on not while the quadros calculating vertices. dedicate one card for one thing and another card for rendering.

    Quadros are designed for these situations, but is not to say you cannot use a gtx, it's just not recommended.

    Building Design
    Architecture
    Structural Detailing
    Home Design
    CAD Software for Landscape Design
    CAD Software for
    Town Planning
    Landscape Architecture
    Landscape Design
    Surveying
    Garden Design
    CAD Software for Interior Design
    CAD Software for
    Interior Design & Fitout
    Set/Stage & Expo Design
    Furniture Design
    Kitchen & Bathroom
    CAD Software for Industrial Design
    CAD Software for
    Industrial & Product Design
    Mechanical Engineering
    Manufacturing CAD/CAM
    CAD Software for General Purposes



    Just decide what card you'll use when you need to. right now my quadros turned off, not only does it save power; but I also use it for when I need it.

    Same for the gtx, I use it for when I need it not simultaneous at the same time.
    Same for a workstation audio card.
  12. Yeah, Dr. Back,

    I don't mean to be a nag, but I still don't see how you've "solved" anything. For me, and I'm sure for a lot of others out there, you're going to have to do better than telling me to "turn off" a graphics card. That's not nearly specific enough. In fact I'm pretty sure you are missing my point completely. Drivers are something you install on your operating system, and it takes a minute or two to do so, and I have never seen a setting in windows for turning off a driver, but then again, I am not an expert. Like many on here I am at the level of building my own pc by watching a youtube video or two. So you're going to have to describe how that is done, because I have also never seen an off switch on a graphics card and to take it out requires a few steps including turning off your pc, opening it up, taking a screw or two out, and unplugging the card from the pci slot. But even if you do that I still don't see a solution for the system instability I experienced when trying to run two separate drivers simultaneously in one operating system. Operating systems are not designed to handle two separate sets of code telling it to do the same thing in different ways, and I have seen first hand how unstable windows becomes when that happens. so maybe you are running both cards with one driver, but again, I'm not sure how that is possible due to the fact that drivers are coded to recognize specific cards. I'm sorry if I seem uninformed or naggy, but telling me to simply "turn off" a graphics card misses my point, doesn't answer my question, and therefor IMHO does not answer the question of the original thread. Therefor I believe my answer is the more relevant one since it assumes a level of computer literacy similar to mine, which is not at a novice level, but not an expert either.
  13. rockthedesigner said:
    Yeah, Dr. Back,

    I don't mean to be a nag, but I still don't see how you've "solved" anything. For me, and I'm sure for a lot of others out there, you're going to have to do better than telling me to "turn off" a graphics card. That's not nearly specific enough. In fact I'm pretty sure you are missing my point completely. Drivers are something you install on your operating system, and it takes a minute or two to do so, and I have never seen a setting in windows for turning off a driver, but then again, I am not an expert. Like many on here I am at the level of building my own pc by watching a youtube video or two. So you're going to have to describe how that is done, because I have also never seen an off switch on a graphics card and to take it out requires a few steps including turning off your pc, opening it up, taking a screw or two out, and unplugging the card from the pci slot. But even if you do that I still don't see a solution for the system instability I experienced when trying to run two separate drivers simultaneously in one operating system. Operating systems are not designed to handle two separate sets of code telling it to do the same thing in different ways, and I have seen first hand how unstable windows becomes when that happens. so maybe you are running both cards with one driver, but again, I'm not sure how that is possible due to the fact that drivers are coded to recognize specific cards. I'm sorry if I seem uninformed or naggy, but telling me to simply "turn off" a graphics card misses my point, doesn't answer my question, and therefor IMHO does not answer the question of the original thread. Therefor I believe my answer is the more relevant one since it assumes a level of computer literacy similar to mine, which is not at a novice level, but not an expert either.



    So your adequately trying to distinguish what I know compared to what you know, with nothing but rants, given your issues. As advised, go buy k5000, it will adequately do what you need. rendering and viewport FOR $1700 what to much money.

    You are serious right?

    What people have to understand is, this isn't a magic bullet, ( yes you have a graphics card, and there for you wish to seek value from it; and and it's not autodesks job to determine every card that's capable for all situations, if it bothers you that much go use blender, its free and works reasonably well with most graphics cards, but you will not have the added extensions that autodesk offer,that support nvidea workstation cards.) All the gtx cards going to offer you speed and time.

    However if you wish utilize your gtx card for autodesk products then do so.

    1 highlight windows down in left hand corner. (based on your operating system)
    2 right click computer/device manager
    3 Search display adapter, right click; opening up a small window, highlight driver, now we will have default, uninstall, and so on, given we cannot pair Q and G's cards under different drivers, but we can assign a task for the gtx, after all it's jobs to speed the process up.
    warning, you'll need to back up your current drivers if you do uninstall, eventually issues do arise we may have o uninstall them, drivers that is, preferably the gtx card. if we wish to work in design environment) a driver sweeper should do.
    http://driver-sweeper.en.softonic.com/

    From what I can tell, you've experienced issues with windows, there for you need a xeon machine running a quadro, with that you would have no problems. they're not cheap, and only a certain amount of xeons will actually do games, at a insane price.

    But we are talking about stability right, ecc ram, ecc motherboard, ecc gpu. ecc enabled cpu. complete work horse that's efficient running windows under any situations.

    Oh, okay it's the drivers, well then uninstall them. after all its the quadro we seek, not the gtx. and when your ready to use the gtx assign it a task.
  14. If it seems like ranting I apologize for that. My goal is not to come on here and ruffle feathers for the sake of ruffling feathers, or for the sake of having some battle of wits or pc iq measuring contest. I only will comment on something I have close personal experience with.

    My goal was to simply get more information out there about this topic, because the experience I had was initially negative due to a lack of information.

    I have a k4000 and I am actually very happy with it. They are better cards than some people make them out. Given my personal workflow I am happy with my setup and would encourage anyone with a similar workflow to do the same: mostly consumer grade, with a ws grade card for better vp nav. I use mostly CPU power for rendering, and haven't had any complaints yet, because render times have more to do with workflow and render settings IMHO. Vray RT works ok for a specific type of workflow, but no one I know uses it. And for that matter I am part of a large community of vray/3ds max/autocad users and to your credit Dr. Back, people use ws and consumer grade cards alike. You are right and I don't think I ever argued that you couldn't choose either one, I merely feel choosing one over the other is a wise choice for the majority of designers based on the average pc knowledge. I also agree going full workstation with ecc and xeon and all the bells and whistles would be awesome. If you can afford it.

    I am glad that my "rants" finally got someone to be a little more specific about how you actually operate a system using two different cards, so I would say thank you Dr. Back for attempting to shed some light on that, because I think a lot of people will benefit from that information. I wasn't ever questioning your general knowledge, it is just a pet peeve of mine when people advocate a method that is controversial and then don't follow up with any details on how to do it without causing more trouble than it was worth.

    So thank you for finally providing us with more detail. You are awesome for doing that.
  15. Im just going to throw this into the frying pan, I know their are a lot of quadro users, and those who're using quadros for: getting a job in any industry, hobby, study related matter.

    But if it's game related, or design related:

    I have listed a few engines for people who maybe interested in another path. (not everything starts and ends with 3dmax, but personally nor do you need a quadro and I personally think blenders a unique tool, you can also transfer obj files over to blender, food for thought.)

    Free indi developer path.

    Take it with a grain of salt:

    Torque 3d, great community free too. must know c++, but have access to core features internally in c++.

    Crystal space. Deep learning curve, with linking library's and rewriting the library for engine, physics engines included, all default shaders.

    Unity. no core features, you'll need to learn the scripting language but c++ isn't necessary. basic learning curve.

    Unreal engine. (there's a new licence monthly fee now) for the commercial version.

    Cry engine. (there's a new licence monthly fee now) for commercial version. trivial learning curve.

    3d space, full featured engine with no programming. just link all logic parameters, and functions for shading with impressive direct x11 effects. free, but must learn a new language that's similar to javascript skol, hell their all similar to js.


    Just import your file formats into these engines, sadly low level api's (like crystal space, or ogre 3d) will need a rewritten extension to except your complex matrix of vertices and polygons baked or not.

    You may like: http://www.garagegames.com/products/purelight/workflow.htm when working with lighting, as I said take it with a grain of salt.
  16. I would fully agree with that most of those programs are perfect for HOBBY needs. In fact, some of them have the advantage of being free, which is a major plus as paying for autocad software can be painful.

    However...

    If you are a professional architecture designer, speaking through years of learning the hard way, these are the programs you need to learn because there is nothing close to them in terms of actual working documents and real world business applications. If you need to make money in the architecture profession, do not waste time with any other programs besides the following:

    1.) Revit.
    2.) 3ds Max + Vray
    3.) Adobe Master Collection
    4.) AutoCAD

    See the following for a second opinion and for decent hardware discussion (although I personally have had better luck with my quadro 4000 than every single one of my colleagues, including the master renderer teacher whom I learned all my hottest skills from): http://www.palaciosdesign.com/articles/the-revit-workstation---hardware-recommendations

    It's fine to go cheap or hardware, esp if you are gaming, but I personally do not play games. I do work.

    If that speaks to you, here is some motivation: https://www.youtube.com/watch?v=Sk56VxaeqEQ

    Get on it people.
  17. “If your computer is not instantaneous, it is too slow and there is an opportunity for improvement that your competition will exploit."

    ~ Mark Palacios

    100% agree.

    http://www.palaciosdesign.com/articles/the-revit-workstation---hardware-recommendations
  18. rockthedesigner said:
    I would fully agree with that most of those programs are perfect for HOBBY needs. In fact, some of them have the advantage of being free, which is a major plus as paying for autocad software can be painful.

    However...

    If you are a professional architecture designer, speaking through years of learning the hard way, these are the programs you need to learn because there is nothing close to them in terms of actual working documents and real world business applications. If you need to make money in the architecture profession, do not waste time with any other programs besides the following:

    1.) Revit.
    2.) 3ds Max + Vray
    3.) Adobe Master Collection
    4.) AutoCAD

    See the following for a second opinion and for decent hardware discussion (although I personally have had better luck with my quadro 4000 than every single one of my colleagues, including the master renderer teacher whom I learned all my hottest skills from): http://www.palaciosdesign.com/articles/the-revit-workstation---hardware-recommendations

    It's fine to go cheap or hardware, esp if you are gaming, but I personally do not play games. I do work.

    If that speaks to you, here is some motivation: https://www.youtube.com/watch?v=Sk56VxaeqEQ

    Get on it people.



    Actually those programs are very complex, they do need visual studio to fully incorporate interesting things and yes its free the express edition. I think everyone in design should learn some form of (scripting and design makes you veritile, tho c++ that's for purists.) coding as it be:

    python
    JavaScript
    maxscript
    c
    c#
    c++


    And for motivation all you need is this:https://www.youtube.com/watch?v=w5tWYmIOWGk

    :)
  19. https://www.youtube.com/watch?v=pWdd6_ZxX8c

    The complexity is the point. I want to be able to access both sides of my world, the technical and the picturesque. But also the complexity allows an endless amount of functionality, and I don't know which visual studio we are referring to but ive never needed it. I also know some basic C, C++ ect, but have yet to actually use them for design. When you can do everything faster and more efficiently in one program with no creative limitations, making any shape and giving it any texture, why would you need anything else. Its called a workflow for a reason, because it works. C++ is pointless, if you are an architect professional don't waste your time with it, you don't need it.

    Using any other program for that matter besides revit, is like trying to build a house with a hammer and nails. You will probably get a great result, but the guy with a nail gun is going to finish 100 houses in the time it takes you to do one.

    Food for thought.
  20. rockthedesigner said:
    https://www.youtube.com/watch?v=pWdd6_ZxX8c

    The complexity is the point. I want to be able to access both sides of my world, the technical and the picturesque. But also the complexity allows an endless amount of functionality, and I don't know which visual studio we are referring to but ive never needed it. I also know some basic C, C++ ect, but have yet to actually use them for design. When you can do everything faster and more efficiently in one program with no creative limitations, making any shape and giving it any texture, why would you need anything else. Its called a workflow for a reason, because it works. C++ is pointless, if you are an architect professional don't waste your time with it, you don't need it.

    Using any other program for that matter besides revit, is like trying to build a house with a hammer and nails. You will probably get a great result, but the guy with a nail gun is going to finish 100 houses in the time it takes you to do one.

    Food for thought.


    No not c++, I said its for purists, people who know data structure, what you want is scripting.
    C++ is more core oriented. it's sloppy, messy leaky and deep rooted and must be rerun each time until something works and most of the time errors become prevalent all the time.
    Scriptings more versatile, it works on out side and has more flexibility and you can make changes instantly on the run but with great things also come limitations too. but we seek the good stuff.

    Recreating dynamic water fluids, calculated physics, light fracture, more precise animation, a virtual resume.

    Anyhows maybe this video will enlighten you: http://www.youtube.com/watch?v=0EyYUKLk_qQ

    Remember you can learn all the skills for that one cause but a designs just a design, you can build the best car in the world, but with out the best driver or you have is static art bound for no cause. those who favour art are those who pursue the same interest in small specific community of artists.

    Imagine that as your resume.
  21. I have a REALLY long ways to go, but this is my bar that I'd like to one day reach:

    http://www.alexhogrefe.com/blog/2013/7/31/the-third-and-the-seventh-by-alex-roman.html

    From what I understand all was 3ds Max + Vray. I've heard from a sketchup model as well, which is crazy.
  22. DrBackwater said:
    rockthedesigner said:
    Yeah, Dr. Back,

    I don't mean to be a nag, but I still don't see how you've "solved" anything. For me, and I'm sure for a lot of others out there, you're going to have to do better than telling me to "turn off" a graphics card. That's not nearly specific enough. In fact I'm pretty sure you are missing my point completely. Drivers are something you install on your operating system, and it takes a minute or two to do so, and I have never seen a setting in windows for turning off a driver, but then again, I am not an expert. Like many on here I am at the level of building my own pc by watching a youtube video or two. So you're going to have to describe how that is done, because I have also never seen an off switch on a graphics card and to take it out requires a few steps including turning off your pc, opening it up, taking a screw or two out, and unplugging the card from the pci slot. But even if you do that I still don't see a solution for the system instability I experienced when trying to run two separate drivers simultaneously in one operating system. Operating systems are not designed to handle two separate sets of code telling it to do the same thing in different ways, and I have seen first hand how unstable windows becomes when that happens. so maybe you are running both cards with one driver, but again, I'm not sure how that is possible due to the fact that drivers are coded to recognize specific cards. I'm sorry if I seem uninformed or naggy, but telling me to simply "turn off" a graphics card misses my point, doesn't answer my question, and therefor IMHO does not answer the question of the original thread. Therefor I believe my answer is the more relevant one since it assumes a level of computer literacy similar to mine, which is not at a novice level, but not an expert either.



    So your adequately trying to distinguish what I know compared to what you know, with nothing but rants, given your issues. As advised, go buy k5000, it will adequately do what you need. rendering and viewport FOR $1700 what to much money.

    You are serious right?

    What people have to understand is, this isn't a magic bullet, ( yes you have a graphics card, and there for you wish to seek value from it; and and it's not autodesks job to determine every card that's capable for all situations, if it bothers you that much go use blender, its free and works reasonably well with most graphics cards, but you will not have the added extensions that autodesk offer,that support nvidea workstation cards.) All the gtx cards going to offer you speed and time.

    However if you wish utilize your gtx card for autodesk products then do so.

    1 highlight windows down in left hand corner. (based on your operating system)
    2 right click computer/device manager
    3 Search display adapter, right click; opening up a small window, highlight driver, now we will have default, uninstall, and so on, given we cannot pair Q and G's cards under different drivers, but we can assign a task for the gtx, after all it's jobs to speed the process up.
    warning, you'll need to back up your current drivers if you do uninstall, eventually issues do arise we may have o uninstall them, drivers that is, preferably the gtx card. if we wish to work in design environment) a driver sweeper should do.
    http://driver-sweeper.en.softonic.com/

    From what I can tell, you've experienced issues with windows, there for you need a xeon machine running a quadro, with that you would have no problems. they're not cheap, and only a certain amount of xeons will actually do games, at a insane price.

    But we are talking about stability right, ecc ram, ecc motherboard, ecc gpu. ecc enabled cpu. complete work horse that's efficient running windows under any situations.

    Oh, okay it's the drivers, well then uninstall them. after all its the quadro we seek, not the gtx. and when your ready to use the gtx assign it a task.





    Hi folks.

    It's good to finally see accurate information about the endless battle Quadro vs Gtx. Thanks a lot.
    You guys talk mainly about rendering but I was wondering what would be the end result for animation ? As it is hard to define which of the CPU / GPU will be used to make your characters / scene animation move faster.

    Here is an example of what I'm talking about ( the full conference talk about lighting/rendering as well )
    Apparently this machine runs with a K6000. that being said. we could think that the GPU is the Key but it's definitely thanks to their software "Presto" that is able to use the full power of the Quadro K6000.

    But what would it looks like with a GTX780 Ti animation wise ?( maya or 3dsmax ) Are Quadros only good at showing vertex shaders etc... or does it increase the fluidity of the animation in the viewport ?

    I'm also looking at having both GTX780 Ti and a Quadro K4000 ( or K5000 if I feel it's worth it ) but as my only concern is "animation" I'm kind of lost in the middle of this war.

    Thanks
Ask a new question

Read More

Workstations Quadro Rendering Geforce Graphics