Sign in with
Sign up | Sign in
Your question

GTX 560Ti vs GTX 680 CUDA

Last response: in Graphics & Displays
Share
March 23, 2012 1:18:51 PM

Hi,
I am kinda newbie in GPU.
I'd like to buy a rig with 3930K for 3ds max and V-Ray RT rendering.
Considering the budget my intention was to get GTX 560Ti 2GB (384 CUDA cores)

Looking purely at numbers of GTX680 - will its 1536 CUDA cores mean that it's gonna be simply 4x faster? (384x4 = 1536)
or is this sort of marketing trap for people like me? :-)

Thanks
M

More about : gtx 560ti gtx 680 cuda

March 23, 2012 6:32:23 PM

Dont go with a 680 just yet.
Vray doesnt use CUDA, it uses OpenCL. Previous NVidias were good enough for OpenCL rendering but the 680 seems to have extremly low direct compute scores. Looks like they want to create a distinction between the GTX and quadro cards.
If you need Open CL for VRay rt go with AMD, they're the best at the moment for any OpenCL work. If you want CUDA for iRay then NVidia is your only option.
April 19, 2012 9:26:45 PM

Hi Tremolo

I want to buy a video card for the same reason, i want to use vray and probably 3Dmax as renderers.

there are a few things you should know:

1. video cards work in "real time" rendering, that is vray RT and 3dmax Nitrousoptions, the "normal" render is still done by the CPU.

2. video cards help speed up navigation while you model, so even if you have really big render scenes or huge poly counts it will be fluid and stable.

3. GPU rendering is greatly dependent on program compatibility and drivers, so until the GTX 680 receives support from Vray and 3Dmax( if they ever receive attention on that aspect) you will not know if it is worth will for 3D rendering over the cheaper GTX580

4. the GTX680 CUDA cores are not equivalent to the GTX580 CUDA cores, in this case the big numbers are probably just marketing crap as you say.

5. the main problem with the GTX680 seems to be the dual GPU procesor, that makes it uncompatible with many current renderers, if this is "fixed" in theory it might make the GTX 680 worthwhile for 3Drendering.


suggestions: wait a bit if you can until the dust settles down, if you cannot wait, just grab the GTX 580, and start rendering like crazy, that way even if it turns out that the GTX 680 will overcome the GTX 580 you will already have enjoyed it enough to neglect the new improvements, plus you are 100% certain it is a card that will work for you.
Related resources
April 20, 2012 1:33:36 AM

@vaell why not start your own thread??
Anyways, I'll respond the original thread and your questions. The Quadros of nVIDIA have more stability and accuracy when it comes to these things. For example in AutoCAD your models (or designs idk) in the viewport might be inaccurate and be a 'little off' when you compare them to that of a machine that is using a Quadro. Also Quadros have special drivers that improve stability and boost performance in software such as AutoCAD, 3dsMax, Maya, etc...
Now if you're going to be using a renderer that supports CUDA then you're probably better off with one of the GeForce video cards that have a large number of cores. The Quadros are expensive though but if money is not a problem then buy all means go for one of their line.
April 20, 2012 10:42:55 AM

@carlosb: I have a similar discussion in my threat to build my rig.

like carlosb said, Quadros are mostly renowned for their stability and compatibility, not for their superpower, it is like a medical instrument, you do not pay for raw power, you pay for the assurance that it will work 100% correctly every time.

I believe that for probably anyone who has to ask a question as if the quadro is worth it will probably be more than fine with a CUDA enabled video card.

like the GTX580 (top of the line currently usable for VRAY) or GTX 680(absolute top of the line currently working for Octane renderer)
April 20, 2012 7:52:08 PM

vaell said:
@carlosb: I have a similar discussion in my threat to build my rig.

like carlosb said, Quadros are mostly renowned for their stability and compatibility, not for their superpower, it is like a medical instrument, you do not pay for raw power, you pay for the assurance that it will work 100% correctly every time.

I believe that for probably anyone who has to ask a question as if the quadro is worth it will probably be more than fine with a CUDA enabled video card.

like the GTX580 (top of the line currently usable for VRAY) or GTX 680(absolute top of the line currently working for Octane renderer)


Oh... sorry I misunderstood and thought you were asking a question. That explains an answer from your part :??:  . @petersollberg, listen to vaell as he probably has more experience than me in these things
May 30, 2012 2:22:17 AM

Tremolo_45 said:
Dont go with a 680 just yet.
Vray doesnt use CUDA, it uses OpenCL. Previous NVidias were good enough for OpenCL rendering but the 680 seems to have extremly low direct compute scores. Looks like they want to create a distinction between the GTX and quadro cards.
If you need Open CL for VRay rt go with AMD, they're the best at the moment for any OpenCL work. If you want CUDA for iRay then NVidia is your only option.

??? what are u saying brother?? actually the 680 have very high solid(if you know what it means) scores in real life aplications...AMD?? i just bought a 7970 and i still have problems with the drives not mention the rendering problems...flickering etc...i have been giving chances to amd graphic cards buying them always believing that something solid would come up... !wrong! same happened with hd5850..now with the 7970..man thats a lot of money!!! so im going back to nvidea, i just bought asus gtx 680 and as i expected is very very solid working with after efects and vegas 11, when rendering with sony v11 i dont have anymore that anoying flickering with pink color...so shame on you AMD nvidea Gforce rules the world..thats a fact not a matter of opinion!!!
June 29, 2012 11:04:25 AM

Has anyone did some tests with 680 on 3d realtime rendering yet? Vray or iray?
I plan to get a 2 gtx680 rig and I mainly do 3d graphics, but I read a lot of mixed reviews and opinions. It seems that the 1000+ cuda cores of 680 are in fact downclocked and the performance is actually similar or even lower than predecessors...
Really confused here...
Thanks
a b Î Nvidia
June 29, 2012 12:21:01 PM

Quote:
4. the GTX680 CUDA cores are not equivalent to the GTX580 CUDA cores, in this case the big numbers are probably just marketing crap as you say.


they are not marketing crap. it only be marketing crap for those assuming 'bigger number must be better'. the way nvidia implement CUDA cores in Kepler and Fermi are different but it is not the same as 1GB vs 2GB RAM on weaker card scenario. no one are going to say: hey that much of CUDA cores are useless for your need. go for a card with fewer cores

Quote:
the main problem with the GTX680 seems to be the dual GPU procesor, that makes it uncompatible with many current renderers, if this is "fixed" in theory it might make the GTX 680 worthwhile for 3Drendering.


what do mean by 'dual gpu'? AFAIK only a single GK104 are being used in GTX680. did you mistaken GTX690 for GTX680?

bigg71 said:
Has anyone did some tests with 680 on 3d realtime rendering yet? Vray or iray?
I plan to get a 2 gtx680 rig and I mainly do 3d graphics, but I read a lot of mixed reviews and opinions. It seems that the 1000+ cuda cores of 680 are in fact downclocked and the performance is actually similar or even lower than predecessors...
Really confused here...
Thanks


it was not downclocked but with kepler generation nvidia ditching hot clocking for their shaders frequency. yes they were weaker than previous generation CUDA cores but nvidia overcomes the diminishing performance by throwing more CUDA cores into the core. if you need compute performance then GTX680 might be not for you. the card was design for gaming performance in mind and that's what nvidia deliver with GTX680.
!