Sign in with
Sign up | Sign in
Your question

Sony Vegas Pro 12 Rendering with GTX 670

Last response: in Graphics & Displays
Share
December 7, 2012 5:25:16 PM

Hi guys,

I built a new rig a month ago and I've had a blast with gaming and 3D modelling on it so far, but recently I've been fairly disappointed with my video rendering performance.

My specs are:

i7 3770k
16gb Memory
GTX 670


I've been trying to render videos on Sony vegas pro 12 using CUDA or OpenCL however after selecting either of these options before rendering, none of them make a difference compared to rendering with CPU only.
Also OpenCL does not seem to be recognized in Vegas, however, using GPU-Z it clearly states that it is installed and available.

A 10 min video takes me 13-14 mins to render.

Not only is it the time to render, if I render videos longer than 10 mins, somewhere through the video there will be pixelated distortions happening in intervals.

I am highly confident that either option should be making massive differences since I have a friend who uses:

i5 2400
hd 6770
8gb ram

... and using OpenCl, he renders videos a third quicker than what the actual video length will be (10min Video renders in 7mins) and flawlessly.

We render videos with EXACT same settings:

recorded:

30fps @1280x720 AVI in dxtory



Render Settings:

MainConcept AVC/AAC - Internet 720p - 29.970fps





My only difference to him in settings is that I allow 11gb of usage in the program.


I don't really understand to be honest, is Vegas not up to date with supporting later cards?
a c 204 U Graphics card
December 7, 2012 5:42:59 PM

Since you have a Nvidia card, you should use the CUDA rendering options (OpenCL for AMD/ATI).
December 7, 2012 5:43:00 PM

Related resources
December 7, 2012 5:56:50 PM

It's the latest one

306.97
a c 204 U Graphics card
December 7, 2012 6:10:06 PM

What make/model of power supply are you using? Also, how are your system temperatures at idle and under load (like when rendering)?
December 7, 2012 6:22:08 PM

COLGeek said:
What make/model of power supply are you using? Also, how are your system temperatures at idle and under load (like when rendering)?


Antec High Current Gamer 620W

~32c idle
~39c Rendering

I know that under a heavy load it will hit ~56-60c (this is with my games)
December 7, 2012 6:24:20 PM

Mousemonkey said:
310.70 is the latest, I've no idea if it will help though.

http://www.guru3d.com/files_details/geforce_310_70_whql...



Hmm I will check it out, it told me on nvidia's site that my gpu had the latest driver version installed, so that's why i assumed so.

Thanks a lot =)
a c 204 U Graphics card
December 7, 2012 10:21:02 PM

How hot is the GPU (video card) getting?
December 10, 2012 11:47:19 AM

COLGeek said:
How hot is the GPU (video card) getting?


Sorry i didn't mention, the temps I posted were the GPU temps.

I can check CPU temps tonight if necessary.
a c 204 U Graphics card
December 10, 2012 1:08:41 PM

Was trying to determine if the GPU was overheating and then under-clocking itself. That does not appear to be the case.

Does Vegas have a "CUDA Only" rendering option in the templates provided?
December 10, 2012 1:36:29 PM

COLGeek said:
Was trying to determine if the GPU was overheating and then under-clocking itself. That does not appear to be the case.

Does Vegas have a "CUDA Only" rendering option in the templates provided?


I wasn't able to find anything to force Cuda only, just "use cuda if available"

I also checked on vegas and it says that Cuda is available to use.
a c 204 U Graphics card
December 10, 2012 3:30:04 PM

Since you are using the current drivers (and I assume Windows up date), then there must be a configuration issue within Vegas.

Did you install Vegas before or after you installed your 670?
December 10, 2012 4:25:27 PM

COLGeek said:
Since you are using the current drivers (and I assume Windows up date), then there must be a configuration issue within Vegas.

Did you install Vegas before or after you installed your 670?



Vegas was installed afterwards
a c 204 U Graphics card
December 10, 2012 4:34:53 PM

I am at a complete loss. Has Sony support been helpful at all?
December 11, 2012 9:44:03 AM

COLGeek said:
I am at a complete loss. Has Sony support been helpful at all?



I am too, I really don't want to have to contact sony and sit through an hour and a half walkthrough of what i've already tried but if I must then so be it.

I've been looking around and apparently there are a lot of others who aren't satisfied with the 6xx series in regards to video editing, supposedly the last generation of cards (GTX 5xx and HD 6xxx) seem to work optimally but im not sure now if this is only vegas' problem, I tried out premiere CS6 and that seemed to render out twice as long.

I also just read while typing this out that it looks like the 6xx series has been stripped of Direct compute and gpgpu features and has been aimed more at gaming :( 
December 11, 2012 7:41:48 PM

I have the same issue:
In Sony Movie Sudio 12 with HD5670 it renders fastest in OpenCL.
With CPU only it renders 2-3 time longer (i5-2400).
After I've upgraded to gtx650, with CUDA it is like CPU only and GPU used 15-20%
I've installed a HD7750 and it's weird: sometimes it use CPU only (100%) sometimes CPU 70-75% and GPU 30% (always with OpenCL option selected).
a c 204 U Graphics card
December 11, 2012 8:52:57 PM

Edmundoh said:
I am too, I really don't want to have to contact sony and sit through an hour and a half walkthrough of what i've already tried but if I must then so be it.

I've been looking around and apparently there are a lot of others who aren't satisfied with the 6xx series in regards to video editing, supposedly the last generation of cards (GTX 5xx and HD 6xxx) seem to work optimally but im not sure now if this is only vegas' problem, I tried out premiere CS6 and that seemed to render out twice as long.

I also just read while typing this out that it looks like the 6xx series has been stripped of Direct compute and gpgpu features and has been aimed more at gaming :( 

There is no doubt that the GTX 6XX series is intended for gaming. Of course, the Quadros and AMD/ATI FireXXX series of GPUs are optimized for CUDA/OpenCL uses.
a c 289 U Graphics card
December 11, 2012 9:04:29 PM

Did you check GPU usage (whether it's being used at all) during the render?
December 11, 2012 10:45:39 PM

Sunius said:
Did you check GPU usage (whether it's being used at all) during the render?


It does seem that the cpu is still primarily used when rendering, but as I posted earlier those were the temperatures of my gpu, so i guess it's using a bit, I don't really know how else to see my gpu usage apart from HWmonitor.

I did a re-render to double check my results and i got cpu results down too this time.

GPU:
Idle 37c
Rendering 41c

CPU:
Idle 33c
Rendering 60c


Heres a screenshot just to be more clear

December 12, 2012 12:30:05 AM

Edmundoh said:
It does seem that the cpu is still primarily used when rendering, but as I posted earlier those were the temperatures of my gpu, so i guess it's using a bit, I don't really know how else to see my gpu usage apart from HWmonitor.

http://i.imgur.com/K9dUn.png


You can install MSI Afterburner or something like this and it shows GPU utilization and more. It works regardless of your graphic card brand.
December 12, 2012 12:42:33 AM

COLGeek said:
There is no doubt that the GTX 6XX series is intended for gaming. Of course, the Quadros and AMD/ATI FireXXX series of GPUs are optimized for CUDA/OpenCL uses.


No offence but you are pointless. It's about using CUDA/OpenCL or not. Do you think HD5670 is optimized for OpenCL and HD7750 is not? Do you think GTX570 is optimized for CUDA and GTX6XX is not?
What about http://www.sonycreativesoftware.com/vegaspro/gpuacceler... ?

There is a problem with SONY Studio products and GPU acceleration: it works nice, but not with all the GPUs. What can I say, please fill a support case here: https://www.custcenter.com/app/ask and maybe they will find the a way to solve it. already bought 2 graphic cards to accelerate rendering and none of them is working as my old HD5670 does.
a c 204 U Graphics card
December 12, 2012 10:48:34 AM

nael said:
No offence but you are pointless. It's about using CUDA/OpenCL or not. Do you think HD5670 is optimized for OpenCL and HD7750 is not? Do you think GTX570 is optimized for CUDA and GTX6XX is not?
What about http://www.sonycreativesoftware.com/vegaspro/gpuacceler... ?

There is a problem with SONY Studio products and GPU acceleration: it works nice, but not with all the GPUs. What can I say, please fill a support case here: https://www.custcenter.com/app/ask and maybe they will find the a way to solve it. already bought 2 graphic cards to accelerate rendering and none of them is working as my old HD5670 does.

No offense taken. Gaming cards and pro cards are optimized for their intended uses. That is the point.

I agree that Sony has done a poor job here. But having the proper tool does help.
December 12, 2012 11:49:14 AM

COLGeek said:
No offense taken. Gaming cards and pro cards are optimized for their intended uses. That is the point.

I agree that Sony has done a poor job here. But having the proper tool does help.



Very Very true, but for it to be a worthwhile investment in a quadro or fire pro for myself, I would need to drop at least another £800+ for a new build, I still want to be able to game you see and apparently a 570 renders very well, faster than a quadro 4000, so it's pretty self explanatory why I thought a 670 would be the better choice.

I understand Workstation GPUs are far better for high poly counts and muti-monitors in real time modelling though.

Well I guess that's really it isn't it? there doesn't seem to be much I can do right now to be honest apart from waiting for updated support.

On the other hand i was speculating of maybe picking up a used 570/80 and put that in my build for render usage only.
Problem with that is it would be a bit sketchy with a 620W PSU, you guys think it could be done?

December 12, 2012 12:01:04 PM

obviusly it gona work you eaven gona have left Watts
December 12, 2012 12:13:11 PM

LagerLV said:
obviusly it gona work you eaven gona have left Watts


670 & 580 (lets say) together ?
December 12, 2012 12:14:58 PM

nael said:
You can install MSI Afterburner or something like this and it shows GPU utilization and more. It works regardless of your graphic card brand.


Thanks dude! will try it out tonight
December 12, 2012 12:17:00 PM

do you want them togerther SLI? is two different GPUs is possible SLI?And if oyu say SLI with them yes it can live but you had to see Watts used when on load that can really stress PSU
December 12, 2012 12:19:34 PM

Edmundoh said:

Well I guess that's really it isn't it? there doesn't seem to be much I can do right now to be honest apart from waiting for updated support.

You can fill a support case and maybe, someday, they will fix it :) 
December 12, 2012 12:38:17 PM

nael said:
You can fill a support case and maybe, someday, they will fix it :) 


By the time a gtx 8xx series comes out XD jk.

will do anyway hopefully there are many others with the same problem and will do the same.
December 12, 2012 12:42:51 PM

Edmundoh said:
670 & 580 (lets say) together ?

I think the best compromise is to (try first!) downgrade to a GTX580. You will lose some gaming performance but hopefully you will gain GPU acceleration in SONY products.
December 12, 2012 9:45:36 PM

nael said:
I think the best compromise is to (try first!) downgrade to a GTX580. You will lose some gaming performance but hopefully you will gain GPU acceleration in SONY products.



Well yeah I was just thinking having both in one build (Obviously not sli), Who knows what the 7xx or HD8xxx will be able to add in their architecture (probably not that much of a boost to be realistic)
a c 204 U Graphics card
December 13, 2012 3:22:15 PM

Have you tried the 670 + 580 config yet? Also, did you ever document any of this with Sony?
December 13, 2012 3:52:57 PM

From NVidia website: "Can I mix and match graphics cards that have different GPUs?
No. For example, an XXXGT cannot be paired with a XXXGTX in an SLI configuration."
a c 204 U Graphics card
December 13, 2012 4:29:42 PM

While you can't run the 670+580 in SLI, you can run them independently. Much like using as a dedicated Physx processor for gaming.
December 14, 2012 7:52:54 AM

COLGeek said:
Have you tried the 670 + 580 config yet? Also, did you ever document any of this with Sony?



Where is it on the site that i can put this forward to sony btw?

Anyway I just managed to find a 580 and test it out in my rig (I'm not sure if brand matters but its the msi twinfrozr II/oc)
So i switched out my 670 temporarily and ran this card.
I ran one render using the driver with the disc the card came with, no difference to CPU render.
made sure to update to latest drivers and still no difference.
Re-installed Vegas, nothing.
(all settings tested - Cpu only / Use CUDA if available / Use OpenCL if available)

I still can't find an option to force CUDA.

I made sure to restart at each update.
I'm pretty lost here.

If it's anymore useful information, my original video (.AVI) of 11 mins 6 seconds is 47GB.
I'm not quite sure about the rendered .mp4 size (not at my PC atm) but I think it's about ~400mb, I know a rendered 30min video is just over 1Gb.

Is this maybe considered quite a heavy compression?
Is it possible to check/change the compression rate or is that just bound to whatever type of video file it renders out?

I should probably try other render settings like Sony .mp4 or something

Really appreciate all this guys, thanks, even if we don't figure it out in the end :) 
December 14, 2012 8:09:00 AM

Also another thing that I just thought of, my Bios is currently using the default driver installed from package disc.

I heard somewhere that it wasn't necessary to update but maybe that was for a different reason.
Is there anything that could happen if I update the bios? my components haven't been overclocked yet, so I don't know if updating the bios is only dangerous for the people who have overclocked.

My motherboard is Asrock Extreme 6

Edit: I'm pretty sure the version was 1.40, latest is 2.40
December 14, 2012 8:31:34 AM

They say driver 270.xx or later. I think there is no problem with the video card. Maybe you can try a GTX 570 :)  ? I've told you I've managed to use GPU accel. with 1 out of 3 different GPU's.
December 14, 2012 9:57:45 AM

nael said:
They say driver 270.xx or later. I think there is no problem with the video card. Maybe you can try a GTX 570 :)  ? I've told you I've managed to use GPU accel. with 1 out of 3 different GPU's.


Didn't think it was the video card, just the software, might go out my way just one more time to test a 570.

Quite exciting though testing out different cards, gives me a really nice perspective in differences rather than looking at charts on review sites, that 580 was performing almost as good as my 670 in games like far cry 3 and borderlands 2.
8fps difference in MSI Kombuster.

anyway I guess on sony's website they did show off performance for 570 and I don't see a 580, but man I can't believe that's the possible case, even one of the older higher end cards doesn't seem to be supported/optimised? Thats pretty terrible to be honest.
I would use Premiere pro but unfortunately it doesn't recognize any of the codec used in Dxtory maybe I should find some plug-ins.


January 12, 2013 5:41:18 PM

After 1 month from my case, Sony answered: "If you are experiencing issues that you think may be related to GPU-accelerated video processing, you can turn it off (in the Preferences dialog, on the Video tab, set “GPU-acceleration of video processing” to “Off” and restart the application).
This should resolve the issue."
I will try also a HD6670
June 20, 2013 5:27:10 AM

I have had the very same problem with my GTX 670, However I contacted Sony and they have told me their CUDA and OpenCL integration has problems with Nvidia 600 and AMD 7000 series cards, this should be improved for Vegas 13, but at the moment there is no fix.

Hope this helps!
July 22, 2013 7:00:29 PM

to fix the pixelation of the video check the two pass option, itll take twice as long but it goes through and makes the video 100% quality just in case it skipped over a little and pixelated part of the video :D 
August 5, 2013 1:58:52 AM

I've been tabulating Vegas 12 rendering speed for various processors and GPUs. The results have been interesting. When I get enough data in the table, I will post it.

I have created a simplified specific test and test file for this purpose.

It would be greatly appreciated for all to measure their own performance and send me a message or post their results. Please see the topic here.

http://www.tomshardware.com/forum/id-1750719/tabulate-s...
December 17, 2013 3:23:44 AM

Just upgraded my Nvidia GTX 570 to an EVGA Nvidia 770 4Gb DDR5 and now I have NO CUDA support/rendering assistance with Movie Studio 12!!

May have to put the 570 back in as well!!
December 20, 2013 12:12:49 PM

I've done a lot of digging into this as I have the same experience with Vegas 12 and the 6XX series of NVidia cards.

The issue here is the architecture of the NVidia card. Prior version had the CUDA cores exposed and accessible to Vegas, which is why there are fewer errors and difficulty with the 5XX series of NVidia cards. The new 6XX and 7XX series NVidia cards have a slightly different architecture leaving some of the cores inaccessible to Vegas. This means that while Vegas detects CUDA and states it can use the cores appropriately, it can't. There is a lot of idle speculation on the wire that this was done intentionally by NVidia to prevent users of their Quadro series cards from using the far less expensive 6XX and 7XX series cards, (usually an end-user cost disparity of $500-700 or more).

Basically the problem isn't your software or hardware setup, it is a base incompatibility between the newer CUDA architectures and the instructions that Vegas attempts to pass to them. I've been using my 5XX card when rendering as it renders quickly enough to not impact my workflow, but this may not be the case for all users.

If a few minutes of render time is make or break for you, I have to recommend a Quadro card, otherwise a down rev to the 5XX series might be the best option for you.
February 15, 2014 3:48:14 PM

600 series and above is disabled by Nvidia so it doesn't compete with their Quadro cards according to the forums. When you use the GTX 570 did you enable the gpu in the video tab in preferences before you render.
Options:p references:Video tab:GPU acceleration of video processing change to your video card.

Download a utility called GPU-Z and you can view your gpu usage while using Vegas or any othe rprogram for that matter and then you will have a better idea what is and isn't working.

So you are using 11gb of RAM for Dynamic Preview? You should lower that significantly to the default or even Zero to get a faster render. Also I close the project and then reopen to render so their is nothing in cache and I close the video preview window as that seems to help and my cpu usuage will use up to 90+% while rendering. With my GTX 670 I will crash while editing and rendering randomly.
!