Hello. I do a lot of programming, both for school projects and to just mess around with ideas. I mainly use Java, but increasingly I've begun using bindings to high-performance runtimes like OpenCL and OpenGL in my programs. Given the school's computers, this leaves me in a predicament where my code can only really be run in real time (or at all) on my desktop at home. The least I could do is show a video demonstrating my applications for a class whenever I have to present, and it would also be nice to upload videos to demonstrate success in anything out-of-school.
I thus need a low-overhead video capture solution for capturing video of my programs and games at work.
My relevant system specs are as follow:
CPU: 3770k on Z77 platform
GPU: GTX690 (at times, I may be using both video cores asynchronously, requiring all 16 lanes to be dedicated to the card for an x8/x8 split as opposed to an x16/x16 or x8/x8 clone, so I can't use a PCI-reliant solution)
Memory: 16 gigs (I doubt my applications will ever push total usage up past 10, so a capture solution that involves some memory overhead is OK)
Display configuration: I'm using the Mini-Displayport out on my 690, converted to HDMI with a dongle, and plugged into a 1080p display (technically a Panasonic TV).
In terms of the video capture, I don't need anything super fancy. 1080p is a necessity, but 60FPS is not, 30 is okay for my purposes. Also, I don't need raw capture, I'm okay with a little compression as long as it doesn't detract from any aesthetic values drastically. And I don't see myself ever capturing video longer than 30 minutes, if that.
A little bit of CPU overhead or RAM overhead is okay, but I need virtually 0 GPU overhead.
I have provided all these specs because I'm not very familiar in this area, and I don't know if the ideal solution would be independent of the video-producing system or not.
Any suggestions or information would be appreciated.
I'll have to try it. If it works well I'll pick your answer as the best.
It does appear to have some GPU overhead; although, from how it describes itself, it looks like it will be using the integrated encoding sectors on the Kepler GPU, and not the unified shaders, which should be fine.