I am planning on using FRAPS to record gameplay footage, ideally 1920:1080 @60FPS however I am willing to step down to 30 FPS if the HDD is not able to maintain that kind of data rate. To my understanding full size recording at 60 FPS for FRAPS would require 356 MB/s and 178 MB/s for 30 FPS.
From my understanding and looking at the benchmarks the only HDD that can even write sequentially at those speeds is Samsung SSD 830 while connected to SATA III. The closest mechanical HDD (excluding VelociRaptor: somehow they are more expensive than SSDs around here) comes to those speeds is Seagate Barracuda ~150 MB/s on average with minimal of ~90 MB/s ,all the benchmarks were done while it was connected to SATA III 6 Gbit/s. However I do not have SATA III slot available to connect for additional HDD.
Here is the question: if I connect it to SATA II, would the performance be effected? Technically SATA II cap is 300 MB/s so it shouldn't bottleneck it from that perspective but I am wondering if the older interface would cause a slowdown in a different way? Is it possible to get a RAID 0 configuration of at most 4 HDDS on SATA II with data rate transfer of at least 356 MB/s ?
SATA III HDD should not be bottlenecked by the SATA II connection since it can handle more than the drive can as you have studied.
The raid question answered here with Yes http://en.wikipedia.org/wiki/RAID
Bandwidth isn't the issue, I can run fraps at 60 fps at 1080p on a wd blue with no issues. What are your specs? The game will also affect it depending on it's cpu usage.
We seems to have a misunderstanding. Bandwidth is an issue, of course it could be overcome with different FRAPS settings like "Lossless RGB" and "Half - Size". Just to be clear when you says "I can run fraps at 60 fps at 1080p" do you run game at 60 FPS while recording or record at 60 FPS? The reason I ask is because you ask about the specs, and to be honest specs are irrelevant in the scenario I described. You could be recording TF2 or BF3, you'd need different amount of "horse power" to run and record these games at 60 FPS but you'd need same amount of bandwidth to your HDD.
K1114, can you go into greater detail on your settings in FRAPS for recording, please? And if it is possible I'd like to watch a sample of your recording to make an educated call on whether it will satisfy my needs or not.
Currently I am researching which HDD I should get for my recording drive , that's when I run into the issue of not having SATA 3 ports anymore. Reading your responses, got me thinking, I have gigabyte z77x-ud3 mobo first SATA 3 is SSD used for OS second SATA 3 is HDD for storage and games. I could connect my current HDD to SATA 2 port since I don't really need 6Gbit/s on it which would free up SATA 3.
It comes down to several possible solutions. 1) Step down in my demands 2) Get an SSD on freed up SATA 3 to dump raw recording on 3) RAID 0 with multiple mechanical HDDs on SATA 2 ports
Full size 1080p, the game is at 60fps(vsync and plenty of pc power to overkill) and fraps is set to 60fps. Why specs matter is because the cpu is encoding(recording) and being used by the game. Different games use different amounts of cpu power. If your cpu is too weak to encode and play, then you fps will take a hit. What's the point of your hdd being able to handle the bandwidth but your pc can't? Now that you mentioned z77, I'm assuming you're building new and specs shouldn't be an issue. You must understand people with older pcs will come here asking the same question as you are now so I need to cover all my bases.
Lossless will increase bandwidth to the theoretical 356MB/s bandwidth of raw 1080p at 60fps. Turning lossless on I cannot hold 30fps and was stuttering. From hdd monitoring programs, yes my hdd is the cause. Lossless is unnecessary in most situations of game recording and I don't see a difference although I'm not looking hard. Whatever codec fraps uses to compress is good enough for me and I can hold 1080p at 60fps on my single hdd. If you want lossless, either a ssd or raid hdds would do.
*Just a PS, half size reduces bandwidth opposite to lossless which increases. You mentioned lossless and half size as if they both reduced bandwidth.
Yes, "lossless RGB" while OFF decreases bandwidth while "half- size" while on. And you saying that you do not see the difference is exactly the issue, I do, not every time but more than not ,it's mostly how clear things are in the shadows. I guess I have to attribute it to having a high def monitor with 50,000,000:1 contrast ratio. I assume that if I'd be using the regular 5 mil or 1 mil I wouldn't tell them apart.
When I think more about it, I am not even sure that all codecs support lossless RGB, it look really nice in raw format (well at least for as long as VLC buffer doesn't run out and I can watch it) but after encoding will there even be a difference? On the offside of that question will the encoding compress the files enough while preserving lossless RGB quality?
Alrighty, seems like more testing is needed. First, I'll do the one minute recording on most pimped out setting to my OS SSD regardless of outcome it should be able to give enough bandwidth. Then I'd play with encoding and see if there is a noticeable difference. Repeat the process on HDD while running things off SSD. I hope that it won't take more than a couple of hours :S .
My initial thoughts seems to be that either you are right and I am thinking of worst case scenario and it is a none issue or two barracudas in Raid 0 would do the trick. Either way I am quite satisfied with that. Thank you for responding.
50m:1 is dynamic contrast, your eyes can't see past 10000:1 and pretty much only an ips monitor goes above 1000:1 static. High def just means 1080p so I'm not sure why you mentioned that. To get a video to a reasonable size, it will need to be compressed and lose quality. Raw is really only used while video editing. When compressing, the blacks and whites are the first to go but only when you really look for it do you ever really notice it. Assuming it's a good codec.
*Using asus VS248H-P which is 50m:1 and all those other niceties. I'm sure if I looked I would see it.
That's the catch, any cinematographer would say that 30 FPS is more than needed and you eyes can't see more than that, however we all know that for games people can tell the difference on higher FPS count. I imagine that same goes for contrast, we might not be able to pin point what is wrong/different but we'd feel the difference.
I bring high def up because the picture quality would suffer much more due to stepping down to 720p from 1080p (on same size screen) rather than lose of color due to lower contrast ratio.
The difference with 30 fps and this contrast issue is humans can actually tell the difference to ~100fps for videos. We can see more than that but it starts to get difficult to differentiate. Fps perception is logarithmic as well as situational; you can see a 1/300th second flash of light (essentially 300fps). While as for contrast we physically can't see past 10000:1 static, 10m:1 dynamic. Your monitor still can hardly get to 1000:1 static, and we can see more contrast in still images.
Cinematographers are fine with 30 fps, movies are actually 24 fps, because film has motion blur so seems smooth. Where as games most likely do not.
Hm, good point. I should check one of those Samsung and BenQ screens that claims to have 3000:1 and 5000:1 static contrast and 20 mil:1 dynamic.
You are quite right the human eye can adjust, both chemically and via iris movement. Just half an hour in a well lit environment would allow us to perceive higher contrast ratios of 1,000,000:1 and up to 10,000,000:1.
As engaging and entertaining this discussion is we are getting off topic.