I wasn't sure if I should be posting this in the overclocking section of the forums, but I looked there and it just seemed like all CPU material. I have been playing World of Warcraft and Borderlands 2 with my new MSI GTX 770 OC TF and i've been running the games fairly well with a small boost (+50MHz) to my core clock. At first the temps were kind of high, 60-75 range, but I fixed that by using the pin cables the card came with instead of just my modular PSU's ones that I set it up with originally. It made the temps go down to the 40-50 range, but now they're back up to 55-65 while playing these two games (without any changes to my OC settings), but I did download a few driver updates. Nevertheless, 55-65 is still pretty decent. I got a bit bored today though and downloaded Planetside 2 and MY GAWD... with max graphics I got 70-75 temps and max'd out at 77, and the game wasn't very smooth, which I blame my processor for mostly (i7-2600 @3.4ghz) and maybe my 8gb of ram wasn't enough either. Due to my accelerating curiosity and boredom I started to mess with my GPU OC settings in MSI Afterburner, and I downloaded Unigine Heaven Benchmark 4.0. With the factory default settings for my card and extreme settings on the benchmark software with a 1600x900 resolution monitor my average FPS for the benchmark was 49.0, and my "score" was 1234. I proceeded to test it with +100 to the core clock and got 50.2 FPS and 1265 score. Then I started to wonder about messing with the memory clock and did some reading and found out that memory clock speed really doesn't impact performance nearly as much as the core clock does. I proceeded to raise my memory clock +100 MHz and kept my core clock at +100 still, 50.8 FPS and 1279 score, a very small increase. BUT THEN, I went and did +200 MHz to the memory clock and +125 MHz to the core, and I got 53.6 FPS and my score got bumped to 1350, that seems like a pretty big bump compared to the +100 mem clock and +100 core clock. Are these benchmark tests just not very accurate, or does tweaking the memory clock really help benchmark tests that much? Did I find my card's sweet spot? I know my tests weren't very organized, and I should do more, but I'm wondering when, if ever, I should tweak my core voltage. I was reading up on OC guides and people were saying to increase it when you start seeing graphical errors like black spots, etc. But increasing it also can hurt the lifespan of your card. I plan on using this card for A WHILE, and to SLI with another 770 in the future. I am getting a 1080p monitor soon and I also plan to purchase Battlefield 3 and play the crap out of it, and I really want to do that with 60 fps. How much does CPU vs GPU effect performance on a 1080p monitor vs 900p. Should I mess with the clocking even more, or should I just chill with the +200 mem clock and +125 core? I also don't want to hurt my PSU; it seems to get VERY hot when my card gets to 60+ temps, and it's a corsair AX850, so I'm not sure why it can't handle the GTX 770 very well... I'm fairly new to all this, any help/feedback is greatly appreciated.