WIll a GTX 770 be good for 4k?

nehalpatel30

Reputable
Jul 16, 2014
32
0
4,530
I'm planning on getting a 4k monitor soon, and im wondering will a Gigabyte GTX 770 3GB with an AMD FX-8350, and 16gb of RAM be enough for 4k? Note I am not going to be using 4k for gaming but just for streaming online videos in 4k, on sites like YouTube. Do not plan to use 4k on anything but video streaming. Will it be sufficient?
 
Solution
Yea, League of Legends can probably hit 4K with decent frames (at least close to 60 if not above it) at maximum settings no problem. On medium settings I think Hitman should be able to reach 1440p and get you smooth frames as well. According to this article, a GTX 680, which is an underclockdd GTX 770, can hit 31 fps on average at 1440p on high settings at 2x MSAA. So ideally at 1440p with the right CPU, you can hit 60 fps on medium settings if you turn off AA (which becomes less and less necessary the higher your resolution gets).

However, that brings me to my second point -- in the case of Hitman, your FX-8350 is not ideal -- even overclocked to 4.5 GHz it only was able to generate a 53 fps average according to that...

nehalpatel30

Reputable
Jul 16, 2014
32
0
4,530


Do you know if this card is good for 1440p gaming? I don't do heavy gaming, but on games like league of legends, hitman on medium settings, etc.
 

Deus Gladiorum

Distinguished
Yea, League of Legends can probably hit 4K with decent frames (at least close to 60 if not above it) at maximum settings no problem. On medium settings I think Hitman should be able to reach 1440p and get you smooth frames as well. According to this article, a GTX 680, which is an underclockdd GTX 770, can hit 31 fps on average at 1440p on high settings at 2x MSAA. So ideally at 1440p with the right CPU, you can hit 60 fps on medium settings if you turn off AA (which becomes less and less necessary the higher your resolution gets).

However, that brings me to my second point -- in the case of Hitman, your FX-8350 is not ideal -- even overclocked to 4.5 GHz it only was able to generate a 53 fps average according to that article. That might sound impressive, but that's extra money spent on a cooler for overclocking and still won't serve you anywhere near as well as an Intel CPU at stock clocks. AM3+ is a dead socket. In fact, this weekend I just upgraded from an FX-6300 to an i5-4690k. If you have a microcenter in your area, you can get an Intel motherboard and processor for a great price. I got an i5-4690k (usually $240 - $260) for $200, and I got a ASRock Z97 Extreme4 (usually $145) for only $100 (I don't know I got it for $100...the price on their site listed it as $145 but when I added it to my cart to pick it up in store it just dropped to $100). So yea, go Intel if possible. If you don't want to overclock, try for an i5-4570 or I think the i5-4590 is coming out.
 
Solution

nehalpatel30

Reputable
Jul 16, 2014
32
0
4,530


I'm planning to upgrade it soon, but I want the monitor. I don't expect to do much gaming on it, just LoL and some other games on Steam, other than that, I'm only going to use 4k for video streaming. By the way, do you know if 1440p upscales good on 4k?

 
Your 8350 is fine. We already know intel has considerably stronger single cores. But we also know mutli core CPUs are getting the love they deserve (finally). its all about how many cores a game can utilize. that will ultimately decide intel vs AMD (also factoring performance to dollar ratio, i only paid 100.00 for a my brand new FX 8320 at microcenter. thats a tough deal to beat!)
 

Deus Gladiorum

Distinguished
Oh, I didn't know you already had the rig aside from the GTX 770. It's probably just cheaper to overclock then, but like I said, it still won't get you the same results as a stock clock intel. As for upscaling... I'm never a big fan. I've never once seen an image upscale in such a way that it looked as good as if it was displayed on a monitor at native resolution. Ever. That being said, I could just be really picky and susceptible to a placebo effect; it wouldn't be the first time. I used to think that consoles displayed 30 fps better than PC for various reasons, until one day I did a side by side comparison of skyrim locked at 30 fps on my PC against my 360 and found they were the exact same.

However, chances are I'm right on this one, at least in this specific instance. 3840x2160 is exactly 2.25 times the resolution of 2560x1440, so there's going to be issues upscaling. If you want to confirm whether or not that's true or whether or not I suffer from terrible placebos, aim for a monitor that's near the size of your current (I assume 1080p) monitor to as to offset differences in pixels per inch. That might be hard unless your current 1080p monitor is rather large. Then compare both monitors at the same resolution and see if there's a noticeable difference in scaling.
 

nehalpatel30

Reputable
Jul 16, 2014
32
0
4,530



I currently have a horrible 1080p monitor by Acer that I cannot even find anymore in any store. It is 27inch so i think it stacks evenly in size to the 28inch Asus 4k monitor i want to get. My monitor is pretty bad, 27 inch, with 6ms response time, while the one i want is 4k, 1ms response and 60Hz(same as mine), Only good thing about the one i have is that its IPS, while the one i want is TN, but that doesnt make a big difference to me

 

Deus Gladiorum

Distinguished
Wow, a 27inch IPS is quite nice. Response times for IPS are always high, but the color quality is fantastic, and I'd probably take that over the slight ghosting due to high response times. Oh well, to each his own I suppose. Still, I'd be really, really interested in seeing if when you compare both monitors at 1080p, you can see any differences with regards to scaling. When you do get that monitor, would you mind posting here about your observations? That'd be superb.

And not to start a war with Beezy, but AMD doesn't just have inferior single core performance compared to Intel. Since bulldozer, AMD has been giving their CPUs logical cores instead of physical cores. These logical cores aren't true physical cores. On an AMD FX-8350, there are 4 'modules', each of which contain 2 ALUs task schedulers and 1 FPU task scheduler, and a single shared L2 cache. AMD calls this 2 cores, but in reality it's more like 2 cores sharing resources between each other. Because of this, AMD CPUs will typically get destroyed in anything FPU heavy, and developer optimization can only go so far. I upgraded from my FX-6300 this weekend and the gains I've made are considerable in the few gains I've tested. The argument has been made that because we're getting 8 core consoles now, PC ports will be better optimized for AMD if not downright better on AMD. However, that isn't true . If you look at games like Watch Dogs or Battlefield 4, it's clear that Intel still remains on top. Just because it's optimized for an 8 core console with a completely different OS doesn't mean that AMD is suddenly going to be comparable to Intel again. That's simply not true.
 

nehalpatel30

Reputable
Jul 16, 2014
32
0
4,530


Sure no problem, thanks for the help!
 

AznGOD

Honorable
Dec 31, 2013
406
0
10,960


More than enough to play league and hitman on high 1440p
 

nehalpatel30

Reputable
Jul 16, 2014
32
0
4,530


Thanks. I do need to change my case before i do get the monitor, airflow in my case is not the greatest.