How long do graphics cards last compared to consoles performance wise?

connor_6

Reputable
Oct 7, 2015
60
0
4,630
If I was to buy a graphics card similar in performance to a PS4 i.e. an r9 270 or 750ti. Will it always be able to play games to the equivalent of a PS4 at least for the life of the console? Or do PC games become more demanding so would I need to upgrade again in a couple of years to match the PS4 again? thanks
 
Solution

TheFluffyDog

Honorable
Oct 22, 2013
469
0
10,960
Those GPU's are both way faster than a PS4. And will last as long as you can stand not upgrading.

You must under stand that Computers are usually aimed at 1080p 60Hz and high settings (med for those hard to run games)

Where as a console will accept an average frame rate of 35-45 at 720p for the majority of titles. nowadays we are seeing 1080p@60Hz from consoles, but its still on select games only. Both of the cards you described can handle 1080p @60Hz in most of your favorite titles at Med-High settings, (turn back some AA) and will out perform the PS4 across the board.

Remember you can always turn your settings down if you need a few more frames, or turn your settings up for pretty pictures.
 
PS4 specs don't change, or if they do it is very minor. So all games made for the ps4 will work decently for it. As for computers they are constantly getting newer and better parts, so games are designed with options of adjusting settings up and down. You should be able to play most games on a computer with those GPUs but you will have to lower settings to do so. If you want to play with higher settings for a longer period of time i'd suggest getting a better GPU.
 
Console's graphics specs never change as far as I'm aware. The chips get die shrinks and things like that every couple years or so, but the actual specs are kept the same. The CPU can change a bit, but only in how much of it you can use and how you use what you get. The CPU's hardware doesn't change either.

Since most modern graphics cards are already vastly superior to the consoles, you can almost always play with better graphics than a console. Consoles use what they have more efficiently, but when you start off with double, triple, or more the graphics performance, the computer will win out anyway.
 
It's difficult to make that comparison, though PC graphics cards are way better. However, as far as features are concerned, you get more for your money using a console, as well as two free games every month if you have a plus membership. On PC, you have lots of free titles as well, not to mention the programs you can run on PC, that won't on console.

The main problem people definitely don't understand is that 30 FPS console is smoother than 30 FPS PC. On console, this is playable, on PC, it's not (for most people anyway). Really, one could go on all day to list the pro's and con's. In the end, the PS4 is far more convenient than a PC, but lacks a lot of things (not all) people take advantage of. Main one being able to upgrade parts, to maintain your desired framerate/resolution/settings in newer games. That on consoles is non existant, if you find a game you want to play, you know it'll play just fine.

With a 750 Ti, it won't play future mutli games that are on PS and PC, maintaining the framerate and graphics quality. This may change as they are pushing DirectX12, or so the hype around it make it look like they are. But for the past 10 years or so, building a PC equivalent or slightly better than a console, won't last nearly as long, perhaps 1/4th the console lifespan at best, if we're talking about "matching" performance.

Comparing my Xbox One (I'm aware it's behind the PS4), but it's playable, and I have no complaints other than aliasing here and there. The PS4 scaler does a pretty bad job when it comes to contrast, but has better hardware than the Xbox One that I own. I don't like when people compare the two, because they are targeted at different market, but there are a ton of pro's and con's for both PC and PS platforms.
 
That's not really accurate. The graphics cards do last just as long with matching performance, the games just get more intense so you need to lower settings. The actual visual graphics is still better than a console because how it looks for a given level (say medium) has improved whereas the console has not. The previous generation PS3 and Xbox 360 are great examples of this- Sure they lasted long, but the latest games don't look any better than a game from around the time of their launch that has similar performance.

Also, a console's 30FPS being better than a PC's 30FPS in my experience. The issue is whether or not the PC is staying steady at 30FPS and it is true that that might not be the case, but you have control over fixing that. The real advantage of consoles is, as part of what you said, convenience.
 

TheFluffyDog

Honorable
Oct 22, 2013
469
0
10,960


One of the reason consoles look so good for having such little "beef" with lower end graphics output, is that they are generally hooked up to TV's which can further add to the image quality. If you make a PC and lock it at 30FPS and hok it up to a TV and then only spend as much as you need to get that, then dial back the settings to match the console settings, you could build the PC for like 200$
 
I don't know about $200, but certainly cheap around the console price range. Around $300 to $400 is reasonable for a PC made specifically to match the current gen consoles, but that's the price for hardware only. You need an operating system (granted there are many ways to get Windows for cheap or even free) and peripherals.
 
Ahh, but they are on a PC if you set it up right. The console's advantage is that you don't need to set it up right- it does that for you.

I don't know what he's saying about TVs and picture quality. TVs tend to look worse than a good monitor, granted you're expected to sit farther away so you won't notice as much.
 

connor_6

Reputable
Oct 7, 2015
60
0
4,630
Thanks everyone for the replies. I am still uncertain though if I was to buy a gpu similar in performance to a PS4 that it would still remain similar in performance to the PS4 for the life of the console eg at least 4 years. I understand that PC games become more demanding and I would need to lower quality to run it but would that still keep it similar to the PS4 or worse? I have everything for my first build apart from a GPU and because I have never really been big into PC gaming I don't exactly know what to expect from gpus or what kind of level of detail etc I would be happy at. If I got one similar to PS4 graphics I think I would be happy as long as I didn't have to upgrade soon to keep it at that level. I also don't want to spend £200 on a 970 and feel its a waste of money knowing I'd be equally happy spending half that and getting perfectly playable performance vice versa.

I have actually been looking for a 280x used for around 100 gbp with no luck at the minute and since the 750ti is under £100 for new I makes me wonder if that would do plus I would be happier going new. Can always sell and upgrade in a year or so if needs be I suppose?
 
I'm by no means trying to look like someone who doesn't have any clue what he's talking about, blindly defending consoles. But most arguments people make against consoles, are just ridiculously inaccurate/invalid.

The whole "PC Master Race" people take too seriously it seems. It's obvious if you own a gaming PC; which I do. That frametimes are all over the place. Because PC games aren't designed around the console standard, it will cost more to maintain performance when compared to the console. How much that is, is irrelevant.

Most people using consoles, are using them because they're convenient. If a gamer on PC has no interest in ever using Photoshop, or other programs, other than strictly gaming and streaming or physical disc movies such as DVD or Blu-ray, those extra features means absolutely nothing to that induvidual, regardless how cool they are, when comparing to consoles.

So in the end, it's about needs. If someone wants to game on 3 displays, then that's not going to happen on console, that's a reason to consider a PC. If however, that person doesn't want to play on 3 displays, just play games on 1 display, is it really a big deal, when in the end, it's about needs, and not having to hammer someone's purchase decision. It's just all silly to me. I didn't just state facts, I also shared my opinion, having used both platforms.
 


Changing settings and things around don't change the fact that a graphics card that has higher performance than the console's graphics will still have stronger performance. If a game is more intense than the previous generation, that's because it has higher visual quality (at least theoretically, very bad code happens every now and then) and dropping the settings to retain performance should keep the visual quality similar to a previous game at higher settings.

The GTX 970 is not competing with consoles. They are in two different leagues. If you're happy with the graphics quality that a console offers, then you have no reason to get the GTX 970 for gaming because it is vastly overkill for you. Even the 280X is greatly superior to any consoles and if you want relatively close performance, some GTX 950 cards are selling cheap near the 750 Ti's price range (at least here in the USA, check your local prices) despite greatly superior performance.
 
Solution

connor_6

Reputable
Oct 7, 2015
60
0
4,630


Ok thank you think I will go for a more budget entry card like the ones you suggested. At least if I do want more performance I can always upgrade after a year or so. I am raging now because I knew I shoulda went for that r9 380 2gb that was on offer for £115 the only reason I didn't was because I was getting worried about 2gb not being enough but thats another story.
 

connor_6

Reputable
Oct 7, 2015
60
0
4,630


My laptop died so decided to build a work desktop capable of decent pc gaming instead of getting a PS4 or Xbox one. I need it for programming photo editing/creating etc. I have a fairly large backlog of PC games I would like to play since I rarely take the time to do so and would like to get caught up on them. Games like SC2, skyrim, oblivion, dark souls 3 when its out and the first two, Civ 5, total war, batman, farcry etc. I have everything apart from gpu would probably be happy with console performance since it will only be used for few hours gaming. I havent used or seen a gaming PC before so don't know what ultra is like. I am trying to build the cheapest best bang for buck I can get. Can get a gtx 670 of my friend for about £50 but that will be a few months away although its a good card. Otherwise looking to get a good deal on something around £100 used or new which really limits me to a 750ti 270x 950
 

connor_6

Reputable
Oct 7, 2015
60
0
4,630


Thanks. My friends 670 is quite mid range and is more than half the price it would cost me for a 960 or 380 so I think getting that card is best value? Will just need something to last me a few months or until then.
 

bluebob71

Honorable
Dec 2, 2013
55
0
10,630


you have no clue what ur talking about, and clearly uve never used a decent pc. u can lock ur frame times at pretty much any variable 20/25/30/ all the way to 144 or more if u want. 30 fps is 30fps unless ur talking about screen tearing and theres many solutions for that lol silly console peasant thinking he knows what a pc is about :)





i say buy a gtx 970 or a 980 if u can afford the 980 def do it that could will last u 5 years on high to maxed out on 1080p with out aa/msaa
 


Text marked in bold is incorrect. You're confusing frame rate with frame time.



All the best!

 

TheFluffyDog

Honorable
Oct 22, 2013
469
0
10,960


On a TV the screen refreshes very differently from a monitor. Cheaper TV's wont help as much, but an expensive TV can do a lot. Plus, the back lighting systems on screens are designed to better match the content as well. For instance, even on a Vizio TV today, the refresh rate is anywhere from 60-120Hz depending on model. But the signals that TV's receive are cable and movies. Movies are in 24fps and most TV shows don't exceed 35fps, so why would a 60HZ TV or 120Hz TV make sense? Back in the day, it didn't, but any decent TV today does frame interpolation filling in the frame gaps and smoothing the image.

The better the TV the better the effects of frame interpolations and back light adjustment. Monitors do this aswell, but they rely solely on the GPU to formulate the image, and then open that image.

Another thing is TV's wont tear. Because the ensure that the image in the frame buffer stays the same until the image is displaye and removed. On a monitor if the frame buffer changes halfway through the refresh you get a tear.

All of those things combined get you an astonishingly better picture from the same source content, but with resulting input lag effects. On newer TV's input lag is becoming less of an issue. And lastly, if you take a cheap PC and launch a game and send it to a TV it will look better immediately, but you may notice lag if you a PC diehard.

Also, 4k upscaling. Once again, cheaper TV's don't do it as well, but it definitely sharpens the image
 
120 Hz TV's are good to get rid of telecine, but also has faster pixel times. As far as picture quality, no... A TV can't att detail to a game, what?

And, there's a difference between frame repeat and interpolation. People just assume that there are no 120 Hz TV's, when infact there are 120 Hz and 240 Hz TV's out there, 480 Hz even on the higher end side. Those are native too.

240 Hz was required to finally beat 120 Hz Passive 3D TV's, using Active 3D, that was it's purpose. 120 Hz Active 3D looks blurry in comparison to 120 Hz Passive 3D.

Monitors are best for both PC and console gaming, a TV is more convenient due to it's size to cost ratio.

True (native) 240 Hz TV: http://www.amazon.com/Samsung-UN60H7150-60-Inch-1080p-240Hz/dp/B00ID2HGK4