Do you REALLY need that much graphical horsepower?

This could be just me, but I've noticed a bit of a trend in benchmarks. It seems to me that just about all the $200 and up graphics set ups (including SLI & CF) handle nearly every game they play at maximum settings and deliver superb (or at least playable) framerates at almost every resolution (with the exception of Crysis and Flight Simulator X of course).

I mean, if Toms, Anand, Hot Hardware, X-bit, etc, are at all reputable information on how great this generation (and soon to be generations) of graphics cards are, then do you really even need to invest more than $400 in a graphics card anymore? I admit, I like my eye candy and I like high resolutions, but if we don't count 2560x1600 (I doubt the average gamer really wants to spend over $1k on a monitor), then everything from 1920x1200 and lower has more or less been conquered in 90% of games out there.

Sure, there are some games that are CPU bound or that still need just a tad more power (Supreme Commander, Mass Effect and World in Conflict come to mind), but even those kind of games you can get at least 30 FPS, almost maxed out details, for $200 - $400.

Then there's the games you see that can run on damn near anything and today deliver framerates in the hundreds (i.e. Prey, FEAR, etc.). Although these kind of benchmarks are starting to phase out, there's really no need for them anymore as they give a slightly misleading idea of what is a good play experience. I'm sure practically all of you already know this, but most humans can't tell a difference in frame rate after 60 FPS. Not only that, but your FPS is also limited by your monitor's refresh rate (can't show more frames per second than times the screen is refreshed per second, right?). I'm not sure how much refresh rate has improved in monitors recently, but I think the average is around 75 Hz, so the maximum FPS you're likely to ever see from your monitor in a real world scenario is 75 FPS, which I think makes anyone happy.

So anyways, my point is, there seems to be a decreasing overall need for graphics over $400. Most of the powerful single card setups (HD 4850, 4870, GTX 260) or the SLI/CF setups of cheaper cards (8800GTs, 3870s) gets the job done for so many games out there and the ones that can't get enough power from those setups to run at maximum seem to be the minority.

I'll probably get replies like 'Yes, we know.' but hey I tried. :kaola:
39 answers Last reply
More about graphical horsepower
  1. Umm Crysis stilll wont play at highest settings and the new Crysis is even more demanding. I think until games look as good as movies like finding memo will it satisfy peoples eye candy desires.
  2. I'll take your point a bit further.
    If all we wanted was quality in games, we wouldn't be assembling $2000 and more systems: we would just be buying a PS3 or Xbox360 ($600 at most) and having a much better gaming experience than we have today in PCs.
    The problem is that it wouldn't make us happy.
    We enjoy the sensation of putting the system together, squeezing every possible mhz via ocing, switching gpus and sharing our experiences in foruns like this, in endless threads about differences of SLIxCross Fire when AA is on...
    The question is: will we survive?
    The average user, like your mother or my sister, don't even game and in terms of computing they prefer a Mac that, although is lousy in terms of pure horsepower when compared to the average PC, is all they need for office, internet and video/music.
    Perhaps, just like the dinossaur, we'll all be extinct in the years to come.
  3. I tend to agree, but I am not a big PC gamer. I like, galta said, prefer to play games on my Xbox360.

    I have a ati 2600XT which is considered ancient here, but for photoshop and even 3D rendering it does the job whithout a problem. I will probably upgrade to a 4850, but I definitely will not spend more than $200 because its just not worth it for me.

    But then again, there is always the thrill of pushing your PC to the limits :)
  4. Yes, i do need that much graphical power actually.
  5. 30FPS is FAR from being enough, no matter the games, there's dips in performance when more action hits the screen. To get a MINIMUM of 30FPS on ANY given game at a decent resolution (say, minimum 1600x1200, most gamers have a minimum of 19" monitors) with the good eye candy, you'll still need a huge CPU+GPU combo, sure you'll hit 90+FPS in some scenarios, but when the all out action/effects hit your screen, it should not go below 30FPS (or one would hope so).

    Hell, I play AoC, and even that isn't even close to being as demanding as Crysis, and even with my OC'd CPU & OC'd GTS 512, it will still dip to as low as 15FPS @ 1280x1024 without the FULL eye candy.

    There's also the fact that, the whole "there's no reason to go beyond 30FPS, the eye can't see more than that"... that's horsecrap! lol. Any real gamer can see the difference between a game that runs as 30FPS (let's say it was 100% constant at 30FPS) and one that runs at 60FPS.

    We need the High end CPU & high end cards, the games demand it, and we like our quality of play.
  6. I'm not saying I don't enjoy putting together a PC or OCing something, I'm just saying it seems like there is a minority of games that really need the graphical horsepower they try to sell us at the top tier. *coughGTX280cough*

    It just seems like the only people who really need the graphics power that comes with the ridiculously expensive setups (Tri-SLI GTX 280s, Teslas, FireGL, etc) are people who want to try and play Crysis at best possible settings (or close to it as possible) or people doing 3D modeling and other graphics intensive work.
  7. @OP: Yeah I would tend to agree with you. The best of the best graphics cards doesn't seem very reasonable for today's games. However, there are some people who believe that getting the best will keep their systems fresh for future generations of games. I personally think getting a mid range card and then getting a second for crossfire/sli is the way to go as the graphics card gets old.
  8. Of course we need the horsepower... there's nothing like the feeling of a couple raging 4870s between your legs...
  9. Until I can't tell the difference between what's on screen and what's real, then the answer to your question should be obvious.
  10. szwaba67 said:
    Of course we need the horsepower... there's nothing like the feeling of a couple raging 4870s between your legs...


    I never used to really believe people when they said that they could tell the difference between 30fps and 60fps....but in certain games the difference is very noticeable. COD4 gets almost unplayable at need at least 50+ in order to make a head shot easily :D
  11. Well, let me rephrase. As Tom's has put it, 30 FPS for a fast paced game (Action, Racing, FPS, etc.) is not that great. 40 FPS is more the minimum for those. 30 FPS is more for slower games like RTSs. But even so, these minimums are easily attainable in almost any game with a decent CPU and a $200 - $400 graphics setup.

    rodney_ws said:
    Until I can't tell the difference between what's on screen and what's real, then the answer to your question should be obvious.

    Well I guess that's that then. How could I have been so foolish? :kaola:
  12. Same can be said of cpus. Do we really need 4 cores, and at over 3 Ghz? When the next demanding game does come out, I want to at least have a chance to play it in a decent way, look at all the people whove said about Oblivion when it came out, that they wouldnt play it til they got their upgrade, same with Crysis. If a truly killer game comes out, and we all have 2600xt's we will be hurtin. I agree, the day of the monster top o the line gpu may be gone, but I at least want to be in the ballpark
  13. I guess that's more what I'm getting at then. Top-of-the-line stuff is more or less overkill and costs way too much. Mid to Mid-High range is really where the good stuff is and usually all you really need (for that generation).

    And 2600XT's are just barely mid-range JDJ. :kaola:
  14. lasttarget said:
    Umm Crysis stilll wont play at highest settings and the new Crysis is even more demanding. I think until games look as good as movies like finding memo will it satisfy peoples eye candy desires.

    Crysis: Warhead will feature the same engine but with better optimization. Crytek knows that they didn't properly optimize that engine when they released it, and many mid to low end systems suffered greatly because of it.

    If anything it promises to be less demanding but still offer some of the same eye candy. Obviously you will be missing out if you don't have some high quality hardware, but it won't kill you the same way Crysis did.
  15. I think the real reason we go for the higher horsepower is for bragging rights. Take my 1988 Toyota Celica for example. It can go much faster than any speed limit set in the U.S.

    ...but for some reason I would prefer a 2009 Lamborghini Reventon.

    It's crazy, I know!

    Edit: yeah I screwed up the name of the Lamborghini - shows you where my wallet stands
  16. ewwww i hate 30 fps. 60 ftw. at 1920x1200 most games dont even run at max settings (with AA) on my 9800gtx...
  17. Do I *NEED the horsepower of an overclocked GTX 280? No. Do I want it? Yes!
    The only people that maybe actually need the best gpu, like a 280, might be those with, as you said, 2560x displays.

    Sure, the latest $200 gpu's and even last gen stuff can play any game to some extent. Often with better than average settings, on displays more common such as 1680x, like mine.
    However, when you do hit resolutions like 1920x, which isn't that rare these days, you might want the best gpu not just so you can play any game, but add a little spice to it. Max settings and a tad of AA and lots of AF, which lesser cards might not allow you to do on more demanding games on that resolution.

    As for me and others gaming at 1680x and less. Maybe these cards give us plenty of horsepower. However, since our resolutions are NOT 1920x or 2560x, we like to tack on even more AA along with the max settings to compensate. You take a 1680x res, throw on 8xAA, the quality is just amazing and can look as good as 1920x with 2xAA. Maybe not quite as good, but not far off.

    And I do play games like Grid, Oblivion, Dirt, Mass Effect, & Crysis. Those games completely maxed out, w/ DX10 (when compatible), maxed out driver settings, @ 1680x will definitely choke a bit when you throw some 8xAA into the mix if you don't have one of the better gpu's out there.
    And if you can hit 60 fps with the settings above, what a smooth treat that is. For example I'm playing Grid right now totally maxed out, 16xAF and 8xAA.
    This game looks absolutely stunning with those settings and better yet I'm always above 60 fps! Maybe not a huge difference from if you can average 30 or 40 fps, but I think it is still noticeable to some extent. Just super smooth gameplay, never a single stutter.

    Crysis really chokes when maxed DX10 and 8xAA with any single gpu out there.
    And, looking ahead, good upcoming games will only be more demanding, want more than 512mb of vid ram, and all that good stuff. So another thing is being prepared for the next year or so of games (depending on when you upgrade again).

    As for Crysis, DX10, 1680x, maxed game & driver pretty much do need a GTX 280 to play that smoothly because it seems to be the only gpu that can get you 30 fps on those settings and even that takes a decent overclock.
    However, here, with over 700mhz core and 2500mhz memory, Crysis is actually playable when throwing on 4xAA on top of that! Don't ask me why, but it is. And hot damn does it look awesome!
    No other single gpu will do that, period.

    So even though I can definitely play Crysis with an 8800GT or 4850, at decent settings even, there is still nothing like being able to play it maxed out in all its glory and then challenging it with some AA. To us its like some monster we've been trying to tame for the last year! So when you finally got a gpu/cpu that can start to own Crysis, its a great feeling! And yes it is fun to get on forums and talk about it and hear about what other people are able to do with their gpu's.

    If you buy a new flagship card (either ATI or Nv), I think more times than not there is more to it than just powering some games. It's a hobby. It's fun to put together rigs and slap stuff like this in there. It's fun to OC and see how far you can push your hardware. And of course its fun, even if you don't like a game that much, to be able to see these games in all their glory in ways most of the world aren't able to. I mean, what isn't fun about being able to run some of the best looking graphics in the world?

    Lastly, it sure is nice to know your gpu can handle pretty much whatever you want to throw at it. Rather than be annoyed when a game comes out that you can't play maxed out, if at all. You can kinda relax on that front, at least until the next line of "best" cards come out, which I'm sure I will be all over those as well!

    Having said all of that, I'm not a huge PC gamer. You would think anybody with a GTX 280 would have to game 20 hours a day, right? Maybe some do, but I work a lot during the week, only have a few hours of downtime for anything. Some days I'll game for an hour of that, some I won't. Weekends I might get a couple good gaming sessions in, but its summer here and so I go out a little more.

    Also, I am a console gamer. I have a PS3 and a good pile of games for it. I think that is some great hardware as well. I have a nice 1080p TV and its really nice to console game, especially when you are low on time since consoles are a little more hassle free. And I'm sure when FF XIII comes out I might ignore my PC for a little while.
    But that all said, I consider PC gaming something I enjoy in moderation, I'm a big fan of hardware and graphics. And it is definitely a bit of a hobby. Like putting the rig together, doing the overclocks, getting some benchmarks and all that good stuff is half the fun for me.

    So yes, people do still WANT the best gpu's out there. Even if they don't NEED them.
  18. I think there are some marketing droids out there who define an "enthusiast" as someone with a lot more money than sense, who is eager to part with it for the sake of a bigger e-dong. I read occasional new-build posts here that seem to be in that category, although most people seem to at least know what a "budget" is. I can't help them; I have no clue as to where their value lies; I'd be hard pressed to spend much over $2K on a rig without making choices I'd know were silly. Ask me how to stretch $400 into something usable however, and I can probably suggest something helpful.
    For me, I can't imagine needing a CF/SLI setup for anything I'm likely to play/run, but 20 years ago the idea that a GPU would need a fan on it also seemed absurd to me. Things change. We'll see what happens.
  19. man, i remember the good old days like a year after everquest came out, you could play with like a $300 machine, now the games are MUCH more demanding and much faster paced, so when a new game comes out, it requires more powerful hardware. The people to blame here are the video card makers and the softtware companies that make the games. The video card companies are constantly going back and fourth to see who is the bigger man on campus, and that is driven by the software companies that make the games so demanding if you want to see good visual quality. it's a dirty game y'all, keep your eyes on the ritches!
  20. lol, I bought a Radeon 9550 gamers edition (the one with a lil fan on it) something like 2-3 years ago and I've been riding it into the ground playing WoW at 1680 x 1050 at 35 fps, once tried to OC it, made it to something like 35-40% of an OC with a tad bit of artifacts how the card didn't explode I'll never know, though the fan just broke down last week and I replaced it with one I got off an old pentium stock cooler.

    What I'm getting at is that not all gamers need the bleeding edge (though I totally plan to get it when Diablo 3 comes out)

    *Edit* Come to think of it, the more computer become powerful, the more important energy saving will become as you really don't need an OCed quad core and Tri SLI 280s to check your email...
  21. Im currently running on a athlon 64 3500+ with a gf3 ti500 under windows vista, and it does great actually. I only miss playing ut3.
  22. Currently running an e6600, 2Gigs, and an 8800GTS, outside of Crysis, I can play everything fine with my monitor @ 1680x1050.

    Do I need the more horsepower, definitely not, I probably will go with a 4870 when the time comes though.

    Personally I don't see the point of crossfire or SLI setups, they come with too much variability and issues between games for the price of such a setup.

    If i get 30+ FPS in my RTS games, and 60+ FPS in my shooters, thats all I need, more than that really ins't necessary, its not like my LCD can respond fast enough to see the difference above 60 fps anyway...

    I believe Crysis is an exception to gaming anyway, I have to question the programming behind a game like that. Sure it looks great, but the performance hit systems take to run that game really makes me wonder if its a question of a lack of raw graphical horsepower, or just inefficient programming.

    If you're running at or below 1920x1200 dual graphics are really just for bragging rights.
  23. Bottom line:
    The game developers rely on the hardware industry to push tech further.
    If we didnt have enthusiasts buying the latest and greatest graphics hardware, new games wouldnt advance graphically. Consoles games are very dependent (indirectly) on the advancement of pc hardware too. And if we didnt buy the hardware before we need it, then the developers wouldnt push their games further each year.

    note to robx46: 11 paragraphs is pushing it. if you want people to read your comments, make them shorter!
  24. What I dont understand is, people wanting the G280. Its overpriced, underperforming and runs hot when cranked up. People say they want this card, and in 1 week, we will see a new card thatll blow its doors off.At the same price to. Without these cards we obviously wouldnt have the 4850s or the 9800gtxs. So we depend on people who want and buy those cards. Im getting a 4870. Why? Cause I know Ill be able to handle anything I throw at it for a good amount of time.
  25. ^^^ and if it's doors are blown off, they'll lower its price to be competitive. there's nothing to wrap your head around here. the flagship is overpriced every generation, until competition forces its price down.
  26. the 280 and the 4870 are good cards but they dont play crysis on full with high AA so you've got to take ur hat off to ati/nvidia not only do they want to sell you an underperforming card they want you to buy two and you still cant play crysis on full with high AA
  27. Then of course, theres the 4850x2, wheres it going to end up?
  28. JAYDEEJOHN said:
    Then of course, theres the 4850x2, wheres it going to end up?

    Probably $399. That would put one Radeon card at every increment of $100 except $100.

    4850 - $199
    4870 - $299 (eventually)
    4850x2 - $399
    4870x2 - $499

  29. The 4850x2 will kill the G280. Thered be no reason to charge 400 for GDDR3 version, and a GDDR5 version at 400 will win out, using the CF sideport
  30. I'm not going to bother with a lengthy post with examples and points.

    I'm simply going to say, "Who CARES !?!"

    Pirces are what the market is willing to pay from top to bottom, you can insert whatever performance figures you want in there. And what's acceptable for some isn't for others, and what they spend their money on is no one else's business.

    And lastly, anyone who things a console enters this discussion is missing the point of the cards in question.
  31. Agreed, Ape. As long as their is a market for the product, then companies will make it. As long as enough people are willing to buy at a certain price, companies will charge that price.
  32. TheGreatGrapeApe said:
    And lastly, anyone who things a console enters this discussion is missing the point of the cards in question.

    It matters to the tech that goes into the NEXT consoles. The groundwork that goes into the GPU's of consoles are laid out by enthusiast cards in the PC segment.
    When people buy the uber expensive flagship graphics cards, they are helping pave the road to the next generation of consoles.
    If noone bought these cards, they would stop selling them, and advancement in graphics tech would grind to a halt.

    I brought this point up to counter the people who say, "why buy a $600 GPU for a PC that costs thousands of dollars, when i can buy a $400 console?" well, my point is that these $400 console wouldnt exist if people didnt buy the expensive PC hardware.
  33. hixbot said:
    It matters to the tech that goes into the NEXT consoles.

    Which still doesn't matter to these cards. Next Gen consoles is 2010 at the earliest at this rate, which will likely have near nothing of this generation in them (they might not even be rasterization based anymore).

    I understand what you're getting at and agree with it somewhat, however it's irrelevant to this discussion other than they *can* sometime use heir R&D money and IP to develop consoles. But it neither guarantees a good chip nor is a requirement.
  34. Indeed, I agree that the current actual cards won't directly fuel the next consoles, and we both see eye to eye on the develpoment of consoles.
    If indeed the tech changes dramatically, we will see that first on PC systems. Consoles will follow the trends. This specific gen of cards may not have much bearing on the next console tech, but it is part of the stepping stones to that path.

    I suppose where we disagree, is what's relevent to this discussion. You seem to think this thread was created to discuss specifically the current cards and if they are overpowered compared to the games of today. I saw it as a question to the purpose of overpowered cards (compared to the games) in all generations. Which is more of a wide discussion topic.
    I wanted to point out that overpowered cards of today allow game devlopers to improve graphics in the PC games of tomorrow. and eventually, console games.
    sometimes the hardware may lead the software (overpowered hardware), but if we didnt buy the overpowered stuff, then advancment could grind to a halt (which can effect consoles indirectly).

    thats why, i think, we need that much graphical horsepower.
  35. Speaking of consoles following PC trends... if I remember right, the current generation of consoles (except for maybe the Wii) have cards either based off of, or directly a part of, the Nvidia GF7 series or ATI's X1000 series, and the graphics on them are just the best we've ever seen and are starting to get to the point of ridiculously realistic.

    So just imagine the kind of graphics consoles of the next generation will put out given the advancements we've had since then. Like, what if the next generation Xbox had a GTX 280 put in it and game developers programmed exactly for that GTX 280. You could probably render all of Europe during WWII and have it look as close to real life as it gets from computerized technology.

    Now remember Ape's comment that this generation likely won't have a spot in next-gen consoles. Then you really start to get a picture of just how much raw power could be harnessed by then.
  36. Quote:
    Do you REALLY need that much graphical horsepower?

    short answer. YES
    Long answer. No with a but.
  37. For me, It's not whether I can get over 200 FPS or not, I usually run games around 30-40 FPS and that is ok for me.

    What I look for is the longevity of the card. It may be better to spend an extra $100 on a card that will last 1-2 years longer (performance wise) than a cheaper card that will be bogged down sooner.

    For me it is always a cost v.s. performance v.s. longevity scenario.
  38. Exactly. To think that a 8800GT will last for very long with the newer games is crazy. Buy a 4870 and your chances improve dramatically. Plus, you have all the eye candy youll need with current games. Thats the purpose. Some people want all the eye candy some dont. I do, and I want it to last. No 8800GT's for me. This is a new gen thats come out people, it happens all the time. Those 8800GT's are only mid range cards now a days. Theyll still play all games most with full eye candy and some you simply cant max out. Where you fit with your desires as to what your gaming experience/needs/wants are fits somewhere in between. I like eye candy, and Ill have something near the top, and by the time its a mid range card, its time to buy again. Thats me, thats what I like. To each his/her own
  39. lameness said:

    I never used to really believe people when they said that they could tell the difference between 30fps and 60fps....but in certain games the difference is very noticeable. COD4 gets almost unplayable at need at least 50+ in order to make a head shot easily :D

    There are many factors:
    1. How good your vision is.
    2. The game itself, and it's technologies. While playing Team Fortress 2, and it's motion blurring and other stuff, even 20 FPS seems very playable, but when I'm playing something like CoD2 (CoD4 is coming tonight :D) 20FPS is absolutely unplayable.
    3. What you are doing...- if you are standing still, or just walking, 20 FPS is pretty smooth.

    Usually I can tell the difference between 30 and 60 FPS... your gun just seems to float across the screen @ 60. I am generally extremely pleased with 4xAA 16xAF @ 1680x1050 with 40 FPS. I really don't care if it's rammed up against my monitor's refresh rate.

    I just ordered the HD 4870 (which is coming tomorrow [lol]). Yes, with a E8400 @ 3.6Ghz a HD 4850 or even a 9800GTX would have been absolutely fine at the above res, I want my $310 hard earned cash to last a long time. I don't plan on upgrading this thing until at least 2010. Do you think this HD 4870 is gonna max out games even a year from now? Probably not. But it will probably handle the games a lot better then the $180 HD 4850 will.
    So the answer is yes.
Ask a new question

Read More

Graphics Cards Graphics