Sign in with
Sign up | Sign in
Your question

What is the point of such powerful hardware?

Tags:
  • Graphics Cards
  • Hardware
  • Graphics
Last response: in Graphics & Displays
Share
April 7, 2010 9:51:22 AM

I have been a reader of Tomshardware since the Voodoo days. I seen 3D gaming evolve from its infancy. PC was the platform that made it all happen.

While it was great thoughout the years and has always been a pleasure to read Tomshardware. Its come to a point where i now question the software we are using on our PC's today.

Games since the last 2-3 years have simply not kept up with the hardware we now have. Developers are greedy, they would rather cater to the PC market and the Console market at the same time. Hence this has brought up a situation where even 3years old hardware like the ancient 8800gtx runs most of todays games at 1920*1080 easily (reason being the 8800gtx is alot more advanced then whatever is present in any Console).

This raises an alarming question, do we really need upgrades anymore? DX10 didnt receive much support, it looks to me like DX11 wont receive much support either. Until a new gen of consoles doesnt arrive DX 9 is here to stay.

If we are to reach a conclusion that current hardware is too powerful and we dont really need upgrades anymore, then shouldnt Nvidia and ATI be concerned with the situation? where their primary market the PC graphics market will shrink significantly.

Any GPU they sell on the consoles wont be replaced for another 7 years(since that is the console life span). So as a business model is it fine for them that the small number of gpus they do sell on consoles will also cause their PC market to become stagnant?

More about : point powerful hardware

a c 274 U Graphics card
April 7, 2010 10:02:36 AM

Maybe that's why Nvidia have chosen this time to make the change in the architecture of their GPU's?
a b U Graphics card
April 7, 2010 10:02:54 AM

When onboard video reaches the same levels as the 8800gtx is at today, then we will be in a situation, like sound cards, where adding a videocard will only be for a select niche market, like soundcards are today.
Related resources
a b U Graphics card
April 7, 2010 10:04:35 AM

Mousemonkey said:
Maybe that's why Nvidia have chosen this time to make the change in the architecture of their GPU's?

Really? All you have to contribute to a good question like that is an nvidia plug? When does jennyh get her moderator tag?
a c 274 U Graphics card
April 7, 2010 10:08:30 AM

JofaMang said:
Really? All you have to contribute to a good question like that is an nvidia plug? When does jennyh get her moderator tag?

Whenever you wish to give it to her I guess. :lol: 
April 7, 2010 10:33:36 AM

JofaMang said:
When onboard video reaches the same levels as the 8800gtx is at today, then we will be in a situation, like sound cards, where adding a videocard will only be for a select niche market, like soundcards are today.



So isnt that bad for Nvidia and ATI?

We still have a lot of room left until the point reaches where graphics become on par with reality.

By supporting consoles both of them have ensured that graphics progress becomes stale.

PS3 + 360 put together in a matter of 4 years havent sold more then 60million. Thats not a huge quantity of chips for any of these companies considerings its 4years already.

So why stalemate the market where customers were buying a GPU from them every 2 years? it just doesnt make sense.

Not only did they compromise the GPU market, for a company like AMD who makes CPU's as well. 4 core+ CPUs wont be used by people who want to surf the net or type word documents. Those would be used by demanding applications like games and if there is no progress in requirements by games then all of us might as well stop upgrading CPUs as well.
a b U Graphics card
April 7, 2010 10:43:02 AM

Well, I understand your concerns, but we are not quite there yet, I think.

For example, when Oblivion came out in 2006, it was considered an huge leap forward in graphics. Still, I could run it on my 3 years old radeon 9800 pro.

Granted, not with the highest settings, but it still ran fairly decently.

But yes, eventually we will get to the point where people do no longer want graphics to improve. Why am I saying this? Well, I'm not a big fan of shooters myself, but many other are. Even so, I think that even the most hard-core shooter fan would feel sickened with super-realistic graphics, guts, blood and gore.

I think most of us want to be able to see that the games are just that - games.

In any case, the time will come when graphics will stop improving, either for the reason above or simply because they already look like reality. When we have reached that point, there will be no need for better gaming cards, will there?

But who knows, the architechture of computers and the ways they are used will probably have changed long before then...
a b U Graphics card
April 7, 2010 10:45:54 AM

JofaMang said:
When onboard video reaches the same levels as the 8800gtx is at today, then we will be in a situation, like sound cards, where adding a video card will only be for a select niche market, like soundcards are today.
Hopefully by then, the consoles will have surpassed that level of graphics capability, too. But I think the day of onboard reaching 8800GT levels is still rather far off. AMD didn't do so bad with their integrated 790GX 3300 and 785G 4200, but these things can't even compete with their entry-level discrete cards from last year.

I don't think it's really a matter that companies can't develop games that push the hardware envelope. Look at Bad Company 2 and Metro 2033. Both are graphics intensive and require some hefty hardware to display all them in all their glory at a playable frame rate. I think it's done far less often because of developers choosing not to do so in order to gain a wider market for their games. They certainly don't want to alienate customers, as that would reduce their sales.

Most people expect their PC's to last a solid 2-3 years, minimum. The problem is, most of the PC's made by the major manufacturers (like HP, Dell, Gateway) are mainstream machines that are produced to double-up as a workstation or home computer. They usually have decent CPUs, but are almost always seriously lacking in the GPU department. It should be easily upgraded, right? Wrong. They use power supplies in these machines that are only adequate for their original components in order to keep costs down and maximize profits. Try slapping a 1GHz factory clocked HD 4890 into a quad-core HP machine with a 250W PSU... See how far into the boot process it gets before dying. LOL

Game makers cover both ends of the market by allowing users to disable features and tone down the graphics intensity so the game is playable on less powerful hardware. The day they stop doing that will be the same day they bite the hands that feed them. That or it will be the same day manufacturers stop producing OLD, outdated mainstream hardware in favor of more modern hardware.

Take a look on Newegg sometime at how many freaking 8400GS video cards are still available. Or worse, look at how many FX5200 and 6200 cards are still available. Or Radeon 7000's - the ones that don't even support hardware T&L. Hardware T&L is what, Direct X 7 technology? It's ridiculous that this hardware is still even around. But why is it around? Because some people are still using 5, 7, even 10-year old computers and just might need one of these dinosaurs someday. People can still buy these pieces of sh*t but cards like the GTX260 and HD4890 are end of life??? What are they thinking?!
a b U Graphics card
April 7, 2010 12:16:04 PM

Mousemonkey said:
Maybe that's why Nvidia have chosen this time to make the change in the architecture of their GPU's?


Changed from big and slow to big, hot, noisy and slow :lol: 
April 7, 2010 12:34:19 PM

Wow! this thread is deep man!!
April 7, 2010 1:19:50 PM

Thanks for the thread kashif. These are exactly my concerns too and for a long time at that.

I built my present sytem sometime around late May last year. (With the exception of some peripherals, which got added later on). I bought the 4870 after reading so many reviews. I spent like 3 4 months just researching the system!!!. And I was totally delighted when I found that I could play my game at highest settings 4xAA @ 1920x1080 except Crysis (high, no aa, 1080P) and maybe 3 4 other games.
In all the games I used to get >60FPS at the settings. Why? Because most of the games are still being made on Unreal engine 3, thats why. And its the most popular engine with loads of games on consoles too.

Currently, I am playing BFBC2 and AC2 at 1920x1080 highest sttings, 2X AA. In BFBC2, I average 50 (30-65), while in AC2 I average 45 (30- 75 being 40 most of the times, thus bringing the average down). There is no game for which I had to lower the resolution in order to make it smoother. So, even I dont understand, why people would spend on a 5870 or a 5970 and worse 5970x2!!!! Reasons are manifold:
1) They might want 16xAA at 1080P
2) Eyefinity!!!!!!!!! Meaning multiple monitors, that increases the resolution in effect
3) 2560x1600 resolution with 4x AA or 8xAA


And, most importantly
4) e-penis!

I, personally dont feel the need to upgrade right now. Hell, I have not even overclocked my processor yet. lol


And jennyh, you are a chick!!!! Damn me. 6 years on this forum and I always thought you are a guy...... Goddamn!!!!!!! So, now we have at least two girls on the forum. amdfangirl and you. Feels good man.!!!!!! Keep up the good work.
April 7, 2010 3:08:48 PM

hellraiser06 said:
Thanks for the thread kashif. These are exactly my concerns too and for a long time at that.

I built my present sytem sometime around late May last year. (With the exception of some peripherals, which got added later on). I bought the 4870 after reading so many reviews. I spent like 3 4 months just researching the system!!!. And I was totally delighted when I found that I could play my game at highest settings 4xAA @ 1920x1080 except Crysis (high, no aa, 1080P) and maybe 3 4 other games.
In all the games I used to get >60FPS at the settings. Why? Because most of the games are still being made on Unreal engine 3, thats why. And its the most popular engine with loads of games on consoles too.

Currently, I am playing BFBC2 and AC2 at 1920x1080 highest sttings, 2X AA. In BFBC2, I average 50 (30-65), while in AC2 I average 45 (30- 75 being 40 most of the times, thus bringing the average down). There is no game for which I had to lower the resolution in order to make it smoother. So, even I dont understand, why people would spend on a 5870 or a 5970 and worse 5970x2!!!! Reasons are manifold:
1) They might want 16xAA at 1080P
2) Eyefinity!!!!!!!!! Meaning multiple monitors, that increases the resolution in effect
3) 2560x1600 resolution with 4x AA or 8xAA


And, most importantly
4) e-penis!

I, personally dont feel the need to upgrade right now. Hell, I have not even overclocked my processor yet. lol


And jennyh, you are a chick!!!! Damn me. 6 years on this forum and I always thought you are a guy...... Goddamn!!!!!!! So, now we have at least two girls on the forum. amdfangirl and you. Feels good man.!!!!!! Keep up the good work.


Well i made this thread at Tomshardware, cause for me atleast this is where it all started. Gaming on the PC. Its when i stopped playing on any console. So i know the majority of the community is oriented toward the PC as a gaming platform and probably shares these concerns.

For the sake of it, i shot Nvidia an email. Lol i didnt really think i would get a reply out of them, but its something i would like to share with you folks here:

I see Nvidia releasing their all new flagship the GTX 480.

I currently own a GTX280. I wonder what is the point of upgrading. Since the GTX280 already runs all the games @ 1080p max AA etc.

Game are now designed with consoles in mind. Hence they barely push even a 8800gtx to its limits.

I seriously wonder why Nvidia or even ATI support consoles, Developers now dont push graphics anymore like they used to. I really wonder isnt this bad for your core business? the PC graphics business? with ppl not needing upgrades any more due to consolisation why are you releasing more and more powerful gpus? when there is no incentive to buy them.



Reply from Nvidia:

Hello Kashif,

Thanks for contacting NVIDIA Customer Care.

NVIDIA designs GPU's for game consoles because the company wants to light every pixel in the world regardless of the platform. Yes, lately game developers have been designed for consoles which puts limits on PC games. In the past PC games traditionally had many features and graphics well beyond a console. However, NVIDIA has always pushed developers to use newer graphics techniques and lately is trying to expand that to include game physics. NVIDIA's core buisiness used to be strictly PC gamming. However, high end gaming GPU's are a very small portion of the companies income now. Most of the profits come from Quadro and now Tesla products and the revenue from lower end GPU products and soon to be released phone and mobile products based on Tegra processors.The same GPU architecture used for these high end expensive Quadro workstation products are used for consumer GeForce products as well so there are markets that are using them to the fullest extent unlike the PC game market.

The traditional computing markets are changing and NVIDIA is now more than just a gamming graphics chip company now. I think the company has to do this to stay in business, keep innovating, and to keep growing.



It makes one wonder, the decrease in people upgrading their GPUS has it already started? and could the stagnation of graphics since the last 2-3 years have already played apart in this happening?
a c 365 U Graphics card
April 7, 2010 3:09:05 PM

Powerful hardware exist for various reasons.

Hardcore gamers who have the money will simply throw it whatever is the fastest video cards they can afford to buy every 6 to 12 months because they want to set everything to max grahpics quality while keeping high frame rates. While AMD and nVidia can make a nice premium off high end cards, this represent a small segment of the market.

AMD and nVidia makes the bulk of thier money from the mainstream market segment roughly $100 - $200 range. However, it's the highend cards that brings these mainstream and lower end cards to the general public's attention.

Both companies need to produce newer, faster cards to stay in business. It gives people a reason to upgrade and fills AMD's and nVidia's coffers. If they were to slow down the release of new video cards people would have less incentive to upgrade. Less incentive means less money for AMD and nVidia. Less money means less R&D and employee layoffs.

As current video cards "ages" in a market, prices will have to be reduced to keep those product attractive to people to buy. Sales of cards will stagnat even if prices are reduced a little bit because there is nothing new. That means lower profit margins which is not good for the bottomline or investors. Introducing new cards allows them to have cards with good profit margins.

I upgraded from a X1900XT 512MB to a HD 5850 'cause my X1900XT died.
April 7, 2010 3:49:02 PM

if your card averages 50fps but still hits lows of 35, there is room for improvement and you do not have the best system on the market. that's why better graphics cards keep coming out, because there is always a game that pushes even the best cards to the limit.

and just my opinion, but games aren't becoming any more realistic, they are just becoming more artificially detailed, which is much different. if you change your setting from medium to ultra, your fps might go from 50 average to 35 average, but is that because the characters look more like humans? no, it's because the details of the game become more intense.

to think the advancement of games is eventually going to yield a game where characters are actually humans that you are playing is naive. stanley kubrick's 2001: a space odyssey is a perfect metaphor to show that naivity and over-anticipation. it's never going to happen.
a b U Graphics card
April 7, 2010 4:00:38 PM

I for one have a 2560x1600 monitor so a 5970 is necessary.
a c 274 U Graphics card
April 7, 2010 5:01:37 PM

jennyh said:
Changed from big and slow to big, hot, noisy and slow :lol: 


And just how much slower is a GTX480 compared to a HD5870?
April 7, 2010 5:07:33 PM

Mousemonkey said:
And just how much slower is a GTX480 compared to a HD5870?


Like, 100x slower, obviously.
a b U Graphics card
April 7, 2010 8:23:43 PM

Mousemonkey said:
And just how much slower is a GTX480 compared to a HD5870?

Flagship card Vs Flagship card: 480 vs 5970. You can directly compare the 480 and the 5870 fairly once there is a dual-gpu fermi on the market. Since 480 uses only slightly less power than a 5970, I think it is an apt comparison. To try use a 5870 as a direct comparison to a 480 is just an attempt to assauge your own dissapointment in Nvidia. It's ok, we are here for you, you are not alone, it is not your fault. I am dissapointed in fermi since it is not really all that competative, especially in light of the delays, so I hurt inside too, due to the lack of ATI price drops. We don't have to be on the same team to both suffer from such a crappy gaming card release, just remember, it is not your fault. It is not your fault. It is not your fault.

*manly hugs*
a b U Graphics card
April 7, 2010 9:31:17 PM

I can somewhat understand what the OP is saying. That eventually the need for superior hardware will stop, not because of hardware limitations but because of software developers not being able or not wanting to output software that is on par with the hardware.

I actually share this same perspective, however I have seen it happen far earlier then the past 2-3 years. I have seen it happen since I was 13 (7 years ago). It is evidently stagnant now a days in the past 3 years.

Microsoft took a hit by introducing Vista to the public, fearing that if they didn't push it then they would never see the leap in hardware that we have seen today. You may say Vista sucked, but the truth is OEM manufacturers sucked by not keeping their hardware updated (chipsets incapable of a version of DX9 that allows for Aero, processors that ran hot and worked slowly, ram that was outdated, slow, and insufficient in quantity).

Anyways I digress, as that is not the case now. We have hit a point were the software industry is becoming stagnant due to the game market getting dissolved into console gaming. They claim to be pushing graphical content to the maximum but we all know that is complete garbage when way older GPU's are still able to keep up with the games. They claim to be including features never seen before, but we have already seen those features just in a different color. They claim to be making revolutionary breakthroughs, yet those breakthroughs are years old.

Im not trying to say that all or even most software developers are bad (plenty of other software fields are advancing) just most game developers.
I'm not even going to get into multi-platform games (PC/Console). If you can't tell PC games are getting short leashed because of the need to fit them into Console games, you must be gaming on a Mac (well maybe not anymore, finally they're getting some games ported... to them..)
April 7, 2010 9:35:47 PM

AsAnAtheist said:
you must be gaming on a Mac (well maybe not anymore, finally they're getting some games ported... to them..)


But you're still running those games on outdated hardware!
!