Sign in with
Sign up | Sign in
Your question

Has the graphics industry slowed down lately?

Tags:
  • Graphics Cards
  • Graphics
Last response: in Graphics & Displays
Share
January 13, 2012 4:42:09 PM

I just wanted to comment on a trend I have been noticing. It seems like the great price deflation and power inflation in the graphics industry has been slowing down. At least in the midrange segment. (OK, even then, mostly nvidia)

Here's the thing, I got a Geforce 9800GTX+ back in winter of, like, '09. That was a card released in what, June of '08? It was about $150 or something like that.

128 Cores
512MiB RAM
256-bit
~$150

This card has lasted me quite a while, but when looking at benchmarks, it seems dwarfed by the latest top-end cards.

But then I look back at the midrange. After, like, 3.5 years, $150 isn't getting you much further. I'm usually seeing this config for NVIDIA:

192 cores
1GiB RAM
128-bit
$120-$150

Wow...

Compare that to my upgrade before: 6600GT - 9800GTX+ Both at the same price points at the time.

It seemed like there was a large spike between the 6-8 series, (5-8 comparison is rediculous). But after that, it seems like they started doing a bunch of rebranding of chips, adding a few efficiency measures here and there, and a die size shrink, with what seems like little performance difference at the same price points.

I know there are a lot of variables, like AMD vs NVIDIA, Efficiency in DX10 and DX11 applications, power consumption, etc. Not to mention the fact that games haven't really pulled a doom3 or Crysis graphics jump since, well, Crysis.

I'm just wondering if I'm the only one noticing this.

I originally had projected an upgrade pattern of a midrange card every 3 years being a reasonable upgrade. Now I'm looking, and the only thing that really dates my 9800GTX to the new midrange stuff is RAM and tesselation. Everything else is merely incremental.

More about : graphics industry slowed

January 13, 2012 4:59:48 PM

Jonathanese said:
I just wanted to comment on a trend I have been noticing. It seems like the great price deflation and power inflation in the graphics industry has been slowing down. At least in the midrange segment. (OK, even then, mostly nvidia)

Here's the thing, I got a Geforce 9800GTX+ back in winter of, like, '09. That was a card released in what, June of '08? It was about $150 or something like that.

128 Cores
512MiB RAM
256-bit
~$150

This card has lasted me quite a while, but when looking at benchmarks, it seems dwarfed by the latest top-end cards.

But then I look back at the midrange. After, like, 3.5 years, $150 isn't getting you much further. I'm usually seeing this config for NVIDIA:

192 cores
1GiB RAM
128-bit
$120-$150

Wow...

Compare that to my upgrade before: 6600GT - 9800GTX+ Both at the same price points at the time.

It seemed like there was a large spike between the 6-8 series, (5-8 comparison is rediculous). But after that, it seems like they started doing a bunch of rebranding of chips, adding a few efficiency measures here and there, and a die size shrink, with what seems like little performance difference at the same price points.

I know there are a lot of variables, like AMD vs NVIDIA, Efficiency in DX10 and DX11 applications, power consumption, etc. Not to mention the fact that games haven't really pulled a doom3 or Crysis graphics jump since, well, Crysis.

I'm just wondering if I'm the only one noticing this.

I originally had projected an upgrade pattern of a midrange card every 3 years being a reasonable upgrade. Now I'm looking, and the only thing that really dates my 9800GTX to the new midrange stuff is RAM and tesselation. Everything else is merely incremental.




For a sec i thought you meant Amd, Nvidia kinda always sucks at that market. At least for awhile.
m
0
l
January 13, 2012 5:02:29 PM

And kinda scratch what i just said the 560 is actually just as good as the 6870 that Amd has so that's not to bad. The only area Amd has Nvidia beat on is the 130$ market and below.
m
0
l
Related resources
a b U Graphics card
January 13, 2012 5:04:35 PM

To just answer your post, I'd like to add that with every CUDA release, comes with a boost. low end cards may only improve a little bit compared to the high end cards. Right now there isnt much on the market because were at this time its "Break radio Silence" for both AMD and NVIDIA. Neither will release much info to obviously build up anticipation on Newer Cards. Which leaves many people question wether to SLI cards or get Kepler or whatever else is available...I myself like getting a Midrange card but as you are auctually right even midrange only improves a lil every release. like my GTS 450 was about the same at the GTX 260 which was a High mid range card
m
0
l
a b U Graphics card
January 13, 2012 5:15:23 PM

Yes the rate at which GPU's are increasing speed/efficieny seems to be slowing down.

If you compare a high end graphics card from 10 years ago, and a high end graphics card from now - cards now are literally 2-3x the size they were 10 years ago. Of course their processing power has exponentially increased, so has their power usage (IMO this is not good).

I would like to see GPU makers start making smaller more efficient GPUs.
m
0
l
a b U Graphics card
January 13, 2012 5:17:47 PM

It's all just video card envy. Not that many games need a $400-$500 video card right now.
m
0
l
a c 122 U Graphics card
January 13, 2012 5:28:08 PM

for the record, not all cores are made equal and they are not clocked the same
m
0
l
a c 206 U Graphics card
January 13, 2012 5:30:34 PM

Where low-end cards improve is in power consumption, making them available to more people as drop-in upgrades. AMD owns this segment, with the HD5670 and now HD6670. NVidia can't touch AMD under 75W; even going up to 150W, their 550Ti is a pale shadow of the HD6770 against which it is supposed to compete, never mind the HD6850 which also needs only one PCIE power connector.
The high end is a little different, with each company offering its own benefits; AMD has Eyefinity, and nVidia has CUDA and (right now) better 3D support. Both are fast enough for most people, and power isn't really an issue. I'm running a single GTX560Ti now, and at 1920x1080, don't believe I'll have any excuse to upgrade any time soon.
So, if development has slowed, it could be because there's little demand. Until 2560x1536 or similar displays become common (fat chance, given their price plus global economic conditions), I don't foresee a change at the high end. HardOCP was even calling the HD7970 overkill; how many will be sold?
If nVidia wants to sell more GPUs, they need to 1) make something to compete on power as well as performance at the low end (they do neither right now), and 2) make PhysX play nice if there's an AMD card in the system. Demand for PhysX is not great, but there are lots of gamers with AMD GPUs who wouldn't hesitate to drop in a $60-$80 nVidia card to add PhysX, just to have it.
m
0
l
January 15, 2012 10:15:57 PM

"Demand for PhysX is not great, but there are lots of gamers with AMD GPUs who wouldn't hesitate to drop in a $60-$80 nVidia card to add PhysX, just to have it."

Yes exactly. I feel like if Nvidia works on their CUDA implementation in games, they have a multiple audiences for their low-end segment: both mainstream AND high-end consumers.

I myself am running my 9800GTX+ alongside a 8400GT (I'm trying to get an 8600 going).

There is a major advantage in games that support PhysX and CUDA that I find in these areas:
-Use your old cards if they support CUDA, it prevents this from being an extra expense.
-PhysX in many cames can use an excess for 256MB of overhead. Getting a cheap 512MB-1GB card diverts that much memory away from your primary GPU or CPU.
-Older cards may be fairly weak in fluid physics or ultra-high-level physics. But nearly no games utilize all of that, making low-end cards perfect for the above advantages.

For instance, I play Unreal Tournament 3 at max settings 2048x1152 while rarely ever dipping FPS below my monitor's 60Hz Refresh.

Once my dad gave me an old 8400 he found at a yard sale for 10 bucks, I'm now able to add 4xAA with little or no performance hit. Because the physX overhead isn't taking from my 9800gtx's frame buffer, making it available for higher-resolution or anti-aliasing.


As for your earlier comments. Yes, it certainly seems the graphics industry has caught up or surpasses the needs of the artists at the present time. We need new game-changing tech like pixel shaders, deferred rendering, dynamic lighting with multiple shadows, and (more recently) Applicable Tessellation. Those seemed like major game changers and features that required some decent upgrades. Now we're seeing more skillful use of the same technologies, so a new card with new architecture isn't really required.

Time to wait it out for Unreal 4 and CryEngine 4. See if those guys can whip out something worth upgrading for.
m
0
l
a b U Graphics card
January 16, 2012 10:09:18 AM

DelroyMonjo said:
It's all just video card envy. Not that many games need a $400-$500 video card right now.


Unless you play at 5760x1080. At that resolution a $400-$500 gpu will struggle in ultra settings.
m
0
l
a c 206 U Graphics card
January 16, 2012 2:49:10 PM

What fraction of a percent of gamers play at 5760x1080 though? Some surely do (and it looks like you're one of them), but the volume at the extreme high end is just too small. Casual games and console ports, two huge segments, need low-end GPUs. In fact, as rapidly-growing as the casual gaming market is, AMD's focus on their APUs is going to look brilliant in a couple years, even if they never again challenge Intel at the high end of the CPU market. It also doesn't matter to a lot of people now, but when energy costs rise (five or ten times), people aren't going to want 200W GPUs in their PCs, or in their kids' PCs.
m
0
l
July 3, 2012 6:59:13 PM

While this may not necessarily affect GPUs, let's not forget those new FFT algorithms MIT has been developing. Software is starting to get so freaking efficient, you can practically get an upgrade by just sticking with what you've got.

I've got a core i5 3570K coming in from Cali. But until it comes in, I'm running a celeron that's abotu 50% weaker than my previous e8400, and my desktop experience has hardly changed with all but bits of FL studio and encoding.

In the same way, as graphics algorithms become more efficient, it's becoming harder for artists to develop works that fill that kind of power. And with showcases like Crysis, we see that we're getting closer to hitting a visual end-game. That is, realism is becoming less about processing power, and more about the eye of the artist and engineer. With Crysis, we saw that DX9 was already capable of incredible realism, and all they really had to do was give it the right information, instead of just MORE information.

So now, I think we're going to start seeing graphics cards level off in terms of raw framerate-based performance, and more towards incorporating new algorithms like those we're seeing in unreal engine 4.
m
0
l
!