Sign in with
Sign up | Sign in
Your question

Is the PC gaming industry at a dead end?

Last response: in CPUs
Share
a b à CPUs
June 27, 2009 7:44:26 AM

Folks an interesting read here in this article about the benefits of 2Gb of graphics memory.

http://www.xbitlabs.com/articles/video/display/gainward...

"There is an opinion that PC gaming industry has come to a dead end" ... was one of the comments.

Your opinions on what is causing lack of progress? Is that true?

Graphics power ...?

CPU power ...?

Software houses catering to the masses at the lower end of the market ...?

I'm not a fan of NVidia cards so this isnt an advert ... I just thought the authors comments about the future of gaming was of interest.

/crysisexempt
June 27, 2009 9:29:03 AM

I don't think so, the developers know that the hardcore gamers get PC's not Consoles due to many reason's so there is still that niche market that they will exploit.
June 27, 2009 12:55:23 PM

Lots of different changes have taken place, with the ability of more to come.

While we used to game at 11x7 or 12x10, and that was the norm not long ago, and 12x10 is still ranked No.1 res for steam, were seeing gpus often able to play at the current highest res of 25x16.

Gpu perf growth is often seen as much as 100% from gen to gen, and theres alot of shrinks yet to be had, as gpus has just now caught up with cpus in process nodes, and Ive been hearing theyre prepping for 16nm and lower. Gpus, usually shrink in half nodes as well, so in reality, we do see 100% perf in full node shrinks, which leaves alot of perf yet to be had.

Theres been talk of higher res, but as with everything else, the process tech has to lead the way for this to happen. Were seeing oleds, 120 instead of 60 etc, and as nodes get better, and smaller, itll transfer into the screens we use.

As for devving on consoles vs PCs, heres the problem I have with anyone pointing the finger only at PCs. Since each console is pretty much proprietary in use and dev, they all have to be ported from one to the other anyways. And, since M$ has been doing a better job from DX10 on up for making gpus less static for HW solution seen on gpus, it also makes for easier porting and dev development for PC games, and thus makes it less proprietary, and why the general consensus about , say, Physx is only really pushed by diehard fans of nVidia who dont see all the negative aspects of it being so proprietary.

As for mem usage, and amounts needed, who cares? Is it pricing our gpus outta reach? Really? Today, when we see more bang for the buck from our gpus than weve ever seen?

Most PC gamers complain about games to come that WILL challenge their setups, not the other way around. The introduction of DX11, W7 and all it brings into the gaming environment, and the available usage of memory etc and the gpgpu acceleration will become the norm, and people (average Joe) will start seeing the benefits of having gpus instead of lame IGPs in their PCs, which should spur sales of gpus, and possibly (most likely) PC gaming as well, since we'll see more gfx power in the average rig. We wont see Intel ignoring it as in the past, as they too will finally have a monetary interest in both gpgpu usage and PC gaming with the coming of LRB.

Heres a nice link for DX11 and what itll mean for us
http://www.legitreviews.com/article/1001/1/

And for the naysayers as to the availability of early impact and games for DX11
http://www.slideshare.net/repii/your-game-needs-direct3...

Back onto his comments on Gddr. Its well known around here its been used as a selling point for lessor cards, and why he mentions top cards is a mystery, unless thats his way of not pointing a finger specifically at the reviewed product. Thats my take, but why add such things as 25x16 will be the max, or 1 gig of Gddr is all we'll ever need, it sounds so prophetic, like the ram comment from Gates.

It seems the writer is taking a bad idea of sticking too much memory on a card that cant or never will use it before its a dinosaur, and transferring it into PC gaming in general, and using that card as the norm, as if there wont be any improvements down the road. Bad article
Related resources
June 27, 2009 1:08:29 PM

Heres a prime example of where they fail in the article
http://www.xbitlabs.com/articles/video/display/gainward...
Notice the improvements seen on the 4850 512 vs the 1gig on GRID? at lower res? Now, since the cards are somewhat limited in ability, the 4870 would show a better perf at a higher res, BUT, somehow they didnt have a 4870 512 vs a 1 gig compro on this game, which does require alot of texturing, which the 4xxx cards are known for, and plays right into his whole argument, of gpus actuallybeing able to use higher amounts of memory. Again, fail
a b à CPUs
June 27, 2009 6:50:33 PM

Reynod said:
Folks an interesting read here in this article about the benefits of 2Gb of graphics memory.

http://www.xbitlabs.com/articles/video/display/gainward...

"There is an opinion that PC gaming industry has come to a dead end" ... was one of the comments.

Your opinions on what is causing lack of progress? Is that true?

Graphics power ...?

CPU power ...?

Software houses catering to the masses at the lower end of the market ...?

I'm not a fan of NVidia cards so this isnt an advert ... I just thought the authors comments about the future of gaming was of interest.

/crysisexempt



The drivers on Nvidia have always been better than ATI, so for that reason alone - Nvidia are for me.


But the problem here is that video cards are so good now that they are almost now film like quality.
It didnt help that dx10 wasnt that a great a difference from 9 as DX 9 still had a lot to give.
Class DX10 as the graphics version of MMX if you like.

What should help is DX11 when it finally arrives.

PC games are at a turning point and physics in pc games will be much more accurate than consoles as it moves the bar higher.
Film like graphics will be there when they are given scans of actual faces and body scans to work with.

This will take a lot more processing power than we have as this is the next bar to over come but at the moment they are up there with Pixar.

The trouble is it still takes a long time to produce a Pixar film - about 3 years minimum and games take same time to perfect ( at least )

Consoles are a bit easier as most games work on 480p and it doesnt need that a higher res to work with.
Only a few are 720 but compared to pcs - we have had hd for ages.

Consoles are also in a bit of a stale mate too the PS3 is worthy of another 3 years atleast. Its just hard to program for.

We need innovative game play first - how many action cgi films have been absolute pap ( the latter aliens or alien vs predator for example ) . Story first please guys then pretty pictures afterwards.

Thats were the pcs at - its been to busy giving us eye candy when the sweet tastes sour.. and as the recession prolongs so will it continue.

The lack of progress is down to one thing Reynod - money..

It costs far too much to produce a game now which makes having a pc worth while. Consoles are a quick fix and give a better return as most are developed cross platform now and unfortunatley the pc has to work like a console because of this and games just dont feel right on the pc platform. I mean you can even plug in a xbox 360 controller. Ugh !

The only ones that do are written for the pc in mind - not a port then made to work on a pc in a engine which is available on all platforms then compiled.

Shame but true..

The pc is the best gaming platform ever - fact - if only games manufacturers let it.

Also multi configurations dont help. Most pcs are configured differently so catering for different needs is also hard to program for.

Stroll on DX11 with built in physics and other wonderfull enhancements we can actually tell the difference with.


a b à CPUs
June 27, 2009 8:56:30 PM



I dont believe that AMD deserve ATI. ATI rocks, shame about the company running it.

Hence ATI to me will Always be ATI and it should be kept that way :) 
June 27, 2009 9:38:52 PM

Gotta give em some credit tho, although they may have overpaid for ATI, they were smart enough to buy them
a b à CPUs
June 27, 2009 9:44:18 PM

JAYDEEJOHN said:
Gotta give em some credit tho, although they may have overpaid for ATI, they were smart enough to buy them



Shame they werent smart enough to use ATI for nearly a year.
June 27, 2009 9:50:51 PM

True. The marketing ideas that have come from ATI, plus a few other things regarding products and direction werent used, but since they are now, we see an upsurge in products, ideas and just plain ol somethin the folksll buy
!