Sign in with
Sign up | Sign in
Your question

The future of Graphics cards? Will there be GPU's a few yrs from now?

Last response: in Graphics & Displays
Share
March 4, 2011 3:14:17 AM

So I often think about upgrading my graphics card to the 6970 2GB (about $350 as of this writing), but I'm not sure there is any point. Until something big changes I don't see the need as I can max most games with my meager HD 5770.


But I was watching this demo from Intel showing off "keyshot", a program that renders photo-realistic shots.


http://www.youtube.com/watch?v=zbokPe4_-mY


http://www.youtube.com/watch?v=ianMNs12ITc



So in a few years will we need GPU's? If Microsoft and Sony release a console in 2015 will everything happen on the CPU? And how long will it take for this type of photo-realism to show up in gaming. Right now this chip intel is showing off can barely handle 1 frame in photo-realism and I believe it's a rather expensive Server chip.

Using moore's law how long will it take before the consumer has this type of power? And how long before this can be rendered in games? (assuming moore's law continues to be true).



Also anyone see this screen shots of GDC showing off Unreal Engine?

http://www.joystiq.com/screenshots/unreal-engine-3-gdc-...

a c 179 U Graphics card
March 4, 2011 3:23:38 AM

Moore's law doesnt relate to computing power, it says we double the number of transistors on a chip of a certain size about every 18 months(it was an observation when he made it, people started calling it a law later), we have millions of times more transistors on chips these days than the old CPUs did, but they are far from a million times faster.

Im pretty sure GPUs will always be around, so will CPUs, they are too very different concepts. A CPU takes one task and does lots of different things to it and then finishes with it, a GPU takes hundreds of small similar tasks and does a few similar things to each of them at the same time. CPUs suck at massively parallel tasks in the same way that GPUs are no good at massively serial tasks.

It probably took them several weeks to render that single frame, which is totally useless in a gaming environment. The biggest challenge before we can get photorealistic images rapidly rendered is not to increase computing power, its to improve ray tracing algorithms, they are extremely heavy compared to rasterizing which is why they arent often used. Even CGI movies only use it for certain portions of scenes and only certain scenes.
March 4, 2011 3:26:17 AM

hunter315 said:
Moore's law doesnt relate to computing power, it says we double the number of transistors on a chip of a certain size about every 18 months(it was an observation when he made it, people started calling it a law later), we have millions of times more transistors on chips these days than the old CPUs did, but they are far from a million times faster.

Im pretty sure GPUs will always be around, so will CPUs, they are too very different concepts. A CPU takes one task and does lots of different things to it and then finishes with it, a GPU takes hundreds of small similar tasks and does a few similar things to each of them at the same time. CPUs suck at massively parallel tasks in the same way that GPUs are no good at massively serial tasks.

It probably took them several weeks to render that single frame, which is totally useless in a gaming environment. The biggest challenge before we can get photorealistic images rapidly rendered is not to increase computing power, its to improve ray tracing algorithms, they are extremely heavy compared to rasterizing which is why they arent often used. Even CGI movies only use it for certain portions of scenes and only certain scenes.


my brain just melted....
Related resources
a c 164 U Graphics card
March 4, 2011 3:28:02 AM

We will for quite some time continue to have video cards. (unless Jack gets his way and video games get outlawed.) Even if the GPUs found on current CPUs get faster, there are many games and resolutions that are beyond its reach. You will also continue to have people who will want to buy cards so they can upgrade their tech as it comes out.
a c 179 U Graphics card
March 4, 2011 3:32:41 AM

alhanelem said:
my brain just melted....


haha sorry about that, sometimes my computer architecture and micro computer systems classes rear their ugly heads and melt minds
March 4, 2011 3:33:57 AM

hunter315 said:
Moore's law doesnt relate to computing power, it says we double the number of transistors on a chip of a certain size about every 18 months(it was an observation when he made it, people started calling it a law later), we have millions of times more transistors on chips these days than the old CPUs did, but they are far from a million times faster.

Im pretty sure GPUs will always be around, so will CPUs, they are too very different concepts. A CPU takes one task and does lots of different things to it and then finishes with it, a GPU takes hundreds of small similar tasks and does a few similar things to each of them at the same time. CPUs suck at massively parallel tasks in the same way that GPUs are no good at massively serial tasks.

It probably took them several weeks to render that single frame, which is totally useless in a gaming environment. The biggest challenge before we can get photorealistic images rapidly rendered is not to increase computing power, its to improve ray tracing algorithms, they are extremely heavy compared to rasterizing which is why they arent often used. Even CGI movies only use it for certain portions of scenes and only certain scenes.



Thanks for the reply, do you work in the industry or do you know someone that does? I figured there were some people who do 3D work that come across these forums and the insight is great.

Let say for example that everyone has the IBM Watson supercomputer, will that be powerful enough to produce photo-realistic gaming?

And based on extrapolation how long would it be before the avg person has something as powerful as Watson?
a b U Graphics card
March 4, 2011 3:34:48 AM

stereoscopic 3d, multi-display setups, these are the current norms today. these are pretty calculation-demanding stuffs. and considering the move to augmented reality in the next 20 years, i'd bet gpus are here to stay.
March 4, 2011 3:37:58 AM

wh3resmycar said:
stereoscopic 3d, multi-display setups, these are the current norms today. these are pretty calculation-demanding stuffs. and considering the move to augmented reality in the next 20 years, i'd bet gpus are here to stay.


Maybe the future of gaming is augmented reality. Instead of fighting scorpion on your screen, you just go to your local forest preserve and he shows up :lol: 


"get over here!"

but on a more serious note, it would really change gaming if our own real environments were used, imagine a basketball game where you actually shoot and it keep track of your shots, score as many buckets as you can in two mins and then get posted on a leader board, this is possible with augmented reality.
a c 164 U Graphics card
March 4, 2011 5:36:31 AM

Quote:
imagine a basketball game where you actually shoot and it keep track of your shots, score as many buckets as you can in two mins and then get posted on a leader board, this is possible with augmented reality.


So gaming of the future is Chuck E Cheese?
March 4, 2011 6:02:30 AM

4745454b said:
Quote:
imagine a basketball game where you actually shoot and it keep track of your shots, score as many buckets as you can in two mins and then get posted on a leader board, this is possible with augmented reality.


So gaming of the future is Chuck E Cheese?



Could also be a training tool, for example, through augmented reality it could trace a path to the basket that the ball is supposed to follow.

There's a bunch of other things you could do, in real life games you could "set the ball on fire" when someone makes a lot of shot in a row

after using augmented reality it will be hard to go back
a c 130 U Graphics card
March 4, 2011 7:58:27 AM

Thats an interesting twist on the topic there john, Most of what you are suggesting is of course pretty much possible today. Ms have the Kinect which could do the basketball thing now.
It dosent take a massive leap to get to where you can use a kinect like set up to produce a realistic, maybe not life like just yet, i see that as many many years away but certainly at an advanced PC game level graphically.

They use a helmet set up for certain military training exercises already, Graphically it looks like an Atari but its not really realism they are looking for, they can put someone in a situation where there is say a helicopter gunship attacking in an urban environment. It was shown on a TV program called the Gadget show.

As far as GPU's staying around, i cant see midrange to enthusiast level cards going anywhere soon but i certainly dont see entry level cards being here for ever. As the Fusion CPU's get better there wont be the need for them.

Even that will take years though. You can still buy AGP discrete cards, the 4670 was the last card i have seen made for AGP, I don't think they have made any 5 series cards in AGP.
So thats quite a few years they have bothered to produce what is realistically a card for a legacy interface. I figure it will take a similar amount of time for entry level cards to die out as there will still be people who just want to upgrade existing older units without upgrading the whole Motherboard and CPU.
Like you said its not as if games demand much to play them these days.
I read an article where the magazine was interviewing some of the legends of the gaming industry, the overall agreement was that its going to be interface and input where the changes will be for the next few years rather than graphical improvements.

Mactronix :) 
a c 164 U Graphics card
March 4, 2011 8:20:52 AM

Its odd if you think about it, but Sandy Bridge will be the death of low end cards. Remember that Intel sells more "video cards" then AMD and Nvidia combined. They sell more IGP motherboards then AMD and Nvidia sell boards + cards. Amazing if you think about it. The problem is the IGPs are so low end that they can't do much of anything gaming wise. I had a neighbor about 2 yrs ago buy a new computer at BB. I made sure he bought one with a PCIe slot incase his kids wanted to game. Because of this he bought one for ~$700. I don't know what IGP it had, but it couldn't play Halo. The first one. They had to turn every setting down just to get it close. Now that Intel has come out with a half way decent IGP, I don't see the need for anyone to buy a lowend card but to upgrade those computers that are to old to take a SB.
a c 130 U Graphics card
March 4, 2011 8:44:07 AM

Well to be fair i don't really think its to the stage where if you want to game at all then you don't need a GPU. Sandy bridge is probably the beginning but its not good enough on its own to mean you don't need a GPU.
I would think its going to be an AMD chip that finally reaches a decent gaming standard for the entry level gamer, for simple older games.
The GPU in Sandy bridge is equal to a 4550 hardly gaming material. Yes i know its better than what we have had and is already capable of playing some of these older easier games at low resolutions but its not quite good enough just yet IMHO.

Mactronix :) 

a c 164 U Graphics card
March 4, 2011 8:56:07 AM

And as the owner of gaming cards for many years now I would agree. SB is however the first GPU/IGP from Intel that is truly capable of playing games. And I'm sure as they release newer "CPUs" that they will only get faster and faster. This is the death of the low end GPUs.

Its also interesting that this puts a big hurt on Nvidia. Their chipset business is all but gone now, and IGPs and lowend cards will soon follow. They don't (currently) have a CPU to speak of, just their GPUs. Is it any wonder they are pushing CUDA and talking about ARM?
a c 130 U Graphics card
March 4, 2011 10:11:47 AM

There is also a concern i have with this trend leading to the cost of CPU's increasing dramatically.
I can see AMD chips going close to matching the cost of Intel chips.
Producing mainstream and enthusiast cards will also get more expensive as they wont have the ability to disable some shaders and call it a lower end product anymore. This is for the lower end cards of course.

Looking forward i can see a scenario where they only produce the one chip in a couple of different flavors say like the 6970/6950
disabling more shaders to make lower end products.
You wouldn't need Bart's at all just lop of some extra shaders.
Well maybe thats a bit too high to go just yet but certainly Cedar or maybe Juniper class cards wont be viable for many more years i wouldn't have thought.

Mactronix :) 
a b U Graphics card
March 4, 2011 12:15:13 PM

Its odd if you think about it, but Sandy Bridge will be the death of low end cards. [b said:
Remember that Intel sells more "video cards" then AMD and Nvidia combined. They sell more IGP motherboards then AMD and Nvidia sell boards + cards. Amazing if you think about it. ]Its odd if you think about it, but Sandy Bridge will be the death of low end cards. Remember that Intel sells more "video cards" then AMD and Nvidia combined. They sell more IGP motherboards then AMD and Nvidia sell boards + cards. Amazing if you think about it.
[/b]

nothing amazing about it. sure INTELS sells more graphics chips, but how many of those graphics chips are actually being used to game is a different thing. a big chunk of those chips are used in an office environment.

it's not like your average accountant/banker actually needs a gpu-accelerated powerpoint presentations to present his graphs.


March 4, 2011 2:26:44 PM

mactronix said:
Thats an interesting twist on the topic there john, Most of what you are suggesting is of course pretty much possible today. Ms have the Kinect which could do the basketball thing now.
It dosent take a massive leap to get to where you can use a kinect like set up to produce a realistic, maybe not life like just yet, i see that as many many years away but certainly at an advanced PC game level graphically.

They use a helmet set up for certain military training exercises already, Graphically it looks like an Atari but its not really realism they are looking for, they can put someone in a situation where there is say a helicopter gunship attacking in an urban environment. It was shown on a TV program called the Gadget show.

As far as GPU's staying around, i cant see midrange to enthusiast level cards going anywhere soon but i certainly dont see entry level cards being here for ever. As the Fusion CPU's get better there wont be the need for them.

Even that will take years though. You can still buy AGP discrete cards, the 4670 was the last card i have seen made for AGP, I don't think they have made any 5 series cards in AGP.
So thats quite a few years they have bothered to produce what is realistically a card for a legacy interface. I figure it will take a similar amount of time for entry level cards to die out as there will still be people who just want to upgrade existing older units without upgrading the whole Motherboard and CPU.
Like you said its not as if games demand much to play them these days.
I read an article where the magazine was interviewing some of the legends of the gaming industry, the overall agreement was that its going to be interface and input where the changes will be for the next few years rather than graphical improvements.

Mactronix :) 



interesting points there, and yes the games are not that demanding today, if you get a very high end card it might be to brag about your frame rate to other nerds =)

But based on that I think we can imagine the big "refresh cycle" in computer graphics cards when the new consoles from Microsoft and Sony hit the market. Based on what I have now...

AMD 955 BE
8 GB of Corsair XMS3 DDR3
HD5770 1GB

you can see that it will be severely outdated by the time new consoles come out in terms of power. I expect new consoles by 2015, so they'll announce them sometime in the fall of 2014.
March 4, 2011 2:34:55 PM

wh3resmycar said:
nothing amazing about it. sure INTELS sells more graphics chips, but how many of those graphics chips are actually being used to game is a different thing. a big chunk of those chips are used in an office environment.

it's not like your average accountant/banker actually needs a gpu-accelerated powerpoint presentations to present his graphs.



Which is a great thing because hopefully with more power all around the world we could get a significant GUI change to the windows platform.

Based on the processing power most people on this forum have, Windows Aero is just silly, they could make much prettier UI's...did anyone see how fast and snappy the iPad 2 is? You can throw pictures around and manipulate them so fast, that's what "wow's" consumers.

If we are to have significant changes it means that chip makers like Intel/AMD will have set some kind of standard. If you think about it not much has changed in the GUI since the early windows days...background, icons, windows...oh wait, now the min/max/close buttons light up when you hover over them :lol: 
March 4, 2011 2:42:05 PM

mactronix said:
There is also a concern i have with this trend leading to the cost of CPU's increasing dramatically.
I can see AMD chips going close to matching the cost of Intel chips.
Producing mainstream and enthusiast cards will also get more expensive as they wont have the ability to disable some shaders and call it a lower end product anymore. This is for the lower end cards of course.

Looking forward i can see a scenario where they only produce the one chip in a couple of different flavors say like the 6970/6950
disabling more shaders to make lower end products.
You wouldn't need Bart's at all just lop of some extra shaders.
Well maybe thats a bit too high to go just yet but certainly Cedar or maybe Juniper class cards wont be viable for many more years i wouldn't have thought.

Mactronix :) 


That's true, but you can't sell anything for more than someone is willing to pay for it so it should work itself out ;) 

Also the amount of money people spend on their computers is very little when you take into account how much time you spend in front of it and how much you rely on it.

What makes high-end CPU's a hard sell for the avg consumer is that they don't need all that processing power, that's why I think significant GUI changes need to made to OS's to take advantage of more processing power from CPU/GPU. In addition to that, low-level things are moving to the smart phone like email and pictures. There will probably be a time soon when you just take pics on your phone and through WiDi just show them on your TV or computer monitor. With these low-level things being taken care of (think Motorola Atrix 4g, let's you connect the phone to a "laptop" display) I see a shift away from the PC, either the big companies anticipate this and make changes to become players in the other markets.
a c 130 U Graphics card
March 4, 2011 3:18:03 PM

Cant see a smart phone replacing the desktop ever. just like laptops dont or netbooks. There is too much a desktop can do that either a laptop, to a lesser extent, but certainly net book or phone could ever do.
I don't have a smart phone and wont ever have one, My phone does calls text web if i want photos etc. That covers what i want doing from a phone and i don't have to pay up to £45 a month for the privilege.

Mactronix :) 
a b U Graphics card
March 4, 2011 3:29:05 PM

I can see once we reach the point of quantum tunneling with die shrinks and if we find a new more efficient medium (perhaps carbon nanotubes... but who knows if it'll be then) but yea optical processing or some other future medium might make it so a gpu and cpu on the same chip or whatever the medium on the unit is called might be possible... but we are not there yet and this will be many many years in the future. and they'd have to have either amazing batteries or be extremely power efficient to replace desktops... i'd say you're sfe buildign desktops with descrete graphics for the forseeable future tho (ie next at least 5 years, probably closer to 15+)
March 4, 2011 10:25:57 PM

g00fysmiley said:
I can see once we reach the point of quantum tunneling with die shrinks and if we find a new more efficient medium (perhaps carbon nanotubes... but who knows if it'll be then) but yea optical processing or some other future medium might make it so a gpu and cpu on the same chip or whatever the medium on the unit is called might be possible... but we are not there yet and this will be many many years in the future. and they'd have to have either amazing batteries or be extremely power efficient to replace desktops... i'd say you're sfe buildign desktops with descrete graphics for the forseeable future tho (ie next at least 5 years, probably closer to 15+)


I don't know about 15yrs though, I think some big advancement will come along in computing by then.
a b U Graphics card
March 5, 2011 5:40:19 AM

Which is a great thing because hopefully with more power all around the world we could get a significant GUI change to the windows platform. said:
Which is a great thing because hopefully with more power all around the world we could get a significant GUI change to the windows platform.


some people don't give a flying F for how funky an OS's GUI looks. i want my os to be functional and that's it. the same can't be said for videogames though.
a c 130 U Graphics card
March 5, 2011 5:54:10 AM

g00fysmiley said:
I can see once we reach the point of quantum tunneling with die shrinks and if we find a new more efficient medium (perhaps carbon nanotubes... but who knows if it'll be then) but yea optical processing or some other future medium might make it so a gpu and cpu on the same chip or whatever the medium on the unit is called might be possible... but we are not there yet and this will be many many years in the future. and they'd have to have either amazing batteries or be extremely power efficient to replace desktops... i'd say you're sfe buildign desktops with descrete graphics for the forseeable future tho (ie next at least 5 years, probably closer to 15+)



Sorry i just cant see it. Seriously a Phone replacing a desktop ?? Really.
No what i think more probable is television sets with all the processing power you need to download and play games, real games right there on the TV. It could easily have a DVD or even Blu Ray burner built in. now that i can see in 15 years time. A phone that make me think i dont need a PC, not in my lifetime.

Oh by the way Quantum tunneling is already an issue with some of the processes being tested ;) 

Mactronix :) 
March 5, 2011 7:46:41 AM

I Think that in 2015,Hd 5970 DDR5 4GB Would be a Mid Range Card(Right Now The Fastest):) )
a c 164 U Graphics card
March 5, 2011 8:14:49 AM

Think about the fastest card back in 2006 was. 8800GT? 7950GT? X1950XT? Something like that? I guess you could call some of those mid range now. I think most of us wouldn't touch a 5 yo card.
March 5, 2011 10:04:42 AM

4745454b said:
Think about the fastest card back in 2006 was. 8800GT? 7950GT? X1950XT? Something like that? I guess you could call some of those mid range now. I think most of us wouldn't touch a 5 yo card.


Yeah I remember the 8800GT, I wanted it so bad so I could play GTA, I'm glad I skipped all of that, now I have a 5770 with a fast enough processor that I can play anything


But soon sandbridge will be out and Bulldozer, if the economy stay ok and we don't all kill each other in a nuclear war we have a lot to look forward to in terms of graphics.
!